Monitoring a job is to monitor the outward behaviour of the web application under test whilst executing a test. This is different to monitoring the infrastructure of an application under test, AgileLoad starter cannot be used to set up infrastructure monitoring; this is done using AgileLoad Center. While running a test with AgileLoad Starter the job can be stopped or it’s possible to take a look at some results as the test executes.
The HTTP Statistics View will show the Global HTTP View where the number of Active, Started, Finished and Aborted users can be seen. This is useful for gaining an understanding of how well the application under test is dealing with concurrency.
The HTTP request status graph shows the response types (200: OK, 300: Redirection or Cache, 400: Not found or permissions issues, 500: Server Error) that AgileLoad is seeing from the server.
The Hits and Connections Opened graph is a tell-tale for whether keep-alive / Persistent Connections is set on the target Web Server (most modern browsers make requests through Persistent Connections as it’s then easier and faster to make subsequent requests rather than having to set a new connection for each request).
TCP Connections should be equal to the number of users running multiplied by the number of connections that browser type will make to a web server (Opera Browsers make 8 connections per user, Safari 4 Browsers make 2 per user). It’s an important figure as the target must be able to handle that number of connections.
The Statistics Tab has a tabular presentation of the information in the graphs, while the Audit Log will show any error messages or information messages.
Clicking Transaction and Pages Statistics opens the Global Timer / SLA View. This shows how long the actions between each timer start and end are taking (the server response time). If the test was recorded without inserting additional timers using the script capture bar or without any insertion of additional timers directly in the script interface, then there will be one timer per script plus one timer per page in that script. The screenshot below shows two timers, one is the script timer, the other is the only page timer in this script (there was only one page in the recording – if there were more pages there would be more page timers) The Job_starter timer is taking around 10s (10,000ms) to execute, this is the load time including user time for the target. The timer above it is showing 3-4s page load time. It becomes apparent that this test was run on a very slow network (refer to the design phase and consider that the load must be generated from a location with enough bandwidth to handle the test to obtain accurate results).
If Service Level Agreements (SLA’s) were specified in the job then they SLA tab will have details. Using Agileload starter it is not possible to specify SLA’s, to use them refer to the advanced section.
Transactions Summary opens the Timers Summary dialog box. Choosing the timer(s) that are interesting by checking them will Open the Timers Summary page in AgileLoad (see Timers (milliseconds) Summary screenshot below). There are some other parameters relating to where the timing samples should be taken from in the test, (whole test, plateau only or (using ramp up detection) divided into each step (this would have been set up when you created the Job in the VU tab).
A quick way to get the transaction rate (other than to use the Transactions per Second or Pages Per Second graphs) is to divide the sample size by the time interval. E.g. below sample is 100 (10 iterations with 10 users were run in the test), the period is 16:59.14 – 17:00.52 = 1m38s or 98s so the transaction rate that was achieved is 100/98 = 1.02 transactions per second, this is of course a very simple scenario hitting one page, a timer could also wrap a transaction such as ‘checkout’ or ‘search’ to get more granular information.
Pages Summary follows very similar actions except that the statistics relate to page timers as opposed to script timers.
Check Statistics shows the Checks View. Any checks inserted in the script either automatically by AgileLoad or manually by editing of the script will appear here.
Next Result Analysis with Agileload starter