Click here to Skip to main content
14,973,117 members
Articles / DevOps / Load Testing
Posted 16 Sep 2010


13 bookmarked

Visual Studio 2008 Load Test Gotchas

Rate me:
Please Sign up or sign in to vote.
4.50/5 (5 votes)
16 Sep 2010CPOL17 min read
Load testing features available in Visual Studio 2008 through Team Suite Extension or in Visual Studio Tester Edition.



Before I start to explain to you my experience with Visual Studio 2008 load testing, I want to give a huge thanks to Edgars Ozolins, without whom this article would still be laying somewhere on my hard disk half written. Not only did he trick me to finish this article, he also corrected all my spelling mistakes. So if you still manage to find one, that means I tried to introduce something new to the article after his correction. Thanks Edgars.


Today, I want to talk about the load testing feature available in Visual Studio 2008 through the Team Suite extension (or in the Visual Studio Tester Edition). The Microsoft Load Testing Framework allows recording user actions (web tests) and later executing them multiple times over a specified time interval while collecting test results (load test). In other words, load test (a.k.a. stress test) is nothing other than web tests repeated multiple times to simulate multiple user loads.

It’s relatively easy to start load testing, but at same time, it may become quite a challenge to do it right. I tend to think that the reason for that is the load testing implementation. Because data collected from a load test doesn’t contain every single request (only averages, maximums, minimums, and other aggregated stuff), you have to setup your load test right in the first place.


Before going any further, we need a site we can run our test against. A simple site which has three buttons will be used in this article. What is special about this site is that it contains only a single page, therefore by default, VS will calculate the common average when clicking on any of the buttons. This is fine in some cases, but you will soon realize that there are situations when you might want to analyze how different features on the same page act under stress conditions. In our site, we will simulate the different features by putting the thread to sleep for different amounts of time when the user clicks on one of these three buttons:

  • 3 seconds – when clicking on the first button (Full callback)
  • 0.3 seconds – when clicking on the second button (AJAX postback)
  • 9 + 3 seconds – when clicking on the last button, the user will wait 9 seconds to get the HTTP redirect code, and then another 3 seconds to get the content of the redirected page.

Image 1

Figure 1: Web page to be tested

Web Test

Recording a Web Test

Before creating a load test, you should have at least one web test (or unit test) you can execute repeatedly. Let’s create one by right-clicking on the project and selecting “Add –> Web test…”. Doing so will force the “Web Test Recorder” to open. While the record button is clicked, all actions that a user performs are recorded to create a new web test.

Let’s create a web test where the user first clicks on the first button (sleep 3s) and then clicks on the second (sleep 0.3s). See Figure 2:

Image 2

Figure 2: Setting up a web test

Remove the initial page opening and both the favicon.ico requests (done by VS2008 automatically) from the test so only the tested features are left. You can also remove the “Response URL” validation rule because it’s not needed.

Next, we must clear some mess we did by removing the recorded request from the web test. By default, Visual Studio will try to reuse the post parameter values using the data from the previous response with the help of context variables (they look like this {{$Variable}}). To be more precise, Visual Studio will find all the hidden variables in the received HTML and create context variables with the prefix HIDDENX (e.g., {{$HIDDEN1._VIEWSTATE}}) and then bind the following request POST variables to one of the created context variables.

Because we removed the first, request bindings in the second request are not valid anymore, and we must fall back to using the recorded values instead. To do so, iterate through each parameter in “Form Post Parameters” and unbind the value as shown in Figure 3:

Image 3

Figure 3: Process form POST parameters

Here is what we are left with:


Figure 4: Cleaned-up web test

Running a Web Test

It’s time to run our web test to see whether it is working. To run the test, click the “Play” button inside the web test window.

Image 5

Figure 5: Successful run of a web test

You can see that all the requests have a green birdie next to them, and all the tests have a “Passed” status. If we had a failed request, that would mean that the server returned a 5XX status or one of the validators associated with the request had failed. It’s useful to know that the test can be stopped or continued if one of its requests fails, but the result of the whole test will be “Failed” in both cases.

Let’s temporarily add a validation rule to see how the failure looks like:

Image 6

Figure 6: Setting up a validation rule

Let’s also make the web test stop on error by enabling the web test property with the matching name.

Image 7

Figure 7: Setting up a web test property

Running the test now will fail, and you can see why in the “Details” tab. Also notice that requests following the failed one did not run at all.

Image 8

Figure 8: Web test failed on validation rule

Load Test

Creating a Load Test

Now let’s create a simple load test by right-clicking on the project and selecting “Add –> Web test…”. The setup wizard will be opened, but we will skip all the steps except “Test mix” for now.

Image 9

Figure 9: Setting up the test mix

In the “Test Mix” step, you will need to add our web test to the list of web tests which will be repeated while doing the load test.

If you haven’t modified the data in the other steps, you should have created a load test which will simulate the 25 users clicking on the first and then on the second button for 10 minutes.

We will also turn off caching for this example. To turn off page caching, you must set the percentage of new users to 100% (see Figure 10: Turning off caching in load test). This will make every simulated user to act as he was visiting the page for the first time, meaning that he has to download ScriptResource.axd and WebResource.axd every time.

Image 10

Figure 10: Turning off caching in the load test

Running a Load Test

Finally, we have got a working load test we can run, but before we do, let’s think what data we should get under the perfect conditions. Assuming optimal IIS configuration (in my case, the easiest way is to increase “Maximum Worker Processes” in the application pool settings), and because our test contains two actions, we should get an average response time equal to (3s + 0.3s) / 2 = 1.65s.

Here is what the actual test result yields. See Figure 11:

Image 11

Figure 11: Load test results

Average Response Time (Total) – Displays the average response time of every response. As you will remember, there are a total of 5 requests done through the whole test: Default.aspx(3s), Default.aspx (0.3s), WebResource.axd, and ScriptResource.axd (called twice).

Average Page Time (Total) – Displays the average time it took to perform an operation from the web test. In our test, there are two operations: click on the 3s button (results in downloading WebResource.axd and 2xScriptResource.axd, therefore making it a little more than 3s), and click on the 0.3s button.

Average Response Time (default.aspx) – Average time it took to download default.aspx. No other requests are included.

Average Response Time (WebResource.aspx) – Average time it took to download WebResource.aspx.

Average Response Time (ScriptResponse.aspx) – Average time it took to download ScriptResource.aspx.

As you can see the “Average Response Time” average value is a little bigger than 1.65s. This means that even though this number is really close, it doesn’t show how much time it took to perform an action on average. What this number really shows is an average between downloading Default.aspx, WebResource.axd, and 2 x ScriptResource.axd when we click on the first button, and downloading Default.aspx when we click on the second.

To prove this, we will disable downloading the reference files (WebResource.axd and ScriptResource.axd) and run the test again. This can be achieved through disabling “Parse Dependent Requests” in the web test configuration.

Image 12

Figure 12: Disabling Parse Dependent Requests

Image 13

Figure 13: Load test results with Parse Dependent Requests disabled

As you can see in Figure 13, the load test results with Parse Dependent Requests disabled, the “Average Response Time” counter average value now displays more correct data, though not exactly what it should be under the perfect conditions.

Request/Page/Transaction/Test Counters

It’s now time to propose a clear definition of the different kinds of counters. There are four types of counters, each representing a different scope.

Request – Any request to the server (CSS, images, JavaScript, and so on). For example, if testing a page with a single picture, the average request time will be calculated from every downloaded file (page itself, images, CSS, JavaScript, and so on).

Page – A page is a recorded user action. It is different from a request because Visual Studio usually will perform more actions than the user recorded. For example, when opening a page, the user enters the address and downloads HTML content, but behind the scenes, he also downloads CSS, images, JavaScript, and so on. The average page time is the total time required to download all these files.

Transaction – Transactions are a way of grouping different pages into one logical operation. In the load test result, it is possible to see the average of every transaction separately. This makes transactions a very powerful tool which we will be back to later.

Test – Test counters are counters which count the time it took to perform all the actions in a Web Test. For example, the average test time will show how much time it took to replay one recorded Web Test for a simulated user on the average.


Using “Transactions” to Collect Data on Feature Performance

The first trick to collect correct data is to use Transactions. You can insert a new transaction by using a context menu in the load test. In our test application, we will use two transactions: “AJAX postback” and “Full callback”, to distinguish the first button (Full callback) performance from the second (AJAX postback). Also keep in mind that even if you don’t need to group multiple requests, you may still make use of transactions for giving meaningful names to separate user actions.


Figure 14: Setting up a transaction

Now let’s see what results we will get after running this test under a constant load of 25 users. We will still be able to see the common response time for default.aspx, but now, we will also be able to see how different features behave using counters from the “Scenario1 –> WebTest2 –> Transactions” category.

Image 15

Figure 15: Transaction counters in load test results

In this screenshot, you can see that the “Full callback” average response is 3.02s, and the “AJAXx postback” average response is 0.3s, but the average total response is somewhere in the middle (1.67s). This means we were able to distinguish the different feature performances in one web test using transactions.

Step Load

Very often, you will need load testing to find out how many concurrent users your system can support. If you use constant load, you will need to repeatedly start the load test with different simultaneous user counts until you find when it starts breaking. While this is possible, it certainly is an overhead knowing that you can create tests that increase the number of users over time.

You can configure the incremental load in the load test tree, “… Load Pattern” node. You may not see the settings in the following screenshot at first, so try changing “Pattern” to “Step” to enable incremental loading. When you have done this, you can configure the size of steps and the number of seconds each step lasts.


Figure 16: Configuring the step load pattern


Figure 17: Step load configuration parameters

In the above picture, you can see which settings modify which parts of the step graph.

Redirect Performance

By default, Visual Studio 2008 counts redirect as a single request although technically it’s two requests. If the page responds with a redirect, then the response time is calculated by adding the original request time (the one which responds with 300 or 301 HTTP status; 9s in our case) to the time it took to get the contents of the redirected page (the actual page presented to the user; 3s in our case). This may seem as a small issue, but if you want to find out exactly which part struggles, you will have to consider this as well.

Let’s create another test where the user clicks on the Redirect button and see how fast the user is presented with the actual content.

Image 18

Figure 18: Creating a test for testing redirect

When we run these user actions through a load test (see Figure 19), the result is the average page time somewhere close to 12 seconds (9 seconds to get the HTTP redirect code, and 3 to get the redirected page). You may also notice that the average response time isn’t very pretty. This is due to the fact that two users aren’t enough to calculate a good average, but it shows us that sometimes the average response is 3s, sometimes it is 9s, and sometimes it is the average of these two (6s). This means that there were periods when a user was waiting for a response and another got a response with a redirect status (9s, remember?). There were also intervals when a user was waiting for a redirect code and another got a response with page contents (3s). Of course, there were intervals when both actions accrued within a 5s sampling interval, making an average of 6s.

Image 19

Figure 19: Redirect load test results

Luckily, we had both numbers (9s and 3s) presented on the above graph because of a small number of simulated users, but if we simulated more users, we would eventually see only the average of these two. To distinguish the time it took to get a redirect code and the time it took to get the contents, the first thing you need to do is to disable the “follow redirects” feature in the web test so that Visual Studio wouldn’t automatically follow a redirect header. What is left to do is to manually request the redirected page and put the two requests into different transactions to separate their counters.

Image 20

Figure 20: Disabling "follow redirects" in a load test

Now we can run our load test with 25 simulated users and still distinguish how much time it took to get an HTTP redirect code and the actual contents. See Figure 21.

Image 21

Figure 21: Bettered redirect load test results

Load Testing Web Services

The Visual Studio load testing feature is not solely for testing web site performance. You can as easily test Web Services with it. To start testing Web Services, create a new web test and add a new Web Service request as shown in the image below:

Image 22

Figure 22: Create a Web Service request

You should be able to set the request body, headers, and other attributes using the Web Service request properties window, allowing you to craft even more advanced requests.

Analyzing Load Test Results

Most of the time, load test results are clear and straight to the point, but a few things will sooner or later make you wonder.

The first thing is gaps in the test result graphs. Most of the time, they can be explained by the displayed counter not being called during the viewed period. For example, if you have a simulated user going through many transactions (like, “Open Page”, “Fill Questions”, “Validate answers”), then the “Validate Answers” transaction data will be present only when the simulated user is validating answers, and on other intervals, this transaction average will be missing.

The next thing is that averages can be a little bigger than they actually are. This is because Visual Studio requires some time to process response, run all sorts of validators, extraction rules, increment counters, and so on. If the average of a counter you are analyzing is small, then this time can make a noticeable portion. However, on bigger averages (like 1s or 2s), it shouldn’t be a problem at all.

Another thing you may come across is comparing two load test results. Be sure that both machines you were performing test on and with were under a similar load. Performing the same test on two different machines can give you different results. For example, be sure that you are not watching YouTube when performing a load test because your browser may make your machine to respond worse.

Also, user load is not always a correct measure. Above a certain number of concurrent users, the system stops creating additional requests because it cannot handle them. Although the number of simulated users is an easily explainable and presentable measure, always compare it to the “number of requests per second” to make sure they change proportionally. In the following graph, you can see a test which performs two requests. You can see that, at first, the number of requests per second grows faster than the number of simulated users, but at some point, the system just can’t handle more requests and users are forced to wait for a response in a queue, resulting in the number of requests per second staying at the same level.

Image 23

Figure 23: Choking up system with too many concurrent users

Opening Load Test Results on a Different Machine

There can be situations when you want to view the results from the load test somebody did on another machine. You will soon realize that it is not enough to just copy the results file because most of the results are stored in the local database on the machine where this tests were performed. Following is a full routine used to open a load test performed on another machine:

  1. First, check that you have Visual Studio 2008 Team Suite or Visual Studio Tester Edition installed. Load tests aren’t supported with other VS distributions!
  2. Ask for a copy of *.trx and a folder with the same name from the $(SolutionDir)\TestResults folder. trx files are generated by VS, and should contain test results, but in the case of a load test, they contain the encoded database connection string where the actual tests are. The TRX name looks like this: “username_MACHINE YYYY-MM-DD HH-mm-ss.trx”, therefore you must know when the test was started to find the correct file.
  3. Ask a copy of the load test results database and logs. The correct path depends on the database used to store these results, but usually it is:
    • C:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Data\ LoadTest.mdf
    • C:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Data\ LoadTest_log.LDF
  4. Replace your local copy of LoadTest.mdf and LoadTest_log.LDF with the received files.
  5. Create and run any load test on your machine.
  6. Copy the “resultsRepositoryConnectString” attribute value from the *.trx file you just created by running the load test in the *.trx file you received.
  7. Double click on the received *.trx file, or open it from Visual Studio.
  8. You won’t see the graphs right away. For them to appear, you need to double click on a test from Test results.

Image 24

Figure 24: Opening test graphs for tests made on another machine


This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


About the Author

Sergej Andrejev
Software Developer Nexum IT
Lithuania Lithuania
No Biography provided

Comments and Discussions

QuestionWeb site source code Pin
Salam Y. ELIAS10-Nov-19 23:14
professionalSalam Y. ELIAS10-Nov-19 23:14 
QuestionVS 2010/2012 Load Test docx report generator Pin
bikramonline10-Feb-13 21:29
Memberbikramonline10-Feb-13 21:29 
GeneralMy vote of 5 Pin
Eric Xue (brokensnow)16-Sep-10 12:02
MemberEric Xue (brokensnow)16-Sep-10 12:02 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.