Tuesday, 14 September 2010

Web Performance Testing with Visual Studio 2010

Performance testing and debugging is a major focus of Visual Studio 2010 Ultimate. Web testing and load testing have been supported in Visual Studio Team System since the 2005 release, but Visual Studio 2010 offers major improvements.

In the Web performance tests, the addition of Loops and Conditions enables developers to write more complex and intelligent tests against their applications. For load tests, the addition of 64-bit agents and controllers allows you to more effectively use the available hardware resources to generate load. Additionally, changes to the licensing of the load test agents and controllers gives you greater flexibility, making it easier -- and potentially cheaper -- to configure your load test rigs.

Web tests allow you to simulate a user performing a set of operations – typically a defined use case – on your ASP.NET Web application, and validate the responses to see if the application is working as expected. Once you have your Web tests defined, you can knit them together to create a load test to see how well your application performs under stress.

Create Some Web Tests:
Web tests are typically created using the Internet Explorer Web Testing toolbar, which records a human user clicking through a Web application. In the sample code for this article, our Visual Studio Test Project is called Com.Benday.WebTesting.WebTests. You can create a new Web test by right-clicking the project and choosing new Web Test (in Visual Studio 2010, Web Performance Test) from the Add menu. See the below picture.


When you add a new Web Test to your project, Visual Studio will open up Internet Explorer and you should see the Web Test Recorder panel in the browser. You can now go to a Web site and start using it just like you normally would and, as you go, you'll see your actions being recorded by the Web Test Recorder.



















Figure 1. The Web Test Recorder toolbar in Internet Explorer


When you've finished performing the actions that you wanted to record, click the Stop on the Web Test Recorder toolbar and you'll be taken back to Visual Studio and you'll see your actions in a *.webtest file (Figure 2).










Figure 2. A new Web Test in Visual Studio

Layered framework for Visual studio 2010 Automation testing

Base your layered architecture on three layers: script layer, business layer, and framework layer. This pattern presents an overview of the responsibilities of each layer and the components that compose each layer.








The above framework has 3 layers:
  • Framework Layer
  • Business Layer
  • Scripting Layer
Framework Layer:  This layer is generic and common across all the projects. It consist of generic components which are useful with in the organization across all the projects. It would be good declare all the possible components as interfaces with in this layer. Below are the compoenets that goes into this layer.
  1. Utilities
  2. Data management
  3. Configuration management
  4. Logging 
  5. Reporting
  6. Logging
  7. Reporting
  8. Exception handling 
  9. Recovery scenarios
  10. Screen recording
  11.  Data access layer
  12.  Object wrapper functions
  13. Data driven driver classes
Business Layer: This layer is specific to project, where the re usable libraries are developed. Each screen will have one class for handling the testing. Common functions will go into separate library. Every project will have their own implementation of the interface that are created in the Framework layer.


Scripting Layer: This layer is responsible for the scripting, each script is responsible for one screen in the application. Scripts will call the business layer components to pass the test data.

Friday, 6 August 2010

Functional Testing REST Web Services With SoapUI

General observations about SoapUI.

  • It is an amazingly high quality open source test tool. The use of groovy scripts, xPath, and dynamic property expansion created a very robust development environment. Although not being Java programmers, it was initially a bit confusing to sort out where Groovy stopped and Java began. A good book, Groovy in Action, solved that problem.
  • We found test case development a bit more tedious than expected with a test cases consisting of a dozen steps and within each step perhaps a dozen assertions. It was just a lot of detail, with lots of mouse clicking to drill into things. We wrote some helper routines to consolidate a lot of the assertion checks which helped to some extent.
  • We wound up buying the Pro version of SoapUI as we needed some technical support and this turned out to be a great decision. The support was excellent, we were more productive using the Pro version, and the code remained compatible with the open source version of SoapUI as long as we stayed away from certain Pro features.
  • The SoapUI documentation was quite well written, but initially it was a challenge to sort out which things applied specifically to REST based testing.

Solution 
  1. Create a SoapUI project, a rest service within the project, defined each of the 100 resource paths within the service, and created an empty test suite.
  2. Utest suite properties as a set of global variables. Some properties were used to store dynamic data that was stored and retrieved by various test steps, while other properties contained static content that represented various possible values for customer records. Referencing these properties in a variety of ways throughout the test suite was simple. A few examples.
    • Simple Property Expansion: ${#TestSuite#password}
    • Dynamic Property Expansion where multiple comma separated values are stored in one property: {=context.testCase.testSuite.properties["accountNumber"].value.split(",")[1]} 
  3. We made extensive use of random numbers for generating values that had to be unique. Yes, there was a remote possibility that we might get a test case failure as a result of a random number collision, but from a practical perspective, this was a non-issue.
  4. We structured our test suite to run sequentially, with the following initialization routines:
    • Generate unique “random” values and store in properties
    • Use REST calls to populate the data base with sufficient framework data to register a customer. Example, define the various valid address types such as home, business, other, etc.
    • Use REST calls to register a set of customers using characteristics pulled form test suite properties. These customers will not be modified during test execution and will be used exclusively for “get” and “search” customer related testing.
    • Define a series of test cases that register customers for subsequent record updating. These can be called from other test cases as needed. Note that we encountered problems exporting and importing test cases that used the Goto Test step (they reference a GUID not the name of the test case), so we chose to call the test cases from a groovy script…
    • def testCase = testRunner.testCase.testSuite.testCases["Another TestCase"] def properties = new com.eviware.soapui.support.types.StringToObjectMap() def async = false testCase.run( properties, async )
  5. Once the basic plumbing was in place we needed to grind out hundreds of test cases that when completed represented thousands of REST API calls and many thousands of test assertions.
  6. Perhaps a note on assertions is warranted. There are a multitude of ways to validate that you get the expected values in a response using SoapUI. We utilized the following methods
    • Contains assertion – On the surface this is a simple text comparison, but we also used regular expressions to accept multiple values such as (?s).*ALREADY.*|.*SUCCESS.* as well as property expansions.
    • Script Assertion – This was our most common assertion tool, which allowed to use the getXmlHolder object provided by SoapUI to easily query and assert conditions within a groovy script, such as..\def groovyUtils = new com.eviware.soapui.support.GroovyUtils( context ) def holder = groovyUtils.getXmlHolder(messageExchange.responseContent) def xPath = "//statusLabel[../SystemLabel='" + \ context.testCase.testSuite.properties["SystemLabel"].value.split(",")[1] \ + "' and ../statusLabel='SUCCESS']" assert holder[xPath] == "SUCCESS"
    • xPath Match – This was nice tool in that I could use an xPath statement to grab a parent node, then compare expected results using both wild cards and property expansions child elements. The only gotcha that I found was that if I wasn’t very careful with my xPath statement and multiple parent blocks were returned, xPath Match would compare the first instance, which wasn’t always what I wanted. But with simpler scenarios, this was definitively a very quick and dirty way to check a bunch of response values.
As most of our assertion testing involved comparing large sets of known values stored in test suite properties to API responses, we wrote some rather elaborate helper routines that simplified the assertion process. We did run into one obstacle in that we couldn’t find a clean way to “include” these routines in the assertions. The Pro version has a common library capability, but we didn’t want to lock ourselves into that. The open source version does allow a Jar file to be included in the the soapUI\bin\ext folder , which would have addressed the issue, but we wanted the test suite to run on a generic install of SoapUI.

So, the utility routines were pasted into assertions throughout the suite (yuck). Just as the test suite was being finalized we discovered that there was a bug in the helper routines (not zero stuffing MD5 hash values) and were faced with editing 147 instances of the helper routines in the test suite (nightmare!). Fortunately, we were able to pull the entire SoapUI project into XMLSpy and globally edit the routines. This notion of editing the project from outside SoapUI might open up some interesting productivity opportunities.

We are now moving on to using SoapUI to load test this implementation and I will share our experiences with this effort in a week or so.

Wednesday, 4 August 2010

SAP CATT - Computer Aided Test Tool

SAP CATT - Computer Aided Test Tool


Just sharing my experiences with CATT


Simply, any one in SAP can use this tool as long as you can access to the tcode. I used CATT atleast in 3 different SAP modules. Typically, in SAP projects, CATT execution is a favorite task for "technical" associates. When the technical resources are either not available immediately or their plate is full with other important tasks, functional associates can simply jump into this activity, since there is no coding or programming involved. This tool has been a great gift from SAP since I am a functional person and could not do coding ! I can remember at least 30 or 40 occassions where I used CATT to get this and that done.


Below find few examples:
1. Created multiple sales orders (excess of 200) in a matter of minutes for the purpose of end user training
2. Created multiple purchase ordes (excess of 200) in a matter of minutes for the purpose of end user training
3. Created Deliveries for the sales orders in a matter of minutes for the purpose of end user training
4. Created config. entires, if the volume of records is large. I remember once I entered 900 records.
5. Extensively used in preparing the transactional data for the purpose of archiving. It worked impeccably.
6. Loading of master data (example: material groups)


Note: Upon execution of CATT, it is very possible that some records will fail, which have to be addressed manually. SAP really needs to further enhance this area of CATT because, there is no easy way of identifying the failed ones, it has to be done manually. One workaround is simply download the result into an Excel and using sort feature of Excel, identify the failed ones and deal with them manually.


CATT stands 4 Computer Aided Testing Tool


Although CATT is meant for as a testing tools, many SAP users have now use CATT frequently to upload vendors master data and make changes to other master record.


SAP Consultant and Abapers tends to used it for creating test data.


With CATT, you don't have to create any ABAP upload programs and this save on development time. However, you still have to spend time on data mapping into the spreadsheet format.


The transactions run without user interaction. You can check system messages and test database changes. All tests are logged.


What CATT does is record you performing the actual transaction once.


You then identify the fields that you wish to change in that view.


Then export this data to a spreadsheet to populate with the data required.


This is uploaded and executed saving you keying in the data manually.


To perform CATT, it has to be enabled in your production environment (your systems administrator should be able to do this - SCC4).


You will also need access to your development system to create the CATT script.


User Guide for Data Upload


The use of CATT is for bulk uploading of data. Although CATT is primarily a testing tool, it can be used for the mass upload of data. The way CATT works is like a real user actually inputting on the SAP screen. You prepare a set of data that are required to be input into the system and execute what you called a Test Case and CATT will do the boring task of keying for you.


Over-all procedure


The over-all procedure to upload data using CATT is as follows:
· Creation of the CATT test case & recording the sample data input.
· Download of the source file template.
· Modification of the source file.
· Upload of the data from the source file.


Details of each step are provided in the following paragraphs.


Detailed Procedure


Creation of the CATT test case:


Creation of the test case is completed as follows:
· Execute Transaction SCAT
· Name the test case. Test case name must start with “Z”. It is also good practise to include the transaction code in
the test case name (e.g. Z_IE01_UPLOAD for the upload of equipment)
· Click the “Record” button.
· Enter the transaction code (e.g. IE01)
· Continue recording the transaction. Ensure data is entered into every field that is to be used during the upload.
· Save the test case.


Download the source file template


Download of source file template is conducted in two stages as follows:
· Creation of import parameters:
· Within transaction SCAT, Double Click on the TCD line in the “Maintain Functions” screen.
· Click the Field List button (Field list is displayed).
· For every field that you wish to upload data, double click in the Column New field contents (This creates an
import parameter).
· In the Maintain Import Parameter Pop-Up:
· Delete the default value if not required.
· Press Enter
· The New field contents column now contains the character & followed by the field name (e.g. &EQART). This
is the name of the import parameter.
· Repeat this for every field (in every screen) to be uploaded.
· Back out and save the CATT test case
· Download of source file template:
· Use the path GOTO -> Variants -> Export Default
· Select path and file name (e.g. C:\TEMP\Z_IE01_UPLOAD.TXT)
· Click Transfer


Modify the source file


The downloaded source file template is now populated with the data that is to be uploaded. This is completed as follows:
· Using Excel, open the tab-delimited text file.
· Do not change any of the entries that already exist.
1st row contains the field names.
2nd row contains the field descriptions.
3rd row displays the default values which are set in the test case.
4th row contains a warning that changing the default values in the spreadsheet has no effect on the actual default values.
· The data to be uploaded can be entered in the spreadsheet from row 4 onwards (delete the 4th row warning &
replace with data for upload).
· Save the file as a Text file (Tab delimited).


Upload data from the source file


Uploading the data is completed as follows:
· Execute the CATT test case
· In the Execute screen:
· Set processing mode to Errors or Background (your choice).
· Set variants to External from file.
· Click the Choose button and select the file to be uploaded.
· If uploading to another client, click the Remote execution button and select the RFC connection to the required client.
· If uploading to the current client, click the execute button.

Thursday, 25 February 2010

Visual Studio Team System 2010 - Connecting Test Cases to User Stories

This walkthrough guides you through the process for connecting existing test cases to user stories. In this scenario, Lucerne Publishing wants to be able to view which test cases will test which user stories for the DinnerNow.net payment system. This lets Lucerne publishing know the level of test coverage for each user story.

Suppose that you are a test lead at Lucerne Publishing. You want to link the existing manual test to the user story to check that the reviews are displayed for a restaurant because the test also covers this functionality.

In this walkthrough, you complete the following task:

Link Existing Test Case to a User Story

To link existing test cases to a user story

1. To display the Planning Activities Center, click Planning.
2. In the Requirements pane, double-click the requirement Customer Finds Reviews for
Restaurant that you want to link to your test case.

The requirement is displayed.



3. Select the Test Cases tab.
4. To find the test cases, click Add.

The Add Link to User Story dialog box is displayed.



5. To find the test cases that you want to link to the requirement, click Browse to
select a query to use.

The Choose linked work items dialog box is displayed



6. Select the following query: Team Queries/Open Test Cases, and then click Find.
The test cases in the query are displayed



9. Select the check box for the row that contains Complete Purchase Scenario.



10. To select to link these test cases to this requirement, click OK.

The Choose linked work items dialog box is no longer displayed



11. To add the links to this requirement, click Ok.

The links are now displayed.

12. To save the requirement, click Save and Close in the toolbar.


Wednesday, 24 February 2010

Visual Studio Team System 2010 - Creating Manual Test Cases and Adding to a Test Suite

I am posting Test Features of Microsoft Visual Studio Team System 2010.

his walk through will focus on how you can plan, design, develop, and test an application by using the Microsoft Visual Studio Team System 2010.

Creating Manual Test Cases and Adding to a Test Suite:
This walkthrough guides you through the process for creating manual test cases and adding a test case to a test suite. In this scenario, Lucerne Publishing wants to add another test to the DinnerNow.net payment system. You have to add an end to end test for the purchasing scenario.



Suppose that you work on Lucerne team and have the task of testing the payment system for Iteration 2. First, you will create a manual test for this. You will add the steps to this test case and the expected results. Then you will add it to the existing test suite for the integrated payment system.

In this walkthrough, you complete the following tasks:
  • Add a test case to an existing test suite.
  • Add test steps to a manual test case.
Overall, performing these tasks will help you create and organize the tests for testing your application.

Before you begin this walkthrough, complete the following prerequisites:
To fulfill the walkthrough prerequisites


  1. Follow these steps to open the new tool for generalist testers, Microsoft "Code Name Camano" for Visual Studio Team Test 2010 CTP:
  2. To display the Microsoft "Code Name Camano" window, click Start, and then point to All Programs.
  3. Point to Microsoft Visual Studio 2010 and then click Microsoft Code Name Camano.
  4. To connect to a team project, point to Home, click the down arrow, and then click Connect to a Team Project.
  5. To add a new Team Foundation Server, click Add.
  6. The Add Team Foundation Server dialog box is displayed.
  7. Type localhost, and then click Add.
  8. To select the team project, click the name of the team project, DinnerNow, in the list. Then click Connect.
  9. If you successfully connect to this team project, the name of the team project is displayed in the Microsoft "Code Name Camano" window following Team Project.


Add Test Case to an Existing Test Suite
You can manage your test cases by organizing tests into test suites. You use a test suite to group test cases together. For example, you can group all test cases for a specific feature together. By grouping them in a test suite, you can now easily run all the tests in the test suite together. Test suites also help in planning your testing effort by working with this logical grouping of test cases.

To add a test case to an existing test suite
  • To display the Planning Activities Center, click Planning.
  • To display all the test suites, click Test Suite Manager in the Planning Activities sidebar.
  • To select the test suite, click End to End Test.
The tests that have already been added to the test suite are displayed in the test suite details panel.

There is a prebuilt test case that you can use instead of creating a test case yourself called Prebuilt: Complete Purchase Scenario.


  • To add a test, click New test case in the toolbar for the test suite details panel.
    The new test case is displayed in the main editing pane of Microsoft "Code Name Camano" titled New Test Case 1:
  • To name the test case, type Complete Purchase Scenario in Title. You use this title to identify the test case and search for it if you have to.
  • Check Area shows DinnerNow.
  • Click Owner to select Ellen Adams for the test case.
  • Click Priority to select 1 for the importance of the test case.
  • To save the test case, click the save icon in the toolbar.

Note After the test case has been saved, the test case identifier is shown in the title of the editing pane.





Add Test Steps to a Manual Test Case
A test step might be an action only, or it might include validation.
To add test steps to a manual test case

  1. Click Steps.
  2. Click , type Verify that DinnerNow Web application is started as the Action for the first test step, and press Enter.
  3. Type Enter 98101 in the Your Zip field as the Action for the next step.
  4. Type Select American as the food type in the combo box as the Action for the next step.\



  1. Type Select Dinner as the meal of choice as the Action for the next step.
  2. Type Select 1 hour as the time frame as the Action for the next step.
  3. Type Click Find as the Action for the next step.
  4. In the Expected Result column, type Check 3 restaurants are displayed: TailSpin BBQ, SouthRidge Subs, Northwind Bar and Grill. Note: If you type a value in the Expected Result column, the step is automatically set to a validation step, as indicated by the check mark icon.
  5. Type Select Northwind Bar and Grill as the Action for the next step, and then type 12 items should be displayed as the expected results.

  6. Type Select Classic Burger and Ice cream from the menu list as the Action for the next step, and then type Your menu today has a cost of $27 as the Expected Result.
  7. Type Click Testimonials at the top of the screen as the Action for the next step, and then type Positive reviews are displayed for Northwind Bar and Grill as the Expected Result.
  8. Type Click Checkout as the Action for the next step.
  9. Type Select sign in using InfoCard as the Action for the next step.
  10. Type Select Brad Sutton and click Send as the Action for the next step.
  11. Type Click the radio button (Work) for Choose Delivery Address as the Action for the next step.
  12. Type Click the radio button (Visa) for payment option as the Action for the next step.
  13. Type Confirm Order as the Action for the next step, and then type Check Total is $27, Delivery address is 1 Microsoft Way, and Payment Option is Visa as the Expected Result.
  14. Type Click Bring my Meal as the final test step.
    a. Note You can mark any test step as a validation test step when you click Toggle Step Validation in the Steps Toolbar. When you run the test case, you must individually mark a validation test step as either passed or failed.
  15. To save and close this test case, click Save and Close.
  16. Important If you do not close the test case and continue to a different activity, the test case is now shown in Work in Progress in the sidebar. Click this test case to open it again.


Friday, 19 February 2010

Automating Flex Applications with QTP

Flex application Automation Using QTP

There are two approaches to enable Flex application automation using QTP.
  1. Runtime Approach.
  2. Compilation Approach.

Runtime Approach:
General Implementation Concept:

You use the run-time testing files rather than building your applications with testing libraries. This lets you test SWF files that are compiled without automated testing support. To do this, you use a SWF file that does include the automated testing libraries. In that SWF file, the SWF Loader class loads your application's SWF file which does not include the testing libraries. The result is that you can test the target SWF file in a testing tool such as QTP, even though the application SWF file was not complied with automated testing support.

Flex Builder includes a wrapper SWF file and an HTML template that supports run-time loading. The source MXML file for the wrapper SWF file is also included. The following files are located in the “Flex_builder_install_dir/sdks\3.2.0\templates\automation-runtimeloading-files”

  • RunTimeLoading.html -- The HTML template that loads the run-time loader SWF file. This template includes code that converts the automationswfurl query string parameter to a flashVars variable that it passes to the application. You use this query string parameter to specify the name of the application you want to test.
  • runtimeloading.mxml -- The source code for the runtimeloading.swf file that you compile. The SWF file acts as a wrapper for your application. This SWF file includes the testing libraries so that you do not have to compile them into your application SWF file.
  • Compile the runtimeloading.swf file: You can use the batch file in the Flex_builder_install_dir/sdks\3.2.0\templates\automation-runtimeloading-files directory. Execute this batch file from the sdks/3.2.0/frameworks directory. This batch file ensures that your runtimeloading.swf file includes the automation.swc, automation_agent.swc, automation_dmv.swc, automation_flashflexkit.swc, and qtp.swc libraries.
  • Deploy the runtimeloading.swf, RunTimeLoading.html, and your application's SWF file to a web server. Request the RunTimeLoading.html file and pass the name of your SWF file as the value to the automationswfurl query string parameter.

For example: http://localhost/RunTimeLoading.html?automationswfurl=MyApp.swf

Implementation:
Please find the below steps followed to implement runtime approach.
Steps

  1. Installed Flex builder 3.20 professional version.
  2. Navigated to Flex_builder_install_dir/sdks\3.2.0\templates\automation-runtimeloading-files directory.
  3. Observed that there one wrapper html file RunTimeLoading.html ,RunTimeLoading.mxml file and also build.bat.
  4. We can use the batch file to compile RunTimeLoading.mxml with automation libraries.
  5. After compilation it will generate runtimeloading.swf file in the same directory.
  6. Request the RunTimeLoading.html file and pass the name of your SWF file as the value to the automationswfurl Automationswfurl=application.swf
  7. Copy the runtimeloading.swf, RunTimeLoading.html, and your application's SWF file to a web server.
  8. Start the services and access the URL as below
    http://hostname:11090/application/ux/RunTimeLoading.html

Compilation Approach:

If you do not use run-time testing, you must compile applications that you plan to test with the testing libraries. The functional testing classes are embedded in the application at compile time, and the application typically has no external dependencies for automated testing at run time.

Implementation of Compilation Approach:
Steps:
Copy the qtp.swc, automation_agent.swc, automation.swc and automation_dmv.swc files from the flex_automation_installer/frameworks/libs directory to flex_sdk_dir/frameworks/libs.

Need to copy the following files (airframework_rb.swc,automation_agent_rb.swc,automation_rb.swc,datavisualization_rb.swc,framework_rb.swc, rpc_rb.swc to) C:\Devtools\flex_sdk_3.3.0.4852\frameworks\libs\locale from flex_automation_installer/frameworks/libs directory to flex_sdk_dir/frameworks/libs

Implementation:
We have the added the above said tag to mxml target in ux/ant/compile.xml and compiled the application.mxml,Please refer the compile.xml from section 49 to 75
The modified ant target will be as follows.