Agile Performance Testing – How we integrated it into our process

agile, blog, testing

I would like to share my experience with agile performance testing with you. It is not about the knowledge of a tool like LoadRunner itself. it is about the process to set up a performance testing environment and how to embed it in your agile process. I had a session about this topic during the Agile Testing Days 2010 in Berlin.

This blog is a summary of that presentation, this is the second blog about how to integrate performance testing into your agile environment. In this blog I described how to set up an agile performance environment.

With performance testing we would like to do two things:

  1. Teams should be able to deliver features that are done, including performance testing;
  2. Detect regression;
If all development teams would need a time to do performance testing just before we release the software, we would need several weeks extra at the end of the release. Sounds like a waterfall approach isn’t? We would like to have all user stories/product backlog items/features to be done at the end of the sprint. Done is including testing, documentation etc. Testing is including performance testing.
As soon as we had the Loadrunner framework in place, we trained all teams. All testers attended the training and everyone who was interested. Several programmers attended also the training. In the training they learned how to build scripts and how to execute them in the performance test environment.
We have identified two types of scripts, sprint scripts and weekend scripts. The sprint scripts are build by the teams to verify new features and weekend scripts are to detect regression.

Sprint Scripts

The goal of sprint script is to test a feature. We have the general requirement that the client always should respond within four seconds. In some cases the product owner has extra performance requirements.that need to tested. During the sprint planning it is discussed for features if a performance test is required. Reason could be for example a change in the framework. It is discussed with all the members of the team if a performance test is required.

If it is decided that a performance test is required, the testers build script. If necessary they use integration tests to generate data or request a customer database. The testers verify the scripts on their local machine and as soon as they are ready, the scripts are executed in the performance test environment. This is environment is available during weekdays for the teams. The new scripts are added to the cvs database.

The scripts have a lifespan of one sprint. The scripts are not maintained when the testing of the feature is done.    If the new feature is important, that is determined by the team, product owner, lead architect, for the project or product owner, the scripts could be promoted to the weekend scripts or the weekend scripts could be expanded to cover the new feature.

Weekend scripts

The second goal of the performance tests environment is detect regression. It could be possible that a team develops a new feature that impact the global performance of the application. They did not think about writing a sprint script and did not expect any impact on the performance.

For these case we have developed weekend scripts. Weekend scripts detect regression and trend data is generated. As a result, we are able to see if the performance increases or decreases during a release of six months.

The weekend scripts cover the most import areas of our software. These areas are discussed with the architects, testers, product owner and lead architect. Goal of the weekend scripts is not to test everything. However, we try to test the 20% of the code that generates 80% of the generic performance problems.

The technical test engineer (toolsmith) is responsible for writing and maintaining the weekend scripts. He is supported by the development teams.

The weekend scripts are automatically executed during the weekend. The reporting process is explained in the next session.

Weekend scripts have a life span of multiple releases and need to be maintained. When the software changes, the weekend scripts need to be changed also (only when the change impacts the scripts of course).

Report Integration

In our organisation we use a lean principle, when there is an error in the production line, first fix the error, then continue the line. The teams first fix the failing automated tests, discuss the open problems found during development and as last start working on new features. When something fails in an automated tests or is reported as a problem, it is automatically processed in the process.

When a weekend script fails, this could be because of an error or the response time is to long, an call is created in our bug tracking system. The bug is assigned to the team responsible for the module. When the team comes in on Monday and discussed the open bugs, the bug reported by the performance test is automatically discussed and assigned to a team member. During the week the team will fix the bug, and in the next weekend the test will run green or fail again.. in the second case they are two bugs assigned to the team.

The problems found during the execution of sprint script are not reported automatically. These test are executed by testers during the sprint and any issues are handled during the test execution.

Conclusions

The advantages of this approach:

  1. We don’t need any extra testing when we deliver a new release. All the testing is already done during the sprint and the regression tests are executed every weekend;
  2. Testing has become more attractive for (some) testers. Our testers are also executing performance tests, they have expended their work experience.
What did we learn:
  1. teams still need support of the technical test engineer or more experienced testers. Especially when testers are doing performance tests the first time, Loadrunner generates a  lot of data… where to look for the possible problems.
  2. it is expensive to set up performance tests, mainly the time and also the licenses costs. Of course depending on which tool you use. Be wise, and determine if you are going to setup your own performance tests or use external capacity;
We made the right choice by implementing performance tests in our project. I spoke to an tester from an test company and they did some performance tests on our software for customer. They didn’t find any issues…. that made me happy.
If you have any questions after reading this blog… please leave a comment or send me an email.
Share:

2 thoughts on "Agile Performance Testing – How we integrated it into our process"

Leave a Reply

Your email address will not be published. Required fields are marked *