Agile Performance Testing

agile, blog, testing

I would like to share my experience with agile performance testing with you. It is not about the knowledge of a tool like LoadRunner itself. it is about the process to set up a performance testing environment and how to embed it in your agile process. I had a session about this topic during the Agile Testing Days 2010 in Berlin. This blog is a summary of that presentation, this is the first blog about how to set up and agile performance testing environment. In the next blog I will describe how you could embed agile performance testing in your process.

For this moment I will assume:
  1. you have implemented Scrum or some kind of comparable Agile project management method, and the implementation is successful;
  2. quality is important in your organisation, if possible you first fix quality issues before you start developing new software;

How to set up Agile Performance Testing

One day you go to the office and decide to implement agile performance testing. Why? There could be different reasons, for example:

  1. a customer found serious problems related to performance;
  2. you would like to take no risk before you deliver software and would like to be certain there no performance issues;
  3. you need to spend you yearly budget on software and decide to buy an expensive performance tool;
  4. you start developing a complete new architecture and would like to monitor the performance from the start;
Where to start…
The first step should be of course to write down you requirements for the test tooling. There are several requirements when selecting a tool, for example:
  1. what kind of technology are you using, or planning to use in the near future;
  2. who is going to perform the performance tests, experienced developers or functional oriented testers with very less programming knowledge;
  3. it is possible to monitor the things you would like to monitor, CPU, Memory, Hits per second, etc;
  4. does it need to integrate with an existing tool, for example create automatically incidents in your incident system or link to requirements in the requirement management system.
Write down the requirements and perform research by surfing on the internet, reading magazines or visit congresses. The “normal” procedure for selecting a new tool or buying software. If possible try to arrange a demo of the performance test software in combination with your software. On paper everything is possible, always prefer to see live software with the features implemented that are important for you.
As soon as you have a tool selected, the next step is training. Quality should not only be the responsibility of the testers, but the whole team should also be responsible for quality. Therefore, not only testers should attend the training, also the developers and the technical test engineers. Everyone should know what the possibilities are of the performance testing tool. Try to use your own software during the training. If you encounter problems, you could directly discuss them with the trainer/consultant.
If everything went well, you have a tool selected and trained the people. Time for the next step…
It is my opinion that all team members should be able to build performance tests. Record and play functionality is nice. However, most of the times it doesn’t work or it is hard to maintain. This is because you are using the application structure directly in your test scripts. Therefore, you should build, if possible, a framework to build performance tests. The framework should be a layer that hides the application structure for the person who creates the performance tests. The framework could be developed by some developers and testers in the team. In general, developers tend to have more knowledge about setting up a good framework then testers.Almost there, we have a tool, team members are trained and a framework is there…. in my next post you will read how you could embed agile performance testing in your process.

Share:

Leave a Reply

Your email address will not be published. Required fields are marked *