Testing and Scrum

Here you can find a link to a presentation I gave during the Scrum Falls in London in November 2007 or just watch the presention below.

Below is a short description of the presentation.

Scrum is a project management framework; it doesn’t contain any developer or test practices. In most companies Scrum is used in combination with XP, Scrum is used for project management and XP practices are used to guide development.

If you’re a traditional tester and you start in a project that is going to use or is using scrum you will have a hard time to find out what you have to do. Scrum doesn’t say anything about testing and XP does say something about testing but it is not a guidebook for a tester.

I want to share our test experiences with the audience, we use scrum for two years and I think we have a good test process that is interesting for everyone who is involved in building software.

In this presentation I will share our experience from Planon about testing and scrum. I want to answer the following questions:

  • Do you need specialized testers in Scrum; 
  • How did we introduce testers in the development teams; 
  • What is the role of the tester in our team; 
  • How do you recruit testers for a Scrum team; 
  • How you do organize the testers over the teams; 
  • Do you write a (master) test plan? 
  • What about formal testing techniques in Scrum; 
  • What kind of test automation harness do we use; 
  • How do we report about testing; 

The audience will have good overview of the test activities we do at Planon and will get tips how to improve the testing process in their own projects.

Our First Why-Did-We-Miss-That Session

Today we had our first Why-Did-We-Miss-That session. I read about this meeting in the blog of Elisabeth Hendrickson.

We already discussed last year that we should analyse our customer problems. We didn’t know how to do it via a structured way and because of all the other important things we didn’t find time to sit down to analyse the problems.

So that’s why I just planned this session, now your of course are curious to our experience…

I planned the session from 13:30 till 15:30, two hours. One hour for the first part en the rest for the second part.

I prepared the meeting, ordered index cards, reserved a room, created a filter in our bug registration system and send every tester the link to the blog. I explained the purpose of the meeting.

We started analyzing and discussing the problems. We analyzed approximately 40 problems of the 140 problems. We run out of time, so we stopped and started with the next step. But first coffee and smoke break.

One person read the cards and the rest of the team created groups. For some cards it was very clear to which group they belong, but we also had a lot of one-card groups. It didn’t work let all persons group cards, we lost the overview. During the grouping it became clear that not all people wrote actionable cards, so some explanation was needed during grouping the cards.

After the grouping, we made up actions for every group. Turned the stack of card up side down and wrote down the actions on the cards. We came up with 15 actions, all kind of actions for example extend our default ET Charter should be extended, more installation testing and educate the help desk to make clear the differences between the old and new product.

As final test we selected random 15 problems and analyzed if they fitted in one the defined groups. 11 problems fitted in a group, only 4 problems would not have discovered. A good score in my opinion.

Of course we did retro at the end of the meeting, the following improvements the following items were mentioned:

  1. It is a intensive meeting, so plan it early on the day and not with another meeting on the same day;
  2. All calls should be in the same language;
  3. We should make a pre-selection, but this has a risk that you will not find all improvements;
  4. The next meeting will be a few months after a customer release (not an internal release) or when there is are for example 50 new problems;
  5. We didn’t find anything surprises…. I think this is a good thing and proves that our Test Retrospectives work well and that we already continuously try to improve our selves.

So what is the conclusion for us? A great tool to periodic analyse our test process.

Still wandering if this is something for you? Just try it and improve it, don’t be afraid of trying something new.

InfoQ: Does “Done” Mean “Shippable”?

A very interesting article about what is done and what is shippable, my opinion is in the comments of the article.