Through the looking-glass, and what testers found there – EWT05

Today it was weekend testing time again, and things got pretty crowded with a pleiad of international testers: facilitators Anna Baik and Markus Gärtner, !ndra from Hyderabad, Shiva Shankari, Krishnaveni K, Nagashree Manjunath, Jeroen RosinkAjay BalamurugadasJassi and myself.

The target today was Virtual Magnifying Glass, an open source screen magnifier for Windows, Linux, FreeBSD and Mac OS X.

The mission was basically: “Test this!”. Although a mission like that is way too vague to start from, it *is* a mission that we all encounter once in a while. It is our job to ask questions to make sure we understand what is needed.  So I wondered… do they just need bugs? Or an advice? Or information about how the application works? Ajay also went in questioning mode: “Test : create test ideas, hunt for bugs, compare with a different product, learn the product, so many things? What exactly is required?”. So with a little help of Ajay, the mission was rephrased to “Find quality related valuable information about the product”.

So far so good. I wanted to try pair testing over skype and Ajay was willing to team up with me. We set up a call and were soon discussing in person which approach to take. Although we encountered connection problems later on, it was a good experience that bears repeating.

Lessons learned:

  • It’s easy to feel frustrated because time is too short. After a round of introductions, some explanation and a discussion about the mission, there’s less than 30 minutes left to explore the software. Even for seasoned Rapid Testers this would be an uncomfortably short timebox. Avoid frustration by setting realistic goals for yourself.
  • When pairing, you’re not starting testing right away. There’s some discussion first, some setting up needs to be done too. Pairing is probably more beneficial when there is more time to test, so there’s plenty of time to talk things through. Now I felt a bit rushed because the clock was ticking.
  • Read-me files are a good place to start when exploring an unknown piece of software. In this case, they provided me with a good model of the software, a good basis to start exploring from.
  • Logging bugs takes time. Time not being spent testing. An issue also addressed in the “Why is testing taking so long (part1)(part2)” blogposts by Michael Bolton. 

Some nice, thought-worthy quotes were dropped during the debriefing too:

  • “Clearing traps is a skill. Recognizing traps is a bigger skill”
    Ajay Balamuragadas (c) 2010
  • “Minds are shaped when guided under pressure in a certain direction trying to maintain vision and control” 
    Jeroen Rosink (c) 2010
  • “Watch out where the huskies go, and don’t you eat that yellow snow” 
    Frank Zappa (c) 1974

Well… I guess all that snow is finally getting to me. Frank never made it to this European Weekend Testing session. He wasn’t much of a tester either. But he sure knew how to play that guitar. And I think he would have been a great explorer, having fun all the way.

Advertisement

The ongoing weekend testing adventure – EWT03

A write-up of European Weekend Testing session 3 – EWT03

I had to skip the previous weekend testing session involving Bing maps (which was a pity – I absolutely adore map applications), but today I was able to participate in EWT03. It was again an interesting experience. Not too many participants this time, which made for a more cozy atmosphere. Today’s testers were Marlena Compton, Anna Baik, Markus Gärtner and myself.

The Mission: You work in a small-medium company, and your manager has been asked to evaluate switching the company over to using Google calendar. He needs a quick assessment from you before his conference call this afternoon. Use the FCC CUTS VIDS touring heuristic to guide you.

We started firing off a bunch of questions at the boss, but he was in a hurry and kind of unresponsive – something about an important golf game he had to attend. He just wanted a quick assessment of  the tool, he apparently wasn’t into answering today. So much for questioning – we were on our own pretty soon. I wonder if there’s any way to keep an imaginary boss from leaving a virtual meeting. Still not sure about that.

We decided to divide the coverage since we started late and there was less than half an hour of testing time left. We ended up cutting the FCC CUTS VIDS into – surprise! – FCC, CUTS and VIDS (yes, sometimes you just go for the obvious). I settled for the VIDS:

  • Variability tour: look for things you can change in the application – and then you try to change them.
  • Interopeability tour: what does this application interact with?
  • Data tour: identify the major data elements of the application.
  • Structure tour : find everything you can about what comprises the physical product (code, interfaces, hardware, files, etc…)

In the meanwhile the boss had magically reappeared to give us a quarter of an hour extra – he probably forgot his bag of golf clubs. But I quickly realised that time would still be way too short to look at all aspects and concentrated on variability and interoperability – and on a list of existing bugs as well, something I remembered from the first session. In my quest for things that can be changed in the application, I settled for the system date, not really a part of the application but still something that can change while you’re using it. Some interesting bugs ensued, including a very severe one in Skype group chat – not really part of the mission. I thought I had lost connection but it turned out that the order of the chat messages was totally messed up. It took a while before I realised what had happened and lost a good deal of time because of that. Later on, I switched to interoperability but I also lost my internet connection for for a while, which was annoying but in retrospect also pointed me to a synchronisation bug.

The debriefing, moderated by Anna, revealed some good points from the other participants. Marlena covered the FCC (feature, complexity, claims) part and did a feature tour first, which proved very helpful. She explored all functionalities to get more familiar with them before she went off to investigate other areas. She wondered how the rest of us could start testing without doing such a feature tour/exploration first. But in fact, I think we all started with a bit of general exploration. I know I did. It feels kind of natural, probing around the surface for a while before before diving into the abyss of detail. Concerning the splitting up of the heuristic among several testers, this turned out to be a bit distracting, confusing and counter-productive. There is quite some overlap between the different areas and sticking to only some of them might mean that you’re missing out on otherwise obvious problems. This led us to the use of sapience in testing, being able to picking the heuristics that are useful for the given features.

We didn’t get to brief the boss in the end, so you could say that our mission didn’t really succeed. But hey, some say that failure breeds success. Soon, things will be looking so bright we’ll have to wear shades.

Weekend Testing hits Europe – EWT01

The first European Weekend Testing session (January 16, 2010)

Last saturday, weekend testing hit European ground. The Weekend testing phenomenon has been the talk of testingtown for a couple of months now. They originally started out as the Bangalore Weekend Testers and are gradually taking their brilliant mix of cybergathering / testing / learning / mentoring / communication / pairing to a higher level. I  became intrigued when I first heard about them through a column by Fiona Charles and some namedropping at Eurostar shortly after. I liked the idea of personal and professional development, while serving the open-source community as well. My first thoughts were “what a shame we don’t have this over here” and “wouldn’t European testers like this too?”.

They do, apparently.

Markus Gärtner and Anna Baik – whom I met online in the Software Testing Club, set up the first European chapter and planned the first session with the help of Ajay Balamurugadas, a weekend testing veteran from Bangalore. On saturday january 16, some great  testers – and me –  gathered online for a unique experience. Among the participants: Markus, Anna and Ajay of course, Phil Kirkham, Jeroen De Cock, Anne-Marie CharrettThomas Ponnet, Maik Nogens and Jassi.

After some initial babylonic confusion in a skype group call, we decided to settle for a group chat. There was a round of introductions first, during which everyone stated his name, experience and what he/she wanted to take away from this session. After that, Ajay (aka the great facilitator) revealed the mission and the target to us, whilst stating enigmatically “that there were traps”. Right then I knew that this wasn’t going to be just another round of testing. The target turned out to be a piece of open-source image-editing software by the name of Splashup. From that moment on, there was a lot of questioning going on – one of the most important testing skills. People explored, asked questions, gave feedback. Some started off on the wrong foot, but quickly recovered and learned from that. We made a lot of assumptions as well, and not always good ones. In the beginning I was a bit distracted by all the chat messages flying around while testing, but once I developed that second pair of eyes, it worked out just fine. The first hour flew by and everyone sent in their bug reports on time (yes, there *was* a deadline). Peer pressure, in a good way.

Next up was an interesting group discussion. Everyone shared their impressions, approaches, things they learned and the traps they fell into. Ajay did a great job in moderating this, and made sure all testers were involved. Here are a couple of things I took away from this first session:  

  • When confronted with testing within a really short timespan, it is good to quickly decide on your personal mission and focus, and stick to that. Try to stay focused on the areas you chose and don’t go wandering off, unless of course you accidentally open pandora’s box and bugs are thrown at you. I sticked to the file opening and saving functions – and there’s plenty to report about that. 
  • I think the way we start testing largely depends on our domain knowledge as well. When confronted with a completely new product, you probably start exploring in an attempt to primordially learn about the application first. Knowing the domain makes you less of an explorer – and more of a bug hunter, which may be dangerous – make sure you don’t block out the learning. I had worked with Splashup before, which gave me a (misplaced?) sense of ‘expertise’, so I dove in and went on an immediate bughunt.
  • Always question the mission you get. Question your assumptions about the mission. The main goal might not always be a mere bughunt.
  • There are some great exploratory testing tools out there. I used Session Tester, which allowed me to quickly generate an html-report at the end. It is great for notetaking too. I really like the ‘Prime Me’-function in there – great to get new testing ideas when you’re stuck. I didn’t have to use it this time – ideas galore.
  • When working with image editing programs, I immediately tend to go in that ‘comparable product’-mode and start comparing it with Photoshop. While that may be ok to have some idea of the expected behaviour of filters and effects, it might not be ideal for an open source application. It makes you expect too much. Free tools often cater for other needs – their users have different expectations.
  • Afterwards we all agreed that it was strange that we didn’t pair up while testing, or that we didn’t do a short ‘who will do what’-briefing before we started.  Divide and conquer, we didn’t. I think that was mainly due to the fact that it was all pretty new to us. This will be easier once we get to know eachother better. Another lesson learned in software testing.

As some scientist already pointed out (and actually proved too), time is relative. The planned two hours quickly became three. I have the impression that everyone was thrilled about this fresh kind of collaboration. Interacting with passionate fellow testers, talking to them, learning from them – it all made for a great experience. Sure, it is probably not always easy to attend the sessions, you actually have to plan for it – make sure the kids are entertained elsewhere. But the reward is significant.

People might argue “weekends, isn’t that supposed to be quality time away from work?” – I admit that this thought crossed my mind too. I now realise that this *is* quality time, and far away from work as well. Quality learning time. And fun fun fun – did I mention it’s lots of fun, too?