Rapid Software Testing – skilled software testing unleashed

Up to 11

The software testing profession looks like a steadily maturing profession from the outside. After all, there are certifications schemes like ISTQB, CAT, IREB and QAMP (the one to rule them all), standards (ISO 29119) and companies reaching TMM (test maturity model) levels that – just like a Spinal Tap guitar amplifier – one day might even go up to 11. The number of employees that companies send off to get certified in a mere three days is soaring, and new certification programs are being created as we speak. Quick and easy. Multiple choice exams for the win!

The reality, however, is that the field of software testing is torn between different “schools” of testing. You could see these schools as determined and persistent patterns of belief, speech and behaviour. This means that different people – all calling themselves “test professionals” – have vastly different ideas of what testing is all about. Even something elementary as the definition of testing varies from “demonstration of fitness for purpose” to “questioning a product in order to evaluate it”, depending on who you talk with (for more info on the schools of software testing, I heartily recommend Brett Pettichord’s presentation on the subject).

And so it happens that different people think differently about “good” or “mature” software testing. I, for one, don’t believe in tester certification programs, at least not in the format they are in now and the way they are being used in the testing profession. The current business model is mainly designed to get as many people as possible certified within the shortest timeframe. Its prime focus is on certifiability, not on tester skill, and certainly not on the advancement of the craft. Advancement comes from sharing, rather than shielding.

Rapid Software Testing (RST)

So what are the options for a tester on a quest for knowledge and self-improvement? What is a budding tester to do?

I think there are valuable alternatives for people who are serious about becoming a world-class tester. One of these is Rapid Software Testing (RST), a 3-day hands-on course designed by James Bach and Michael Bolton.

Actually, calling this “a course” doesn’t do it justice. RST is at the same time a methodology, a mind-set and a skill set about how to do excellent software testing in a way that is very fast, inexpensive, credible and accountable. It is a highly experiential workshop with lessons that stick.

How is RST different?

During RST you spend much of the time actually testing, working on exercises, puzzles, thought experiments and scenarios—some computer-based, some not. The goal of the course is to teach you how to test anything expertly, under extreme time pressure and conditions of uncertainty, in a way that will stand up to scrutiny.

The philosophy presented in this class is not like traditional approaches to testing, which ignore the thinking part of testing and instead focus on narrow definitions for testing terms while advocating never-ending paperwork. Products have become too complex for that, time is too short, and testers are too expensive. Rapid testing uses a cyclic approach and heuristic methods to constantly re-optimize testing to fit the needs of your clients.

What’s in it for you?

  • The ability to test something rapidly and skilfully is critical. There is a growing need to test quickly, effectively, and with little information available. Testers are expected to provide quick feedback. These short feedback loops make for more efficient and higher quality development
  • Exploratory testing is at the heart of RST. It combines test design, test execution, test result interpretation, and learning into a seamless process that finds a lot of problems quickly. Experienced testers will find out how to articulate those intellectual processes of testing that they already practice intuitively, while new testers will find lots of hands-on testing exercises that help them gain critical experience
  • RST teaches you how to think critically and ask important questions. The art of questioning is key in testing, and a very important skill for any consultant
  • RST will provide you with tools to do excellent testing
  • RST will heighten your awareness

Bold claim bottom-line:

RST will make you a better tester.

RST comes to Belgium

Co-learning and Z-sharp are proud to announce that from 30 September – 2 October, Michael Bolton will visit Belgium to deliver the first ever RST course on Belgian soil, giving you the opportunity to experience this unique course in person. More info can be found here, or feel free to contact us for more info.

Brace yourself for an mind-opening experience that will energize and boost your mind.

All the way up to 11.

(for even more information and testimonials about RST, see Michael Bolton’s RST page)

 

Advertisement

Exploratory bug diagnosis

The prologue

At the Let’s Test conference last week, I attended a half-day tutorial on bug diagnosis by James Lyndsay, in which we tried to analyze the actions of testers when pinpointing bugs. We did all this by identifying our actions during some bug diagnosing exercises. My learnings kept lingering in the back of my mind throughout the conference (which was excellent, by the way). When I noticed on the way back home that something was wrong with the songs on my iPhone play-list, I decided to test my newly learned diagnosing-fu by describing my learnings in an exploratory essay while trying to find out what the problem really is.

At the time of writing, I don’t know what the cause of that problem is, yet. I will document my knowledge as it evolves (hopefully). So cover me, I’m going in…

The problem

On the way from Stockholm to Brussels, at a cruising altitude of 10.000 km, the hostess tells us it is safe to turn all electronic devices back on. I whip out my phone and start flipping through the albums that were uploaded via a newly created play-list. I select an album and hit play. But something’s amiss – that familiar album sounds less familiar this time around. It takes a while before I realize that the album didn’t start with its opening song. Did I hit shuffle unknowingly, by any chance? That has happened before… Nope, shuffle is off. I go to the details of the album and notice that the first song is not there.

First probing

Strange, that. I initially dismiss it as a one-off, but the next album I listen to, the same phenomenon occurs. All songs are there but the opening one. I check the other albums and it turns out that half of the albums uploaded to my play-list lack their first song. If a bug is something that bugs a user, this must be a capital B one. Having to listen to incomplete albums seriously bugs me; what’s even more: it takes away my desire to listen.

[I notice how I react quite emotionally to the strange behavior. Emotions are a powerful oracle. Although I have only limited knowledge of the problem now, I declare it officially a bug]

Defocusing & narrowing down

I feel frustrated because further bug investigation possibilities on my phone seem limited. I put it away and decide to defocus. A little in-flight snack and some reading manage to temporarily distract me, but an hour later the bug creeps up on me again. I follow my energy that leads me to start a static analysis. Although the symptoms are all on display here, I suspect the cause is not located in my phone but rather within my PC, iTunes or in the synching between iTunes and the phone.

[I just narrowed down the scope of the investigation with a first broad hypothesis]

From possible to plausible

I haven’t got iTunes at my disposal right now, but while I am at it, I refine my previous hypothesis into a couple of more specific ones that I will be able to confirm or refute when I get home:

  1. The original source mp3-folders contain incomplete albums
  2. The albums were uploaded wrongly in my iTunes library
  3. The albums were copied wrongly into the play-list
  4. The albums were synchronized wrongly from the play-list to the phone

[This list of 4 contains possible causes]

I can narrow these down further because hypotheses 1 and 2 are highly unlikely. The very same songs, from the very same source folders were recently used in other play-lists without a problem, and I haven’t noticed songs missing from albums in my music library.

[This makes hypotheses 3 and 4 the most plausible ones – better concentrate on these]

Checking hypotheses

Back home, I am reunited with the family, and with my iTunes library that resides on an external storage drive (which also happens to contain the original mp3-files). I quickly check option 1 and 2, because I am painfully aware of biases in my memory, and my thinking. [ Although I think these two options are less likely, you never know. I’m a tester, and we know things can be different, right? Well, in this case, not so much]. The suspicious albums in the source folders are complete, as they are in the music library.

[I notice that, rather than checking all albums, I tend to focus on one sample album to check my assumptions. Comparing the same sample throughout the hypothesis increases consistency and diminishes possible distortions. A possible risk is of course that this could turn out to be a not-so representative example]

This leaves me with 3 and 4, the plausible ones.

Were the albums copied wrongly in the suspected play-list? I know they have been correctly copied to other play-lists before, so I am curious to see if this can really be the case.

A-ha! Now we’re getting somewhere. Song number one, “Get Miles” got lost in the mists of the copy from library to play-list.

[I now come to realize that this was to be expected, since the synch process was designed to synch exactly what is in this play-list. Oh well, better be safe than sorry. This causes me to drop hypothesis 4, because the synch did exactly what is was supposed to do]

Reflecting & diving deeper

So, time-out for a second. What is happening? The contents of some albums were corrupted somewhere during the transfer to the play-list. First thing that strikes me: why only half of the albums? Why not all of them? They were all dragged to the play-list in the same session. Is there something I did differently for some albums? I recall that I started with importing individual albums into the library, but that I then resorted to a bulk import of the remaining albums. Maybe the “bulk-imported” albums are causing this? Then again, they are correctly loaded in the library, it is when they were transferred to the play-list that things went awry.

[While diving deeper within hypothesis 3, I develop a sub-hypothesis]

3.   The albums were copied wrongly into the play-list
3.1.   The problem with the play-list has something to do with bulk imports

I check hypothesis 3.1: I do a bulk import from several albums in one folder, and then transfer those to a newly created play-list. To no avail. I drag the suspicious album to a new play-list, but all 12 songs are there. I drag a couple of similar ones in there separately. Nothing wrong with them.

[This is not really working for me, and it starts to get boring. Let’s drag all of my available albums in, at the same time]

Bingo! Many albums there with the first songs missing. That was easy. Triumphantly, I clean the play-list and repeat the same action, to confirm.

[Repeating experiments can decrease uncertainty, but can also  free us from the illusion of control]

Nothing. All back to normal again. Huh? What did I do exactly, that first time? I launch several attempts to reproduce what I had first seen, including starting from a new play-list from scratch, all of them unsuccessful. It takes a while before I realize that I just copied the contents from the faulty play-list into the new one.

[So I start making mistakes. Back to square one. I abort hypothesis 3.1. and decide to catch some sleep]

New perspectives

Another day, a fresh perspective. What else is striking about this bug? It occurs to me that the solution might lie in the fact that every single one of those missing songs is the first song on the album. What does the missing “number one” tell me?

  • Order of play?
  • Something that started wrong and then went well?
  • Switching between albums?
  • Corrupting the first song and switching albums?
  • Switching from a bad to a good state?

[I am now focusing on the “why” of the first songs, whereas in the previous hypothesis I was focusing on the “why” of only half the albums]

Was there something I did that made the first songs go in a special state?  Suddenly, I remember… I keep forgetting that I moved the iTunes library to an external drive, it used to be on my laptop until a month ago. That means that iTunes does not recognize the songs in my library as long as the external drive is not connected to my laptop. That is no problem, as long as I don’t perform any actions on the songs, like dragging them or playing them. Otherwise the songs get a lovely exclamation mark in front of them. I disconnect the external drive and try to play the first song of the album in the library. The trusted exclamation mark appears:

I find myself investigating a new sub-hypothesis of hypothesis no. 3:

3.   The albums were copied wrongly into the play-list
3.1.   The problem with the play-list has something to do with bulk imports
3.2.   The problem has something to do with a disconnected library

Usually, the moment I notice I forgot to hook up the external drive, I quickly connect it and no harm is done. I wonder what would happen if I now connect the external drive again and drag the album to a new playlist in this state? [I have the feeling I’m nearly there. Could it really be…?]

I feel I am finally making some progress and refine 3.2 into 3.2.1.:

3.   The albums were copied wrongly into the play-list
3.1.   The problem with the play-list has something to do with bulk imports
3.2.   The problem has something to do with a disconnected library
3.2.1. Actions performed on songs while being disconnected from the library cause the songs to be skipped when copying albums to play-lists, even when the library is connected again at the time of copying

I do the experiment, and it confirms my hypothesis. I repeat the same procedure with an other album and this time, the same behavior occurs.

[I was kind of hoping and expecting that this would happen, which is normal behavior, but which can also be a danger during testing. We tend to focus more on things we really *want* to see]

The Cause & the Trigger

This last discovery leaves me with some mixed emotions. I feel happy to know what caused the missing songs, but I am also puzzled as to why so many songs have received the exclamation mark without me noticing. I can perfectly reproduce the problem, and I’m pretty sure it won’t happen to me again, since I will now be aware of the little exclamation marks while making play-lists. I have found the cause of the strange behavior that kept me busy for quite a while, but still… I am not sure how it got triggered in the first place.

I do have a trigger hypothesis (for now): flipping through albums using the cover flow view and hitting enter or trying to play them while the library is not there only marks the first songs with an exclamation mark. I recall that tried to listen to some albums, but not the complete amount that had missing songs. So there is still a decent amount of mystery involved.

Epilogue – is it a bug, really?

When I was first confronted with the problem, I proclaimed this a bug with capital B, because it annoyed me – the user – and it made me stop using the product. Has my opinion changed now that I have lived with the thing for a couple of days? I would argue that it has.

The behavior only seems to occur in very specific situations, and although the impact was quite big for me, it is unlikely that it will happen to me again. Is there a possibility that others will stumble upon this? Well, I stumbled upon it, so chances are that others will do too. And I certainly think there are other people like me that have their libraries on other media that are not by default connected to their computers. So yes, I think it IS a bug, although not as severe as I initially thought it was. This goes to show that we adapt severity and priority to our gradually evolving knowledge about the bug, and to the changing context (Something Rob Sabourin neatly pointed out as well in his brilliant Let’s Test keynote).

What I got to know about the problem so far leads me to believe that the product (iTunes) can be improved in a couple of ways (actually, there are plenty of other ways of improving it, but I digress). How about the following ones, for starters:

  • Doing a re-check of previously failed songs in case connectivity has been restored?
  • Removing obsolete exclamation marks when an external library is re-connected?
  • Adding a notification when trying to copy “songs not found” to playlists?
  • Making it more conspicuous to the user when the music library is not connected?

This concludes my adventure that started on the way back from Let’s Test. I wrote this post in several stages as I was trying to get a grip on that devious bug. It didn’t turn out to be the “clean” or “clear” bug I hoped it to be. Perhaps the iTunes product managers will even say it’s cosmetic or trivial. After all, they make the call. Oh well. I learned valuable stuff in the process. I learned that wording/noting your thoughts in the process helps you to see where your line of reasoning is heading, and what the (sometimes hidden) hypotheses are. It was all about the journey[1] of course, and not so much about the eventual outcome (which I felt was only a partial success).


[1] Although it was a personal journey, it was inspired by James Lyndsay, who encouraged me to share my thoughts on diagnosing bugs

On being context-driven

This is the transcript/elaboration of a lightning talk I did at the Context-Driven theme night organised by TestNet in cooperation with our Dutch Exploratory Workshop on Testing (DEWT).

The hardest part

There comes a moment in the career of a context-driven tester where he is bound to have a sobering epiphany: for every situation where he knows the right approach, for every situation where he knows the perfect tools for the job, he comes to realize that there are numerous contexts where that approach isn’t the most appropriate one, where his ‘best practices’ are not usable. Or maybe they *are* usable, but they lead to suboptimal results.

Maybe I shouldn’t generalize. This is how it happened to me at least, and that was the moment when I became aware of the main principle of Context-Driven Testing: how you approach things is driven by the context of your project, not by your process. That is also the main difference with the common methodologies that try to replicate the same process over multiple contexts.

I admit that was a source of frustration for me. Going context-driven is certainly not taking the easy road – it would be much easier to implement the same processes everywhere I go. But none of that. Luckily, it happens to be the most exciting and challenging road I know.

Knowledge and awareness

What can we do to arm us against all these different changing contexts we find ourselves in? Gather knowledge, get experience, learn. Make your tester toolkit – with all your techniques and tips and tricks in there – as big as possible . That way you’ll be able to pick the right approach at the right moment.

Talk to fellow testers as much as possible. Grab every opportunity to network with them. Twitter is a blessing for these things – it has rocked my world and continues to do so. Conferences are hotspots for fascinating people, national and international alike. Events like this Testnet theme night are golden, they really are. Learn and read continuously. About testing, for sure, but also about other disciplines. I think there are many lessons to be learned from eg. psychology, sociology, philosophy and science. Some of the sharpest testers I know are physicists and philosophers!

Try out new ideas, new stuff, new approaches that you haven’t tried before. Apart from it being more fun, Marie Pasinski (staff neurologist at the Massachusetts General Hospital) recently wrote  that studies have shown that engaging in novel, stimulating activities promotes the growth of new neurons in the hippocampus (1). Putting this to work in your everyday life can be as simple as trying out a new recipe, taking a different route to work, reading up on the newest technology trends, or meeting new people.

Widen your awareness. Keep eyes and ears open, at all times. Absorb everything, as if you were a sponge. Are you familiar with the phenomenon where you happen upon some obscure piece of information – often an unfamiliar word or name – and soon afterwards encounter the same subject again, often repeatedly? It sure has happened to me, and it has a name as well: the Baader-Meinhof phenomenon. However strange it may seem, it is not that illogical. It just proves that our brains are fantastic pattern recognition engines. This is a characteristic that is highly useful for learning and I think we can use this to our advantage. The more we are aware of things surrounding us, the more we absorb knowledge, the higher the chance that it will keep lingering in our subconscious, and the likelier that a piece of knowledge will surface – and will stick in memory – when we need it.

The importance of knowledge in the context-driven community cannot be overestimated. That is the reason why there are so many initiatives for sharing that knowledge: free coaching by experts, peer workshops (DEWT, anyone?), tester meet-ups, weekend testing, Pair / Learn / Present, blogs, lots and lots of course materials available online…

The context-driven community is a very open community that focuses on sharing. If you are curious and want to know more, do get in touch. We’re more than willing to help where we can. 

(1) Marie Pasinski – Beautiful Brain Beautiful You, 2011

Home Swede Home – Øredev 2011

Last week I attended (and presented at) the Øredev conference in Malmö. Sigge Birgisson invited me to be part of a fully Context-driven test track, which I gratefully accepted. It turned out to be quite a memorable experience. Øredev was the first ever *developer* conference (be it with a testing twist) I attended, which gave the event a totally different vibe for me. Cosy, laid back and open-minded. Geeky too, in a good way:  they provided a cool conference app with a puzzle that could only be solved by obtaining other people’s codes. The side effect of that was that random people started addressing me with “Hi. Can I have your code?” moments before bolting off in their own space-time continuum. Speed dating for techies.

Another thing that really stood out were the graphic live-recordings by Heather and Nora from Imagethink. These talented ladies recorded every keynote live on stage, and made the beautifully looking artworks available as handouts later on. A brilliant idea.

As for the proceedings of the conference – here are some personal highlights:

Day 1

Day 1 had no real testing track, but there was enough fun to be had in other areas of the development spectrum. As the conference was centered around “the user” (Enter Userverse), it kicked off with “Only your mom wants to use your website”, an entertaining keynote by Alexis Ohanian, of Reddit and Hipmunk fame. Hey, the guy even spoke at TED about a whale called Mister Splashy Pants – top that! This time he told a compelling story about how the secret behind succesful websites is caring for your users. He told us that generally, the bar on websites is raised so low that it is really easy to stand out if you’re able to delight your user.

In “Collaboration by better understanding yourself”, Pat Kua stated that people have lots of in built reactions that hold us back from collaborating more effectively: power distance, physical distance, titles, even clothes. What could help us? Awareness, feedback, breaking the cycle, XP practices, courage. A good talk with good content, and some good book recommendations as well.

Johanna Rothman managed to keep me engaged for her whole talk about “Managing for collaboration”. She talked about how to manage the entire system for success, and how we should optimize and collaborate on the highest level, solving problems for the entire organization, not the project. I had the privilige of getting to know Johanna in her may 2011 PSL (Problem Solving Leadership) class, which she organizes together with Esther Derby and Jerry Weinberg. I knew she was a great storyteller, and she did not let us down: one gem was how she upset management by donating her entire bonus to her team and letting them decide who got what. 

Neal Ford closed off the day conference with “Abstraction distractions”, in which he dissected abstractions that have become so common that we started mistaking them for the real thing. An abstraction is a simplification of something much more complicated that is going on under the covers. As it turns out, a lot of computer programming consists of building abstractions. A file system, for instance, is a way to pretend that a hard drive isn’t really a bunch of spinning magnetic platters that can store bits at certain locations, but rather a hierarchical system of folders. And what’s that icon on a save button again? A floppy what? In addition, we shouldn’t name things that expose the underlying details. Users really don’t want save buttons, they just want their stuff to be saved. He also quoted Joel Spolsky’s Law of Leaky Abstractions: All non-trivial abstractions, to some degree, are leaky.

The day ended with drinks, dinner and some live jazz. I ended up talking testing (among other things) with Pradeep Soundarajan over dinner, when suddenly a late night evening session was announced: Copenhagen Suborbitals. At that moment, it reeked of a mediocre techno-act from the late nineties and I didn’t really feel like joining in. But Pradeep was curious enough and I decided to tag along. 

Flash forward one hour. Pradeep and I were literally blown away by a passionate tale of two Danes with a dream to build and launch their own manned rocket into space. Peter Madsen told a compelling and inspiring story about dreams, constraints, possibilities, enthusiasm, courage and rocket fuel.

Day 2

Day two was kicked off by Dan North who talked about “embracing uncertainty”. Fear – he said – leads to risk, risk leads to process, process leads to hate… and suffering and Gantt charts. Dan stressed that people would rather be wrong than uncertain, and that adding more process in times of uncertainty is wasteful and counter-productive. He also contrasted the original intentions of the agile manifesto in 2001, and what has become of that now. He stated that our ability to survive is directly related to handling the unexpected. We should embrace uncertainty, expect the unexpected and anticipate ignorance.

I decided to put up my basecamp in the “Test” room today, since this was context-driven testing day: six testing tracks covering a wide variety of topics. The only drawback was that the room looked like it was designed by an architect on acid: unfinished, an enigmatic door way up high in a wall, bare cables and sockets and a very short and high stage that forced you either to stand in front of the projection screen or to stay cemented in the same spot the whole time. Sound isolation was kind of peculiar too, although that only seemed to be  a problem when Americans were presenting nextdoors. But I’m nitpicking here: the whole Slagthuset venue was nice, and organization and technical team were super helpful, the whole day.

Pradeep Soundararajan‘s talk was titled “How I wish users knew how I help them through context driven testing”. Pradeep started by pointing out that he had the shortest abstract and the longest bio in the conference booklet. True. He seems to like long titles for his talks, too. In combination with his name, this probably makes him a nightmare to introduce at conferences. But in contrast with the title, his talk was short, crisp and funny. He was brave enough to do some live-demoing of his twitter-driven exploratory testing approach: looking for user feedback by searching in tweets with negative emoticons and profanities combined with the product or website name. I hadn’t read his blogpost before now, and it made me laugh out loud. I love the smell of profanities in the morning. Brilliant idea, that.

Next up was Shmuel Gershon, who shared an experience report of a 100% exploratory testing project, “Case Study on Team Leadership with Context-Driven Exploratory Tests”. He came well-prepared, all set to win our hearts with charisma, handouts and chocolats. He told us about how he took his team on a journey towards more context-driven testing and how he dealt with that as his role was also changing. He told us a story on test management, session based testing, recruiting even. He urged us to let people tell their stories, don’t start asking why, leaving them feeling that they have to justify themselves.

The ubiquitous Gojko Adzic (I suspect there are several clones making the rounds of conferences worldwide. Where /doesn’t/ he speak?) was his energetic self in his graveyard shift session called “Sleeping with the enemy”. Independent testing, he said, should be a thing from the past. Testers should engage with developers and business users, in order to create opportunities to accomplish things they cannot do otherwise. I like Gojko’s style, always direct and uncompromising, but always thoughtful. After Gojko’s presentation, a heated hallway discussion ensued in the so-called chalk-talk area. This embodies what conferences are all about: conferring. 

With “Diversity in team composition”, Henrik Andersson took the small stage trying to convince us that when assembling good teams, diversity rocks and uniformity, well, not so much. With some simple examples (“can I please ask everyone wearing black clothes to stand up. You are now a team”), he showed us that there’s much more to it than randomly throwing some people together.

Then it was Selena Delesie‘s turn to shine in the beamer lights. In “Focusing Testing on Business Needs”, she explained how to focus the testing effort on customer needs. She asked some pertinent questions; Are you valued in your team? How do you know?

The last presentation slot of the day in the testing track was for yours truly. In Artful Testing, I talked about how I think testing can benefit from the arts. From thoughtfully looking at it, to develop our thinking. From critical theory and the tools used by art critics, to become software critics. From artists, and how they look at the world – through artist personas). I also touched on the importance of context in evaluating art and software. I received some great reactions and feedback afterwards, and some good tips from Pradeep and Rikard as well.

After that, there was dinner, drinks and Øredev Open, where Pradeep was invited to present “The next generation software tester”. In theory. But you know how these things go. In theory, theory and practice are the same; in practice they are not: dinner took a bit longer than expected, drinks were abundant and so it happened that Pradeep took the stage for some Beer-Driven Exploratory Presenting. It was a great stand-up routine.

Ola Hylten joined in and Shmuel decided to whip out his box with tester games and puzzles. Time for some serious thinking, mixed with laughs. When the Øredev Open closed, we took ourselves and our silly games to the hotel bar where innocent passers-by quickened their pace.

Day 3

Day 3 in the test track started with “Agile testing: advanced topics” by Janet Gregory who highlighted five topics that had emerged since the release of “Agile Testing” by Lisa Crispin and herself. She mentioned feature acceptance (when you’re not able to deliver everything, focus on the features that matter), collaborative automation, large organizations, distributed teams, continuous learning.

Next up was my favorite Swedish philosopher (granted, I only know one), Rikard Edgren, who delivered did a spot-on and thought-provoking session called “Curing Our Binary Disease”. He stated that software testing is suffering from a binary disease: pass/fail addiction, coverage obsession, metrics tumor and sick test design techniques (sick as in “ill”, not “wicked” – my interpretation). Couldn’t agree more. He also mentioned his infamous “software potato”, which made for following legendary phrase: “A tester might not even know that he’s in the potato”. 

All this binary goodness got me thinking: Stay/Go? Focus/Defocus? Defocus it was. I chose to do a final round of the expo and do a quick Copenhagen visit to get some fresh air while it was still light out.

That concluded Øredev 2011. It was great to finally meet Selena, Sigge and Pradeep. And Robert Bergqvist as well. It was great catching up with others (Johanna, Shmuel, Henrik, Janet, David, Ola, Rikard,…). Next up: Eurostar in Manchester next week. A full-blooded tester conference that will rock as well. Let’s meet there.

Finding Porcini

A couple of weeks ago I found myself in southwestern France, a region which – at the time – was being struck by a unseen spell of global wetting. Summer had arrived three months early, people said. April and may had been exceptionally dry and warm; but in July, autumn was knocking on the door of our vacation home. Early autumn, I was told by our gentle host Philippe, goes hand in hand with a peculiar phenomenon: fungi frenzy/mushroom madness. All the locals get this strange misty-eyed look and head for the woods to hunt for the precious boletus edulis, more commonly known as  fungo porcino, porcine mushroom, cep and lovingly referred to as “the brown plague”, as it tends to halt the local economy.

When Philippe invited me to join him on an early morning quest for porcini, I gladly accepted. It sounded like a treasure hunt, as fresh porcini are sold for outrageous prices to local restaurants. And who doesn’t enjoy a good treasure hunt after breakfast? We left for the woods, armed only with wooden baskets, a knife and a sturdy 4×4.

Mushroom hunting, I found out quickly, is an art in its own right. Slowly and carefully, like an old sensei, Philippe unveiled his mushroom hunting mysteries. And as the mysteries disappeared, a neat set of categorized heuristics came trickling through:

The Mission

  • “Ready Zeger? So, we’re looking for cèpes, têtes de negres and chanterelles. Leave the others be. Some are poisonous, others just don’t taste nice”
  • “When we stop? When we’re finding more mushrooms than we can carry. Or when we’re not finding anything anymore.”

Classification

  • “Wait, you don’t know what to look for, do you? Come over here for a sec. This is a vintage cèpe. A fungho porcino. Like all other boletes, it has tubes extending downward from the underside of the cap, rather than gills. The pore surface of fruit body is whitish when young, but ages to a greenish-yellow when older.”
  • “Be careful there. Some mushrooms look very much like porcini. You should look under the hood. If it’s yellow, don’t touch ’em.”
  • “When you’re not really sure, scratch the bottom of the hood. When the scratch turns purple, don’t touch.”

Timing

  • “Why we’re heading out all of a sudden? Porcini tend to appear after summer peaks, depending on the weather. Usually they pop up about a week after a wet spell.”
  • “Oh no, that’s not true, Zeger. The fact that we picked loads of mushrooms here doesn’t mean I’m not coming back here tomorrow. These things grow fast. They often push overnight, you know.”

Location

  • “Location is everything. You should look at open spots in the woods, where the sun can actually reach the ground. Look, that should be a good place over there. Do you see the sunbeams peaking through the leafs? Let’s head over there.”
  • “When you find one, mind your step. Where there is one, there are many. You might crush some perfectly good ‘shrooms hidden under some leaves or grass.”
  • “Spotting porcini takes a trained and experienced eye. Here’s a pro-tip: look for where the leaves bunch up – perhaps they are being pushed up by a growing mushroom.”
  • “Don’t spend your time looking near ferns, man. Ferns grow on intensely acid soil, porcini don’t”
  • “Hey Zeger! You see this black beauty here? This is a tête de nègre, a particularly tasty and expensive kind of cep. Look for them near oak trees.”
  • “This here’s a chanterelle. If you spot two of them, follow the line that connects them; they always grow on a straight line.”

Picking technique

  • “Use a knife. Never ever pull mushrooms out of the ground. Cut them. If you damage their mycelium, they won’t grow again next year.”
  • “Remember: be gentle, cut them near the bottom of the stem in a straight line. Don’t break the hood.”
  • “Wait! Don’t cut the really small ones – they’ll be worth much more later on.”

I was soaking with sweat after a couple of hours of intense scouting. In a short timespan, Philippe managed to transform me into a die-hard mushroom hunter. A novice still, but I felt I was learning quickly. Philippe’s heuristics (not best practices, mind you) helped me discern the good from the bad, finding porcini hidden under leaves and cantharelles in a neat straight line. I even developed my own heuristics as I went along: I started looking alongside paths through the woods – plenty of chances for the sun to peep through the deck of leaves, and easier to spot since the vegetation is less dense.

Testing porcini

As I was wandering through the woods with eagle eyes and at a snail’s pace, it all felt strangely familiar. When Philippe said “Where there is one, there are many”, it struck my tester chord. Here I am, a tester, looking for mushrooms, which doesn’t seem to be all that different than looking for bugs. No wonder I liked it so much. I also realized that when I’m looking for bugs, I use these kind of heuristics all the time, but all too often I’m not very aware of them. Which is a pity, because used consciously, these heuristics (“a fallible method for solving a problem”) can be a really powerful tool to boost your exploratory testing efforts.

  • Start with a mission – make sure you – and your team – know what to look for, since our conception limits our perception. Michael Bolton often quotes Marshall McLuhan on this: “I wouldn’t have seen it if I hadn’t believed it”
  • Make sure you’ve got your classification right. If you’re only interested in a specific kind of bug, maybe you shouldn’t waste time reporting others. You could consider parking them somewhere, or keeping the reporting rather lightweight by MIPping (mention in passing) them. But try to stick to your main focus for the session. And if you find a nice-looking bug, is it really? Scratch it, it might turn purple
  • Timing – as in mushroom picking – is also a factor to be considered in bug hunting. Are there typical times at which the application is less stable? When is an ideal testing time, really? Again, this is largely dependent on context
  • Location, location, location. Personally, I use many heuristics to guide me where to test. Which areas are more vulnerable? When you find one, there tend to be many others, indeed. As leaves bunching up *might* indicate a pushing mushroom, seemingly insignificant facts might be a tell-tale sign for bugs nearby: the code that developers write after a wild night of partying might not be all that good, for example. Or they can just have a bad day. I was once told by an old native American medicine man that developers are human too
  • As some mushrooms are picked and not cut, our bag-o-techniques should enable us to deal with any situation. As Lee Copeland points out in A Practitioner’s Guide to Software Test Design: a tester should carry his techniques with him at all times, just like a handyman’s toolbox follows him around everywhere he goes. Apply a specific technique, use an particular approach when the situation calls for it.

For the record: I’m not a mushroom master, yet. I lack practice, experience and domain knowledge to attain mastery. I’m not a testing master either, as I’m in constant learning mode. For every good practice I know, in context, I am aware that there’s always another context that I need to get myself familiar with. That prospect may seem humbling and daunting to many, but I wouldn’t want it any other way. That’s Context-Driven Testing for ya.

(For more info, see The Seven Basic Principles of the Context-Driven School as a starting place. There’s a lot more where that came from).