Through understanding our thought processes and those of people around us, we can become better testers. We can achieve more, get our way more often, and make fewer unreasonable mistakes. Whilst we recognise the potential “dark side” of these abilities, let’s remember we are using them to achieve a positive result for our organisations and our teams. We want high quality solutions. High quality software. High quality interactions. But testers and the people we work with are not machines. They have their own foibles, opinions, biases and concerns. What can we learn to make our lives (and the lives of our teams) easier? How can we twist things in our favour?
Both Maaike’s and Alex and Huib’s talks had some ideas for this, and also some things to be mindful of… the dark side expressing itself through us, perhaps?
Jedi Mind Tricks for Testers – Huib Schoots and Alex Schladebeck
Mapping Biases to Testing – Maaike Brinkhof
First up, let’s talk about Alex and Huib’s talk. In it, they espoused a number of approaches to mind tricks, including tricking ourselves. Their complete list:
1 Invite yourself everywhere
2 Use Biscuit Driven Development
3 Don’t tell them – show them
4 Ask to learn
5 Ask good questions
6 Ask three times
7 Tell stories
8 Don’t ask, just do
9 Stay human
10 Look after yourself
11 Be a productivity ninja
12 Solve problems
13 Be brave
14 Reflect on your work
Particular mirth at Biscuit Driven Development – the idea someone is more likely to want to talk to you if you’re bringing baked goods! Of course, the lesson here is more subtle – people want to help you if you’re a nice person who does nice stuff for them.
There are certain trends here in the way we interact with others and treat ourselves designed to combat the anti-patterns testers so often come up against. There’s a clear vision of the know-it-all/know-nothing tester, pestering the devs for their time, someone to be left off of meeting invites and in need of permission to do their job well. As Alex and Huib noted, these are all daft notions and we can prove that by working through them with our colleagues. By asking good questions, by asking to learn and showing we’re willing to take on new information, by showing devs odd behaviour rather than slapping a bug ticket on their foreheads, we learn to work more closely within our teams.
The other trend is definitely of individuals who are working themselves ragged to try and do a good job! By taking time to rest, to accept we can’t keep doing the creative work of our job like a machine, endlessly, and finding ways to make what we do the most valuable use of our time, we avoid burnout, get braver and learn about ourselves. Instead of just accepting the things we find within our own behaviour that we’d like to change, we change them, and thereby become better testers (and better people!).
This brings me neatly on to Maaike’s talk, around cognitive biases in testing. The talk made several references to the book Thinking Fast and Slow by Daniel Kahneman, most crucially the distinction between System 1 and System 2 thinking:
System 1 thinking: Automatic, instinctual. You don’t even realise you’re thinking.
System 2 thinking: Deliberate, slow. You realise you’re “having to think about it”.
After this brief intro, Maaike started by impressing on us the huge number of biases already coined, but drilled into 4 which were likely very relevant to testers:
(aka “The Mother of All Biases”!)
You are more likely to see (or see positively) things which conform to your own opinion or experience. An example might be preferring a solution during design phase because you like the person who came up with it. You can combat this by changing your approach, trying to come at things from new angles and perceive things differently.
The Halo effect
An opinion held in one area can affect an opinion held in another area, even when there’s obviously no correlation. An approach to handle this would be persistent standards which are applied in all relevant cases – a good example is not testing someone’s software less because you think they’re knowledgeable about code.
The more often you’ve used a tool or heuristic, the more likely you are to reach for it. The thing which comes to mind first will be used, and that means you’re likely to fall into patterns or ruts in your approach. An example might be always testing input fields in the same way. Learn your patterns and actively work to break them!
The first information you get is likely to be the most significant. An example of this would be someone estimating a story as 3 points of effort. You might have thought it was a 13, and you might be right, but you’re more likely to move towards an 8 or 5 – getting more and more wrong to reach the anchor. Be wary of the first opinion, and try to give all ideas a fair hearing before changing your opinion.
Maaike’s talk was an excellent example of some of the unconscious patterns we fall into as testers, and our teams fall into as well. It gave a huge amount of food for thought, and taken with Huib and Alex’s earlier talk, described something it’s obviously critical for testers to understand if we want to succeed.
So this post is a mishmash of a couple of great talks, but in my head they seemed to be driving in something like the same direction, albeit in different ways. Your brain is not a computer. It has needs, it has biases which it can be completely blind to, and it has to be in a certain state to do good testing work. It has pathways which work better than others, as do the people you work with. Pathways of thought. The phrase which inspired this post’s title was said by Huib (I think?):
“Testing is a Thinking Craft.”
This was alongside a slide advising testers to get proper rest and forgive themselves for not being machines. It’s precisely because we are fallible, biased human beings that we make excellent testers! In the section of the industry I work in, machines aren’t the primary consumer of our software, humans are. And our users also have biases, they also get tired and bored and frustrated. They are irrational, they make mistakes, they can be careless or flat out incompetent at using our software.
Or, they can be experts. They can understand how our code works on a PHP developer level. They can know more about our software than I ever will. I can think of examples of both from our actual user base, and that tells me the fact I reached for the “these guys don’t know shit about our software” analogy first, before catching myself, is one of my (doubtless many) biases. That’s useful information (even if it is a bit of a bummer!).
As testers, I believe we are fundamentally great systems thinkers. We are used to big-picture approaches to complex (or chaotic) ecologies of services, functions and interactions. To do that, we doubtlessly reach for instinctual ways of doing things. Whether that’s how we communicate, how we perceive software problems, how we treat ourselves, we have all built up ways of doing things, which are almost certainly imperfect, and potentially actively harmful to the work we are attempting to do.
What are your patterns, and which of those are not helping you? What are your go-to heuristics, and which of them could do with retiring (if only for a while)? How will you ensure your first impression doesn’t colour your whole experience, or ensure your interactions run smoothly?
By considering how you think, how others think, and making some adjustments. Testing is a thinking craft – our thoughts enable our work, and our value in teams, and we are wise to be mindful of how we go about that thinking.
More info on cognitive biases here.
I welcome your thoughts and observations in the comments section below. What are your patterns, and anti-patterns? What’s your cognitive bias?