Episode 124: Talking Testing with Anne-Marie Charrett

The Agile Revolution Podcast

16069825102_aa54010a22_zCraig is at YOW! Conference and catches up with Anne-Marie Charrett who is well known in the testing community as a trainer, coach and consultant but also for her support of the community:

  • Don Reinertsen talk “Thriving in a Stochastic World
  • Context-Driven Testing
  • Testing is a verb – it’s a doing thing and not an output, but the challenge is you cannot see doing
  • Anne-Marie’s class in Exploratory Testing
  • Where there is risk and failure, there is a job for testing
  • Exploratory testing – the key is feedback and using the learning to feedback into the next test
  • Agile testing – don’t try and test everything and don’t try and automate everything either, rather adopt a risk based approach
  • Unit testing – the usefulness depends on the programmer and the context and figuring out what you are trying to achieve
  • Sydney Testers Meetup
  • Speak Easy – Speak Easy…

View original post 67 more words


Lisa Crispin and Janet Gregory on (More) Agile Testing, Learning and New Approaches

InfoQLisa Crispin and Janet Gregory talk about how they came to collaborate on the “Agile Testing” books, the testing skillset and approaches to learning, and new and interesting approaches to testing.

lisa-janetSource: Lisa Crispin and Janet Gregory on (More) Agile Testing, Learning and New Approaches

Rapid Software Testing

A couple of years ago I received an awesome opportunity to attend James Bach deliver his Rapid Software Testing course in Adelaide. At the time I was working with Sharon Robson from Software Education to help re-develop the Agile Testing course for the Agile Academy, and she thought it might be good for us to sit in the back. The two day course was awesome (one of the best courses I have ever attended), although the animated debate between James and Sharon over breakfast in relation to ISTQB is one I will never forget either.

One of the great things about the course is that the notes are freely available from the Statisfice site (slides and appendices). Although it is the insight and passion from James that makes the course extremely worthwhile. Unfortunately I did not earn my “testing stars” from James from this course, but I did learn a lot. I recently dug out my notes from the course and here they are below.

  • the secret – “watch people test” – then follow the patterns
  • traditionally testers muddled through, as you got more experienced you just muddled better
  • there is lots of practices yet to be written about
  • James is “walking through an orchard rip with apples”
  • “nobody expects a tester to be right about anything” – we are in the evidence and inference business
  • tester tip – did you do “booja booja” testing? Your answer should be “not by that name”
  • method of concommonant testing – vary x for y (eg. dimmer switches) (John Stuart Mill – A System of Logic)
  • you test under uncertainity and time pressure – if not you are about to be laid off!, organisations keep testers at minimum number
  • heuristics – essential to rapid testing, eg. walking into a foreign building – “I’ll know it when I see it”
  • “creep and leap” – leap is the most outrageous test you can do, creep is to gently shatter the pattern in your mind – creep and leap may fail because you don’t leap far enough or you don’t creep enough
  • minimum number of cases has no meaning – infinite – no light flashes when you have finished testing / understand the pattern
  • pattern in the test cases is just the pattern in the test cases, not the program
  • need to leap beyond imagination
  • rapid testing is not about techniques – a way of thinking, a set of skills
  • what do testers do? – they are the “headlights of a project”, don’t need testers in the daylight (no risks)
  • testers don’t ensure quality of a product, they report the quality of the product
  • key definitions: quality is value to some person (who matters), a bug is anything about the product that threatens its value
  • testers represent the people whos opinion matters
  • defect is a bad word legally; not sure it is a defect when you find it, assumes more than you know (emotional word: bug, issue, incident)
  • testing and questioning are the same thing
  • there is a motivating question behind each test (if not, a zombie walk)
  • first principle – know your mission – allows you to test what matters, gets you more focussed
  • we are chasing risk
  • quality criteria – what is important, who are users
  • curse of expertise – people who know a lot, don’t always see a lot (why you need developers and testers)
  • need an oracle / result – otherwise you are just touring (an oracle is a principle or mechanism by which you find a problem)
  • rapid test teams should be a team of superheroes – what is your super power? Seek test teams that have variety
  • critical thinking – “huh”, “really”, “so” – say these words and you are on the road to critical thinking, you have to make assumptions to get work done
  • “huh” = what exactly does that mean?
  • “really” = what are the facts, how do we know it is true?
  • “so” = does any of this really matter, who cares?
  • safety language – this desk “appears” brown, have “not yet seen” a number 127 work, when you see this language your brain keeps thinking about the problem (interim conclusion only)
  • if you have stopped questioning you have stopped testing (and turned yourself into a test tool)
  • video tape your tests – take notes at timestamps, good for audit when you need that
  • The Amazing Colour Changing Card Trick – look from a different angle, view things more than once
  • ask a question without asking a question – make a statement / fact and wait for a reaction
  • model it differently – look at it in a different way
  • need to have the ability to slow down your thinking and go step-by-step and explain/examine your steps and inferences
  • exploratory testing is about trying to de-focus – seeing things in a different way
  • there is no instruction you can write down that won’t require some judgement from a human
  • irresponsible to answer a question without knowing some context – allows you to establish a risk landscape
  • James remembers his testing approach as a heuristic – CIDTESTDSFDPDTCRUSSPICSTMPLFDSFSCURA (his notes go on to explain this one!)
  • when you hear “high level”, substitute “not really”
  • HICCUPS(F) heuristic, a set of patterns all testers seem can be an answer to justify why something might be: History (something has changed), Image (OK, but something makes us looks stupid), Comparable products (like another system), Claims (said in a meeting, hallway), User’s expectations (do you understand users), Product (consistency), Purpose (why and what is it trying to accomplish), Statutes (something legal), Familiarity (a familiar feeling)
  • Oracles – calculator (ON 2 + 2 =4; not heuristic, answer won’t be 5, burst into flames, number won’t disappear), Word saving files (came up with 37 alternatives), Notepad (this application can break, Microsoft suggested it was not a bug)
  • Ask for testability – give me controllability (command line version and visibility, text version of display), when developers say no send email so you have documented evidence on why didn’t or it takes so long to test
  • ask “is there a reason I have been brought into test this?”
  • ad-hoc / exploratory does not equal sloppy
  • testing is not the mechanical act but the questioning process, only people who have a goal of 100% automated testing are people who hate to test, don’t hear about automated programming (what about compiling?)
  • everybody does exploratory testing – creating scripts, when a script breaks, learning after a script runs, doing a script in a different way
  • exploratory testing acts on itself
  • “HP Mercury is in the business of avoiding blame”
  • script – to get the most out of an extremely expensive test cycle, for interactive calculations, auditable processes
  • mix scripting and exploration – what can we do in advance and what can we do as we go, James always starts at exploratory and moves back towards scripting
  • use a testing dashboard – break down by key components in the system, all management cares about is a schedule threat so get to the point, count the number of test sessions  (uninterrupted block of testing time – 90 minutes) as management understand this (session test management), the key is simplicity, what does management usually ask for / need (usually a different measure), counts give the wrong impression, numbers out of context, number of test cases is useless, use coverage (0 = nothing, 1 = assessed, 2 = minimum only, 3 = level we are happy to ship) and status (green = no suspected problems, yellow = testers suspect problem, red = everybody nervous)
  • equivalence partitioning – you treat differences as if they are the same, models of technology allow us to understand risk (eg. dead pixels on a button), critical tester skill to slow your thinking down (is that a button?)
  • galumphing – doing something in an intential, over exuberant way (eg. skipping down the street), some inexpensive galumphing can be be beneficial, takes advantages of accidents to help you test better
  • An Introduction to General Systems Thinking (Gerry Weinberg, 1974) – basic text of software testing
  • many people are hired to fake testing – not to find bugs but to point fingers (“we hired testers”)
  • good testers build credibility
  • testers question beliefs (we are not in the belief business) – cannot believe anything that the developers tell you
  • lots of people can test – like surgery in the 14th century
  • reality steamroller method – maximise expenses from the value that they are going to have – record decisions, do your best to help out, let go of the result, write emails to get your hands clean (helpful, timestamp documented)
  • get all of the documentation and create a testing playbook – diagrams, tables, test strategy
  • The Art of Software Testing (Glenford Myers) – introduced the triangle exercise
  • calendar exercise – visualise your test coverage whenever you can, plot times on a grid, bar chart, wheel
  • choose a number between 1 and 20 – 17, 7, 3 – 20 is the least popular – what about pi, floating points – choose because they look less random
  • bugs with data types (eg. string in JavaScript) and bugs in tables and labels not found by boundary tests – this is when you need to run inexpensive random testing
  • anti-random testing – heuristic – every molecule trying to get away from the other molecule – as every test is trying to do something different
  • Crazy Ivan Testing Manoeuvre – defocussing  approach, looking for approaches you weren’t looking for (The Hunt for Red October)
  • finding bugs – testing exhaustively, focus on the right risk, indulge curiosity, use a defocussing strategy
  • curiosity – urge to learn something you don’t need to know
  • good usability checklist (medical devices) – ISO 60601-1-4
  • base testing on activities (what a user does) rather than on test cases
  • playbook – table – goal, key, idea, motivation, coverage, etc… – is just a list of ideas
  • you can’t check an always – but you can test aggressively for confidence
  • stopping heuristic – piñata heuristic (when you have enough candy), cost vs value (when cost exceeds value), convention (what is expected of you), loss of mission, ship
  • basic boundary is testing is not one over / one under –> fairy tale boundary testing