Once again I was extremely lucky to get two talks accepted at Agile 2009 (with Paul King) and the support from Suncorp to send me along to speak. Whilst its a been quite a number of weeks since the conference, I wanted to ensure that I posted my notes and comments. This year, being my second attendance, I found the hallway discussions all the more valuable and had many awesome conversations with friends made last year as well as new friends just met. Added to this, Chicago exceeded my expectations as the host city.
Once again, the number of simultaneous sessions made the decisions extremely difficult on what to attend.
The sessions I attended on day 1 were as follows:
Using the Agile Testing Quadrants to Plan Your Testing Efforts
This session on the testing stage was delivered by Janet Gregory, one of the authors of Agile Testing. The slides are available on the Agile 2009 site.
Testers should be part of release planning and think about:
- scope
- test infrastructure and test tools / automation
- how much documentation, is it too much, can I extract it from somewhere
Iteration planning:
- plan for done, acceptance tests
- priorities of stories, which stories to do first, connect with developers
- budget for defects unless you are a high performing team
Need to acceptance test the feature, not just the story.
We then did a collaboration tools exercise, and some of the tools used by the audience were:
- desk check / show me – when a developer thinks they have finished coding, get together and take a look
- wikis, conference calls, GreenHopper, etc
- daily standup – share when things are done, if you find them ineffective
- project cards – used for story management and documenting conditions for acceptance
- sticky notes and pens for a co-located team
- demonstration every week or end of every iteration
- FIT tool, used for demos
- walking and talking
- pairing
- generated artefacts from the CI server
- instant messaging
- puzzle / chocolates on desk to encourage talk, “free to developers if they come and ask a question”
- rolling desks on wheels, so they can switch configuration
- rolling whiteboards
- JIT (Just In Time) meetings as required
- mind mapping software that hooks up to Jira
- retrospectives
- team review story and write tests together
- nobody said “email”!, no email!
- recorded chat room, so conversation is recorded
Waterfall test pyramid, upside down, very unstable – Functional Tests –> API Tests –> Unit Tests (heavy functional tests based on GUI, very few unit tests).
Automated test pyramid (Mike Cohn) – unit tests / component tests are the base layer, require testable code that we can hook into below the GUI at API layer, GUI tests are most brittle because UI changes so do as few of these as possible, right at the top you might need a handful of manual tests.
Agile testing quadrants change the way you think about testing – use to classify tests, what the purpose of the test is (why are we ariting these tests), tests will cross boundaries.
Agile testing quadrant – can be used as a collaboration tool (developers will understand how they can help), emphasizes the whole team approach (no “pass this to the QA team”, whole team is responsible for testing), use to defne doneness (use for planning, what needs to be done, has estimate allowed for the amount of testing we wish to complete).
Quadrant 1 – technology facing tests that support the team, TDD supports the design of the team, tester has feeling of comfort
- unit tests test the developer intent, individual tests on a method, small chunks of code, fast feedback mechanism, code is doing what it should do
- TDD tests internal code quality, if developers test correctly it flows all the way through and makes easier to test functionally
- base for regression suite, if you are going to spend any time on automation, “put it here”, return on investment is better the lower you go in the pyramid
Quadrant 2 – where the acceptances tests live, supporting the team in natural language, helping the team deliver better software, use paper prototypes to talk to customers rather than big GUI, acceptance test upfront helps define the story, use examples to elicit requirements (easiest way to get clarification from the customer, always ask “not sure what you mean” or “give me an example”, pair testing (ask for feedback as soon as possible)
- the examples can become your tests, write upfront and ensure that developer makes them pass when they develop code, use tools such as Fit / Fitnesse, Cucumber, Ruby / Watir
- examples help customer achieve advance clarity, focus on external quality (facing the business), want the tests to spark a conversation with the developers
- BDD use of given (preconditions), when, then as opposed to tabular formats in Fitnesse, useful for workflows
- Janet polled the room and only about a dozen people in the room give their acceptance tests to the developers prior to the story being developed
- if no automation tool, write up a manual sheet, give it to the developers and have a conversation before the card starts
Quadrant 3 – user acceptance testing, critiquing the product, getting the customer to look at the system
- exploratory testing – time box these sessions to reassess about how far you wish to go, following instincts and smells with a purpose, touring (eg. the money tour) as defined by James Whittaker and James Bach (in the book Exploratory Software Testing), this is where you find majority of bugs so testers should spend the majority of their time here (which is why you need a good base of automated tests)
- collaboration testing – forge a relationship with the developers so you know what they are developing,
- remember your context to determine how much testing is enough (eg. mission critical software vs an internal application)
- attack stories using different personas – Brian Marick likes to create evil personas (eg “pathological evil millionaire”) or use impatient internet user vs grandma who clicks every link on the internet
Quadrant 4 – non functional tests (should be part of every story (eg. is there a security or performance aspect), ility testing, security testing, recovery, data migration, infrastructure testing, do as much as possible upfront although sometimes you will need environments that will not be available to the end
- non functional requirements may be higher than fucntional (eg Air Canada seat sale might need critical performance)
Test plan matrix – big picture of testing against functions for release, usually on a big whiteboard, use colours (stickies) to show progress, benefit is in the planning in what we need to do testing wise but also appeases management because they like to see progress, gives idea of where you are going
Can use a lightweight plan, put risks on a white page, 35 of the 37 pages of the IEEE test plan are static, so put that information somewhere else
Test coverage – think about it so the team knows when the testing is done, burn down chart will be enough if you test story by story, when thinking risk ensure you include the customer (they may have different opinion of risk
Summary:
- think big picture – developer following a GPS only needs to know next 2 weeks, but tester is a navigator and needs the map
- include the whole team in planning and test planner
- use the quadrants as a checklist (put them on the wall)
- consider the simplest thing, especially in relation to documentation
- think about metrics – one man team might be good enough to just know they passed
- visible, simple, valuable
Janet also mentioned the following throughout the session:
- Kevin Lawrence – Growing Your Test Harness Naturally (2005) – good article
- Gojko Adzic – Bridging the Communication Gap
I also stumbled across a related blog post on this session at: http://agile2009.blogspot.com/2009/08/agile-testing-quadrants.html
What Does an Agile Coach Do?
This session was delivered by Liz Sedley & Rachel Davies, authors of the new book Agile Coaching. The slides are available on the Agile 2009 site.
This was a hands-on workshop and involved some good discussions on how to deal with different coaching scnarios.
Zen & the Art of Software Quality
This session was delivered by the legendary Jim Highsmith. The slides are available on the Agile 2009 site.
- “There Is No More Normal” – John Chambers, Cisco CEO, Business Week, 2009
- business strategy needs to be more adapting to change than performing to plans
- mixed messages – be flexible but conform to a plan – dilemma faced by many agile teams
- “Artful Making” – Rob Austin – describes $125 million software failure
- 1994 there was 82% software failures, 68% in 2009 (success defined as on time, on budget, all specified features) – Standish is measuring the wrong thing, not a good measure
- cancellation of a project should not be a failure, it is a good thing
- current environment – schedule is more important than value
- “Beyond Budgeting” – Hope/Fraser – not a good book, but good ideas
- “Measuring & Managing Performance in Organisations” – Austin – all measurements are dysfunctional, get a different outcome than you expected
- if budget 100 and you achieve 100, better than budget is 120 and you achieve 110 – which would a performance management system reward (the 100, even though latter is better achievement)
- beyond budgeting – make people accountable for customer outcomes, create high performance climate based on relative success amongst others
- trust, honesty and intentions are better than measurements
- performance tends to improve while people figure out the system, but under pressure people focus on measurement goals rather than outcomes
- earned value (time + cost) has nothing to do with value, does not have anything to do with what is delivered to the customer
- we need to move from scope/cost/quality to value/quality/constrants (scope/cost/schedule)
- core benefit from agile has been value and quality
- everybody comes to work to do good quality, but never well defined
- “Zen & The Art of Motorcycle Maintenance” – Pirsig – quality ideas
- is quality objective or in the eye of a beholder, people have different ideas
- need extrinsic quality (value) and intrinsic quality (so you can deliver quality tomorrow)
- “Applied Software Measurement” – Capers Jones – 95% defect removal rate the sweet point for quality
- experience is doubling staff quadruples the number of defects – BMC were able to kick this trend using agile
- difficult errors take time to find – longer the worse quality of the code
- first year of product release the quality might be OK, but then adding new features more important than fixing software debt, over time the cost of change increases and accumulated technical debt harder to fix, but the more debt the higher the pressure to deliver
- strategies – do nothing, replace (high cost/risk), incremental refactoring, commitment to innovate – best way but hard to sell politically – downward cycle from vicous cycle to a virtuous cycle (55% said easier to support agile developed products)
- productivity in features you don’t do, 64% of software features never used, what if we put 25% of that money into refactoring or leaning agile
- agile value curve – if doing high value first we can ask the quation do we have enough to release the product?
- need to reduce the margincal value of our stories
- if you don’t have time to estimate value, you don’t have time to estimate cost
- philosophy – value is an allocation not a calculation (cost is a calculation), so use value points and allocate from the top down – value points need more thought than ranking – additional information when you look at 25 story point card worth only 2 value points, also demonstrates that value is important, should be able to do this fairly quickly
- value and priority are different – a low value card high on priority might be a guide, pick a cap for the value
- value points like story points are value
- story point is calculation of cost, value point is allocation of revenue
- Intel has 17 standard measures of value, help to determine as a guide
- value in Chinese means smart/fast
- value – is product releasable – always ask the business owner or product manager that question – example that a product could be released when it was 20% complete
- parking lot diagram – empasizes capabilities we are delivering to customer in their own language, show progress and value deliverd by number of stories done / done
- Gantt chart shows task complete to a schedule
- questions – can we release, what is value-cost ratio (do we need to continue or do something else that is higher value), what is product quality, are we within acceptable constraints
- how do you determine if you are in a technical debt hole – using qualitative measures in your code
- ask the queston – do you know why it takes 3 months to make a change, explain the technical debt curve, start to show people that quality matter (eg. automated testing becomes a time accelerator)
Ice Breaker & Freshers Fair
The Fresher’s Fair at the Ice Breaker had a number of great groups including Kanban, Usability and CITCON. I stumbled across the following poster that was a long way from home…