The STANZ (Software Testing Australia New Zealand) 2011 conference was held in Wellington and Melbourne on the last week of August (into September). I was lucky enough to be invited to speak at the Melbourne event by my good friends at Software Education, who were the promoters of the event. I rolled up on the back of a flight from Los Angeles to Brisbane (and then Brisbane to Melbourne) a little jet lagged, but got heaps from the event.
|From STANZ 2011|
Here are my notes from day one of the conference.
Am I Creating Value With My Testing?
|From STANZ 2011|
- ask questions of the CEO about the vision and what the product is supposed to do, listen to customer support calls, talk to marketing, talk to the developers about what bugs they value
- what are the top 10 things people love and hate about your software?
- look for efficiency – use checklists instead of test cases, forget about regression testing and use the computer to be more efficient
- testing is about creating value for the people who matter most, your customers
- people need an emotional attachment to your product – the share market is an example of a product driven by emotion
- we need to create value for our customers, but just as importantly for ourselves
- we can’t just focus on business value – it’s a big stick that will erode morale
- talk to your customers – what do they need, what do they like, dislike, what is missing?
- talk to team – what do they like about your work, how can you be better?
- self evaluation – what is new in the field, am I enjoying work, what do other team members focus on or find things that I miss?
- avoid blame – excuses rather than finding and solving real problems – “we wouldn’t have this problem if we we doing agile”, “management don’t get testing”, etc… – feels good to say but is not constructive
- don’t expect tools or processes to rescue you – look out for your own best interests, know the problem you are solving and use the tools/process to solve it and ensure you have a way to measure it
- the key to creating value is alignment – people in different jobs or teams often have different goals
- leaders – clearly articulate vision and goals to the testing team and how does that align to our goals for the product and company, leadership comes from everyone in the team, leaders need to manage the politics (an organisation with more than one person will have politics)
- need to continually inject change and keep people interested
- people have skills, they are not resources – find your talents and invest in it
- understand your context – every team will be different
- tangible quality can be measured by understanding if the stakeholders needs are met and if you are meeting ROI, intangible quality is important and not often taken seriously – would you be afraid if you mother used this, would you like your name on the splash screen?
- impress the most important stakeholder – you!
- most people don’t know what great testing is – you can be shocked and appalled by what most people think is good, strive to be better
- tangibly getting better – learn about planning and strategy and exploit the opportunities, write good bug reports as developers really value this, be good at communicating what needs to be done and where we are going, take more responsibility and display competence in basic technical skills
- intangibly getting better – be in demand for your testing service, have good problem solving ability
- use external communities to develop your testing skills
- work as though your favourite person in testing was coming to visit
- need to be able justify your work – is your testing defensible
- use repeatable or intermittent bugs as a clue to something bigger – don’t ignore the anomalies
- testing is like journalism – need to do crazy things to get the story, move towards the issues, people need the news today not tomorrow
- need to have a technical curiosity about what is going on in the community – what is coming down the pipe, what are the people that have the ability to change things doing?
Overall this was a refreshing session to see a passion in testing and improving skill, with some excellent sound bytes along the way.
What Does A CEO Want From Testing
Mark Feldman from IV&V Australia delivered this presentation.
|From STANZ 2011|
- the CEO is accountable for delivery, protecting his reputation
- the CEO is not going to check test cases unless you look like a risk (ie. front page of the newspaper)
- governance is a CEO buzzword that covers a bunch of things
- need to provide more than alignment – creativity and innovation
- looking for thought leaders and competitiveness enhancement not 80/20 maintenance work
- CEO wants creative disruption along with well run divisions
- testing needs to be proactive rather than reactive
- have some answers about the cloud – how it affects the team
- CEOs like ERP because they believe there is less risk
Working With Remote & Distributed Teams
Karen Johnson delivered this session.
|From STANZ 2011|
- we are not alone – through Twitter and Skype you can connect with great people
- understand time zones and calculate meetings for each persons time zone, put the number in the meeting request
- rotate inconvenient team calls – when people are in very inconvenient time zones such as India
- recalculate time differences again when people are travelling
- important to have a usable workable space – particularly when working from home
- some people have trust issues, so ask what have you done for them to have doubts
- you just can’t work from Starbucks, could you invite your boss to your home workspace
- be aware on calls when people are not in the room – handing out documents or drawing on the whiteboard
- get to know your remote people and get to meet them in person when you can
- observe with your ears – look for clues to mood and listen for tone
From Jaded To Jubilant: Invigorating Your Test Team
Anne-Marie Charrett delivered this presentation.
|From STANZ 2011|
- wanted a team that had long term motivation, so could not motivate with carrots
- Outliers by Malcolm Gladwell – mostly people are successful because they are in the right place at the right time
- before you can motivate a team you need to ask yourself how motivated you are – what gets you up in the morning about testing
- know your testers – give your testers a testing challenge to understand how they test, also understand what they want to get out of testing
- important that your test team knows that you believe in them and that they are being listened to, important that they get excited about testing again
- testers are paid to think – test scenarios often go against that
- think about for every test, how is it adding value to the company
- testers need to take responsibility – make and defend decisions
- you sometimes need to let go of your own goals – the team need to feel empowered
- exploratory testing – the tester needs to decide when it is good enough, this is the way testing is and it is hard to estimate – session based test management (SBTM) and Rapid Reporter (enter your charter/objective – time stamps and records test sessions)
I really enjoyed this session, although it reminded me how many organisations still have large test separate teams.
Test Planning for Mobile Application Projects
|From STANZ 2011|
- implications – power, display size, portability, connectivity, radios, large number of devices
- less power than a PC – multitasking can freeze memory, interactions with O/S can have a big impact, kinetic input (tapping, touching, pinching) can have strange behaviours, needed to test using physical movement to replicate locking
- connectivity – strange things happen when moving between WiFi, 3G and 4G, driving also causes issues
- distribution – you do not have control of distribution in app stores, read the guidelines and understand the timelines early
- mobile project issues – time pressures due to market competition, smaller applications, constant change in environments, handsets, software, very programmer centric environments so planning, testing, etc is viewed as a bat anchor, lots of competition, high risk if your application does not work as expected
- testers need to prove their worth as rigid approaches will leave you behind
- key is to focus on test execution rather than planning, because everything is going to change anyway
- need a strategy on how you are going to test, what devices you are going to buy, how are you going to manage the devices/cables because they go missing easily (had to chain cables to a hubcap!)
- find out strategies that you are targeting so you can procure equipment
- emulators are useful for basic testing, better to use real device of target platform, developers would have used the emulator anyway
- supporting IOS 3 to IOS 4.1 resulted in 104 combinations between multiple devices, etc – classification trees are good to explain permutations and combinations
- automation is still in initial infancy – not as nice as web applications at this point
- devices are being exploited to do combined activities so need to exploit this in testing
- we use these devices in environments where we do not use a PC – they are addictive and are part of our lives
- testing will involve leaving the office and moving around to mimic what the users are doing – determine high value because everybody will want to do this testing!
- tricky to get devices that you are targeting – standing in line for the iPhone!
- may need to target different carriers and plans as technologies can be different
- think about logistics of storage, charging, etc…
- ergonomics are an issue whe testing mobile devices – shorter work days, can be painful on fingers, people are 25% less productive on these devices than PCs
- health is an issue because devices are shared and illness spreads fast – hand sanitizers, wiping devices after use, washing hands frequently
- need to factor in training as there are lots of way to use devices
- taking screen shots is a lot more painful than web applications
- usability testing – no standards unfortunately, look for user emotions, perceived lack of performance, one of the most important things on these devices
- performance testing – no real tools, can jailbreak IOS, some emlators have rudimentary tools, can affect performance of device, use stopwatches, spoof the headers, emulate on a machine using small memory footprints and look for speed
- security is often a trade-off with performance
- can automate using emulator in a browser, tools are rudimentary, vendors are clamouring in the space, Opera has a mobile mode
- influenced by James Bach’s Statisfice Test Plan Evaluation Model and Test Planning Guide
- planning needs to be a parallel activity, do just enough in regulated environments, video can be good to replace test cases, need to meet their intent and needs but rather than giving them what they ask for give them something better
- research your customers for your scenario tests – how they will use the app, are they locals or visitors, is it easy to understand outside context (eg. train schedules)
- trick – search ” sucks” to find and exploit common problems
- allow time to keep up-to-date with platform changes
- remember to test technology like GPS, graphics, camera, video, sound, messaging, data
- Smashing Magazine – good resource for usability
- modeling state allows you to get understanding quickly
- risk vs reward testing – focus on what is value – if you need to demo to get funding, test that the demo will work and not crash
- quality attributes – HP’s FURPS
- may need to set time aside for guidance documentation
- put structure and timebox around exploratory testing so that everybody knows what mission and goal is – look at application from different perspectives
- express completeness as how have we done and how much we have to go from different perspectives
- Session Tester – video is good for brining new testers in, easier to digest than written down test cases
- estimating – use uncertainty models (Software Estimation by Steve McConnell), Galton Estimation tool – like to use P90 – give a range, use S curve to give confidence matched to dates
- regulators are worried about repeatability – they like formal session based testing
- adapted James Bach’s testing dashboard