Presentation is loading. Please wait.

Presentation is loading. Please wait.

Innovative Test Practices Assurance with Intelligence Paul Gerrard Gerrard Consulting 1 Old Forge Close Maidenhead Berkshire SL6 2RD UK e:

Similar presentations

Presentation on theme: "Innovative Test Practices Assurance with Intelligence Paul Gerrard Gerrard Consulting 1 Old Forge Close Maidenhead Berkshire SL6 2RD UK e:"— Presentation transcript:

1 Innovative Test Practices Assurance with Intelligence Paul Gerrard Gerrard Consulting 1 Old Forge Close Maidenhead Berkshire SL6 2RD UK e: w: t: 01628 639173

2 Paul Gerrard Paul is the founder and Principal of Gerrard Consulting, a services company focused on increasing the success rate of IT-based projects for clients. He has conducted assignments in all aspects of Software Testing and Quality Assurance. Previously, he has worked as a developer, designer, project manager and consultant for small and large developments using all major technologies and is the webmaster of and several other websites. Paul has degrees from the Universities of Oxford and London, is Web Secretary for the BCS SIG in Software Testing (SIGIST), Founding Chair of the ISEB Tester Qualification Board and the host/organiser of the UK Test Management Forum conferences. He is a regular speaker at seminars and conferences in the UK, continental Europe and the USA and was recently awarded the “Best Presentation of the Year” prize by the BCS SIGIST. Paul has written many papers and articles, most of which are on the Gerrard website. With Neil Thompson, Paul wrote “Risk-Based E-Business Testing” – the standard text for risk-based testing. Assurance with IntelligenceSlide 2

3 What’s Innovative? Assurance with IntelligenceSlide 3

4 What’s innovative? New – just invented – new to everyone New to me – never heard of it Boss’s big idea (airline magazine syndrome) My big idea – I want to shape the future Innovative – I can see why no one did it before, but now I can see why it might work! Assurance with IntelligenceSlide 4

5 Innovative? But... it could be old hat – It’s something re-badged/renamed to look new – We tried it and do it all the time now! – We tried it and it didn’t work for us One person’s ‘innovative’ is another person’s ‘old hat’ Should we take a sceptical view? YES! But we must always be open to new ideas. Assurance with IntelligenceSlide 5

6 Assurance with IntelligenceSlide 6 Can innovations in testing be thought of in the same way?

7 Assurance with IntelligenceSlide 7 Not all innovations make it across the “chasm”

8 The Hype Cycle Assurance with IntelligenceSlide 8

9 Emerging practices Let’s take a look at emerging practices that – Look promising – Are being hyped and might (or might not) work – Are destined to become mainstream because they make it over the ‘chasm’ Are they – Too good to be true? – Just hype? – A REAL opportunity to get better/faster/cheaper? Assurance with IntelligenceSlide 9

10 What’s on your list? Here’s mine: Virtualisation and the cloud Behaviour-Driven Development Crowdsourced Testing Assurance with IntelligenceSlide 10

11 Virtualisation Huge pressure on IT to rationalise environments – To reduce energy footprint – To migrate from legacy hardware – To simplify environment management and support Tools to manage transition from legacy to virtual are here. Assurance with IntelligenceSlide 11

12 Opportunities Creating environments will become routine and automated Base test environments can be created as virtual appliances New test environments copied and SUT installed Assurance with IntelligenceSlide 12

13 Consequences A new emphasis on build and installation processes Test the deployment processes early – so environments become ‘easy’ to deploy Test environments as reusable appliances? The cloud becomes a reality The cloud is a solution looking for a problem. Assurance with IntelligenceSlide 13

14 The cloud (Wikipedia) Cloud computing is Internet- ("cloud-") based development and use of computer technology ("computing"). [1] In concept, it is a paradigm shift whereby details are abstracted from the users who no longer need [says who?] knowledge of, expertise in, or control over [dubious – discuss] the technology infrastructure "in the cloud" that supports them [clarification needed]. [2] Cloud computing describes a new supplement, consumption and delivery model for IT services based on Internet, and it typically involves the provision of dynamically scalable and often virtualized resources as a service over the Internet. [3][4]Internetcomputing [1] paradigm shiftsays who?dubiousdiscussclarification needed [2] scalablevirtualizedas a service Internet [3][4] The term cloud is used as a metaphor for the Internet, based on the cloud drawing used to depict the Internet in computer network diagrams as an abstraction of the underlying infrastructure it represents. [5] Typical cloud computing providers deliver common business applications online which are accessed from a web browser, while the software and data are stored on servers.metaphorcomputer network diagramsabstraction [5] business applicationsweb browsersoftwaredataservers Assurance with IntelligenceSlide 14

15 What the cloud gives us On demand access to: – Someone else’s spare capacity (infrastructure) – Your own spare capacity (you mean we have some?) And the good news? – Use it, abuse it – you don’t need to worry about supporting it – It’s a relatively cheap commodity if you are an occasional user. Assurance with IntelligenceSlide 15

16 And how can testers benefit? Client or server environments – Build, baseline, copy as many times as you like – Use once and throw away – Don’t attach a screen shot to an incident - attach your whole environment Environments can be disposable! Automation process: – Copy a prepared environment – Run automated tests – Store results – Throw environment away or archive. What else can you think of? Assurance with IntelligenceSlide 16

17 Behaviour-Driven Development (and testing) Assurance with IntelligenceSlide 17

18 What’s BDD Dan North coined the term: and defines it thus: – “BDD is a second-generation, outside-in, pull- based, multiple-stakeholder, multiple-scale, high- automation, agile methodology. It describes a cycle of interactions with well- defined outputs, resulting in the delivery of working, tested software that matters.” Assurance with IntelligenceSlide 18

19 Domain specific languages A domain specific language (DSL) is a programming language or specification language dedicated to a particular problem domain e.g. Banking, retail, defence... COBOL was an early attempt... uh-oh... Assurance with IntelligenceSlide 19

20 The practices of BDD 1 Establishing the goals of different stakeholders required for a vision to be implemented Drawing out features which will achieve those goals using feature injection Involving stakeholders in the implementation process through outside-in software development Using examples to describe the behaviour of the application, or of units of code Assurance with IntelligenceSlide 20

21 The practices of BDD 2 Automating those examples to provide quick feedback and regression testing Using 'should' when describing the behaviour of software to help clarify responsibility and allow the software's functionality to be questioned Using 'ensure' when describing responsibilities of software to differentiate outcomes in the scope of the code in question from side-effects of other elements of code. Using mocks to stand-in for collaborating modules of code which have not yet been written Assurance with IntelligenceSlide 21

22 A new focus on requirements elicitation Where do testers and test managers fit in Agile projects? With BDD the door is open for us to take on some (all?) the analyst role – Focus on stakeholder goals (great!) – We learn of features as they are thought of – Act as a conduit/facilitator between stakeholders and developers A new title: Analyst/Tester? Assurance with IntelligenceSlide 22

23 Be involved early, involved often Haven’t we been asking for this for years? (forever?) A new focus on “test early, test often” at a system level – Example-driven development (cf test-case) – Automated testing from day 1 (cf TDD) – Know which code exists and what doesn’t. Assurance with IntelligenceSlide 23

24 Things to look out for BDD Cucumber (an OS tool to support BDD) – Still too developer focused for my liking – But it’s early enough to influence its development – Maybe a variant product for Analyst/Tester role? Keep a close watch on the Agile movement I sense there’s a renewed momentum, realism, pragmatism and enthusiasm Tester as mentor/coach is “transitional” Assurance with IntelligenceSlide 24

25 Crowdsourced Testing Assurance with IntelligenceSlide 25

26 Crowdsourcing Testing – Can it Work? Quite a lot of hype surrounding distributed and remote resourcing – Clouds and… – Crowds Remote, removed, responsible, cheap, disposable, unaccountable, irresponsible… What’s the real agenda with crowds? 26Assurance with Intelligence

27 What is a crowd? Any group of people who find themselves in a shared situation Where does the “wisdom” come from? – Markets are crowds – ability to determine the value of a stock, product, restaurant for example – Crowds don’t need super-intelligent people – Crowds usually out-perform ‘experts’ but “Anyone taken as an individual is tolerably sensible and reasonable – as a member of a crowd, he at once becomes a blockhead” – Bernard Baruch What is going on here? 27Assurance with Intelligence

28 Problems that crowds can solve Cognition problems – Problems with definite solutions – “who will win the Premier League?” – “how many widgets will we sell this year?” Coordination problems – “which restaurant should we eat at tonight?” – “how do buyers and sellers make contact?” Co-operation problems – Self-interested, distrustful people working together even though self-interest seems to be a driver – Paying taxes, solving energy crisis, pay negotiations. 28Assurance with Intelligence

29 Conditions required for a crowd to be ‘wise’ Diversity – Differences trigger debate, ideas and independent thinking and behaviour – Best ideas come from disagreement and contest Independence – Not isolation, but freedom from the influence of others – Army Ants and the “Circular Mill” – Compare: “Information Cascades”, plank roads, client/server, 4GLs, DotCom, OO, Agile, cloud and crowd Decentralisation – Centralisations makes crowds homogeneous, less diverse, more likely to imitate – Crowds needs to be left to their own devices to operate wisely. 29Assurance with Intelligence

30 Crowds versus experts Crowds are more intelligent than experts Markets usually outperform the experts’ predictions – Few people put their money where their mouth is – They’d be Warren Buffet otherwise Chess players – Given a position, Grand Masters can analyse/ reproduce the play perfectly – Give them a random distribution of pieces on a board and they are lost (no better than laymen) – Expertise is often very limited in scope, “spectacularly narrow” 30Assurance with Intelligence

31 Crowdsourcing The White Paper Version: Crowdsourcing is the act of taking a job traditionally performed by a designated agent (usually an employee) and outsourcing it to an undefined, generally large group of people in the form of an open call. The Soundbyte Version: The application of Open Source principles to fields outside of software. 31Assurance with Intelligence

32 Crowdsourcing is a buzzword – can testing benefit from it? Not many companies are yet using a 'crowd' to test – yet appear to be the foremost promoters of this approach in the US but it hasn't penetrated the UK much Five main options for crowdsourced testing? 32Assurance with Intelligence

33 1. Crowded Scripting and Execution If the deliverable from testers is information, how exactly does one reward one's crowd of testers? Do we pay for documented tests and then a log of the tests being performed? – Probably not – We would need to manage the scope of documented tests - otherwise everyone would go for the low hanging fruit and we'd end up with a load of useless, duplicate tests – We'd still get some useful test activity, but the chaotic nature of these tests (with duplicated tests and untested areas of the application in the mix) would make it impossible to manage and the rewards to testers would be hard to distribute fairly. 33Assurance with Intelligence

34 2. Scripted Tests, Crowded Execution Maybe we could distribute scripted tests to the crowd and say "get on with it” In this case, we may still need to manage execution and assure ourselves that the tests, as documented, have actually been executed We could trust the crowd to execute tests remotely and log the tests and incidents as appropriate But we still need to manage their activity to avoid people wasting their time running tests that we know will fail or logging defects we already know about We could retain a coordinating role - that's not too hard to do either But if we're just hiring people to execute tests, and we want people we can trust, it's probably more effective (and possibly cheaper) to hire some junior testers to run our tests reliably under our supervision Crowded execution of scripted tests doesn't leap out to me as a great opportunity. 34Assurance with Intelligence

35 3. Crowded Exploratory Testing Could publish an initial set of exploratory test charters and manage those across the 'team‘ Coordination and management of the feedback from testers could be done but clearly, we are losing the benefits of close collaboration and communication How do we reward exploratory test activity? To be both prudent and fair to the testers involved we need to monitor and manage their useful activity. This is relatively easy to do with in-house resources, but with remote exploratory testers - that's hard - unless we trust them One or two exploratory testing support tools are emerging and maybe they could be used to collect feedback efficiently But, forcing the testers to use a logging tool might undermine their effectiveness – Tester A might thrash away and run hundreds of tests - achieving nothing – Tester B might spend time thinking through an anomaly and decide after due consideration that the software works fine or perhaps they logs a defect instead The notes of these two testers could be dramatically different in scale and value. How could we reward that activity fairly without micro-managing them? 35Assurance with Intelligence

36 4. Crowded Informal Load/Performance Testing If we hire the crowd to perform basic transactions in volume, the workload generated could be useful as a test or background load for other tests we might run internally The crowd's contribution could easily be monitored and testers could be paid by the hour or for valid transactions executed as we could monitor this on our test environment. This could be an economic and practical proposition. But… – If the transactions we ask the crowd to execute are simple enough that we don't need to monitor them too closely, would the use of automation, using a cheap or free tool do the job just as well? – Tools don't need global coordination across timezones, they are reliable, don't complain or get tired and can be worked 24 hours a day. The economics may not work in the crowd's favour after all. 36Assurance with Intelligence

37 5. Crowded Configuration Testing The crowd provide us with a ready-made, limitless collection of widely varying technology configurations Potentially, the hardware out there provides the most obvious opportunity Whether scripted or exploratory, incidents raised by the crowd have some validity An agent installed on the testers' machines could provide reliable inventories of hardware and software in use by the testers and this could be posted back for scrutiny This has real possibilities. 37Assurance with Intelligence

38 Pay-per-Bug We could reward testers for raising valid and distinct defect reports – Testers will tend to focus on trivial defects to earn as much as possible but the same defects may be logged repeatedly by different testers Perhaps the reward for defects could be weighted by severity and testers paid more for critical defects? – But the temptation would be to under play defects to pay testers less. Would testers ever believe they were being treated fairly? The other 'problem‘ is that of software quality – If the crowd are testing high quality software - there won't be many defects to detect and report – Full day's testing might go unrewarded - it's just too hard to make a buck. However, if the quality is poor, then the number of defect reports being logged and paid for might bankrupt the client! Is there a middle ground where the fairness and economics even out? It's hard to see how "reward by defect" will work. 38Assurance with Intelligence

39 Other Practical Considerations Remote testing of large, commercially sensitive systems across insecure connections is out of the question Only ‘simple’ systems or systems installable on home computers or handsets are viable Multi-user environments require complex test databases to be established and controlled for testing – a problem Remote, autonomous testers will either need very close coordination and supervision or they must be granted their own dedicated test environment or both This severely limits the types of application for which crowdsourced test execution is viable. 39Assurance with Intelligence

40 Crowdsourced testing - summary The difficulties are significant Crowdsourced configuration testing – maybe Outsourcing all testing to the crowd - no way It can work for tiny, trivial, toy systems – Perhaps the best use of the crowd is to use them in a "mopping up strategy“? Alternatively, perhaps we could use the crowd to give us feedback on an unfinished or evolving system But how different is this from a traditional beta test? I can’t see this being significant until certain questions are answered 40Assurance with Intelligence

41 Question to be answered? What test objectives (and associated techniques) are well-served by a crowd? What test activities (plan, spec, script, run, explore, report, decide etc.) best suit this model? Where do automation and tools fit? (Or must a crowd always be manual?) What commercial models work for companies offering crowdsourced testers? Can a software company or user company recruit its own crowd? How can we identify and recruit the best testers in a crowd? How can we reward good testers in a crowd? How do we control crowded testing to optimise the outcome? How do we recruit a trusted crowd and maintain confidentiallity? How do we measure the effectiveness of crowds? (and compare with other approaches) How do we assure the work done by a crowd? How do we report the output of crowds? Assurance with Intelligence41

42 Innovative Test Practices Assurance with Intelligence Paul Gerrard Gerrard Consulting 1 Old Forge Close Maidenhead Berkshire SL6 2RD UK e: w: t: 01628 639173

Download ppt "Innovative Test Practices Assurance with Intelligence Paul Gerrard Gerrard Consulting 1 Old Forge Close Maidenhead Berkshire SL6 2RD UK e:"

Similar presentations

Ads by Google