Presentation is loading. Please wait.

Presentation is loading. Please wait.

Software Testing & Test-Driven Development JAMS Workshop Makerere University September 2010.

Similar presentations


Presentation on theme: "Software Testing & Test-Driven Development JAMS Workshop Makerere University September 2010."— Presentation transcript:

1 Software Testing & Test-Driven Development JAMS Workshop Makerere University September 2010

2 Agenda Intro to Software Testing Test-Driven Development Writing a Test Plan Test Frameworks – JUnit – Visual Studio

3 Software Testing Without testing, theres no proof that the software does what it is intended to do – Testing == Quality Testing should be incorporated into the development process from the beginning – The need to test your software will impact how you develop it – The earlier you find a problem, the easier it is to fix Time Introduced Time Detected RequirementsDesignImplementationQAPost-Release Requirements 1x3x5-10x10x10-100x Design -1x10x15x25-100x Implementation --1x10x10-25x Software testing is the process of verifying that a software program works as expected. Cost of fixing a defect depending on the stage it was found

4 Who is Involved with Testing? Business Stakeholder Participates in internal alpha testing before the software is released Program Manager Writes functional spec that enables tester to write a test plan Reviews the test plan Developer Writes implementation spec that enables tester to complete a test plan Reviews the test plan Implements unit tests Test Engineer Writes test plan Writes and executes test cases, files bugs Signs off on final product Customer The person who is ultimately affected if the code was not well-tested May participate in beta testing or perform user- acceptance testing

5 Testing Methods The box analogy – describes the point of view the engineer takes when testing Tester has access to internal data structures and algorithms API testing Code coverage Fault injection Tester treats the code as a black box – no knowledge of implementation Boundary-value analysis Alpha/beta testing, dogfooding Tester has knowledge of internal data structures and algorithms, but tests at the user/black-box level Integration testing (between two modules of code at the interface level) Reverse engineering White Box Black Box Grey Box

6 Many Levels of Testing Tests that verify a specific section of code, e.g. classes or methods Usually white box tests written by developers Unit Testing Verifying interfaces between components Iteratively add elements until the whole system is verified Integration Testing Testing a completely integrated system to verify that it meets requirements System Testing Testing that a system is integrated to an external or 3 rd party system Each system has already passed system testing System Integration Testing Tests that serve to prevent old bugs from coming back Usually written as part of the bug fix process Regression Testing Also known as User Acceptance Testing Testing performed by the customer, in their own lab environment Acceptance Testing Testing by bleeding edge internal users or external customers Could also refer to internal dogfood efforts Alpha Testing A broader test deployment, usually happens after alpha testing Usually involves an external audience Beta Testing

7 Test-Driven Development A development technique that relies on the concept of writing the test cases before the product code – Validates that spec and requirements are well-understood – Test cases will initially fail – Developer writes code to make them pass The test is the proof that the code works – Developer must clearly understand user requirements in order to write tests – There should be no functionality in product code that isnt tested Encourages simple designs and inspires confidence – Clean code that works

8 TDD Workflow 1. (Re)write a test 2. Run all tests & see if the new one fails 3. Write some code 4. Run the automated tests & see them pass 5. Refactor code Repeat Test succeeds Test fails Test(s) fail All tests succeed Always start by writing a test Test must fail because the feature isnt implemented (if it succeeds, the need for the feature is obviated) To write a test, you must fully understand the features specification and requirements Write minimal product code to make the test compile and run Validates that test harness works Tests the test itself: make sure it doesnt pass without new code Test should fail for the expected reason – ensure that it is testing the right thing Write some code to make the test pass May be inelegant or suboptimal; we will improve it later Make the test pass only, no extra untested functionality If all test cases pass, all tested requirements are met If they fail, keep iterating… Assuming the tests are comprehensive, move on to the final stage Clean up the code as necessary to achieve production quality Focus on removing duplication, including duplication between test & product code Re-run the test cases to ensure that refactoring isnt breaking any functionality

9 Writing a Test Plan Like a functional specification for your test code Test Plan Template – Test Plan Objectives – Scope Features to be tested Features NOT to be tested – Test Strategy – Test Cases – Open Issues

10 Test Plan Objectives & Scope Similar to functional spec – What are we trying to accomplish with our test cases? – What is in and out of scope for testing? Test Plan Objectives [P1] Test functional correctness of the games underlying methods using white-box unit testing [P1] Test that only valid moves are allowed [P1] Validate that the game can correctly switch between players and declare the end of the game [P2] Test advanced features, like the statistics table, play timer, and computer player, if implemented [P3] Performance testing Scope Features to be tested Basic game play o The game alternates between players o Players can place moves o Only legal moves are allowed Ending the game o The game can declare a winner o The game can declare a tie Player statistics table (if implemented) Timer (if implemented) Machine player (if implemented) Features NOT to be tested User interface rendering Networked play

11 Test Strategy Describe the testing methodology you plan to use – E.g. white box, black box, grey box, or a combination Explain which testing frameworks youll use, if any – E.g. JUnit, Visual Studio Will your tests require any sophisticated infrastructure, setup, or tools? – E.g. mock objects, load simulation, test bridges Test Strategy We plan to employ mostly white-box unit tests in order to test the application, using Visual Studios built-in unit-testing framework. Each of the following major classes will have at least one unit test: Game GameBoard Player GameStatistics

12 Test Case Detail Test Case IDTest Case NameStepsExpected Result 2.1Test basic moves 1.Place an X at TopLeft 2.Assert that the board records an X at TopLeft 3.Place an O at Center 4.Assert that the board records an O at Center Markers are present at expected positions (assertions pass) 2.2 Disallow player from placing his marker on a square occupied by his opponent 1.Place an X at TopLeft 2.Place an O at TopLeft Move 2 fails 2.3Disallow player from playing on his own occupied square 1.Place an X at TopRight 2.Place an O at Center 3.Place an X at TopRight Move 3 fails Feature 2: Validate Basic Moves/Allow Legal Moves Only

13 Testing Frameworks JUnit is a unit-testing framework for Java – Developed by the same people who pioneered TDD – Uses source code annotations to decorate special methods to be run by the test harness – Integrated with Java IDEs like Eclipse and JCreator Visual Studio has a test framework for any.NET language – Based on the same ideas as JUnit – Supports unit tests, database unit tests, generic tests, manual tests, load tests, web tests

14 Test Methods JUnit – Annotate test case methods – Use methods from org.junit.Assert to check your test conditions or fail the test case Visual Studio – Annotate test case methods with – Use methods from the Assert class to check your test conditions or fail the test case import org.junit.Test; public class AdditionTest { private int x = 1; private int y = public void testAdd() { int z = this.x + this.y; assertEquals(2, z); } Imports Microsoft.VisualStudio.TestTools.UnitTesting Public Class AdditionTest Private x As Integer = 1 Private y As Integer = 1 Public Sub TestAdd () Dim z As Integer = Me.x + Me.y Assert.AreEqual(2, z) End Sub End Class

15 Initialization & Cleanup Methods Common code that runs before/after each test case … also known as Fixture methods – Add a field for each part of the fixture – Annotate a method with Before/TestInitialize and initialize the variables in that method – Annotate a method with After/TestCleanup to clean up before the next protected void setUp() { this.x = 1; this.y = 1; protected void tearDown() { this.x = 0; this.y = 0; } Public Sub SetUp() Me.x = 1 Me.y = 1 End Sub Public Sub TearDown() Me.x = 0 Me.y = 0 End Sub

16 Running Tests in the IDE Visual Studio & Eclipse both enable running tests inside the IDE – Select specific tests to run or run the whole suite – IDE reports which cases passed/failed – Run selected tests again Easily switch between test code and product code – Change each as needed and rerun tests – Easy to debug a test case, set breakpoints in test or product code

17 Further Reading TDD – Test-Driven Development: By Example (Google Books) Test-Driven Development: By Example – testdriven.com testdriven.com JUnit – JUnit Cookbook JUnit Cookbook – JUnit Javadoc JUnit Javadoc – An early look at JUnit 4 An early look at JUnit 4

18 APPENDIX

19 Suite Initialization Methods Similar to setUp and tearDown, but they run before & after the entire test suite Useful for expensive config operations that dont need to be run for each unit test – E.g. setting up a DB or network connection, redirecting System.err when testing 3 rd -party libraries – Be careful that your unit tests dont make changes to static state that will impact other unit tests later in the suite // This class tests a lot of error conditions, which Xalan annoyingly logs // to System.err. This hides System.err before the run, restores it after. private PrintStream protected void redirectStderr() { systemErr = System.err; // Hold on to the original value System.setErr(new PrintStream(new ByteArrayOutputStream())); protected void tearDown() { // restore the original value System.setErr(systemErr); }

20 Testing Exceptions It is easy to test for expected exceptions – Annotate your test with the expected exception – If the exception isnt thrown (or a different one is), the test will fail Limitation: if you need to test the exceptions message or other properties, use a different public void divideByZero() { int n = 2 / 0; public void divideByZero() { try { int n = 2 / 0; fail("Divided by zero"); } catch (ArithmeticException(success) { assertNotNull(success.GetMessage()); }

21 Timed Tests Simple performance bench-marking: Network public void retrieveAllElementsInDocument() { doc.query("//*"); public void remoteBaseRelativeResolutionWithDirectory() throws IOException, ParsingException { builder.build("http://www.ibiblio.org/xml"); }

22 Creating JUnit Tests with Eclipse Using JUnit with Eclipse is easy Create new JUnit tests using File -> New -> JUnit – Specify JUnit 4 – Select location, class you want to test, method stubs to create – Then select which methods you want to test, Eclipse will generate test stubs


Download ppt "Software Testing & Test-Driven Development JAMS Workshop Makerere University September 2010."

Similar presentations


Ads by Google