Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sakai Community Performance Testing Working Proof of Concept Sakai Performance Working Group Linda Place, Chris Kretler, Alan Berg Universities of Michigan.

Similar presentations


Presentation on theme: "Sakai Community Performance Testing Working Proof of Concept Sakai Performance Working Group Linda Place, Chris Kretler, Alan Berg Universities of Michigan."— Presentation transcript:

1 Sakai Community Performance Testing Working Proof of Concept Sakai Performance Working Group Linda Place, Chris Kretler, Alan Berg Universities of Michigan & Amsterdam 1 July 2008

2 Performance Testing Overview Part of QA process –May include stress, capacity, scalability, reliability as well as performance (response time, throughput, bottleneck identification) Black box testing –Running system with projected use-case scenarios for acceptable throughput and response times How will users experience application?

3 Performance Testing Overview White box testing –Pushing system to identify application, database, operating system, or network problems Tune environment to identify and address specific problems Tests watched by developers, DBA, system and network administrators, and performance engineers Resource intensive nature of process –Infrastructure, personnel, tools, time

4 Community Performance Testing The Approach Common Environment/Data Creation Load Test Tool Proof of Concept The Practice

5 Organization vs. Community Testing Organization approach –Advantages Focus on exact infrastructure used in production Define use cases according to real use in production Only test tools specific to organization –Limitations Very resource intensive (expensive) Hard to maintain Hard to be agile without careful advance planning 5

6 Organization vs. Community Testing Community approach –Advantages Pool of test scripts available for immediate use May not need full testing infrastructure to be confident in production Increased confidence in adding new tools Total cost of testing shared –Limitations Must clearly communicate results to community Collaboration seems harder than just doing it yourself 6

7 WG Project Objectives Create QA environment that enables performance testing by community in shared manner –Plan available on Confluence http://confluence.sakaiproject.org/confluence/display/PE RF/Homehttp://confluence.sakaiproject.org/confluence/display/PE RF/Home Have working proof of concept by Paris Sakai Conference 7

8 Infrastructure Issues What do we learn from testing done on infrastructure that doesn’t match our own? –Advantages Software/application focus forces code improvements More efficient code more likely to scale to meet size and performance needs –Disadvantages Hardware and network bottlenecks NOT found Capacity limits of environment NOT discovered

9 Community Performance Testing The Approach Common Environment/Data Creation Load Test Tool Proof of Concept The Practice

10 Provisioning Alan's provisioning tool –Perl scripts use web services to populate DB with courses –Configuration via property files –Adds tools, instructors and students to courses –Creates file structure and uploads resources to sites Other efforts are underway

11 Provisioning, Benefits Easy, fast creation of test environment Common data environment –Common environment by sharing properties files –Common data files for tests

12 Provisioning, Some Modifications ORIGINAL 1.provision.pl 2.make_media_list.pl 3.make_local_resources. pl MODIFIED 1.provision.pl 2.make_media_list.pl 3.create_templates.pl 4.load_resources.pl

13 Data Environment Design Based on UM Winter 2008 semester –Number of courses (2000) and students (20,000) –Tool mix (+projection) Courses broken into 5x3 categories: –5 categories of roster size –3 categories of tool distribution Site name based upon category

14 Community Performance Testing The Approach Common Environment/Data Creation Load Test Tool Proof of Concept The Practice

15 Evaluation & Tool Selection Goal –Enable near-enterprise level of tool quality using Open Source tools –Hewlett Packard LoadRunner used for comparison 15

16 Evaluated Test Software Grinder JMeter  CLIF  Web Application Load Simulator  SlamD (Sun)  Funkload  Selenium  Pylot WebLoad Open Source TestMaker  OpenSTA  Hammerhead  httperf  Seagull  Deluge

17

18 Grinder: Plus Easy record mechanism –Can record https on multiple platforms Like the scripting language (jython)‏ –Java extensible –Allows for re-usable libraries –Flexible reporting, data handling Run-time UI displays multiple test scripts –We use 22 for our standard scenario!

19 Grinder: Plus Distributed load generation –Multi-platform Active user/development community –Open source –Separate project for developing reports: “Grinder Analyzer” Jython Eclipse plug-in (Grinderstone)

20 Grinder: Minus Default recorded script is complex –Verbose results data –Perl script cleans the default record script (J)Python not ubiquitous –To do list shows support for java scripts Reporting project doesn't consolidate data easily

21 Community Performance Testing The Approach Common Environment/Data Creation Load Test Tool Proof of Concept The Practice

22 Proof of Concept Read-only scripts used –Download file –Viewing a grade –“Idle” user Mix/Frequency of users corresponds to a peak usage period –Events table used to determine usage

23 Event Graphs

24 LoadRunner Script

25 Grinder Script

26 LoadRunner Results

27 Grinder Results

28 Database Results

29 Grinder: Some Ideas Consolidated & Expanded reporting Business Process Testing –Makes test script development more flexible –Potentially expand the pool of testers

30 Community Performance Testing The Approach Common Environment/Data Creation Load Test Tool Proof of Concept The Practice

31 Establishing Baseline(s) Identify “core” scripts for Sakai baseline Identify related script clusters to baseline tool clusters Maximum time threshold for page loads Minimum user load for tools and kernel Database activity reports

32 Establishing Baseline(s) Identify “core” scripts for Sakai baseline –All tools included in release? –More limited subset? Identify related script clusters to baseline tool clusters –Tools with dependencies? –Tools with similar actions? –Usage scenarios associated with specific campuses? 32

33 Community Sharing Scripts –Set up performance test branch in Sakai svn? –Associate test scripts with code in svn? –Expand Confluence Performance WG site? –Establish communication mechanisms during QA release cycles? 33

34 Community Sharing Scripts –Sakai performance test branch in svn? –Packaged along with tools in svn? Results –Expand Confluence site? –Contribute to QA “score card” Documentation –On confluence and/or in svn?

35 Contributing to Sakai QA Recommend performance “standards” to be part of formal QA release process? –Score cards Volunteer organization to test specific tools based on usage needs –Use shared scripts and contribute scripts/results back to community Combined test results yield greater confidence in Sakai code

36 Moving from Concept to Community Performance Working Group BOF –Wednesday, 11-12, BOF Room 1 Check BOF schedule for exact location Looking to identify contributors Goal to establish new working plan for WG contribution to next conference

37 Michigan’s Contribution Leadership & initial design All test scripts used at Michigan All results from Michigan tests All documentation needed to conduct testing as currently implemented at Michigan All relevant materials for future tests Kernel testing for Next Generation Sakai


Download ppt "Sakai Community Performance Testing Working Proof of Concept Sakai Performance Working Group Linda Place, Chris Kretler, Alan Berg Universities of Michigan."

Similar presentations


Ads by Google