Presentation is loading. Please wait.

Presentation is loading. Please wait.

Performance in the New Millennium Speaker: Mark Tomlinson April 2012 Twin Cities Quality Assurance Association.

Similar presentations


Presentation on theme: "Performance in the New Millennium Speaker: Mark Tomlinson April 2012 Twin Cities Quality Assurance Association."— Presentation transcript:

1 Performance in the New Millennium Speaker: Mark Tomlinson April 2012 Twin Cities Quality Assurance Association

2 First, a canine metaphor… Old DogsNew Dogs

3 What can we learn? Old dogs: Take their time Consistent Predictable They know themselves Trusting and Loyal Obedient New dogs: Anxious and Busy Ferocious Learning Eating and Growing Making mistakes Breaking the rules Unpredictable

4 Reactions to Novelty NeophobicNeophilic

5 …stop, look around There are new things happening faster around us: Increase in globally interactivity Pervasive mobile use, dependency, addiction Era of the “connected worker” Power and control of information Accelerated cycles of novelty “The world is shifting quite literally under our feet.” – Roi Carmel

6 SO, WHAT’S NEW?

7 Web Performance Optimization Why you should add #WPO to your approach: You can start earlier in the lifecycle You don’t need load, optimize with single-session and data The tools are already there, automate them (GUI Virtual User?) Influence performance closer to the end-user It’s really popular – with meetup groups and social media hangouts

8 8 Web Performance Optimization 80-90% of the end-user response time is spent on the frontend. Start there.

9 9 Web Performance Optimization On Twitter follow @souders or search for #WPO stevesouders.com Head Performance Engineer at Google Steve spurred the creation of numerous performance tools and services including the Yslow!, HTTP Archive, Cuzillion, Jdrop, ControlJS, and Browserscope. Author of several books…

10 Cloud Performance Tips on cloud-based performance testing: Cloud-savvy Ops teams are already open to this Consider your cloud provider carefully To CloudTo On-prem From Cloud Self-service and on-demand use different cloud from SUT Repeatable but not consistent Security and Firewall issues Data privacy issues Traffic routing issues From On-prem Put LG’s outside the firewall Test results stay on prem Use secure channels to LG’s Can’t find bugs outside the FW Emulate real-world noise Match production scale %

11 Mobile Performance “The mobile insanity is here to stay…good are apps being punished by bad performance every day.” Consider new techniques for mobile performance: Injecting mobile latency, switching, loss into tests Inspecting mobile device performance (@JAHarrison) Analyze mobile user behavior Test for traffic segmentation Test for application tolerances Know your devices & users

12 Early Performance Testing Old Performance Tester: “As usual, performance testing is always left until the end of the project.” New Performance Tester: “Why not learn what we can as soon as we can? Shift left.” Characteristics of early performance testing efforts: – Automation must “work around” unstable components – Service Virtualization and “stubbing” is a necessity – Results are always “preliminary” and not conclusive – Limited availability of test lab resources – shared infrastructure – Business visibility is much higher (agile, pairing, scrum, reviews) – Embraces the core principles of Exploratory Testing

13 Tuning in production – that’s old. Ops knows it. Testing in production – that’s new. Ops is scared. Real-world Performance Testing: bringing the real- world workload, noise, data, configuration, monitoring into the lab. Enhancing Accuracy Testing in Production: bringing the test into the production environment. Simply easier and more cost- effective than building a lab.` Performance Testing in Production

14 Testing in Production: is becoming popular with new companies who never had a lab or never knew how to test properly the “old” way SOASTA has a great slide showing the list of things that you can find with traditional, in-lab performance testing – compared on a continuum of issues that can be found if you enhance your approach to include performance testing in production.

15 PE vs. PT Considering the titles: “…which has a more defined career path.” “…influences the software design.” “…requires developer skills.” “…drives the business objectives.” “…can save money for the company.” “…optimizes the end-user experience.” “…is exchangeable with the other title.” The title “Performance Engineer” won 6 of 7

16 New Performance Testers Developer PEOperational PE Capacity Planning APM and RUM Tuning and Root-cause Triage and Escalation Infrastructure Workload Analytics Scalability Single-malt Scotch Application Development Application Profiling Method/Query Tuning Architectural Review Component Performance Pre-release Load Testing Building Performance Tools International Micro-brews

17 Software Developers in Performance Once upon a time in the dev/test team… Software Development Engineer in Performance: likes automation but only to keep learning how to code stepping stone to becoming a real developer knows the automation code – still learning the app & users not interested in testing, generally responsible for running tests accountable for running tests

18 Network Performance Optimization Consider that load testing without emulating the cache settings on the client can now result in total inaccuracy. If your systems in production leverage network accelerators or optimizers – consider this: Your response time measures are wildly pessimistic The bugs you seek are not in the application layers You have a new customer on the Network team

19 Profiling and Diagnostics “If you aren’t monitoring the system under test while you are running a load test – you should just give up and go home. You’re causing more harm than good.” Combine monitoring and profiling in your load tests: dig into the root-cause and recommend the solution correlate infrastructure usage to the application code dissect the system under test – follow the slow code

20 Agile Performance Stories Functional User Story: So that I can report to my management As a campaign manager I want to see the results of my active campaign on a single web page

21 Agile Performance Stories Performance User Story: So that campaign managers can report accurately and timely to their managers As an Operations Manager I want the data on the active campaign page to render in less than 2 secs, when 10,000 users are logged in to the website

22 Listening for Performance Active listening is a communication technique that requires the listener to feed back what he hears to the speaker, by way of re-stating or paraphrasing what he has heard in his own words, to confirm what he has heard and moreover, to confirm the understanding of both parties. nothing is being assumed or taken for granted reduces misunderstanding and conflicts strengthens cooperation it is proactive, accountable and professional

23 What's this DevOps stuff? DevOps Self-proclaimed cultural movement (is it a fad?) Embraces patterns for collaboration and Leads next-generation engineering principles Not grounded in top-down methods or mgmt The “ops” part is performance savvy The “dev” part…not so much Continuous App Delivery Continuous App Optimization Performance binds Dev & Ops PERFOPS DEV

24 24 Application Performance Index A new way to measure end-user response to experience – a score for tolerance: Measures response times with a calculation on a scale between 0 and 1. Infers a level of customer happiness or satisfaction based on the calculation. Depends on two settings: T = is the threshold between satisfied and tolerating F = is the threshold between tolerating and frustrated Check out: apdex.org

25 25 Application Performance Index The default is set T to 6 seconds, and F to 26 seconds It is not transaction-specific. You still need to know your customer and set SLA’s

26 26 Application Performance Index APDEX is still just a number - a product of a calculation: To use APDEX correction you must: 1) determine the settings for T and F 2) explain the meaning of apdex to the biz 3) relate the APDEX score to revenue

27 Exploratory Performance Testing Performance is inherently exploratory Has over-arching goals for performance Includes iteratively adapted tactics Fueled by self-direction and curiosity Requires dynamic test configurations Places a priority creativity and improvement

28 Exploratory Performance Mapping Distributed Centralized Physical Logical

29 CLIENTWEBAPPDATA API BROWSERHTTPdFXRDBMS OS Physical Logical Distributed Centralized Exploratory Performance Mapping

30 Lifecycle Virtualization “Using virtualization technologies along the path of the entire application lifecycle is now an essential part of continuous, automated software development and release.” Platform virtualization is the foundation – the virtual OS Service virtualization eliminates application dependencies Device and network virtualization can emulate the real world Virtual Lab Management (VLM) allows agility in the test lab

31 New Load Testing Tools Established: HP LoadRunner and PC IBM/Rational RPT Jmeter Microfocus/Performer Webload Compuware Microsoft VS Oracle/Empirix Newcomers: AgileLoad Web Performance LT Neotys NeoLoad SOASTA CloudTest Lite Telerik Load Complete

32 New Cloud Perf Tools Established: SOASTA CloudTest Keynote Compuware/Gomez HP LoadRunner Cloud HP PC SaaS Newcomers: Blazemeter LoadZen LoadStorm Blitz.io Apica Cloud Assault

33 New Client Diagnostics Established: Fiddler YSlow! Firefox Firebug Shunra Google PageSpeed Akamai Mobitest WebSiteOptimization Newcomers: Charles Proxy WebPageTest.org Google PageSpeed WebWait.com GTMetrix.com BenchJS

34 New App Diagnostics Profiling Established : JProfiler CA Wily HP Diagnostics Dell/Quest PerformaSure Intel V-Tune Microfocus DevPartner Microsoft VS - Intelitrace Newcomers: Yourkit NewRelic dynaTrace Google Perf Tools

35 New Monitoring Tools Established : IBM Tivoli TPV HP APM & BAC HP SiteScope Microsoft SCOM CA APM & Wily Dell/Quest Foglight Compuware Vantage Newcomers: AppDynamics Nagios/Cacti Zyrion Traverse NewRelic Confio Ignite Gomez 360 Splunk Blue Triangle

36 CONCLUSION

37 Our reactions to novelty are shaped thusly: “The likeliness for an individual to be included in one of those categories depends both on what it has learned from its socialization with its own species and from experiences from exploring its environment (as range expansion often brings animals into contact with novelty), especially the maternal influence of the individuals’ experiences, and the individual’s genetics influencing its likeliness to explore or focus on remaining within a safe and familiar space.” Source: “We All Like New Things, Or Do We? A comparison of human and non-human primate reactions to novelty.” (Wiley)

38 Mark Tomlinson West Evergreen Consulting, LLC mtomlins@westevergreen.com +1-773-775-8780 mtomlins.blogspot.com @mtomlins & @perfbytes


Download ppt "Performance in the New Millennium Speaker: Mark Tomlinson April 2012 Twin Cities Quality Assurance Association."

Similar presentations


Ads by Google