Presentation is loading. Please wait.

Presentation is loading. Please wait.

User Experience Analysis in Application Performance Monitoring: An Overview Francesco Molinari, 1 Summer.

Similar presentations


Presentation on theme: "User Experience Analysis in Application Performance Monitoring: An Overview Francesco Molinari, 1 Summer."— Presentation transcript:

1 User Experience Analysis in Application Performance Monitoring: An Overview Francesco Molinari, mail@francescomolinari.itmail@francescomolinari.it 1 Summer School “Open Innovation & User eXperience Design & Assessment”

2 Acknowledgments (I) Polymedia SpA – located in Milan, IT – is 100% owned by KIT Digital Inc. [NASDAQ: KITD], a New York and Prague based premium provider of end-to-end video asset management software and related services for multi-screen and socially-enabled video delivery to over 2,300 media and entertainment, network operators and non-media enterprise clients.

3 Acknowledgments (II) This research is being carried out in the context of the EU funded FP7 ICT project ELLIOT (“Experiential Living Lab for the Internet Of Things”), coordinated by Polymedia SpA (www.elliot-project.eu).www.elliot-project.eu However, the opinions expressed here are solely of the author and do not involve or engage any European institution or partner organisation.

4 This presentation Definitions –User Experience (UX) –Application Performance Monitoring & Management (APM&M) UX Analysis in the framework of APM&M –Gartner’s Magic Quadrant –Why this comparison –Detailed results –Overview of competition Further steps ahead –Behavioural Change –Some examples IOT Service Environment –User Perspective –Living Lab Approach UX Measurement in ELLIOT –Proposed Workflow –Application Examples Implementation of Gartner’s concept –Discussion, Group Work & Conclusions 4

5 5 UX The four elements of User Experience (Rubinoff, 2004) Facets of the User Experience (Morville, 2004) User Experience (ISO 9241-210): “A person's perceptions and responses that result from the use or anticipated use of a product, system or service" “is subjective and focuses on the use” “includes all the users' emotions, beliefs, preferences, perceptions, physical and psychological responses, behaviours and accomplishments that occur before, during and after use”

6 6 APM&M Wikipedia definition: The discipline within systems management that focuses on monitoring and managing the performance and availability of software applications – to translate IT metrics into business meaning (value). This discipline looks at workflow and related IT tools deployed to detect, diagnose, and report on application performance issues to ensure that it meets or exceeds the expectations of end-users and the business. Application Performance Monitoring and Management

7 7 UX Analysis within APM&M Wikipedia definition (adapted): APM&M is related to end-user experience management and real user monitoring in that measuring the experience of real users in the use of an application in production is considered as the most valid method of assessing its performance.

8 8 UX within APM&M (example) Source: www.nastel.com

9 9 UX within APM&M (example) Source: www.compuware.com

10 10 UX proxies in AM&M (example) Source: www.aternity.com

11 11 UX proxies in AM&M (example) Source: www.aternity.com Real & Virtual Desktop Performance: Boot profiling, CPU and memory utilization per process, error messages, non-responding processes, crashed applications, Blue Screen of Death (BSOD), etc. Application Performance: Latency, response time and “key-to-glass” transaction time for user workflows across the widest array of applications and environments, including HTTP(s), RIA/AJAX, Client Server, Java,.NET, XenApp/ICA, Terminal Server/RDP, VDI/RDP, etc. User Productivity: Application, module and function usage statistics, usage trail, and execution time/time spent, e.g. trades executed, calls closed, emails sent, invoices created, etc.

12 Problem 12

13 13 The Gartner framework In a recent report entitled “Magic Quadrant for Application Performance Monitoring,” authors Jonah Kowall and Will Cappelli outlined a five ‐ dimensional model for APM&M to identify the technologies and services that comply at their best with it. The five dimensions are: End ‐ user experience (EUE) monitoring Runtime application architecture discovery, modeling and display User ‐ defined transaction profiling Component deep ‐ dive monitoring in an application context Provision of business analytics In particular, EUE is defined as the capture of data about how end-to- end application availability, latency, execution correctness and quality appeared to the end user

14 14 The Magic Quadrant

15 15 The Magic Quadrant

16 Utility of this comparison While a considerable number and variety of tools exist that perform End UX Analysis in the context of APM&M, ELLIOT is quite unique in support to the evaluation of co-creation performance in a Living Lab environment. However, in order to assess its true business potential, a benchmarking exercise needs to be made with the only comparable market at hand – which exhibits fast growth and rapid evolution indeed. Following A. Ben Shohan, we only focus on tools that are able to track the response times that real users experience when using on site applications – not robots that synthetically ping the site. 16

17 Detailed results (1/5) 17

18 Detailed results (2/5) 18

19 Detailed results (3/5) 19

20 Detailed results (4/5) 20

21 Detailed results (5/5) 21

22 Overview of competition In this family of tools: –No reference is ever made to an IOT rich environment –Focus is on service usage, never creation or co-creation of a service Two main business scenarios are represented: –CRM and customer satisfaction –Business process intelligence Many big players are already there (Oracle, IBM, Hp, Microsoft…) Some software tools are free of charge Solutions are available both as stand-alone and in SaaS mode 22

23 Further steps ahead It would be interesting to adopt Gartner’s framework to evaluate ELLIOT capacity –To help monitor and analyse End UX By capturing and delivering a potentially huge number of workflow related data and analytics, and –To stick to another workflow definition, more closely related to service innovation management 23 Technology Driven (TD) User Centric (UC) User Driven (UD) Open (O)OTDOUCOUD Closed (C)CTDCUCCUD Source: Leminen and Fred (2009)

24 Further steps ahead (2) –To leverage Behavioural Change as a permanent source for innovation in services 24 Adapted from Peedu (2011)

25 Behavioural Change 25 “H-H-I”

26 Behavioural Change (foll.) 26 “H-C-I”

27 Behavioural Change (foll.) 27 “H-C-T”

28 Example 1 28 http://loop1.aiga.org/images/edition003/sapientucd/saptimage1.gif This model shows how one type of user evolves from viewing a cellular phone as a single-function appliance to experiencing it as an essential life tool.

29 Example 2 29 http://www.meshplanning.com/content/wp-content/uploads/2011/04/Experiencing-Schweppes-Proving-the-Power-of-data.pdf

30 Example 3 30 SAVE ENERGY Project, http://www.ict4saveenergy.eu/

31 Example 4 Equipped with: Speakers Camera Microphone Display 32’’ Touch Screen Smart Card, RFiD, NFC Reader Ergonomic Dispenser CPU and Connectivity (Ethernet, Wireless, BT, USB) “The Vending Machine of the Future” http://www.eservices4life.org

32 IOT Service Environment 32

33 User Perspective 33

34 Living Lab Living (adj.) Active, functioning, exhibiting the life or motion of nature Merriam-Webster Laboratory (n.) A place providing opportunity for experimentation, observation, or practice in a field of study Merriam-Webster 34  Living Labs are open innovation environments in real-life settings, where user-driven design and experimentation are integrated within a co- creation process of new services, products and societal infrastructures. (Santoro & Conte 2009)

35 Problem 35

36 Proposed Solution 36

37 Example 37

38 Designer Perspective 38

39 Technical Workflow 39 Modelling 1.Set up of the Scenario and Goals 2.IOT LL Configuration 3.Model the Experience Evaluation Runtime a.Raw Data Collection b.Analysis of Results 1 2 3 b a

40 40 Platform Architecture

41 Applications I.Logistics (BIBA, DE) II.Health and Well-Being (HSR, IT) III.Green Services (INRIA, FR) IV.Healthcare (VirTech, BG) V.Retail (SafePay, HU) VI.Energy (Intersoft, SK) 41

42 Applications (I) 42

43 iTV for PaediatricsNextgen Vending Machine Vai In Bici (Go Bike!) Public Transport e-Services Design Drivers TM Function Emotion Relation Applications (II)

44 Applications (II-foll.) 44

45 Applications (III) 45

46 Applications (IV) 46 www.temeo.org

47 Applications (V) 47

48 Applications (VI) 48 Energy Efficient Pilot: Managing energy for buildings and offices

49 Proposal for Group Work 49 Application workflowAPM&M objectives 1. The end user initiates a request, which triggers the execution of the software and hardware components used to respond to the request. 1. Tracking, in real time, the execution of the software algorithms that constitute an application. 2. Some of the steps in the execution are defined and sequenced by business logic, as opposed to computer system logic. 2. Measuring and reporting on the finite hardware and software resources that are allocated to be consumed as the algorithms execute. 3. The software algorithms work with one another as they execute. The results are compiled and assembled into a resultant set of data. 3. Determining whether the application executes successfully according to the application owner. 4. The resultant data is delivered by using hardware and software to the user in a well- defined computer interface. 4. Recording the latencies associated with some of the execution step sequences. 5. If the algorithms complete their execution successfully, then they achieve well-defined goals that meet the established requirements of some end users or end-user communities. 5. Determining why an application fails to execute successfully, or why resource consumption and latency levels depart from expectations.

50 Proposal for Group Work (2) 50 APM&M objectivesGartner’s key dimensions of APM&M 1. Tracking, in real time, the execution of the software algorithms that constitute an application. 1. End-user experience monitoring — the capture of data about how end-to-end application availability, latency, execution correctness and quality appeared to the end user. 2. Measuring and reporting on the finite hardware and software resources that are allocated to be consumed as the algorithms execute 2. Runtime application architecture discovery, modeling and display — the discovery of the various software and hardware components involved in application execution, and the array of possible paths across which those components could communicate that, together, enable that involvement. 3. Determining whether the application executes successfully according to the application owner. 3. User-defined transaction profiling — the tracing of events as they occur among the components or objects as they move across the paths discovered in the second dimension, generated in response to a user's attempt to cause the application to execute what the user regards as a logical unit of work. 4. Recording the latencies associated with some of the execution step sequences. 4. Component deep-dive monitoring in an application context — the fine-grained monitoring of resources consumed by and events occurring within the components discovered in the second dimension. 5. Determining why an application fails to execute successfully, or why resource consumption and latency levels depart from expectations. 5. Analytics — the marshalling of a variety of techniques (including behavior learning engines, complex-event processing (CEP) platforms, log analysis and multidimensional database analysis) to discover meaningful and actionable patterns in the typically large datasets generated by the first four dimensions of APM.

51 Proposal for Group Work (3) 51 Application workflowIoT service creation workflow 1. The end user initiates a request, which triggers the execution of the software and hardware components used to respond to the request. 1. ???. 2. Some of the steps in the execution are defined and sequenced by business logic, as opposed to computer system logic. 2. ???. 3. The software algorithms work with one another as they execute. The results are compiled and assembled into a resultant set of data. 3. ???. 4. The resultant data is delivered by using hardware and software to the user in a well- defined computer interface. 4. ???. 5. If the algorithms complete their execution successfully, then they achieve well-defined goals that meet the established requirements of some end users or end-user communities. 5. ???.

52 Proposal for Group Work (4) 52 IoT service creation workflowELLIOT objectives 1. ???. 2. ???. 3. ???. 4. ???. 5. ???.

53 Proposal for Group Work (5) 53 ELLIOT objectives Gartner’s (adapted?) key dimensions of APM&M 1. ???. 1. End-user experience monitoring — the capture of data about how end-to-end application availability, latency, execution correctness and quality appeared to the end user. 2. ???. 2. Runtime application architecture discovery, modeling and display — the discovery of the various software and hardware components involved in application execution, and the array of possible paths across which those components could communicate that, together, enable that involvement. 3. ???. 3. User-defined transaction profiling — the tracing of events as they occur among the components or objects as they move across the paths discovered in the second dimension, generated in response to a user's attempt to cause the application to execute what the user regards as a logical unit of work. 4. ???. 4. Component deep-dive monitoring in an application context — the fine-grained monitoring of resources consumed by and events occurring within the components discovered in the second dimension. 5. ???. 5. Analytics — the marshalling of a variety of techniques (including behavior learning engines, complex-event processing (CEP) platforms, log analysis and multidimensional database analysis) to discover meaningful and actionable patterns in the typically large datasets generated by the first four dimensions of APM.

54 Conclusions 54

55 Thank You 55 Experiential Living Lab for the Internet Of Things Francesco Molinari, mail@francescomolinari.itmail@francescomolinari.it

56 References Gartner (2012), The Magic Quadrant for Application Performance Monitoring, http://www.gartner.com/technology/reprints.do?id=1-1BRHACN&ct=120817&st=sb http://www.gartner.com/technology/reprints.do?id=1-1BRHACN&ct=120817&st=sb Seppo Leminen and Minna Fred (2009), in Leminen S. (Ed.) State of the Art of UDOI Usage within Companies Business Processes, Laurea University, Espoo. Francesco Molinari (2012), User Experience Analysis in Service Co-Creation: A Living Lab Approach. In: Proceedings of the SERVDES12 Conference. Peter Morville (2004), User Experience Design. http://semanticstudios.com/publications/semantics/000029.php http://semanticstudios.com/publications/semantics/000029.php Geroli Peedu (2011), Enhancing Public Service User Experience in Information Society, Master Thesis, University of Tallinn. Robert Rubinoff (2004), How to Quantify the User Experience. http://www.sitepoint.com/quantify-user-experience/ http://www.sitepoint.com/quantify-user-experience/ Roberto Santoro & Marco Conte (2009). Living Labs in Open Innovation Functional Regions. In: Proceedings of the ICE09 Conference. Alon Ben Shohan, A List of End User Experience Monitoring Tools, http://www.real- user-monitoring.com/category/blog/http://www.real- user-monitoring.com/category/blog/ The ELLIOT project (2010-2012). http://www.elliot-project.eu/http://www.elliot-project.eu/ 56


Download ppt "User Experience Analysis in Application Performance Monitoring: An Overview Francesco Molinari, 1 Summer."

Similar presentations


Ads by Google