User Experience Analysis in Application Performance Monitoring: An Overview Francesco Molinari, 1 Summer.

Slides:



Advertisements
Similar presentations
Symantec 2010 Windows 7 Migration Global Results.
Advertisements

1 A B C
Variations of the Turing Machine
1 Senn, Information Technology, 3 rd Edition © 2004 Pearson Prentice Hall James A. Senns Information Technology, 3 rd Edition Chapter 7 Enterprise Databases.
Process Description and Control
AP STUDY SESSION 2.
1
Select from the most commonly used minutes below.
Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
1 Copyright © 2013 Elsevier Inc. All rights reserved. Chapter 4 Computing Platforms.
Processes and Operating Systems
ASTM Member Website Tools Jeff Adkins Diane Trinsey 1 September 2012 Officers Training Workshop.
September 2013 ASTM Officers Training Workshop September 2013 ASTM Officers Training Workshop ASTM Member Website Tools September 2013 ASTM Officers Training.
1 Hyades Command Routing Message flow and data translation.
David Burdett May 11, 2004 Package Binding for WS CDL.
18 Copyright © 2005, Oracle. All rights reserved. Distributing Modular Applications: Introduction to Web Services.
Local Customization Chapter 2. Local Customization 2-2 Objectives Customization Considerations Types of Data Elements Location for Locally Defined Data.
Process a Customer Chapter 2. Process a Customer 2-2 Objectives Understand what defines a Customer Learn how to check for an existing Customer Learn how.
1 Click here to End Presentation Software: Installation and Updates Internet Download CD release NACIS Updates.
ACT User Conference E-Statements Shopping Carts Secure FTP 2.
© Tally Solutions Pvt. Ltd. All Rights Reserved Shoper 9 License Management December 09.
A Fractional Order (Proportional and Derivative) Motion Controller Design for A Class of Second-order Systems Center for Self-Organizing Intelligent.
Impressive Star Softwares (P) Ltd. Presents Sent Item Box-Detail of Mails from Tally ( 1.0 )
Chapter 7: Steady-State Errors 1 ©2000, John Wiley & Sons, Inc. Nise/Control Systems Engineering, 3/e Chapter 7 Steady-State Errors.
Welcome. © 2008 ADP, Inc. 2 Overview A Look at the Web Site Question and Answer Session Agenda.
Break Time Remaining 10:00.
Chapter 5 – Enterprise Analysis
Turing Machines.
Table 12.1: Cash Flows to a Cash and Carry Trading Strategy.
Database Performance Tuning and Query Optimization
ACT User Meeting June Your entitlements window Entitlements, roles and v1 security overview Problems with v1 security Tasks, jobs and v2 security.
Discovering Computers Fundamentals, 2012 Edition
PP Test Review Sections 6-1 to 6-6
Health Artifact and Image Management Solution (HAIMS)
Operating Systems Operating Systems - Winter 2010 Chapter 3 – Input/Output Vrije Universiteit Amsterdam.
Exarte Bezoek aan de Mediacampus Bachelor in de grafische en digitale media April 2014.
Sample Service Screenshots Enterprise Cloud Service 11.3.
Copyright © 2012, Elsevier Inc. All rights Reserved. 1 Chapter 7 Modeling Structure with Blocks.
Mobility Tool Fremtidens afrapportering 2013 – Erasmus Mobilitet / IP 2014 – Erasmus+ aktioner.
Lilian Blot PART III: ITERATIONS Core Elements Autumn 2012 TPOP 1.
Adding Up In Chunks.
SLP – Endless Possibilities What can SLP do for your school? Everything you need to know about SLP – past, present and future.
MaK_Full ahead loaded 1 Alarm Page Directory (F11)
1 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt Synthetic.
Artificial Intelligence
1 How Do I Order From.decimal? Rev 05/04/09 This instructional training document may be updated at anytime. Please visit and check the.
HORIZONT 1 XINFO ® The IT Information System HORIZONT Software for Datacenters Garmischer Str. 8 D München Tel ++49(0)89 /
By CA. Pankaj Deshpande B.Com, FCA, D.I.S.A. (ICA) 1.
: 3 00.
5 minutes.
1 hi at no doifpi me be go we of at be do go hi if me no of pi we Inorder Traversal Inorder traversal. n Visit the left subtree. n Visit the node. n Visit.
Speak Up for Safety Dr. Susan Strauss Harassment & Bullying Consultant November 9, 2012.
1 Titre de la diapositive SDMO Industries – Training Département MICS KERYS 09- MICS KERYS – WEBSITE.
Essential Cell Biology
James A. Senn’s Information Technology, 3rd Edition
Converting a Fraction to %
Clock will move after 1 minute
1 © 2004, Cisco Systems, Inc. All rights reserved. CCNA 1 v3.1 Module 9 TCP/IP Protocol Suite and IP Addressing.
The DDS Benchmarking Environment James Edmondson Vanderbilt University Nashville, TN.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 13 Slide 1 Application architectures.
Physics for Scientists & Engineers, 3rd Edition
Select a time to count down from the clock above
Copyright Tim Morris/St Stephen's School
1.step PMIT start + initial project data input Concept Concept.
Introduction Peter Dolog dolog [at] cs [dot] aau [dot] dk Intelligent Web and Information Systems September 9, 2010.
1 DIGITAL INTERACTIVE MEDIA Wednesday, October 28, 2009.
© IDC, IIT Bombay Children Solve Problems  “We can learn a lot from children, and especially from watching children think.” Edward de Bono, Children Solve.
1 Decidability continued…. 2 Theorem: For a recursively enumerable language it is undecidable to determine whether is finite Proof: We will reduce the.
Aviation Management System 1 2  Silver Wings Aircraft Aviation Management System represents a functional “high – end” suite of integrated applications.
Presentation transcript:

User Experience Analysis in Application Performance Monitoring: An Overview Francesco Molinari, 1 Summer School “Open Innovation & User eXperience Design & Assessment”

Acknowledgments (I) Polymedia SpA – located in Milan, IT – is 100% owned by KIT Digital Inc. [NASDAQ: KITD], a New York and Prague based premium provider of end-to-end video asset management software and related services for multi-screen and socially-enabled video delivery to over 2,300 media and entertainment, network operators and non-media enterprise clients.

Acknowledgments (II) This research is being carried out in the context of the EU funded FP7 ICT project ELLIOT (“Experiential Living Lab for the Internet Of Things”), coordinated by Polymedia SpA ( However, the opinions expressed here are solely of the author and do not involve or engage any European institution or partner organisation.

This presentation Definitions –User Experience (UX) –Application Performance Monitoring & Management (APM&M) UX Analysis in the framework of APM&M –Gartner’s Magic Quadrant –Why this comparison –Detailed results –Overview of competition Further steps ahead –Behavioural Change –Some examples IOT Service Environment –User Perspective –Living Lab Approach UX Measurement in ELLIOT –Proposed Workflow –Application Examples Implementation of Gartner’s concept –Discussion, Group Work & Conclusions 4

5 UX The four elements of User Experience (Rubinoff, 2004) Facets of the User Experience (Morville, 2004) User Experience (ISO ): “A person's perceptions and responses that result from the use or anticipated use of a product, system or service" “is subjective and focuses on the use” “includes all the users' emotions, beliefs, preferences, perceptions, physical and psychological responses, behaviours and accomplishments that occur before, during and after use”

6 APM&M Wikipedia definition: The discipline within systems management that focuses on monitoring and managing the performance and availability of software applications – to translate IT metrics into business meaning (value). This discipline looks at workflow and related IT tools deployed to detect, diagnose, and report on application performance issues to ensure that it meets or exceeds the expectations of end-users and the business. Application Performance Monitoring and Management

7 UX Analysis within APM&M Wikipedia definition (adapted): APM&M is related to end-user experience management and real user monitoring in that measuring the experience of real users in the use of an application in production is considered as the most valid method of assessing its performance.

8 UX within APM&M (example) Source:

9 UX within APM&M (example) Source:

10 UX proxies in AM&M (example) Source:

11 UX proxies in AM&M (example) Source: Real & Virtual Desktop Performance: Boot profiling, CPU and memory utilization per process, error messages, non-responding processes, crashed applications, Blue Screen of Death (BSOD), etc. Application Performance: Latency, response time and “key-to-glass” transaction time for user workflows across the widest array of applications and environments, including HTTP(s), RIA/AJAX, Client Server, Java,.NET, XenApp/ICA, Terminal Server/RDP, VDI/RDP, etc. User Productivity: Application, module and function usage statistics, usage trail, and execution time/time spent, e.g. trades executed, calls closed, s sent, invoices created, etc.

Problem 12

13 The Gartner framework In a recent report entitled “Magic Quadrant for Application Performance Monitoring,” authors Jonah Kowall and Will Cappelli outlined a five ‐ dimensional model for APM&M to identify the technologies and services that comply at their best with it. The five dimensions are: End ‐ user experience (EUE) monitoring Runtime application architecture discovery, modeling and display User ‐ defined transaction profiling Component deep ‐ dive monitoring in an application context Provision of business analytics In particular, EUE is defined as the capture of data about how end-to- end application availability, latency, execution correctness and quality appeared to the end user

14 The Magic Quadrant

15 The Magic Quadrant

Utility of this comparison While a considerable number and variety of tools exist that perform End UX Analysis in the context of APM&M, ELLIOT is quite unique in support to the evaluation of co-creation performance in a Living Lab environment. However, in order to assess its true business potential, a benchmarking exercise needs to be made with the only comparable market at hand – which exhibits fast growth and rapid evolution indeed. Following A. Ben Shohan, we only focus on tools that are able to track the response times that real users experience when using on site applications – not robots that synthetically ping the site. 16

Detailed results (1/5) 17

Detailed results (2/5) 18

Detailed results (3/5) 19

Detailed results (4/5) 20

Detailed results (5/5) 21

Overview of competition In this family of tools: –No reference is ever made to an IOT rich environment –Focus is on service usage, never creation or co-creation of a service Two main business scenarios are represented: –CRM and customer satisfaction –Business process intelligence Many big players are already there (Oracle, IBM, Hp, Microsoft…) Some software tools are free of charge Solutions are available both as stand-alone and in SaaS mode 22

Further steps ahead It would be interesting to adopt Gartner’s framework to evaluate ELLIOT capacity –To help monitor and analyse End UX By capturing and delivering a potentially huge number of workflow related data and analytics, and –To stick to another workflow definition, more closely related to service innovation management 23 Technology Driven (TD) User Centric (UC) User Driven (UD) Open (O)OTDOUCOUD Closed (C)CTDCUCCUD Source: Leminen and Fred (2009)

Further steps ahead (2) –To leverage Behavioural Change as a permanent source for innovation in services 24 Adapted from Peedu (2011)

Behavioural Change 25 “H-H-I”

Behavioural Change (foll.) 26 “H-C-I”

Behavioural Change (foll.) 27 “H-C-T”

Example This model shows how one type of user evolves from viewing a cellular phone as a single-function appliance to experiencing it as an essential life tool.

Example

Example 3 30 SAVE ENERGY Project,

Example 4 Equipped with: Speakers Camera Microphone Display 32’’ Touch Screen Smart Card, RFiD, NFC Reader Ergonomic Dispenser CPU and Connectivity (Ethernet, Wireless, BT, USB) “The Vending Machine of the Future”

IOT Service Environment 32

User Perspective 33

Living Lab Living (adj.) Active, functioning, exhibiting the life or motion of nature Merriam-Webster Laboratory (n.) A place providing opportunity for experimentation, observation, or practice in a field of study Merriam-Webster 34  Living Labs are open innovation environments in real-life settings, where user-driven design and experimentation are integrated within a co- creation process of new services, products and societal infrastructures. (Santoro & Conte 2009)

Problem 35

Proposed Solution 36

Example 37

Designer Perspective 38

Technical Workflow 39 Modelling 1.Set up of the Scenario and Goals 2.IOT LL Configuration 3.Model the Experience Evaluation Runtime a.Raw Data Collection b.Analysis of Results b a

40 Platform Architecture

Applications I.Logistics (BIBA, DE) II.Health and Well-Being (HSR, IT) III.Green Services (INRIA, FR) IV.Healthcare (VirTech, BG) V.Retail (SafePay, HU) VI.Energy (Intersoft, SK) 41

Applications (I) 42

iTV for PaediatricsNextgen Vending Machine Vai In Bici (Go Bike!) Public Transport e-Services Design Drivers TM Function Emotion Relation Applications (II)

Applications (II-foll.) 44

Applications (III) 45

Applications (IV) 46

Applications (V) 47

Applications (VI) 48 Energy Efficient Pilot: Managing energy for buildings and offices

Proposal for Group Work 49 Application workflowAPM&M objectives 1. The end user initiates a request, which triggers the execution of the software and hardware components used to respond to the request. 1. Tracking, in real time, the execution of the software algorithms that constitute an application. 2. Some of the steps in the execution are defined and sequenced by business logic, as opposed to computer system logic. 2. Measuring and reporting on the finite hardware and software resources that are allocated to be consumed as the algorithms execute. 3. The software algorithms work with one another as they execute. The results are compiled and assembled into a resultant set of data. 3. Determining whether the application executes successfully according to the application owner. 4. The resultant data is delivered by using hardware and software to the user in a well- defined computer interface. 4. Recording the latencies associated with some of the execution step sequences. 5. If the algorithms complete their execution successfully, then they achieve well-defined goals that meet the established requirements of some end users or end-user communities. 5. Determining why an application fails to execute successfully, or why resource consumption and latency levels depart from expectations.

Proposal for Group Work (2) 50 APM&M objectivesGartner’s key dimensions of APM&M 1. Tracking, in real time, the execution of the software algorithms that constitute an application. 1. End-user experience monitoring — the capture of data about how end-to-end application availability, latency, execution correctness and quality appeared to the end user. 2. Measuring and reporting on the finite hardware and software resources that are allocated to be consumed as the algorithms execute 2. Runtime application architecture discovery, modeling and display — the discovery of the various software and hardware components involved in application execution, and the array of possible paths across which those components could communicate that, together, enable that involvement. 3. Determining whether the application executes successfully according to the application owner. 3. User-defined transaction profiling — the tracing of events as they occur among the components or objects as they move across the paths discovered in the second dimension, generated in response to a user's attempt to cause the application to execute what the user regards as a logical unit of work. 4. Recording the latencies associated with some of the execution step sequences. 4. Component deep-dive monitoring in an application context — the fine-grained monitoring of resources consumed by and events occurring within the components discovered in the second dimension. 5. Determining why an application fails to execute successfully, or why resource consumption and latency levels depart from expectations. 5. Analytics — the marshalling of a variety of techniques (including behavior learning engines, complex-event processing (CEP) platforms, log analysis and multidimensional database analysis) to discover meaningful and actionable patterns in the typically large datasets generated by the first four dimensions of APM.

Proposal for Group Work (3) 51 Application workflowIoT service creation workflow 1. The end user initiates a request, which triggers the execution of the software and hardware components used to respond to the request. 1. ???. 2. Some of the steps in the execution are defined and sequenced by business logic, as opposed to computer system logic. 2. ???. 3. The software algorithms work with one another as they execute. The results are compiled and assembled into a resultant set of data. 3. ???. 4. The resultant data is delivered by using hardware and software to the user in a well- defined computer interface. 4. ???. 5. If the algorithms complete their execution successfully, then they achieve well-defined goals that meet the established requirements of some end users or end-user communities. 5. ???.

Proposal for Group Work (4) 52 IoT service creation workflowELLIOT objectives 1. ???. 2. ???. 3. ???. 4. ???. 5. ???.

Proposal for Group Work (5) 53 ELLIOT objectives Gartner’s (adapted?) key dimensions of APM&M 1. ???. 1. End-user experience monitoring — the capture of data about how end-to-end application availability, latency, execution correctness and quality appeared to the end user. 2. ???. 2. Runtime application architecture discovery, modeling and display — the discovery of the various software and hardware components involved in application execution, and the array of possible paths across which those components could communicate that, together, enable that involvement. 3. ???. 3. User-defined transaction profiling — the tracing of events as they occur among the components or objects as they move across the paths discovered in the second dimension, generated in response to a user's attempt to cause the application to execute what the user regards as a logical unit of work. 4. ???. 4. Component deep-dive monitoring in an application context — the fine-grained monitoring of resources consumed by and events occurring within the components discovered in the second dimension. 5. ???. 5. Analytics — the marshalling of a variety of techniques (including behavior learning engines, complex-event processing (CEP) platforms, log analysis and multidimensional database analysis) to discover meaningful and actionable patterns in the typically large datasets generated by the first four dimensions of APM.

Conclusions 54

Thank You 55 Experiential Living Lab for the Internet Of Things Francesco Molinari,

References Gartner (2012), The Magic Quadrant for Application Performance Monitoring, Seppo Leminen and Minna Fred (2009), in Leminen S. (Ed.) State of the Art of UDOI Usage within Companies Business Processes, Laurea University, Espoo. Francesco Molinari (2012), User Experience Analysis in Service Co-Creation: A Living Lab Approach. In: Proceedings of the SERVDES12 Conference. Peter Morville (2004), User Experience Design. Geroli Peedu (2011), Enhancing Public Service User Experience in Information Society, Master Thesis, University of Tallinn. Robert Rubinoff (2004), How to Quantify the User Experience. Roberto Santoro & Marco Conte (2009). Living Labs in Open Innovation Functional Regions. In: Proceedings of the ICE09 Conference. Alon Ben Shohan, A List of End User Experience Monitoring Tools, user-monitoring.com/category/blog/ user-monitoring.com/category/blog/ The ELLIOT project ( ). 56