Presentation is loading. Please wait.

Presentation is loading. Please wait.

Global Performance Analysis

Similar presentations


Presentation on theme: "Global Performance Analysis"— Presentation transcript:

1 Global Performance Analysis
Bernd F. Lober SAP AG

2 Agenda Introducing Global Performance Analysis
Understanding Global Performance Analysis Conducting a Manual Analysis Possibilities for Automatic Analysis Summary

3 Introducing the Global Performance Analysis (1)
Motivation As system landscapes become increasingly complex, it is necessary to monitor performance across system boundaries Deliver performance results of all (possibly unknown) application/system components covered by a specific business process Automatic performance analysis, for example: For pre-defined scenarios For regression tests Global Performance Analysis (GPA) Transaction code ST30 Available since 4.6C for SAP Basis 6.10 for SAP Web Application Server

4 Introducing the Global Performance Analysis (2)
Provide an overview of complex system landscapes For performance analysis across different system components To analyze possible performance bottlenecks that may occur in a system landscape To collect performance figures across system borders in one centrally located system Trace data and statistical records are saved to this centrally located system Collect performance figures Trace data (compare transaction ST05) Statistical records (compare transactions STAD, ST03, ST03N, ST03G, STATTRACE) Only those figures that belong to the same logical entity are selected Saving log data on database Automatic performance analysis uses remotely controlled test cases (CATTs) and test configurations (eCATT) When you use the GPA you can work on a centrally located system. You do not need to log on to the systems you want to monitor. Only those figures that belong to the same logical entity are selected. This means: only data concerning single and maybe complex business processes are considered. The log data on the database is ready to be compared and evaluated statistically at a later stage. Test cases are configured in the order and according to the destination in which they have to run. They only need to exist on a central system because they are remotely executed in the destination systems. This is the same for test configurations ( eCATT tool). The performance figures are collected and stored automatically on the central system's database. The Runtime Analysis SE30 is not included in the GPA for several reasons: The focus of the ST30 is to provide a "global" and initial overview of the performance of a given scenario, so that you can get an impression of the process in its entirety. A detailed analysis using SE30 would be a second step. With the current technology, including SE30 would dilute the results of ST30.

5 Agenda Introducing Global Performance Analysis
Understanding Global Performance Analysis Conducting a Manual Analysis Possibilities for Automatic Analysis Summary

6 Functions of the Global Performance Analysis
Central monitoring and test tool across system components To analyze performance in a complex SAP system landscape To collect performance figures across system boundaries In a central system Trace data (compare ST05) and Statistical records (compare STAD) Collect performance data on a database for later comparison and statistical evaluation System performance figures that belong together are tied together logically That is: GPA only considers data concerning the business processes that are to be examined

7 More Functions: Global Performance Analysis
Automatic performance analysis In an SAP system landscape Transaction ST30 Automatic performance analysis for "single" transactions and business processes in a SAP system landscape Using remotely controlled eCATT test configurations eCATT is a tool for cross component testing in a system landscape The eCATT functionality is available from Web Application Server (Web AS) 6.20 on but tests can be done down to release 4.6C

8 eCATT – Overview eCATT Is a test execution and comparison tool for functional testing Is suited for Unit tests Integration tests System tests Acceptance tests Verification tests Compares and protocols test results with expected outcomes Designed as a central test system Integrated into the Test Organizer Tests local and remote systems Parameterization of commands and test cases Scripting language for controlling test execution flow

9 Features of the Global Performance Analysis
Within GPA, performance figures are Collected automatically on the central system‘s database Presented as average values resulting from multiple runs of the scenarios that are to evaluate As of WAS 6.20 SP23 Presented using evaluation patterns to get an individual overview that is: performance data are filtered as you like Compared automatically between different test runs Evaluated automatically against the performance checklist conditions with result „conditions met“ or not (including description why)

10 Positioning the Global Performance Analysis
Hands-on testing ST05 ST30 SE30 SCI Single systems Multiple systems This graph shows how GPA (ST30) compares to other tools. When only a few systems are involved, the automatic testing of the Code Inspector (SCI) can be very helpful. With the ABAP Runtime Analysis (SE30) you can get a good overview within a system component – particularly useful for single programs or transactions. You cannot use it for automatic testing. The Performance Analysis (ST05) is appropriate for a detailed performance analysis in a single system. You cannot use it for automatic testing. The Global Performance Analysis (ST30) is intended to be used in a more complex system landscape and delivers automatic test results, but can also be used for a hands-on analysis. Important: The GPA is not appropriate for stress testing. For this purpose there are benchmarking tools available. The GPA tests single business cases in a system landscape. Automatic testing

11 Understanding the Global Performance Analysis
The GPA is a tool to help you evaluate the SAP product standard "Performance" Database Application Frontend Two Average No Complete Parallel Linear commu- Appropriate dialog time identical SAP buffer WHERE processing depen- nication Indexes below 2 selects clauses enabled dency steps per seconds dialog step x x x x SCI ST05 x x x x x x ST30 x x x x x The Solution Production standard "Performance" requires that specific scenarios must fulfill certain performance requirements. To achieve this, there are several different tools available: Code Inspector (transaction SCI) Performance Analysis (transaction ST05) Global Performance Analysis (transaction ST30) Statistical Records (transactions ST03N etc. ) Runtime Analysis (transaction SE30) The performance figures that are relevant are number of database calls, number of rows read, memory consumption, CPU load and the check if correct indexes are available for all database calls. GPA delivers these data except for the check for indexes, which is done by Code Inspector (transaction SCI) ST03N x x SE30 x

12 Example of a Business Process Across Different Components and Systems
mySAP CRM Internet Sales Scenario (CRM ISA) A sales order is created using SAP GUI for HTML (component SAP ITS) Processed in SAP Internet Sales (component ISA) A financial document is created in SAP R/3 The selling information is sent to BW and APO for further analysis SAP CRM Internet Sales Internet Pricing & Configurator Internet Transaction Server SAP R/3 Text Retrieval & Information User Request SAP BW SAP APO This scenario is to show you that many different components on different servers or on one server can be involved in a business scenario. Global Performance Analysis can help you detect where performance is being used up.

13 Example: System Landscape Process
Internet Transaction Server SAP CRM Internet Sales SAP R/3 SAP CRM Internet Sales SAP CRM Internet Sales Internet Pricing & Configurator Text Retrieval & Information Internet Pricing & Configurator Internet Pricing & Configurator Internet Transaction Server Internet Transaction Server SAP R/3 SAP R/3 Text Retrieval & Information Text Retrieval & Information SAP BW SAP APO User Request The following slides show an example of how a transaction initiated by a user could run over the components of a system landscape. The central monitoring system contains transaction ST30, the database and possible test cases and / or test configurations. The arrows between the system components represent the communication between them (e.g. RFC calls ... ). The dark blue system component symbol stands for activity in that system component, leading to a reaction and an answer (shown by vice versa arrows). SAP BW SAP BW SAP APO SAP APO

14 Example: System Landscape Process
Central Monitoring System (ST30, DB, test cases, test configurations) Internet Transaction Server Internet Transaction Server Internet Pricing & Configurator Text Retrieval & Information SAP CRM Internet Sales SAP R/3 SAP CRM Internet Sales SAP R/3 SAP CRM Internet Sales Internet Pricing & Configurator Text Retrieval & Information Internet Pricing & Configurator Internet Transaction Server SAP R/3 Text Retrieval & Information SAP BW SAP APO SAP BW SAP APO The central monitoring system component with the test configurations and ST30 (GPA) simulates user interaction (running test configurations) while monitoring the resource consumption. SAP BW SAP APO

15 Agenda Introducing Global Performance Analysis
Understanding Global Performance Analysis Conducting a Manual Analysis Possibilities for Automatic Analysis Summary

16 Initial Screen for Manual Analysis
Enter system components manually Switch traces on and off for remote system components Go to other system components for detailed analysis

17 Manual Analysis: Maintaining Test Data Descriptions
Maintain performance data description. Request data from system components. Save data on database.

18 Agenda Introducing Global Performance Analysis
Understanding Global Performance Analysis Conducting a Manual Analysis Possibilities for Automatic Analysis Summary

19 Automatic Analysis With Test Cases
Do an automatic performance test by running test cases that are assigned to a log ID

20 Automatic Analysis With eCATT Test Configurations
Enter a log ID and a text Run the specified test configuration Performance figures are automatically collected and stored on the database

21 Automatic Analysis With eCATT Test Configurations
After starting the eCATT test run you can set its parameters on the following screen Accept the proposed values

22 Global Performance Analysis: Comparison
Result screen of a “Test and compare” analysis

23 Using Global Performance Analysis
Preconditions Test scenario in form of test configurations defined within eCATT Important for regression testing -> makes life much easier Profiles S_TOOLS_EX S_ADMI_FCD Profile to run ST05 Authorization To run function groups ECATT_EXECUTE SSQ0 SAPWL_STAT To do RFCs

24 Planned Functions Trace follow up across system borders over several steps is done automatically Since the data is retrieved automatically, you do not need to know which systems are actually involved Integration of non-WebAS components (ITS, IPC, ...) Integration of Code Inspector functionality to examine trace data Example: According to the Performance Standard there should be appropriate indexes Note: not only static SQL, which is what the Code Inspector analyzes, but also dynamic SQL can be analyzed, which is what the Code Inspector cannot do "Automatic trace follow up" means the following: A transaction on system component A calls a function on system component B via RFC and that function again calls any functionality on system component C. It is not always defined that a monitoring system knows in advance or by looking to the starting system component whether system component C is involved at all. Therefore, there must be an integrated functionality to find this out automatically, by examining and interpreting the data exchanged between the system components (e.g. RFC traces). The performance of non-SAP R/3 components such as ITS, IPC or all kind of Web Application Server (WAS) components also has to be monitored to find out why a business transaction takes longer than expected, for example. The already mentioned Code Inspector functionality to check coding for suitable indexes, for example, will be added to GPA in the future because the Code Inspector cannot analyze dynamic coding (dynamic SQL statements). These statements are contained in the performance trace data of ST30. The Code Inspector checks will also analyze the trace data. After having run a test configuration there is no need to examine the performance data manually. The GPA will recognize automatically whether the standard has been met and will indicate this with a simple O.K. Thus, the GPA has to compare the current run to an older run or the previous run to determine whether the objectives are fulfilled.

25 Agenda Introducing Global Performance Analysis
Understanding Global Performance Analysis Conducting a Manual Analysis Possibilities for Automatic Analysis Summary

26 Global Performance Analysis - Summary
The performance relevant data are Collected automatically on the central system‘s database Presented as average values resulting from multiple runs of the szenarios that are to evaluate Compared automatically between different performance tests Evaluated automatically against possible performance checklist conditions with result „conditions met“ or not Including a detailed description why

27 Questions and Answers

28 Copyright 2003 No part of this presentation may be reproduced or transmitted in any form or for any purpose without the express permission of SAP AG. The information contained herein may be changed without prior notice. Some software products marketed by SAP AG and its distributors contain proprietary software components of other software vendors. Microsoft®, WINDOWS®, NT®, EXCEL®, Word® and SQL Server® are registered trademarks of Microsoft Corporation. IBM®, DB2®, OS/2®, DB2/6000®, Parallel Sysplex®, MVS/ESA®, RS/6000®, AIX®, S/390®, AS/400®, OS/390®, and OS/400® are registered trademarks of IBM Corporation. ORACLE® is a registered trademark of ORACLE Corporation, California, USA. INFORMIX®-OnLine for SAP is a registered trademark of Informix Software Incorporated. UNIX®, X/Open®, OSF/1®, and Motif® are registered trademarks of The Open Group. HTML, DHTML, XML, XHTML are trademarks or registered trademarks of W3C®, World Wide Web Consortium, Laboratory for Computer Science NE43-358, Massachusetts Institute of Technology, 545 Technology Square, Cambridge, MA JAVA® is a registered trademark of Sun Microsystems, Inc. , 901 San Antonio Road, Palo Alto, CA USA. JAVASCRIPT® is a registered trademark of Sun Microsystems, Inc., used under license for technology invented and implemented by Netscape. SAP, SAP Logo, mySAP.com, mySAP.com Marketplace, mySAP.com Workplace, mySAP.com Business Scenarios, mySAP.com Application Hosting, WebFlow, R/2, R/3, RIVA, ABAP, SAP Business Workflow, SAP EarlyWatch, SAP ArchiveLink, BAPI, SAPPHIRE, Management Cockpit, SEM, are trademarks or registered trademarks of SAP AG in Germany and in several other countries all over the world. All other products mentioned are trademarks or registered trademarks of their respective companies.


Download ppt "Global Performance Analysis"

Similar presentations


Ads by Google