SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

Slides:



Advertisements
Similar presentations
PHP I.
Advertisements

Configuration management
Test Case Management and Results Tracking System October 2008 D E L I V E R I N G Q U A L I T Y (Short Version)
Last update: August 9, 2002 CodeTest Embedded Software Verification Tools By Advanced Microsystems Corporation.
MPI and C-Language Seminars Seminar Plan  Week 1 – Introduction, Data Types, Control Flow, Pointers  Week 2 – Arrays, Structures, Enums, I/O,
Supplement 02CASE Tools1 Supplement 02 - Case Tools And Franchise Colleges By MANSHA NAWAZ.
1 CSc Senior Project Software Testing. 2 Preface “The amount of required study of testing techniques is trivial – a few hours over the course of.
Intel Trace Collector and Trace Analyzer Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding.
Overview of Eclipse Parallel Tools Platform Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:
GASP: A Performance Tool Interface for Global Address Space Languages & Libraries Adam Leko 1, Dan Bonachea 2, Hung-Hsun Su 1, Bryan Golden 1, Hans Sherburne.
Architecture Of ASP.NET. What is ASP?  Server-side scripting technology.  Files containing HTML and scripting code.  Access via HTTP requests.  Scripting.
PAPI Tool Evaluation Bryan Golden 1/4/2004 HCS Research Laboratory University of Florida.
1 Shawlands Academy Higher Computing Software Development Unit.
Configuration Management and Server Administration Mohan Bang Endeca Server.
UPC/SHMEM PAT High-level Design v.1.1 Hung-Hsun Su UPC Group, HCS lab 6/21/2005.
What is Sure BDCs? BDC stands for Batch Data Communication and is also known as Batch Input. It is a technique for mass input of data into SAP by simulating.
MpiP Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida.
WORK ON CLUSTER HYBRILIT E. Aleksandrov 1, D. Belyakov 1, M. Matveev 1, M. Vala 1,2 1 Joint Institute for nuclear research, LIT, Russia 2 Institute for.
MPE/Jumpshot Evaluation Report Adam Leko Hans Sherburne, UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information.
Parallel Programming Models Jihad El-Sana These slides are based on the book: Introduction to Parallel Computing, Blaise Barney, Lawrence Livermore National.
Advanced PI Calculation Engine Makes Complex PI Calculations Easy! Use of EDICTvb for Multi-Plant Advanced PI Calculations Dane OverfieldEXELE Information.
CCS APPS CODE COVERAGE. CCS APPS Code Coverage Definition: –The amount of code within a program that is exercised Uses: –Important for discovering code.
London April 2005 London April 2005 Creating Eyeblaster Ads The Rich Media Platform The Rich Media Platform Eyeblaster.
London April 2005 London April 2005 Creating Eyeblaster Ads The Rich Media Platform The Rich Media Platform Eyeblaster.
Measuring Synchronisation and Scheduling Overheads in OpenMP J. Mark Bull EPCC University of Edinburgh, UK
Adventures in Mastering the Use of Performance Evaluation Tools Manuel Ríos Morales ICOM 5995 December 4, 2002.
Tutorial 121 Creating a New Web Forms Page You will find that creating Web Forms is similar to creating traditional Windows applications in Visual Basic.
9 Chapter Nine Compiled Web Server Programs. 9 Chapter Objectives Learn about Common Gateway Interface (CGI) Create CGI programs that generate dynamic.
Statistics Monitor of SPMSII Warrior Team Pu Su Heng Tan Kening Zhang.
Support for Debugging Automatically Parallelized Programs Robert Hood Gabriele Jost CSC/MRJ Technology Solutions NASA.
Scalable Analysis of Distributed Workflow Traces Daniel K. Gunter and Brian Tierney Distributed Systems Department Lawrence Berkeley National Laboratory.
1 The Software Development Process  Systems analysis  Systems design  Implementation  Testing  Documentation  Evaluation  Maintenance.
11 July 2005 Tool Evaluation Scoring Criteria Professor Alan D. George, Principal Investigator Mr. Hung-Hsun Su, Sr. Research Assistant Mr. Adam Leko,
VAMPIR. Visualization and Analysis of MPI Resources Commercial tool from PALLAS GmbH VAMPIRtrace - MPI profiling library VAMPIR - trace visualization.
OpenMP – Introduction* *UHEM yaz çalıştayı notlarından derlenmiştir. (uhem.itu.edu.tr)
MPICL/ParaGraph Evaluation Report Adam Leko, Hans Sherburne UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information.
We have developed a GUI-based user interface for Chandra data processing automation, data quality evaluation, and control of the system. This system, known.
Overview of CrayPat and Apprentice 2 Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red: Negative.
Replay Compilation: Improving Debuggability of a Just-in Time Complier Presenter: Jun Tao.
Profiling, Tracing, Debugging and Monitoring Frameworks Sathish Vadhiyar Courtesy: Dr. Shirley Moore (University of Tennessee)
KOJAK Evaluation Report Adam Leko, Hans Sherburne UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red: Negative.
Agilent Technologies Copyright 1999 H7211A+221 v Capture Filters, Logging, and Subnets: Module Objectives Create capture filters that control whether.
CEPBA Tools (DiP) Evaluation Report Adam Leko Hans Sherburne, UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information.
Portable Parallel Performance Tools Shirley Browne, UTK Clay Breshears, CEWES MSRC Jan 27-28, 1998.
Dynaprof Evaluation Report Adam Leko, Hans Sherburne UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:
Blackfin Array Handling Part 1 Making an array of Zeros void MakeZeroASM(int foo[ ], int N);
The Software Development Process
HPCToolkit Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:
Comparative Study of Parallel Performance Visualization Tools By J. Ramphis Castro December 4, 2002.
SvPablo. Source view Pablo GUI for instrumenting source code and viewing runtime performance data Joint work at Univ. of Illinois and Rice Univ. HPF programs.
Tool Visualizations, Metrics, and Profiled Entities Overview [Brief Version] Adam Leko HCS Research Laboratory University of Florida.
Dynaprof Evaluation Report Adam Leko, Hans Sherburne UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:
CISC Machine Learning for Solving Systems Problems Presented by: Suman Chander B Dept of Computer & Information Sciences University of Delaware Automatic.
Overview of dtrace Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red: Negative note Green: Positive.
TAU Evaluation Report Adam Leko, Hung-Hsun Su UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red: Negative.
Overview of AIMS Hans Sherburne UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red: Negative note Green:
Introduction Selenium IDE is a Firefox extension that allows you to record, edit, and debug tests for HTML Easy record and playback Intelligent field selection.
21 Sep UPC Performance Analysis Tool: Status and Plans Professor Alan D. George, Principal Investigator Mr. Hung-Hsun Su, Sr. Research Assistant.
PestPac Software. Leads The Leads Module allows you to track all of your pending sales for your company from the first contact to the close. By the end.
Testing plan outline Adam Leko Hans Sherburne HCS Research Laboratory University of Florida.
1 The Software Development Process ► Systems analysis ► Systems design ► Implementation ► Testing ► Documentation ► Evaluation ► Maintenance.
Chapter – 8 Software Tools.
Projections - A Step by Step Tutorial By Chee Wai Lee For the 2004 Charm++ Workshop.
A Dynamic Tracing Mechanism For Performance Analysis of OpenMP Applications - Caubet, Gimenez, Labarta, DeRose, Vetter (WOMPAT 2001) - Presented by Anita.
Microsoft Visual C# 2010 Fourth Edition Chapter 3 Using GUI Objects and the Visual Studio IDE.
Profiling/Tracing Method and Tool Evaluation Strategy Summary Slides Hung-Hsun Su UPC Group, HCS lab 1/25/2005.
A Semi-Automated Digital Preservation System based on Semantic Web Services Jane Hunter Sharmin Choudhury DSTC PTY LTD, Brisbane, Australia Slides by Ananta.
MASS Java Documentation, Verification, and Testing
MapReduce Computing Paradigm Basics Fall 2013 Elke A. Rundensteiner
A configurable binary instrumenter
Presentation transcript:

SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red: Negative note Green: Positive note

2 Basic Information Name: SvPablo Developer: University of Illinois Current versions:  SvPablo 6.0  SDDF component 5.5  Trace Library component Website: Contact:  ? address on website/documentation are no good

3 Introduction SvPablo  Part of Pablo Project at UIUC Last website update: June, 2004 Website now hosted at Project appears to be inactive at this point  GUI for source code correlation of performance data Instrumentation of source code  Automatic  Interactive Display of performance data  Color coded indicators beside procedure listings and source code  Popup dialogs display more detailed information  Scalability graph

4 Instrumentation/Visualization Process In SvPablo Figure 2: instrumentation/visualization process in SvPablo (c/o [1])

5 Performance Data in SvPablo What can be instrumented?  Function calls  Outer Loops What type of performance data is avalable?  Procedure Statistics Statistics describing all instrumented calls to a procedure  Call Statistics Statistics describing a specific procedure callsite  Loop Statistics Statistics describing an outer loop Statistics provided  Exclusive and inclusive duration  Count  Max, min, mean, std dev.  HW Counters (if available) Scalability  Run program with same instrumentation configuration using varying number of nodes  SvPablo will calculate and graph parallel efficiency for each instrumented procedure and loop

6 Main Window in SvPablo Figure 1: Main window in SvPablo

7 Self-Defining Data Format Self-Defining Data Format (SDDF)is the format used to store performance data in SvPablo  Provides a flexible structure Allows multi-language analysis support Possible to add new metrics easily  Performance File Entries Configuration records  Specifies GUI display information for event, and procedure statistics Event statistics records  In SvPablo events are routine calls, loops, or hardware counter values  Define records, and give value for routine callsites, and loops Procedure statistics records  Define records, and give value for procedures Event locality records  Links events and procedures

8 Data Capture Library Offers a set of functions which can be used to instrument code  Start/End Data Capture  Function Entry/Exit  Loop Start/End  OpenMP applications Threaded Function Entry/Exit Threaded Loop Entry/Exit

9 Errors Abound in SvPablo Parser  Does not understand c++ style comments  Variables not declared at top of function  Once a source file is loaded that cannot be parsed by SvPablo, the application must be restarted to view correlated data for any source code! GUI  Some lines in camel appear instrumentable, but are not Instrumenting these lines causes runtime errors  Routines correctly correlated in Main window, not correctly correlated in “Procedure Statistics” window Compiling Instrumented code  Problems with “void” return types in some code  Some instrumented source code output by SvPablo could not be compiled Executing instrumented code  Errors are generated for functions ending with “exit(0)” Scalability graph  Have to change screen resolution to see entire graph

10 SvPablo – Overhead All programs executed correctly when instrumented Benchmarks marked with a star had high variability in execution time  Readings with stars probably not accurate Instrumenting a large number of loops creates high overhead!

11 Evaluation (1) Available metrics: 2/5  Can use PAPI and MIPS R10000 hardware counters (not evaluated)  Other statistics based on loop and function call count and duration are provided  No statistics regarding communication are provided Cost: 5/5  SvPablo is freely available Documentation quality: 3/5  Documentation covers all available features, however figure placement and arduous terminology impede quick understanding  Documentation does describe how one might extend the tool Extendibility: 3/5  SvPablo source code is freely available  May be more of a hassle than helpful due to large number of bugs  Project appears to be inactive, so we’d likely be on our own for support Filtering and aggregation: 3/5  Only hardware counter values and statistical data are recorded,

12 Evaluation (2) Hardware support: 4/5  Sun Solaris, SGI IRIX, IBM SP, Compaq Alpha, NEC SX6 Linux workstations Heterogeneity support: 0/5 (not supported) Installation: 3.5/5  Installation on Linux platform required a number of manual configurations, but was not bad overall Interoperability: 3/5  SvPablo uses SDDF file format  The source code for the SDDF component is freely available, and documented  A SDDF to XML conversion component is also freely available Learning curve: 3.5/5  The interface is fairly intuitive, but takes some use to get comfortable  The terminology and project file hierarchy are a bit cumbersome Manual overhead: 3/5  It is fairly straightforward to instrument all loops and routines  It is necessary to manually select only important loops in order to keep overhead low Measurement accuracy: 3/5  Tracing all loops increased overhead substantially in Camel

13 Evaluation (3) Multiple analyses: 1/5  Scalability analysis view is only means of analysis provided Multiple executions: 3.5/5  SvPablo includes a view to select performance data from multiple runs and view a graph plotting efficiency for each instrumented routine Multiple views: 2/5  A limited number of views are available  Only profile data (not trace data) is viewable Performance bottleneck identification: 2.5/5  Scalability view shows methods with poor parallel efficiency  Routines can be sorted based on max exclusive duration Profiling/tracing support: 1.5/5  Only profiling is supported  Profiling is done on routine, and loop execution metrics  Communication profiling is not available

14 Evaluation (4) Response time: 2/5  Data is not available in SvPablo until after execution completes and performance data is processed Software support: 3/5  MPI profiling library allows linking against different MPI implementations  C and Fortran are fully supported, PGI HPF is partially supported (no selective instrumentation) Source code correlation: 4/5  Source code correlation of profile data is the main view offered System stability: 2.5/5  SvPablo is very finicky about C syntax (once source code is loaded with syntax that cannot be understood, program must be restarted!)  On occasion SvPablo seg faults for unknown reasons Technical support: 0/5  addresses listed in documentation and on webpage are bad

15 Bottleneck Identification: Performance Tool Test Suite: CAMEL, LU Testing metric: what did profile data tell us? CAMEL: FAIL  Not possible to profile a section of code that is not a loop or function call  Not possible to represent actual dynamic behavior (no trace)  Required a lot of effort to clean up syntax Can’t have:  C++ “//” style comments  Variable initialization after beginning of function NAS LU: NOT TESTED  Unable to succesfully instrument code Segmenation fault, when opening init_comm.f Instrumenting lu.f alone causes execution errors

16 Bottleneck Identification: Performance Tool Test Suite: PPerfMark Big message: PASSED  Profile showed large amount of time spent in Send and Receive Diffuse procedure: PASSED  Profile showed large amount of time spent in bottleneck procedure, even though time is diffused across processes Hot procedure: PASSED  Profile showed large amount of time spent in bottleneck procedure Intensive server: TOSS-UP  Profile showed large amount of time spent in Recieve, and in waste_time()  It would take a lot of reasoning to figure out the two are related Ping pong: TOSS-UP  Profile indicates lots of time spent in Receive, and in Send  Does not show communication pattern between two processes Random barrier: TOSS-UP  Profile shows lots of time spent in Barrier, and waste_time()  Profile does not show communication pattern amongst processes Small messages: TOSS-UP  Profile shows lots of time spent in Send and Receive  Very high standard deviation, difference between Max/Min  Profile does not show communication pattern amongst processes System time: TOSS-UP  Profile show lots of time spent in kill(), and getpid()  No distinction is made between user and system calls Wrong way: TOSS-UP  Profile show lots of time spent in Send and Receive  Profile does not show communication pattern amongst processes

17 Conclusions The use of a GUI for interactive code instrumentation simplifies the process for the user The source code is available, and fairly well documented Extension of this tool is a possibility  Risky due to no support  Lots of errors experienced!

18 References 1. SvPablo User’s Guide  ftp://