Allen D. Malony Department of Computer and Information Science Computational Science Institute University of Oregon TAU Performance.

Slides:



Advertisements
Similar presentations
K T A U Kernel Tuning and Analysis Utilities Department of Computer and Information Science Performance Research Laboratory University of Oregon.
Advertisements

Dynamic performance measurement control Dynamic event grouping Multiple configurable counters Selective instrumentation Application-Level Performance Access.
ARCS Data Analysis Software An overview of the ARCS software management plan Michael Aivazis California Institute of Technology ARCS Baseline Review March.
Robert Bell, Allen D. Malony, Sameer Shende Department of Computer and Information Science Computational Science.
Sameer Shende Department of Computer and Information Science Neuro Informatics Center University of Oregon Tool Interoperability.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
DCS Architecture Bob Krzaczek. Key Design Requirement Distilled from the DCS Mission statement and the results of the Conceptual Design Review (June 1999):
Allen D. Malony, Sameer Shende Department of Computer and Information Science Computational Science Institute University.
Allen D. Malony Department of Computer and Information Science Computational Science Institute University of Oregon Integrating Performance.
TAU Parallel Performance System DOD UGC 2004 Tutorial Allen D. Malony, Sameer Shende, Robert Bell Univesity of Oregon.
The TAU Performance Technology for Complex Parallel Systems (Performance Analysis Bring Your Own Code Workshop, NRL Washington D.C.) Sameer Shende, Allen.
Combining Static and Dynamic Data in Code Visualization David Eng Sable Research Group, McGill University PASTE 2002 Charleston, South Carolina November.
Nick Trebon, Alan Morris, Jaideep Ray, Sameer Shende, Allen Malony {ntrebon, amorris, Department of.
On the Integration and Use of OpenMP Performance Tools in the SPEC OMP2001 Benchmarks Bernd Mohr 1, Allen D. Malony 2, Rudi Eigenmann 3 1 Forschungszentrum.
DANSE Central Services Michael Aivazis Caltech NSF Review May 23, 2008.
Allen D. Malony, Sameer Shende Department of Computer and Information Science Computational Science Institute University.
Workshop on Cyber Infrastructure in Combustion Science April 19-20, 2006 Subrata Bhattacharjee and Christopher Paolini Mechanical.
The TAU Performance System: Advances in Performance Mapping Sameer Shende University of Oregon.
© , Michael Aivazis DANSE Software Issues Michael Aivazis California Institute of Technology DANSE Software Workshop September 3-8, 2003.
An overview of the DANSE software architecture Michael Aivazis Caltech DANSE Kick-Off Meeting Pasadena Aug 15, 2006.
TAU: Performance Regression Testing Harness for FLASH Sameer Shende
Allen D. Malony, Sameer Shende, Robert Bell Department of Computer and Information Science Computational Science Institute, NeuroInformatics.
Kai Li, Allen D. Malony, Robert Bell, Sameer Shende Department of Computer and Information Science Computational.
Sameer Shende, Allen D. Malony Computer & Information Science Department Computational Science Institute University of Oregon.
Allen D. Malony, Sameer Shende Department of Computer and Information Science Computational Science Institute University.
Chapter 10: Architectural Design
Performance Technology for Complex Parallel Systems REFERENCES.
November 2011 At A Glance GREAT is a flexible & highly portable set of mission operations analysis tools that increases the operational value of ground.
An Automated Component-Based Performance Experiment and Modeling Environment Van Bui, Boyana Norris, Lois Curfman McInnes, and Li Li Argonne National Laboratory,
Some Thoughts on HPC in Natural Language Engineering Steven Bird University of Melbourne & University of Pennsylvania.
An Introduction to Software Architecture
Publishing and Visualizing Large-Scale Semantically-enabled Earth Science Resources on the Web Benno Lee 1 Sumit Purohit 2
Presenting Statistical Data Using XML Office for National Statistics, United Kingdom Rob Hawkins, Application Development.
material assembled from the web pages at
DANSE Central Services Michael Aivazis Caltech NSF Review May 31, 2007.
John Mellor-Crummey Robert Fowler Nathan Tallent Gabriel Marin Department of Computer Science, Rice University Los Alamos Computer Science Institute HPCToolkit.
RELATIONAL FAULT TOLERANT INTERFACE TO HETEROGENEOUS DISTRIBUTED DATABASES Prof. Osama Abulnaja Afraa Khalifah
KMS Products By Justin Saunders. Overview This presentation will discuss the following: –A list of KMS products selected for review –The typical components.
A performance evaluation approach openModeller: A Framework for species distribution Modelling.
Contents 1.Introduction, architecture 2.Live demonstration 3.Extensibility.
DataNet – Flexible Metadata Overlay over File Resources Daniel Harężlak 1, Marek Kasztelnik 1, Maciej Pawlik 1, Bartosz Wilk 1, Marian Bubak 1,2 1 ACC.
Dynamic performance measurement control Dynamic event grouping Multiple configurable counters Selective instrumentation Application-Level Performance Access.
Presented by Scientific Annotation Middleware Software infrastructure to support rich scientific records and the processes that produce them Jens Schwidder.
Presented by Jens Schwidder Tara D. Gibson James D. Myers Computing & Computational Sciences Directorate Oak Ridge National Laboratory Scientific Annotation.
Allen D. Malony Department of Computer and Information Science TAU Performance Research Laboratory University of Oregon Discussion:
© 2006, National Research Council Canada © 2006, IBM Corporation Solving performance issues in OTS-based systems Erik Putrycz Software Engineering Group.
NOVA A Networked Object-Based EnVironment for Analysis “Framework Components for Distributed Computing” Pavel Nevski, Sasha Vanyashin, Torre Wenaus US.
Visualization in Problem Solving Environments Amit Goel Department of Computer Science Virginia Tech June 14, 1999.
Shangkar Mayanglambam, Allen D. Malony, Matthew J. Sottile Computer and Information Science Department Performance.
Integrated Performance Views in Charm++: Projections meets TAU Scott Biersdorff Allen D. Malony Department Computer and Information Science University.
Parallel Performance Measurement of Heterogeneous Parallel Systems with GPUs Allen D. Malony, Scott Biersdorff, Sameer Shende, Heike Jagode†, Stanimire.
Performane Analyzer Performance Analysis and Visualization of Large-Scale Uintah Simulations Kai Li, Allen D. Malony, Sameer Shende, Robert Bell Performance.
High Risk 1. Ensure productive use of GRID computing through participation of biologists to shape the development of the GRID. 2. Develop user-friendly.
The Virtual Observatory and Ecological Informatics System (VOEIS): Using RESTful architecture and an extensible data model to provide a unique data management.
Allen D. Malony Department of Computer and Information Science Computational Science Institute University of Oregon Integrating Performance.
Online Performance Analysis and Visualization of Large-Scale Parallel Applications Kai Li, Allen D. Malony, Sameer Shende, Robert Bell Performance Research.
XML and Distributed Applications By Quddus Chong Presentation for CS551 – Fall 2001.
Parallel OpenFOAM CFD Performance Studies Student: Adi Farshteindiker Advisors: Dr. Guy Tel-Zur,Prof. Shlomi Dolev The Department of Computer Science Faculty.
Kai Li, Allen D. Malony, Sameer Shende, Robert Bell
Performance Technology for Scalable Parallel Systems
Allen D. Malony, Sameer Shende
TAU Parallel Performance System
Copyright © 2011 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 2 Database System Concepts and Architecture.
TAU Parallel Performance System
Evaluating Compuware OptimalJ as an MDA tool
An Introduction to Software Architecture
Allen D. Malony Computer & Information Science Department
Outline Introduction Motivation for performance mapping SEAA model
Chapter 1 Database Systems
TAU Performance DataBase Framework (PerfDBF)
Presentation transcript:

Allen D. Malony Department of Computer and Information Science Computational Science Institute University of Oregon TAU Performance DataBase Framework (PerfDBF)

APART EuroPar 2002 Workshop2 The TAU Performance DataBase Framework Outline  Motivation for performance databases  TAU performance system  TAU Performanc DataBase Framework  Architecture  XML profile data representation  Example  Performance engineering in software engineering  X-PARE (eXPeriment Alerting and Reporting)  Concluding remarks

APART EuroPar 2002 Workshop3 The TAU Performance DataBase Framework Why Performance Databases?  Focus on empirical performance optimization process  Necessary for multi-results performance analysis  Multiple experiments (codes, versions, platforms, …)  Historical performance comparison  Integral component of performance analysis framework  Improved performance analysis architecture design  More flexible and open tool interfaces  Supports extensibility and foreign tool interaction  Performance analysis collaboration  Performance tool sharing  Performance data sharing and knowledge base

APART EuroPar 2002 Workshop4 The TAU Performance DataBase Framework Empirical-Based Performance Optimization characterization Performance Tuning Performance Diagnosis Performance Experimentation Performance Observation hypotheses properties Experiment Schemas Experiment Trials observability requirements ? Process

APART EuroPar 2002 Workshop5 The TAU Performance DataBase Framework TAU Performance System Framework  Tuning and Analysis Utilities (aka Tools Are Us)  Performance system framework for scalable parallel and distributed high-performance computing  Targets a general complex system computation model  nodes / contexts / threads  Multi-level: system / software / parallelism  Measurement and analysis abstraction  Integrated toolkit for performance instrumentation, measurement, analysis, and visualization  Portable performance profiling/tracing facility  Open software approach

APART EuroPar 2002 Workshop6 The TAU Performance DataBase Framework TAU Performance System Architecture EPILOG Paraver

APART EuroPar 2002 Workshop7 The TAU Performance DataBase Framework TAU Performance Database Framework Performance analysis programs Performance analysis and query toolkit  profile data only  XML representation  project / experiment / trial PerfDML translators... ORDB PostgreSQL PerfDB Performance data description Raw performance data

APART EuroPar 2002 Workshop8 The TAU Performance DataBase Framework PerfDBF Components  Performance Data Meta Language (PerfDML)  Common performance data representation  Performance meta-data description  PerfDML translators to common data representation  Performance DataBase (PerfDB)  Standard database technology (SQL)  Free, robust database software (PostgresSQL)  Commonly available APIs  Performance DataBase Toolkit (PerfDBT)  Commonly used modules for query and analysis  Facility analysis tool development

APART EuroPar 2002 Workshop9 The TAU Performance DataBase Framework Common and Extensible Profile Data Format  Goals  Capture data from profile tools in common representation  Implement representation in a standard format  Allow for extension of format for new profile data objects  Base on XML (obvious choice)  Leverage XML tools and APIs  XML parsers, Sun’s Java SDK, …  XML verification systems (DTD and schemas)  Target for profile data translation tools  eXtensibile Stylesheet Language Transformations (XSLT)  Which performance profile data are of interest?  Focus on TAU and consider other profiling tools

APART EuroPar 2002 Workshop10 The TAU Performance DataBase Framework Performance Profiling  Performance data about program entities and behaviors  Code regions: functions, loops, basic blocks  Actions or states  Statistics data  Execution time, number of calls, number of FLOPS...  Characterization data  Parallel profiles  Captured per process and/or per thread  Program-level summaries  Profiling tools  prof/gprof, ssrun, uprofile/dpci, cprof/vprof, …

APART EuroPar 2002 Workshop11 The TAU Performance DataBase Framework TAU Parallel Performance Profiles

APART EuroPar 2002 Workshop12 The TAU Performance DataBase Framework PerfDBF Example  NAS Parallel Benchmark LU  % configure -mpiinc=/usr/include -mpilib=/usr/lib64 -arch=sgi64 -fortran=sgi -SGITIMERS -useropt=-O2 NPB profiled With TAU Standard TAU Output Data TAU XML Format SQL Database Analysis Tool TAU to XML Converter Database Loader

APART EuroPar 2002 Workshop13 The TAU Performance DataBase Framework Scalability Analysis Process  Scalability study on LU  % suite.def # of procs -> 1, 2, 4, and 8  % mpirun -np 1 lu.W1  % mpirun -np 2 lu.W2  % mpirun -np 4 lu.W4  % mpirun -np 8 lu.W8  populateDatabase.sh  run Java translator to translate profiles into XML  run Java XML reader to write XML profiles to database  Read times for routines and program from experiments  Calculate scalability metrics

APART EuroPar 2002 Workshop14 The TAU Performance DataBase Framework  Raw data output  One processor: "applu ” GROUP="applu“  Four processors: "applu ” GROUP="applu“ "applu " GROUP="applu“ "applu " GROUP="applu" Raw TAU Profile Data name calls exclusive time subs inclusive time profile calls group name

APART EuroPar 2002 Workshop15 The TAU Performance DataBase Framework XML Profile Representation  One processor 'applu ' E E8

APART EuroPar 2002 Workshop16 The TAU Performance DataBase Framework XML Representation  Four processor mean 'applu ' E E7

APART EuroPar 2002 Workshop17 The TAU Performance DataBase Framework Contents of Performance Database

APART EuroPar 2002 Workshop18 The TAU Performance DataBase Framework Scalability Analysis Results  Scalability of LU performance experiments  Four trial runs Funname| processors| meanspeedup …. applu| 2| applu| 4| applu| 8| … exact| 2| exact| 4| exact| 8|

APART EuroPar 2002 Workshop19 The TAU Performance DataBase Framework Current Status and Future  PerfDBF prototype  TAU profile to XML translator  XML to PerfDB populator  PostgresSQL database  Java-based PostgresSQL query module  Use as a layer to support performance analysis tools  Make accessing the Performance Database quicker  Continue development  XML parallel profile representation  Basic specification  Opportunity for APART to define a common format

APART EuroPar 2002 Workshop20 The TAU Performance DataBase Framework Performance Tracking and Reporting  Integrated performance measurement allows performance analysis throughout development lifetime  Applied performance engineering in software design and development (software engineering) process  Create “performance portfolio” from regular performance experimentation (couple with software testing)  Use performance knowledge in making key software design decision, prior to major development stages  Use performance benchmarking and regression testing to identify irregularities  Support automatic reporting of “performance bugs”  Enable cross-platform (cross-generation) evaluation

APART EuroPar 2002 Workshop21 The TAU Performance DataBase Framework XPARE - eXPeriment Alerting and REporting  Experiment launcher automates measurement / analysis  Configuration and compilation of performance tools  Instrumentation control for Uintah experiment type  Execution of multiple performance experiments  Performance data collection, analysis, and storage  Integrated in Uintah software testing harness  Reporting system conducts performance regression tests  Apply performance difference thresholds (alert ruleset)  Alerts users via if thresholds have been exceeded  Web alerting setup and full performance data reporting  Historical performance data analysis

APART EuroPar 2002 Workshop22 The TAU Performance DataBase Framework XPARE System Architecture Experiment Launch Mail server Performance Database Performance Reporter Comparison Tool Regression Analyzer Alerting Setup Web server

APART EuroPar 2002 Workshop23 The TAU Performance DataBase Framework Experiment Results Viewing Selection

APART EuroPar 2002 Workshop24 The TAU Performance DataBase Framework Web-Based Experiment Reporting

APART EuroPar 2002 Workshop25 The TAU Performance DataBase Framework Web-Based Experiment Reporting (continued)

APART EuroPar 2002 Workshop26 The TAU Performance DataBase Framework Alerting Setup

APART EuroPar 2002 Workshop27 The TAU Performance DataBase Framework Other Performance Database Projects  HPM Toolkit (DeRose; IBM)  PPerfDB (Karavanic; Portland State University)  HPCView (Mellor-Crummey, Fowler; Rice)  SCALEA (Fahringer, Truong ; University of Vienna)  EXPERT (Mohr, Wolf; Research Center Juelich)

APART EuroPar 2002 Workshop28 The TAU Performance DataBase Framework Acknowledgements  University of Oregon  Li Li, Robert Bell, Sameer Shende  University of Utah  Alan Morris, Steve Parker, Dav St. Germain  Department of Energy (DOE), ASCI Academic Strategic Alliances Program (ASAP)  Computational Science Institute, ASCI/ASAP Level 3 projects with LLNL / LANL, University of Oregon  Center for the Simulation of Accidental Fires and Explosions (C-SAFE), ASCI/ASAP Level 1 center, University of Utah