TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.

Slides:



Advertisements
Similar presentations
Xsede eXtreme Science and Engineering Discovery Environment Ron Perrott University of Oxford 1.
Advertisements

1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
Grid Deployments and Cyberinfrastructure Andrew J. Younge 102 Lomb Memorial Drive Rochester, NY 14623
Sergiu January 2007 TG Users’ Data Transfer Needs SDSC NCAR TACC UC/ANL NCSA ORNL PU IU PSC.
Summary Role of Software (1 slide) ARCS Software Architecture (4 slides) SNS -- Caltech Interactions (3 slides)
Experimental Facilities DivisionORNL - SNS June 22, 2004 SNS Update – Team Building Steve Miller June 22, 2004 DANSE Meeting at Caltech.
Computational Steering on the GRID Using a 3D model to Interact with a Large Scale Distributed Simulation in Real-Time Michael.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
The Role of DANSE at SNS Steve Miller Scientific Computing Group Leader January 22, 2007.
Simo Niskala Teemu Pasanen
Core Services I & II David Hart Area Director, UFP/CS TeraGrid Quarterly Meeting December 2008.
Network, Operations and Security Area Tony Rimovsky NOS Area Director
Neutron Science TeraGrid Gateway Update Vickie Lynch, Meili Chen, John Cobb Oak Ridge National Laboratory AUS Technical Presentation October 8, 2009.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
Scientific Data Infrastructure in CAS Dr. Jianhui Scientific Data Center Computer Network Information Center Chinese Academy of Sciences.
TeraGrid National Cyberinfrasctructure for Scientific Research PRESENTER NAMES AND AFFILIATIONS HERE.
GIG Software Integration: Area Overview TeraGrid Annual Project Review April, 2008.
TeraGrid Information Services December 1, 2006 JP Navarro GIG Software Integration.
DANSE Review – SNS/HFIR Update Steve Miller Scientific Computing Group Leader Neutron Scattering Science Division (NSSD) Spallation Neutron Source May.
National Center for Supercomputing Applications The Computational Chemistry Grid: Production Cyberinfrastructure for Computational Chemistry PI: John Connolly.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Advancing Scientific Discovery through TeraGrid Scott Lathrop TeraGrid Director of Education, Outreach and Training University of Chicago and Argonne National.
TeraGrid Resources Enabling Scientific Discovery Through Cyberinfrastructure (CI) Diane Baxter, Ph.D. San Diego Supercomputer Center University of California,
1 TeraGrid ‘10 August 2-5, 2010, Pittsburgh, PA State of TeraGrid in Brief John Towns TeraGrid Forum Chair Director of Persistent Infrastructure National.
August 2007 Advancing Scientific Discovery through TeraGrid Adapted from S. Lathrop’s talk in SC’07
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
TeraGrid Overview Cyberinfrastructure Days Internet2 10/9/07 Mark Sheddon Resource Provider Principal Investigator San Diego Supercomputer Center
Some Grid Experiences Laura Pearlman USC Information Sciences Institute ICTP Advanced Training Workshop on Scientific Instruments on the Grid *Most of.
Accelerating Scientific Exploration Using Workflow Automation Systems Terence Critchlow (LLNL) Ilkay Altintas (SDSC) Scott Klasky(ORNL) Mladen Vouk (NCSU)
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Code Applications Tamas Kiss Centre for Parallel.
Rochester Institute of Technology Cyberaide Shell: Interactive Task Management for Grids and Cyberinfrastructure Gregor von Laszewski, Andrew J. Younge,
Ames Research CenterDivision 1 Information Power Grid (IPG) Overview Anthony Lisotta Computer Sciences Corporation NASA Ames May 2,
TeraGrid Advanced Scheduling Tools Warren Smith Texas Advanced Computing Center wsmith at tacc.utexas.edu.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
The Research Alliance in Math and Science program is sponsored by the Office of Advanced Scientific Computing Research, Office of Science, U.S. Department.
Scalable Systems Software for Terascale Computer Centers Coordinator: Al Geist Participating Organizations ORNL ANL LBNL.
ISERVOGrid Architecture Working Group Brisbane Australia June Geoffrey Fox Community Grids Lab Indiana University
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
NEES Cyberinfrastructure Center at the San Diego Supercomputer Center, UCSD George E. Brown, Jr. Network for Earthquake Engineering Simulation NEES TeraGrid.
Mantid Current Development and Future Plans Nicholas Draper ICNS 2013.
May 6, 2002Earth System Grid - Williams The Earth System Grid Presented by Dean N. Williams PI’s: Ian Foster (ANL); Don Middleton (NCAR); and Dean Williams.
TeraGrid Extension Gateway Activities Nancy Wilkins-Diehr TeraGrid Quarterly, September 24-25, 2009 The Extension Proposal!
1 Spallation Neutron Source Data Analysis Jessica Travierso Research Alliance in Math and Science Program Austin Peay State University Mentor: Vickie E.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
© 2006 The University of Chicago Team Science, Team Scholarship Tom Barton Chad Kainz.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Data, Visualization and Scheduling (DVS) TeraGrid Annual Meeting, April 2008 Kelly Gaither, GIG Area Director DVS.
Network, Operations and Security Area Tony Rimovsky NOS Area Director
The Integrated Spectral Analysis Workbench (ISAW) DANSE Kickoff Meeting, Aug. 15, 2006, D. Mikkelson, T. Worlton, Julian Tao.
October 2007 TeraGrid : Advancing Scientific Discovery and Learning Diane A. Baxter, Ph.D. Education Director San Diego Supercomputer Center University.
Software Integration Highlights CY2008 Lee Liming, JP Navarro GIG Area Directors for Software Integration University of Chicago, Argonne National Laboratory.
TeraGrid Arch Meeting RP Update: ORNL/NSTG 1 ORNL_RP_Update_ TeraGrid Arch Meeting RP Update: ORNL/NSTG August 1, 2008 John W. Cobb.
Grid Execution Management for Legacy Code Architecture Exposing legacy applications as Grid services: the GEMLCA approach Centre.
TG ’08, June 9-13, State of TeraGrid John Towns Co-Chair, TeraGrid Forum Director, Persistent Infrastructure National Center for Supercomputing.
SAN DIEGO SUPERCOMPUTER CENTER Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways SDSC Director of Consulting,
Common User Environments - Update Shawn T. Brown, PSC CUE Working Group Lead TG Quartely 1.
TeraGrid Capability Discovery John-Paul “JP” Navarro TeraGrid Area Co-Director for Software Integration University of Chicago/Argonne National Laboratory.
Data Infrastructure in the TeraGrid Chris Jordan Campus Champions Presentation May 6, 2009.
Quarterly Meeting Spring 2007 NSTG: Some Notes of Interest Adapting Neutron Science community codes for TeraGrid use and deployment. (Lynch, Chen) –Geared.
INTRODUCTION TO XSEDE. INTRODUCTION  Extreme Science and Engineering Discovery Environment (XSEDE)  “most advanced, powerful, and robust collection.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
TeraGrid Software Integration: Area Overview (detailed in 2007 Annual Report Section 3) Lee Liming, JP Navarro TeraGrid Annual Project Review April, 2008.
Cooperative International Simulations with McStas Vickie Lynch, Meili Chen, John Cobb, Mark Hagen, James Kohl, Stephen Miller, Michael Reuter, Sudharshan.
Clouds , Grids and Clusters
NSF TeraGrid Review January 10, 2006
Belle II Physics Analysis Center at TIFR
Joint Techs, Columbus, OH
Cyberinfrastructure and PolarGrid
Presentation transcript:

TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory In collaboration with many teams: NSTG, SNS Scientific Computing, McStas group, Open Science Grid, Tech-X Corp, and the TeraGrid Partners teams.

What is a Science Gateway?  A Science Gateway  Enables scientific communities of users with a common scientific goal  Uses high performance computing  Has a common interface  Leverages community investment  Three common forms:  Web-based Portals  Application programs running on users' machines but accessing services in TeraGrid  Coordinated access points enabling users to move seamlessly between TeraGrid and other grids.

How can a Gateway help?  Make science more productive  Researchers use same tools  Complex workflows  Common data formats  Data sharing  Bring TeraGrid capabilities to the broad science community  Lots of disk space  Lots of compute resources  Powerful analysis capabilities  A nice interface to information

What is the TeraGrid? 4 SDSC TACC UC/ANL NCSA ORNL PU IU PSC NCAR Caltech USC/ISI UNC/RENCI UW Resource Provider (RP) Software Integration Partner LONI NICS Grid Infrastructure Group (UChicago) (GIG) GIG 20 computers at 11 facilities 10Gbps network Over a petaflop of computing power 136,470 CPU-cores 60 petabytes long-term storage Growing

Neutron Science TeraGrid Gateway  Focus is neutron science  Connects facilities with cyberinfrastructure  Bridges cyberinfrastructure  Combines TeraGrid computational resources with neutron datasets  Data movement across TeraGrid  Outreach to neutron science 5

Community Certificate and Account  Gateways with community accounts scale to thousands of facility users  Have Jimmy Neutron community accounts on 14 TeraGrid computers  Use Jimmy Neutron Community Certificate from SNS community account  Record end-user identification for auditing and return of results 6 Community Account

7 TeraGrid Before Gateways  Large facilities:  Recorded histogram data from experiments  Users:  Took their data home on floppy disk in pocket  Saved permanent copy on hard disk  Did not have event data to change histogram  Translated data into format needed for analysis  Wrote their own code to read and analyze data  Installed discipline focused software on their PC for analysis  Installed plotting programs/libraries to plot analysis output

8 TeraGrid After Gateways  Large facilities:  Record data from multiple facilities  SNS, HFIR, LENS, IPNS, LUJAN, …  Save permanent copy of raw data  Bin event data into histogram  Translate data into standard NeXus format  Have analysis and simulation programs available from portal  Use remote TeraGrid cyberinfrastructure for computations  Have visualization capability in portal  Users:  Use portal from web for all data, analysis, and visualization

9 TeraGrid Gateway Savings  Users do not duplicate efforts  Facilities do not duplicate efforts  Data is not lost  Data is easily shared  Natural way to integrate community contributed instrument specific software for hosting and wide availability to many facility users  Analysis is done quickly on high performance computers

10 TeraGrid Portal submit to TeraGrid

11 TeraGrid Simulation Service  Simulation of neutron instrument is available in portal with McStas  Simulations agree with experimental results  Linear scaling to 1024 cores  Output is NeXus  Use cases:  Instrument design and construction  Experiment planning and analysis

Fitting Service Fits theoretical models to the NeXus data files from the experiments Adaptive nonlinear least squares algorithm implemented in parallel Linear speedup to 32 cores Service to run on TeraGrid 12 TeraGrid

Reduction Service Reduction software is available tor backscattering and reflectometry through portal Calculations will be sent to local cluster and TeraGrid Attempted to parallelize this calculation by distributing regions of the time-of-flight to each processor. Each processor read only its region of the NeXus input data file and write a new file containing only that region. Each processor performs the data reduction on its file. The results are merged at the end of the calculation 13 TeraGrid Backscattering Data Reduction Reflectometry Data Reduction

Job Information Service 14 TeraGrid Portal Job Information Service tells where jobs is running, when it started and status Daily tests of submitting five simultaneous remote portal jobs Percentage success is > 82%

Tests of Remote Job Submission 15 TeraGrid Difficult to diagnose the problem from the Globus output. - Check the status of the computer - Look at the output files Some problems diagnosed: - Updated software on a computer that required relinked executables - Globus software setting the wrong time limit - Batch prologue script that killed jobs on same core - Long queue waits - Firewall installed

Conclusions  Gateways help facilities scale to a large number of users  Gateways give facilities access to high performance computing such as the TeraGrid  Gateways enable a scientific community to use community software through a common interface  Researchers are more productive if they use the same tools, use a common data format, and share data easily 16 TeraGrid