TeraGrid Science Gateways Nancy Wilkins-Diehr TeraGrid 10, August 2-5, 2010 South Tenth Street Bridge, Pittsburgh.

Slides:



Advertisements
Similar presentations
SALSA HPC Group School of Informatics and Computing Indiana University.
Advertisements

TeraGrid Science Gateways Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways NBCR Summer Institute, August 4, 2009.
The ADAMANT Project: Linking Scientific Workflows and Networks “Adaptive Data-Aware Multi-Domain Application Network Topologies” Ilia Baldine, Charles.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
TeraGrid Science Gateways Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways Computing in Atmospheric Sciences, September.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Core Services I & II David Hart Area Director, UFP/CS TeraGrid Quarterly Meeting December 2008.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
CCSM Portal/ESG/ESGC Integration (a PY5 GIG project) Lan Zhao, Carol X. Song Rosen Center for Advanced Computing Purdue University With contributions by:
TeraGrid Science Gateways Nancy Wilkins-Diehr Area Director for Science Gateways TeraGrid 09 Education Program, June 22, 2009.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
18:15:32Service Oriented Cyberinfrastructure Lab, Grid Deployments Saul Rioja Link to presentation on wiki.
Effective User Services for High Performance Computing A White Paper by the TeraGrid Science Advisory Board May 2009.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
TeraGrid Science Gateways in Portland Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways SC09, November 14-20, 2009.
TeraGrid Science Gateways in Barcelona Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways EGEE 2009, September 21-25, 2009.
CloudCom Software for Science Gateways: Open Grid Computing Environments Marlon Pierce, Suresh Marru Pervasive Technology Institute Indiana University.
Gateways Tutorial Outline: Morning Session Overview: Marlon Pierce OGCE Introduction: Marlon Demos for OGCE Gadget Container, GFAC Application Factory,
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
Cloud Computing in NASA Missions Dan Whorton CTO, Stinger Ghaffarian Technologies June 25, 2010 All material in RED will be updated.
Planning for Arctic GIS and Geographic Information Infrastructure Sponsored by the Arctic Research Support and Logistics Program 30 October 2003 Seattle,
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
TeraGrid Science Gateways Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways Cyber-GIS Workshop, February 2-3, 2010 White.
1 Preparing Your Application for TeraGrid Beyond 2010 TG09 Tutorial June 22, 2009.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
TeraGrid Science Gateways Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways MURPA lecture, March 12, 2010.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
The Future of the iPlant Cyberinfrastructure: Coming Attractions.
Gateway Update for AUS Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways AUS telecon, January 28, 2010.
TeraGrid Science Gateways Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways EGU, Vienna, May 3, 2010.
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
Geosciences - Observations (Bob Wilhelmson) The geosciences in NSF’s world consists of atmospheric science, ocean science, and earth science Many of the.
TeraGrid Science Gateways Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways Navajo Technical College, September 25, 2008.
1 Computing Challenges for the Square Kilometre Array Mathai Joseph & Harrick Vin Tata Research Development & Design Centre Pune, India CHEP Mumbai 16.
National Center for Supercomputing Applications Barbara S. Minsker, Ph.D. Associate Professor National Center for Supercomputing Applications and Department.
TeraGrid Advanced Scheduling Tools Warren Smith Texas Advanced Computing Center wsmith at tacc.utexas.edu.
Experts in numerical algorithms and High Performance Computing services Challenges of the exponential increase in data Andrew Jones March 2010 SOS14.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Nature Reviews/2012. Next-Generation Sequencing (NGS): Data Generation NGS will generate more broadly applicable data for various novel functional assays.
TeraGrid Science Gateways Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways NSF Program Officers, September 10, 2008.
ISERVOGrid Architecture Working Group Brisbane Australia June Geoffrey Fox Community Grids Lab Indiana University
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
GEOSCIENCE NEEDS & CHALLENGES Dogan Seber San Diego Supercomputer Center University of California, San Diego, USA.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
7. Grid Computing Systems and Resource Management
Distributed Data for Science Workflows Data Architecture Progress Report December 2008.
| nectar.org.au NECTAR TRAINING Module 2 Virtual Laboratories and eResearch Tools.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
NICS Update Bruce Loftis 16 December National Institute for Computational Sciences University of Tennessee and ORNL partnership  NICS is the 2.
Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views.
Science Gateways What are they and why are they having such a tremendous impact on science? Nancy Wilkins-Diehr
PEER 2003 Meeting 03/08/031 Interdisciplinary Framework Major focus areas Structural Representation Fault Systems Earthquake Source Physics Ground Motions.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
TG ’08, June 9-13, State of TeraGrid John Towns Co-Chair, TeraGrid Forum Director, Persistent Infrastructure National Center for Supercomputing.
Data Infrastructure in the TeraGrid Chris Jordan Campus Champions Presentation May 6, 2009.
Purdue RP Highlights TeraGrid Round Table November 5, 2009 Carol Song Purdue TeraGrid RP PI Rosen Center for Advanced Computing Purdue University.
Page : 1 SC2004 Pittsburgh, November 12, 2004 DEISA : integrating HPC infrastructures in Europe DEISA : integrating HPC infrastructures in Europe Victor.
INTRODUCTION TO XSEDE. INTRODUCTION  Extreme Science and Engineering Discovery Environment (XSEDE)  “most advanced, powerful, and robust collection.
Accessing the VI-SEEM infrastructure
Joslynn Lee – Data Science Educator
A. Rama Bharathi Regd. No: 08931F0040 III M.C.A
rvGAHP – Push-Based Job Submission Using Reverse SSH Connections
Presentation transcript:

TeraGrid Science Gateways Nancy Wilkins-Diehr TeraGrid 10, August 2-5, 2010 South Tenth Street Bridge, Pittsburgh

TeraGrid resources today include: Tightly Coupled Distributed Memory Systems, 2 systems in the top 10 at top500.org –Kraken (NICS): Cray XT5, 99,072 cores, 1.03 Pflop –Ranger (TACC): Sun Constellation, 62,976 cores, 579 Tflop, 123 TB RAM Shared Memory Systems –Cobalt (NCSA): Altix, 8 Tflop, 3 TB shared memory –Pople (PSC): Altix, 5 Tflop, 1.5 TB shared memory Clusters with Infiniband –Abe (NCSA): 90 Tflops –Lonestar (TACC): 61 Tflops –QueenBee (LONI): 51 Tflops Condor Pool (Loosely Coupled) –Purdue- up to 22,000 cpus Gateway hosting –Quarry (IU): virtual machine support Visualization Resources –TeraDRE (Purdue): 48 node nVIDIA GPUs –Spur (TACC): 32 nVIDIA GPUs Storage Resources –GPFS-WAN (SDSC) –Lustre-WAN (IU) –Various archival resources TeraGrid 10, August 2-5, 2010 Source: Dan Katz, U Chicago But change is constant - new systems: Data Analysis and Vis systems Longhorn (TACC): Dell/NVIDIA, CPU and GPU Nautilus (NICS): SGI UltraViolet, 1024 cores, 4TB global shared memory Data-Intensive Computing Dash (SDSC): Intel Nehalem, 544 processors, 4TB flash memory FutureGrid Experimental computing grid and cloud test-bed to tackle research challenges in computer science Keeneland Experimental, high-performance computing system with NVIDIA Tesla accelerators

So how do Gateways fit into this? Gateways are a natural result of the impact of the internet on worldwide communication and information retrieval Implications on the conduct of science are still evolving –1980’s, Early gateways, National Center for Biotechnology Information BLAST server, search results sent by , still a working portal today –1989 World Wide Web developed at CERN –1992 Mosaic web browser developed –1995 “International Protein Data Bank Enhanced by Computer Browser” –2004 TeraGrid project director Rick Stevens recognized growth in scientific portal development and proposed the Science Gateway Program –Today, Web 3.0 and programmatic exchange of data between web pages Simultaneous explosion of digital information –Growing analysis needs in many, many scientific areas –Sensors, telescopes, satellites, digital images, video, genome sequencers –#1 machine on Top500 today over 1000x more powerful than all combined entries on the first list in 1993 TeraGrid 10, August 2-5, 2010 Only 18 years since the release of Mosaic!

vt100 in the 1980s and a login window on Ranger today TeraGrid 10, August 2-5, 2010

Why are gateways worth the effort? Increasing range of expertise needed to tackle the most challenging scientific problems –How many details do you want each individual scientist to need to know? PBS, RSL, Condor Coupling multi-scale codes Assembling data from multiple sources Collaboration frameworks TeraGrid 10, August 2-5, 2010 #! /bin/sh #PBS -q dque #PBS -l nodes=1:ppn=2 #PBS -l walltime=00:02:00 #PBS -o pbs.out #PBS -e pbs.err #PBS -V cd /users/wilkinsn/tutorial/exercise_3../bin/mcell nmj_recon.main.mdl +( &(resourceManagerContact="tg- login1.sdsc.teragrid.org/jobmanager-pbs") (executable="/users/birnbaum/tutorial/bin/mcell") (arguments=nmj_recon.main.mdl) (count=128) (hostCount=10) (maxtime=2) (directory="/users/birnbaum/tutorial/exercise_3") (stdout="/users/birnbaum/tutorial/exercise_3/globus.out") (stderr="/users/birnbaum/tutorial/exercise_3/globus.err") ) ======= # Full path to executable executable=/users/wilkinsn/tutorial/bin/mcell # Working directory, where Condor-G will write # its output and error files on the local machine. initialdir=/users/wilkinsn/tutorial/exercise_3 # To set the working directory of the remote job, we # specify it in this globus RSL, which will be appended # to the RSL that Condor-G generates globusrsl=(directory='/users/wilkinsn/tutorial/exercise_3') # Arguments to pass to executable. arguments=nmj_recon.main.mdl # Condor-G can stage the executable transfer_executable=false # Specify the globus resource to execute the job globusscheduler=tg-login1.sdsc.teragrid.org/jobmanager- pbs # Condor has multiple universes, but Condor-G always uses globus universe=globus # Files to receive sdout and stderr. output=condor.out error=condor.err # Specify the number of copies of the job to submit to the condor queue. queue 1

Gateways democratize access to high end resources Almost anyone can investigate scientific questions using high end resources –Not just those in the research groups of those who request allocations –Gateways allow anyone with a web browser to explore Opportunities can be uncovered via google –My then 11-year-old son discovered nanoHUB.org when his science class was studying Bucky Balls Foster new ideas, cross-disciplinary approaches –Encourage students to experiment But used in production too –Significant number of papers resulting from gateways including GridChem, nanoHUB –Scientists can focus on challenging science problems rather than challenging infrastructure problems TeraGrid 10, August 2-5, 2010

Today, there are approximately 35 gateways using the TeraGrid TeraGrid 10, August 2-5, 2010

Not just ease of use What can scientists do that they couldn’t do previously? Linked Environments for Atmospheric Discovery (LEAD) - radar data coupled with on demand computing National Virtual Observatory (NVO) – access to sky surveys Ocean Observing Initiative (OOI) – access to sensor data PolarGrid – access to polar ice sheet data SIDGrid – expensive datasets, analysis tools GridChem –coupling multiscale codes How would this have been done before gateways? TeraGrid 10, August 2-5, 2010

3 steps to connect a gateway to TeraGrid Request an allocation –Only a 1 paragraph abstract required for up to 200k CPU hours Register your gateway –Visibility on public TeraGrid page Request a community account –Run jobs for others via your portal Staff support is available! TeraGrid 10, August 2-5, 2010

Tremendous Opportunities Using the Largest Shared Resources - Challenges too! What’s different when the resource doesn’t belong just to me? –Resource discovery –Accounting –Security –Proposal-based requests for resources (peer-reviewed access) Code scaling and performance numbers Justification of resources Gateway citations Tremendous benefits at the high end, but even more work for the developers Potential impact on science is huge –Small number of developers can impact thousands of scientists –But need a way to train and fund those developers TeraGrid 10, August 2-5, 2010

When is a gateway appropriate? Researchers using defined sets of tools in different ways –Same executables, different input GridChem, CHARMM –Creating multi-scale workflows –Datasets Common data formats –National Virtual Observatory –Earth System Grid –Some groups have invested significant efforts here caBIG, extensive discussions to develop common terminology and formats BIRN, extensive data sharing agreements Difficult to access data/advanced workflows –Sensor/radar input LEAD, GEON TeraGrid 10, August 2-5, 2010

How to get started? Conduct a needs assessment –Should I build a gateway? –Can I use an existing gateway? –What problems am I trying to solve? All gateways don’t need high end computing Decide on a software approach –Recommended software at Targeted effort by a few can benefit many –Could a pool of developers design gateways for different domain areas? Yes! TeraGrid staff assistance TeraGrid 10, August 2-5, 2010

Linked Environments for Atmospheric Discovery (LEAD) Providing tools that are needed to make accurate predictions of tornados and hurricanes Meteorological data Forecast models Analysis and visualization tools Data exploration and Grid workflow TeraGrid 10, August 2-5, 2010

Highlights: LEAD Inspires Students Advanced capabilities regardless of location A student gets excited about what he was able to do with LEAD “Dr. Sikora:Attached is a display of 2- m T and wind depicting the WRF's interpretation of the coastal front on 14 February It's interesting that I found an example using IDV that parallels our discussion of mesoscale boundaries in class. It illustrates very nicely the transition to a coastal low and the strong baroclinic zone with a location very similar to Markowski's depiction. I created this image in IDV after running a 5-km WRF run (initialized with NAM output) via the LEAD Portal. This simple 1-level plot is just a precursor of the many capabilities IDV will eventually offer to visualize high-res WRF output. Enjoy! Eric” ( , March 2007) TeraGrid 10, August 2-5, 2010

Community Climate System Model (CCSM) Makes a world-leading, fully coupled climate model easier to use and available to a wide audience Compose, configure, and submit CCSM simulations to the TeraGrid Used in Purdue’s POL 520/EAS 591: Models in Climate Change Science and Policy –Semester-long projects, 100 year CCSM simulations, generate policy recommendations based on scientific, economic, and political models of climate change impacts TeraGrid 10, August 2-5, 2010

Analytical Ultracentrifugation Emerging computational tool for the study of proteins The Center for Analytical Ultracentrifugation of Macromolecular Assemblies, UT Health Sciences –Major advances in the characterization of proteins and protein complexes as a result of new instrumentation and powerful software –Monitoring the sedimentation of macromolecules in real time in the centrifugal field allows their hydrodynamic and thermodynamic characterization in solution –Observations are electronically digitized and stored for further mathematical analysis – TeraGrid 10, August 2-5, 2010 Source: Modern analytical ultracentrifugation in protein science: A tutorial review, Wikipedia

UltraScan provides a comprehensive data analysis environment Management of analytical ultracentrifugation data for single users or entire facilities Support for storage, editing, sharing and analysis of data –HPC facilities used for 2-D spectrum analysis and genetic algorithm analysis TeraGrid (~2M CPU hours used) Technische University of Munich Juelich Supercomputing Center Portable graphical user interface MySQL database backend for data management Over 30 active institutions TeraGrid advanced support –Fault tolerance, workflows, use of multiple TG resources, community account implementation TeraGrid 10, August 2-5, 2010

Southern California Earthquake Consortium (SCEC) Gateway used to produce realistic hazard map Probabilistic Seismic Hazard Analysis (PSHA) map for California –Created from Earthquake Rupture Forecasts (ERC) ~7000 ruptures can have 415,000 variations Warm colors indicate regions with a high probability of experiencing strong ground motion in the next 50 years Ground motion calculated using full 3-D waveform modeling for improved accuracy –Results in significant CPU use TeraGrid 10, August 2-5, 2010

SCEC: Why a gateway? Calculations need to be done for each of the hundreds of thousands of rupture variations –SCEC has developed the “CyberShake computational platform” Hardware, software and people which combine to produce a useful scientific result –For each site of interest - two large-scale MPI calculations and hundreds of thousands of independent post-processing jobs with significant data generation »Jobs aggregated to appear as a single job to the TeraGrid »Workflow throughput optimizations and use of SCEC’s gateway “platform” reduced time to solution by a factor of three –Computationally-intensive tasks, plus the need for reduced time to solution is a priority make TeraGrid a good fit TeraGrid 10, August 2-5, 2010 Source: S. Callahan et.al. “Reducing Time-to-Solution Using Distributed High-Throughput Mega- Workflows – Experiences from SCEC CyberShake”.

Social Informatics Data Grid Collaborative access to large, complex datasets SIDGrid is unique among social science data archive projects –Streaming data which change over time Voice, video, images (e.g. fMRI), text, numerical (e.g. heart rate, eye movement) –Investigate multiple datasets, collected at different time scales, simultaneously Large data requirements Sophisticated analysis tools TeraGrid 10, August 2-5,

Viewing multimodal data like a symphony conductor “Music-score” display and synchronized playback of video and audio files –Pitch tracks –Text –Head nods, pause, gesture references Central archive of multi-modal data, annotations, and analyses –Distributed annotation efforts by multiple researchers working on a common data set History of updates Computational tools –Distributed acoustic analysis using Praat –Statistical analysis using R –Matrix computations using Matlab and Octave TeraGrid 10, August 2-5, 2010 Source: Studying Discourse and Dialog with SIDGrid, Levow, 2008

Geographic Information Systems and HPC GISolve –Data-intensive, large- and multi-scale spatial analysis and modeling increasingly important for scientific discovery and decision- making Ecology, environmental sciences, geosciences, public health, and social sciences –Short demo TeraGrid 10, August 2-5, 2010

Future Technical Areas Web technologies change fast –Must be able to adapt quickly Gateways and gadgets –Gateway components incorporated into any social networking page –75% of 18 to 24 year-olds have social networking websites iPhone apps? Web 3.0 –Beyond social networking and sharing content –Standards and querying interfaces to programmatically share data across sites Resource Description Framework (RDF), SPARQL TeraGrid 10, August 2-5, 2010

Gateways can further investments in other projects Increase access –To instruments, expensive data collections Increase capabilities –To analyze data Improve workforce development –Can prepare students to function in today’s cross-disciplinary world Increase outreach Increase public awareness –Public sees value in investments in large facilities –Pew 2006 study indicates that half of all internet users have been to a site specializing in science –Those who seek out science information on the internet are more likely to believe that scientific pursuits have a positive impact on society TeraGrid 10, August 2-5, 2010

But, sustained funding is a problem Gateways can be used for the most challenging problems, but –Scientists won’t rely on something that they are not confident will be around for the duration We see this with software, but even more so with gateway infrastructure A sustained gateway program can –Reduce duplication of effort Sporadic development with many small programs –Increase diversity of end users –Increase skill set diversity of developers –Bring together teams to address the toughest problems TeraGrid 10, August 2-5, 2010

Gateway Sustainability Small, non-TG, EAGER grant Characteristics of short gateway funding cycles –Build exciting prototypes with input from scientists –Work with early adopters to extend capabilities –Tools are publicized, more scientists interested –Funding ends –Scientists who invested their time to use new tools are disillusioned Less likely to try something new again –Start again on new short-term project Need to break this cycle EAGER grant to look at characteristics of successful gateways and domain areas where a gateway could have a big impact Science Advisory Board, July 19-20, 2010