Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 The Application-Infrastructure Gap Dynamic and/or Distributed Applications A 1 B 1 9 9 Shared Distributed Infrastructure.

Similar presentations


Presentation on theme: "1 The Application-Infrastructure Gap Dynamic and/or Distributed Applications A 1 B 1 9 9 Shared Distributed Infrastructure."— Presentation transcript:

1 1 The Application-Infrastructure Gap Dynamic and/or Distributed Applications A 1 B 1 9 9 Shared Distributed Infrastructure

2 2 Bridging the Gap: Grid Technology l Service-oriented applications u Wrap applications as services u Compose applications into workflows l Service-oriented infrastructure u Provision physical resources to support application workloads Appln Service Users Workflows Composition Invocation Provisioning

3 3 Grid Technology: Service-Oriented Infrastructure Uniform interfaces, security mechanisms, Web service transport, monitoring ComputersStorage Specialized resource User Application User Application User Application GRAMGridFTP Host Env User Svc DAIS Database Tool Reliable File Transfer MyProxy Host Env User Svc MDS- Index

4 4 Data Management Security Common Runtime Execution Management Information Services Web Services Components Non-WS Components Pre-WS Authentication Authorization GridFTP Grid Resource Allocation Mgmt (Pre-WS GRAM) Monitoring & Discovery System (MDS2) C Common Libraries GT2GT2 WS Authentication Authorization Reliable File Transfer OGSA-DAI [Tech Preview] Grid Resource Allocation Mgmt (WS GRAM) Monitoring & Discovery System (MDS4) Java WS Core Community Authorization Service GT3GT3 Replica Location Service XIO GT3GT3 Credential Management GT4GT4 Python WS Core [contribution] C WS Core Community Scheduler Framework [contribution] Delegation Service GT4GT4 Globus Open Source Grid Software

5 5 Java Services in Apache Axis Plus GT Libraries and Handlers Your Java Service Your Python Service Your Java Service RFT GRAM Delegation Index Trigger Archiver pyGlobus WS Core Your C Service C WS Core RLSPre-WS MDS CAS Pre-WS GRAM SimpleCAMyProxy OGSA-DAI GTCP GridFTP C Services using GT Libraries and Handlers SERVER CLIENT Interoperable WS-I-compliant SOAP messaging Your Java Client Your C Client Your Python Client Your Java Client Your C Client Your Python Client Your Java Client Your C Client Your Python Client Your Java Client Your C Client Your Python Client X.509 credentials = common authentication Python hosting, GT Libraries GT4 Components

6 6 Custom Web Services WS-Addressing, WSRF, WS-Notification Custom WSRF Web Services GT4 WSRF Web Services WSDL, SOAP, WS-Security User Applications Registry Administration GT4 Container Web Services: Standards, Tools, Interoperability

7 NEES: Network for Earthquake Engineering Simulation Links instruments, data, computers, people

8 8 Scaling: Grid2003 Workflows Genome sequence analysis Physics data analysis Sloan digital sky survey

9 9 Application Examples Earth System Grid: O(100TB) online data STAR: 5 TB transfer (SRM, GridFTP) NASA/NVO: Mosaics from multiple sources Fusion Grid: 1000s of jobs

10 10 LIGO Scientific Collaboration l Continuous gravitational waves are expected to be produced by a variety of celestial objects l Only a small fraction of potential sources are known l Need to perform blind searches, scanning the regions of the sky where we have no a priori information of the presence of a source u Wide area, wide frequency searches l Search is performed for potential sources of continuous periodic waves near the Galactic Center and the galactic core l Search for binary inspirals collapsing into black holes. l The search is very compute and data intensive P. Brady, S. Koranda, D. Brown, S. Fairhurst UWMilwaukee, USA, S. Anderson, K. Blackburn, A. Lazzarini, H. Pulapaka, T. Creighton Caltech, USA, G. Gonzalez, Louisiana State University, Many Others involved in the Testbed

11 11 Montage l Montage (NASA and NVO) u Deliver science-grade custom mosaics on demand u Produce mosaics from a wide range of data sources (possibly in different spectra) u User-specified parameters of projection, coordinates, size, rotation and spatial sampling. B. Berriman, J. Good, A. Laity, Caltech/IPAC J. C. Jacob, D. S. Katz, JPL http://montage.ipachttp://montage.ipac. caltech.edu/ Mosaic created by Pegasus based Montage from a run of the M101 galaxy images on the Teragrid.

12 12 Small Montage Workflow ~1200 nodes

13 13 Other Applications Southern California Earthquake Center Southern California Earthquake Center (SCEC), in collaboration with the USC Information Sciences Institute, San Diego Supercomputer Center, the Incorporated Research Institutions for Seismology, and the U.S. Geological Survey, is developing the Southern California Earthquake Center Community Modeling Environment (SCEC/CME). Create fully three-dimensional (3D) simulations of fault-system dynamics. Physics-based simulations can potentially provide enormous practical benefits for assessing and mitigating earthquake risks through Seismic Hazard Analysis (SHA). The SCEC/CME system is an integrated geophysical simulation modeling framework that automates the process of selecting, configuring, and executing models of earthquake systems. Acknowledgments : Philip Maechling and Vipin Gupta University Of Southern California

14 14 Biology Applications Tomography (NIH-funded project) l Derivation of 3D structure from a series of 2D electron microscopic projection images, l Reconstruction and detailed structural analysis u complex structures like synapses u large structures like dendritic spines. l Acquisition and generation of huge amounts of data l Large amount of state-of-the-art image processing required to segment structures from extraneous background. Dendrite structure to be rendered by Tomography Work performed by Mei-Hui Su with Mark Ellisman, Steve Peltier, Abel Lin, Thomas Molina (SDSC)

15 15 BLAST : set of sequence comparison algorithms that are used to search sequence databases for optimal local alignments to a query Lead by Veronika Nefedova (ANL) as part of the Paci Data Quest Expedition program 2 major runs were performed using Chimera and Pegasus: 1)60 genomes (4,000 sequences each), In 24 hours processed Genomes selected from DOE-sponsored sequencing projects 67 CPU-days of processing time delivered ~ 10,000 Grid jobs >200,000 BLAST executions 50 GB of data generated 2) 450 genomes processed Speedup of 5-20 times were achieved because the compute nodes we used efficiently by keeping the submission of the jobs to the compute cluster constant.

16 16 Functional MRI Analysis

17 Global Community

18 18 Domain-independentDomain-dependent Content Services Resources Experimental apparatus Servers, storage, networks Metadata catalog Data archive Simulation server Certificate authority Simulation code Expt design Telepresence monitor Simulation code Expt output Electronic notebook Portal server Scaling Up: Service-Oriented Science

19 19 For More Information l Globus Alliance u www.globus.org l Globus Consortium u www.globusconsortium.com l Global Grid Forum u www.ggf.org l Open Science Grid u www.opensciencegrid.org 2nd Edition www.mkp.com/grid2


Download ppt "1 The Application-Infrastructure Gap Dynamic and/or Distributed Applications A 1 B 1 9 9 Shared Distributed Infrastructure."

Similar presentations


Ads by Google