1 Birmingham LIGO Gravitational Wave Observatory Replicate 1 TB/day of data to 10+ international sites Uses GridFTP, RFT, RLS, DRS Cardiff AEI/Golm

Slides:



Advertisements
Similar presentations
Earth Science Room Mr. Becker. Analyzing Seismic Waves.
Advertisements

TeraGrid's GRAM Auditing & Accounting, & its Integration with the LEAD Science Gateway Stuart Martin Computation Institute, University of Chicago & Argonne.
The future of Globus (Grid meets Cloud) Ian Foster Computation Institute University of Chicago & Argonne National Laboratory.
3 September 2004NVO Coordination Meeting1 Grid-Technologies NVO and the Grid Reagan W. Moore George Kremenek Leesa Brieger Ewa Deelman Roy Williams John.
National e-Science Centre Glasgow e-Science Hub Opening: Remarks NeSCs Role Prof. Malcolm Atkinson Director 17 th September 2003.
Web: OMII-UK Delivering Software and Social Platforms for Successful Research RCUK Review of e-Science, 8 December.
 Depth Into the earth Surface of the earth Distance along the fault plane 100 km (60 miles) Slip on an earthquake fault START.
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
Development of China-VO ZHAO Yongheng NAOC, Beijing Nov
A. Sim, CRD, L B N L 1 ANI and Magellan Launch, Nov. 18, 2009 Climate 100: Scaling the Earth System Grid to 100Gbps Networks Alex Sim, CRD, LBNL Dean N.
SACNAS, Sept 29-Oct 1, 2005, Denver, CO What is Cyberinfrastructure? The Computer Science Perspective Dr. Chaitan Baru Project Director, The Geosciences.
The ADAMANT Project: Linking Scientific Workflows and Networks “Adaptive Data-Aware Multi-Domain Application Network Topologies” Ilia Baldine, Charles.
Ian Foster Computation Institute Argonne National Lab & University of Chicago Education in the Science 2.0 Era.
Ian Foster Computation Institute Argonne National Lab & University of Chicago Cyberinfrastructure and the Role of Grid Computing Or, “Science 2.0”
Application of GRID technologies for satellite data analysis Stepan G. Antushev, Andrey V. Golik and Vitaly K. Fischenko 2007.
1 High Performance Computing at SCEC Scott Callaghan Southern California Earthquake Center University of Southern California.
Is 'Designing' Cyberinfrastructure - or, Even, Defining It - Possible? Peter A. Freeman National Science Foundation January 29, 2007 The views expressed.
Using Globus to Locate Services Case Study 1: A Distributed Information Service for TeraGrid John-Paul Navarro, Lee Liming.
1 CYBERINFRASTRUCTURE FOR THE GEOSCIENCES Global Earth Observation Grid Workshop, Bangkok, Thailand, March Integration Platform.
Simo Niskala Teemu Pasanen
Globus Computing Infrustructure Software Globus Toolkit 11-2.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
Grid Computing. What is a Grid? Many definitions exist in the literature Early definitions: Foster and Kesselman, 1998 –“A computational grid is a hardware.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
1 CEOS GRID Activity Shinichi Sobue/NASDA Kaori Kuroiwa/RESTEC.
Data Management Kelly Clynes Caitlin Minteer. Agenda Globus Toolkit Basic Data Management Systems Overview of Data Management Data Movement Grid FTP Reliable.
Patrick R Brady University of Wisconsin-Milwaukee
CYBERINFRASTRUCTURE FOR THE GEOSCIENCES High Performance Computing applications in GEON: From Design to Production Dogan Seber.
1 Introduction to Grid Computing. 2 What is a Grid? Many definitions exist in the literature Early definitions: Foster and Kesselman, 1998 “A computational.
August 2007 Advancing Scientific Discovery through TeraGrid Adapted from S. Lathrop’s talk in SC’07
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
Ian Foster Computation Institute Argonne National Lab & University of Chicago Globus and Service Oriented Architecture.
Why GridFTP? l Performance u Parallel TCP streams, optimal TCP buffer u Non TCP protocol such as UDT u Order of magnitude greater l Cluster-to-cluster.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
Some Grid Experiences Laura Pearlman USC Information Sciences Institute ICTP Advanced Training Workshop on Scientific Instruments on the Grid *Most of.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Apache Airavata (Incubating) Gateway to Grids & Clouds Suresh Marru Nov 10 th 2011.
“Integrated observational system of systems” (~$200 million) SAFOD: 3.1 km borehole into the San Andreas Fault PBO: 1099 geodetic stations; 81 strainmeter/seismic.
TeraGrid CTSS Plans and Status Dane Skow for Lee Liming and JP Navarro OSG Consortium Meeting 22 August, 2006.
SEEK Welcome Malcolm Atkinson Director 12 th May 2004.
Pegasus: Running Large-Scale Scientific Workflows on the TeraGrid Ewa Deelman USC Information Sciences Institute
The Earth System Grid (ESG) Computer Science and Technologies DOE SciDAC ESG Project Review Argonne National Laboratory, Illinois May 8-9, 2003.
Ian Foster Computation Institute Argonne National Lab & University of Chicago Cyberinfrastructure and the Role of Grid Computing Or, “Science 2.0”
Preparation for Integration Organized access to the code WP6 infrastructure (MDS-2, RC, …) Input from WPs on requirements,... Acquire experience with Globus.
Known Territories l There are several ways that Globus software has been used that are well-trod paths u General-purpose Grid infrastructures u Domain-specific.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
GridFTP GUI: An Easy and Efficient Way to Transfer Data in Grid
ISERVOGrid Architecture Working Group Brisbane Australia June Geoffrey Fox Community Grids Lab Indiana University
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
Junwei Cao March LIGO Document ID: LIGO-G Computing Infrastructure for Gravitational Wave Data Analysis Junwei Cao.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT.
© 2006 The University of Chicago Team Science, Team Scholarship Tom Barton Chad Kainz.
State of LSC Data Analysis and Software LSC Meeting LIGO Hanford Observatory November 11 th, 2003 Kent Blackburn, Stuart Anderson, Albert Lazzarini LIGO.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Earth System Curator and Model Metadata Discovery and Display for CMIP5 Sylvia Murphy and Cecelia Deluca (NOAA/CIRES) Hannah Wilcox (NCAR/CISL) Metafor.
GSI: Security On Teragrid A Introduction To Security In Cyberinfrastructure By Dru Sepulveda.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
LIGO-G W Use of Condor by the LIGO Scientific Collaboration Gregory Mendell, LIGO Hanford Observatory On behalf of the LIGO Scientific Collaboration.
PEER 2003 Meeting 03/08/031 Interdisciplinary Framework Major focus areas Structural Representation Fault Systems Earthquake Source Physics Ground Motions.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
TeraGrid’s Process for Meeting User Needs. Jay Boisseau, Texas Advanced Computing Center Dennis Gannon, Indiana University Ralph Roskies, University of.
INTRODUCTION TO XSEDE. INTRODUCTION  Extreme Science and Engineering Discovery Environment (XSEDE)  “most advanced, powerful, and robust collection.
SCEC CyberShake on TG & OSG: Options and Experiments Allan Espinosa*°, Daniel S. Katz*, Michael Wilde*, Ian Foster*°,
Parallel Computing Globus Toolkit – Grid Ayaka Ohira.
Joslynn Lee – Data Science Educator
Scott Callaghan Southern California Earthquake Center
Philip J. Maechling (SCEC) September 13, 2015
Presentation transcript:

1 Birmingham LIGO Gravitational Wave Observatory Replicate 1 TB/day of data to 10+ international sites Uses GridFTP, RFT, RLS, DRS Cardiff AEI/Golm

2 CABig: Cancer Bioinformatics Grid Integrate software and services across more than 50 sites Uses Introduce, RAVi, MDS, GSI, GAARDS, Core

3 DOE Earth System Grid Enable sharing & analysis of high- volume data from advanced earth system models Uses RLS, MDS, GridFTP, GSI

4 NSFs TeraGrid l TeraGrid DEEP: Integrating NSFs most powerful computers (60+ TF) l TeraGrid WIDE Science Gateways: Engaging Scientific Communities l Base TeraGrid Cyberinfrastructure: Persistent, Reliable, National Enable user access and gateways across NSF domains Uses GRAM4, GridFTP, MDS, GSI, OGSA-DAI, and others

5 Open Science Grid Enable common infrastructure for DOE applications Uses GRAM2, GridFTP, GSI, +

6 Fusion Grid: TRANSP Service Represent Fusion applications as services with an application-specific portal Use: GRAM2

7 NASA/NVO Mosaic What do they do Uses which pieces? Mosaic of M42 created on the Teragrid

8 l Simulates ground motions for potential fault ruptures within 200 km of each site ~ 12,700 potential ruptures in SoCal from USGS 2002 ERF l Extends each rupture to multiple hypocenters and slip models for each source ~ 100,000 ground motion simulations for each site l Enables physics-based probabilistic seismic hazard analysis Globus-Based CyberShake Platform Conduct large-scale earthquake science sim Uses: GRAM2, GSI, GridFTP, RLS

Digital Human: Simulation of Human Arterial Tree Supported by NSF (IMAG, CI-TEAM and DDDAS) What do they do Uses which pieces?