April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.

Slides:



Advertisements
Similar presentations
ASCR Data Science Centers Infrastructure Demonstration S. Canon, N. Desai, M. Ernst, K. Kleese-Van Dam, G. Shipman, B. Tierney.
Advertisements

Bosco: Enabling Researchers to Expand Their HTC Resources The Bosco Team: Dan Fraser, Jaime Frey, Brooklin Gore, Marco Mambelli, Alain Roy, Todd Tannenbaum,
The ADAMANT Project: Linking Scientific Workflows and Networks “Adaptive Data-Aware Multi-Domain Application Network Topologies” Ilia Baldine, Charles.
EInfrastructures (Internet and Grids) US Resource Centers Perspective: implementation and execution challenges Alan Blatecky Executive Director SDSC.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation,
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
The Open Science Grid: Bringing the power of the Grid to scientific research
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
Structural Genomics – an example of transdisciplinary research at Stanford Goal of structural and functional genomics is to determine and analyze all possible.
 Amazon Web Services announced the launch of Cluster Compute Instances for Amazon EC2.  Which aims to provide high-bandwidth, low- latency instances.
© The Trustees of Indiana University Centralize Research Computing to Drive Innovation…Really Thomas J. Hacker Research & Academic Computing University.
Welcome to HTCondor Week #14 (year #29 for our project)
Assessment of Core Services provided to USLHC by OSG.
M.Goldberg/NSFOSGSep 18, 2012 The Open Science Grid 1.
SICSA student induction day, 2009Slide 1 Social Simulation Tutorial Session 6: Introduction to grids and cloud computing International Symposium on Grid.
Welcome to CW 2007!!!. The Condor Project (Established ‘85) Distributed Computing research performed by.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
and beyond Office of Vice President for Information Technology.
Advancing Computational Science in Academic Institutions Organisers: Dan Katz – University of Chicago Gabrielle Allen – Louisiana State University Rob.
August 2007 Advancing Scientific Discovery through TeraGrid Adapted from S. Lathrop’s talk in SC’07
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
SG - OSG Improving Campus Research CI Through Leveraging and Integration: Developing a SURAgrid-OSG Collaboration John McGee, RENCI/OSG Engagement Coordinator.
Cyberinfrastructure Planning at NSF Deborah L. Crawford Acting Director, Office of Cyberinfrastructure HPC Acquisition Models September 9, 2005.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
Miron Livny Center for High Throughput Computing Computer Sciences Department University of Wisconsin-Madison Open Science Grid (OSG)
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Russ Hobby Program Manager Internet2 Cyberinfrastructure Architect UC Davis.
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
Tools for collaboration How to share your duck tales…
Open Science Grid For CI-Days NYSGrid Meeting Sebastien Goasguen, John McGee, OSG Engagement Manager School of Computing.
Authors: Ronnie Julio Cole David
08/05/06 Slide # -1 CCI Workshop Snowmass, CO CCI Roadmap Discussion Jim Bottum and Patrick Dreher Building the Campus Cyberinfrastructure Roadmap Campus.
HPC Centres and Strategies for Advancing Computational Science in Academic Institutions Organisers: Dan Katz – University of Chicago Gabrielle Allen –
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Biomedical and Bioscience Gateway to National Cyberinfrastructure John McGee Renaissance Computing Institute
SALSASALSASALSASALSA Cloud Panel Session CloudCom 2009 Beijing Jiaotong University Beijing December Geoffrey Fox
Open Science Grid OSG Engagement Strategy and Status ETP Conference Call Oct ; 5:30PM EST Bringing additional non-physicists onto OSG John McGee.
Cyberinfrastructure: An investment worth making Joe Breen University of Utah Center for High Performance Computing.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
GriPhyN Project Paul Avery, University of Florida, Ian Foster, University of Chicago NSF Grant ITR Research Objectives Significant Results Approach.
Digital Data Collections ARL, CNI, CLIR, and DLF Forum October 28, 2005 Washington DC Chris Greer Program Director National Science Foundation.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
Sept 2008 OSG Engagement VO, RENCI 1 Open Science Grid Embedded Immersive Engagement for Cyberinfrastructure on the Open Science Grid John McGee –
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
© Copyright AARNet Pty Ltd PRAGMA Update & some personal observations James Sankar Network Engineer - Middleware.
Accelerating Campus Research with Connective Services for Cyberinfrastructure Rob Gardner Steve Tuecke.
High Risk 1. Ensure productive use of GRID computing through participation of biologists to shape the development of the GRID. 2. Develop user-friendly.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
March 2014 Open Science Grid Operations A Decade of HTC Infrastructure Support Kyle Gross Operations Support Lead Indiana University / Research Technologies.
Job submission overview Marco Mambelli – August OSG Summer Workshop TTU - Lubbock, TX THE UNIVERSITY OF CHICAGO.
Elastic Cyberinfrastructure for Research Computing
Scaling Science Communities Lessons learned by and future plans of the Open Science Grid Frank Würthwein OSG Executive Director Professor of Physics UCSD/SDSC.
Tools and Services Workshop
Joslynn Lee – Data Science Educator
Miron Livny John P. Morgridge Professor of Computer Science
XSEDE’s Campus Bridging Project
High Throughput Computing for Astronomers
Welcome to (HT)Condor Week #19 (year 34 of our project)
Presentation transcript:

April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel Hill

April 2009 OSG Grid School - RDU 2 Open Science Grid Computational Science is a critical component of discovery, and its importance is growing

April 2009 OSG Grid School - RDU 3 Open Science Grid HTC vs. HPC High Throughput Computing (HTC) – simple serial jobs, mote carlo, parameter sweep – complex workflows with data and/or job dependencies – ensembles of small node count MPI jobs – must be able to express the work in a highly portable way – cannot care about exactly where pieces are executed – do not have SSH access to resources; many implications – must be able to respect the community rules of a shared infrastructure; “plays well with others” – remarkable scaling opportunities

April 2009 OSG Grid School - RDU 4 Open Science Grid HTC vs. HPC High Performance Computing (HPC) – top N computational scientists in the world – many important problems simply cannot be solved any other way – what does it take to successfully develop or run models at 256, or 512+ node parallelism? – what percentage of researchers that can benefit from computational science can realistically do this today? – how does grid /distributed computing fit with large scale HPC applications?

April 2009 OSG Grid School - RDU 5 Open Science Grid PI owned and operated cluster Campus Condo Computing Departmental Cluster Campus Research Computing Campus Condor Pool Regional Grids (eg NWICG) Communities of Practice (NanoHub, GridChem, etc) Open Science Grid: DOE/NSF, opportunistic access TeraGrid: NSF, competitively awarded allocations DOE ASCR: INCITE awards Commercial Cloud service providers There must be more … what have I missed? where is NIH? Where do researchers go for services?

April 2009 OSG Grid School - RDU 6 Open Science Grid Where do researchers go for services? How many different: service interfaces software stacks policy frameworks identities per researcher … where is The National Cyberinfrastructure? How many different: service interfaces software stacks policy frameworks identities per researcher … where is The National Cyberinfrastructure? Answer: wherever they can get it, with the least amount of pain PI owned and operated cluster Campus Condo Computing Departmental Cluster Campus Research Computing Campus Condor Pool Regional Grids (eg NWICG) Communities of Practice (NanoHub, GridChem, etc) Open Science Grid: DOE/NSF, opportunistic access TeraGrid: NSF, competitively awarded allocations DOE ASCR: INCITE awards Commercial Cloud service providers There must be more … what have I missed? where is NIH?

April 2009 OSG Grid School - RDU 7 Open Science Grid

April 2009 OSG Grid School - RDU 8 Open Science Grid The Open Science Grid OSG is a consortium of software, service and resource providers and researchers, from universities, national laboratories and computing centers across the U.S., who together build and operate the OSG project. The project is funded by the NSF and DOE, and provides staff for managing various aspects of the OSG. Brings petascale computing and storage resources into a uniform grid computing environment Integrates computing and storage resources from over 80 sites in the U.S. and beyond A framework for large scale distributed resource sharing addressing the technology, policy, and social requirements of sharing

April 2009 OSG Grid School - RDU 9 Open Science Grid Context: Evolution of Projects PPDG GriPhyN iVDGL Trillium Grid3 OSG (DOE) (DOE+NSF) (NSF)

April 2009 OSG Grid School - RDU 10 Open Science Grid Using OSG Today Astrophysics Biochemistry Bioinformatics Earthquake Engineering Genetics Gravitational-wave physics Mathematics Nanotechnology Nuclear and particle physics Text mining And more…

April 2009 OSG Grid School - RDU 11 Open Science Grid Why should my University facilitate (or drive) resource sharing? Enables new modalities of collaboration Enables new levels of scale Democratizes large scale computing Sharing locally leads to sharing globally Better overall resource utilization Funding agencies care about this At the heart of the cyberinfrastructure vision is the development of a cultural community that supports peer-to- peer collaboration and new modes of education based upon broad and open access to leadership computing; data and information resources; online instruments and observatories; and visualization and collaboration services. Arden Bement CI Vision for 21 st Century introduction At the heart of the cyberinfrastructure vision is the development of a cultural community that supports peer-to- peer collaboration and new modes of education based upon broad and open access to leadership computing; data and information resources; online instruments and observatories; and visualization and collaboration services. Arden Bement CI Vision for 21 st Century introduction

April 2009 OSG Grid School - RDU 12 Open Science Grid OSG Engagement Mission 1.Help new user communities from diverse scientific domains adapt their computational systems to leverage OSG 2.Facilitate University Campus CI deployment, and interconnect it with the national community 3.Provide feedback and new requirements to the infrastructure providers

April 2009 OSG Grid School - RDU 13 Open Science Grid Date range: :00:00 GMT :59:59 GMT

April 2009 OSG Grid School - RDU 14 Open Science Grid Engaged Users Activity

April 2009 OSG Grid School - RDU 15 Open Science Grid Questions? OSG Engagement VO

April 2009 OSG Grid School - RDU 16 Open Science Grid E N D