Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.

Slides:



Advertisements
Similar presentations
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
Advertisements

Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
Research CU Boulder Cyberinfrastructure & Data management Thomas Hauser Director Research Computing CU-Boulder
EInfrastructures (Internet and Grids) US Resource Centers Perspective: implementation and execution challenges Alan Blatecky Executive Director SDSC.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation,
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
The Open Science Grid: Bringing the power of the Grid to scientific research
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Assessment of Core Services provided to USLHC by OSG.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
OSG Campus Grids Dr. Sebastien Goasguen, Clemson University ____________________________.
Welcome to CW 2007!!!. The Condor Project (Established ‘85) Distributed Computing research performed by.
Key Project Drivers - FY11 Ruth Pordes, June 15th 2010.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
August 2007 Advancing Scientific Discovery through TeraGrid Adapted from S. Lathrop’s talk in SC’07
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
Mehdi Ghayoumi Kent State University Computer Science Department Summer 2015 Exposition on Cyber Infrastructure and Big Data.
SG - OSG Improving Campus Research CI Through Leveraging and Integration: Developing a SURAgrid-OSG Collaboration John McGee, RENCI/OSG Engagement Coordinator.
Cyberinfrastructure Planning at NSF Deborah L. Crawford Acting Director, Office of Cyberinfrastructure HPC Acquisition Models September 9, 2005.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
Miron Livny Center for High Throughput Computing Computer Sciences Department University of Wisconsin-Madison Open Science Grid (OSG)
Condor Team Welcome to Condor Week #10 (year #25 for the project)
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
CyberInfrastructure workshop CSG May Ann Arbor, Michigan.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Russ Hobby Program Manager Internet2 Cyberinfrastructure Architect UC Davis.
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
Open Science Grid For CI-Days NYSGrid Meeting Sebastien Goasguen, John McGee, OSG Engagement Manager School of Computing.
1 Condor Team 2011 Established 1985.
08/05/06 Slide # -1 CCI Workshop Snowmass, CO CCI Roadmap Discussion Jim Bottum and Patrick Dreher Building the Campus Cyberinfrastructure Roadmap Campus.
HPC Centres and Strategies for Advancing Computational Science in Academic Institutions Organisers: Dan Katz – University of Chicago Gabrielle Allen –
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Biomedical and Bioscience Gateway to National Cyberinfrastructure John McGee Renaissance Computing Institute
Leveraging the InCommon Federation to access the NSF TeraGrid Jim Basney Senior Research Scientist National Center for Supercomputing Applications University.
Open Science Grid OSG Engagement Strategy and Status ETP Conference Call Oct ; 5:30PM EST Bringing additional non-physicists onto OSG John McGee.
Explore Cyberinfrastructure and ECSU’s meaningful & strategic engagement.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
11/15/04PittGrid1 PittGrid: Campus-Wide Computing Environment Hassan Karimi School of Information Sciences Ralph Roskies Pittsburgh Supercomputing Center.
The OSG and Grid Operations Center Rob Quick Open Science Grid Operations Center - Indiana University ATLAS Tier 2-Tier 3 Meeting Bloomington, Indiana.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
CMS Usage of the Open Science Grid and the US Tier-2 Centers Ajit Mohapatra, University of Wisconsin, Madison (On Behalf of CMS Offline and Computing Projects)
Digital Data Collections ARL, CNI, CLIR, and DLF Forum October 28, 2005 Washington DC Chris Greer Program Director National Science Foundation.
Open Science Grid as XSEDE Service Provider Open Science Grid as XSEDE Service Provider December 4, 2011 Chander Sehgal OSG User Support.
Sept 2008 OSG Engagement VO, RENCI 1 Open Science Grid Embedded Immersive Engagement for Cyberinfrastructure on the Open Science Grid John McGee –
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
Accelerating Campus Research with Connective Services for Cyberinfrastructure Rob Gardner Steve Tuecke.
Purdue RP Highlights TeraGrid Round Table May 20, 2010 Preston Smith Manager - HPC Grid Systems Rosen Center for Advanced Computing Purdue University.
PEER 2003 Meeting 03/08/031 Interdisciplinary Framework Major focus areas Structural Representation Fault Systems Earthquake Source Physics Ground Motions.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Summary of OSG Activities by LIGO and LSC LIGO NSF Review November 9-11, 2005 Kent Blackburn LIGO Laboratory California Institute of Technology LIGO DCC:
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
March 2014 Open Science Grid Operations A Decade of HTC Infrastructure Support Kyle Gross Operations Support Lead Indiana University / Research Technologies.
HTCondor-CE. 2 The Open Science Grid OSG is a consortium of software, service and resource providers and researchers, from universities, national laboratories.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
] Open Science Grid Ben Clifford University of Chicago
Clouds , Grids and Clusters
Scaling Science Communities Lessons learned by and future plans of the Open Science Grid Frank Würthwein OSG Executive Director Professor of Physics UCSD/SDSC.
High Throughput Computing for Astronomers
GLOW A Campus Grid within OSG
Presentation transcript:

Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure Development Renaissance Computing Institute University of North Carolina, Chapel Hill

Why should my University facilitate (or drive) resource sharing? Because it’s the right thing to do –Enables new modalities of collaboration –Enables new levels of scale –Democratizes large scale computing –Sharing locally leads to sharing globally –Better overall resource utilization –Funding agencies At the heart of the cyberinfrastructure vision is the development of a cultural community that supports peer-to- peer collaboration and new modes of education based upon broad and open access to leadership computing; data and information resources; online instruments and observatories; and visualization and collaboration services. - Arden Bement CI Vision for 21 st Century introduction At the heart of the cyberinfrastructure vision is the development of a cultural community that supports peer-to- peer collaboration and new modes of education based upon broad and open access to leadership computing; data and information resources; online instruments and observatories; and visualization and collaboration services. - Arden Bement CI Vision for 21 st Century introduction

Clemson Campus Condor Pool Machines in 27 different locations on Campus ~1,700 job slots >1.8M hours served in 6 months users from Industrial and Chemical engineering, and Economics Fast ramp up of usage Accessible to the OSG through a gateway

6,400 CPUs available Campus Condor pool backfills idle nodes in PBS clusters - provided 5.5 million CPU- hours in 2006, all from idle nodes in clusters Use on TeraGrid: 2.4 million hours in 2006 spent Building a database of hypothetical zeolite structures; 2007: 5.5 million hours allocated to TG

The Open Science Grid OSG is a consortium of software, service and resource providers and researchers, from universities, national laboratories and computing centers across the U.S., who together build and operate the OSG project. The project is funded by the NSF and DOE, and provides staff for managing various aspects of the OSG. Brings petascale computing and storage resources into a uniform grid computing environment Integrates computing and storage resources from over 50 sites in the U.S. and beyond A framework for large scale distributed resource sharing addressing the technology, policy, and social requirements of sharing

Virtual Organizations (VOs) The OSG Infrastructure trades in Groups not Individuals VO Management services allow registration, administration and control of members of the group. Facilities trust and authorize VOs. Storage and Compute Services prioritize according to VO group. Set of Available Resources VO Management Service OSG and WAN VO Management & Applications VO Management & Applications Campus Grid Image courtesy: UNM

Date range: :00:00 GMT :59:59 GMT

“What impressed me most was how quickly we were able to access the grid and start using it. We learned about it at RENCI, and we were running jobs about two weeks later,” says Kuhlman. For each protein we design, we consume about 3,000 CPU hours across 10,000 jobs,” says Kuhlman. “Adding in the structure and atom design process, we’ve consumed about 100,000 CPU hours in total so far.” Designing proteins in the Kuhlman Lab

Campus IT and Enterprise Systems Department IT Campus Research Computing Department IT Campus ResearcherStudent … so, what can we do together … … to advance scientific research and education? Lab IT

What can we do together? OSG is looking for a few partners to help deploy campus wide grid infrastructure that integrates with local enterprise infrastructure and the national CI RENCI’s OSG team is available to help scientists get their applications running on OSG –low impact starting point –Help your researchers gain significant compute cycles while exploring OSG as a framework for your own campus CI mailto:

E N D