1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.

Slides:



Advertisements
Similar presentations
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
Advertisements

Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
Towards a Virtual European Supercomputing Infrastructure Vision & issues Sanzio Bassini
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation,
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
R. Pordes, I Brazilian LHC Computing Workshop 1 What is Open Science Grid?  High Throughput Distributed Facility  Shared opportunistic access to existing.
Open Science Grid Frank Würthwein UCSD. 2/13/2006 GGF 2 “Airplane view” of the OSG  High Throughput Computing — Opportunistic scavenging on cheap hardware.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Assessment of Core Services provided to USLHC by OSG.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Open Science Ruth Pordes Fermilab, July 17th 2006 What is OSG Where Networking fits Middleware Security Networking & OSG Outline.
Welcome to CW 2007!!!. The Condor Project (Established ‘85) Distributed Computing research performed by.
Key Project Drivers - FY11 Ruth Pordes, June 15th 2010.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
SG - OSG Improving Campus Research CI Through Leveraging and Integration: Developing a SURAgrid-OSG Collaboration John McGee, RENCI/OSG Engagement Coordinator.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
Open Science Grid  Consortium of many organizations (multiple disciplines)  Production grid cyberinfrastructure  80+ sites, 25,000+ CPU.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Key Project Drivers - an Update Ruth Pordes, June 14th 2008, V2: June 23 rd. These slides are in addition to the information available in
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
The Swiss Grid Initiative Context and Initiation Work by CSCS Peter Kunszt, CSCS.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
The Open Science Grid OSG Ruth Pordes Fermilab. 2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
HPC Centres and Strategies for Advancing Computational Science in Academic Institutions Organisers: Dan Katz – University of Chicago Gabrielle Allen –
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
OSG Consortium Meeting (January 23, 2006)Paul Avery1 University of Florida Open Science Grid Progress Linking Universities and Laboratories.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
April 26, Executive Director Report Executive Board 4/26/07 Things under control Things out of control.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
Parag Mhashilkar Computing Division, Fermi National Accelerator Laboratory.
1 An update on the Open Science Grid for IHEPCCC Ruth Pordes, Fermilab.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
Summary of OSG Activities by LIGO and LSC LIGO NSF Review November 9-11, 2005 Kent Blackburn LIGO Laboratory California Institute of Technology LIGO DCC:
Toward High Breakthrough Collaboration (HBC) Susan Turnbull Program Manager Advanced Scientific Computing Research March 4, 2009.
Ruth Pordes Executive Director University of Washingon Seattle OSG Consortium Meeting 21st August University of Washingon Seattle.
March 2014 Open Science Grid Operations A Decade of HTC Infrastructure Support Kyle Gross Operations Support Lead Indiana University / Research Technologies.
Defining the Technical Roadmap for the NWICG – OSG Ruth Pordes Fermilab.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
What is OSG? (What does it have to do with Atlas T3s?) What is OSG? (What does it have to do with Atlas T3s?) Dan Fraser OSG Production Coordinator OSG.
] Open Science Grid Ben Clifford University of Chicago
Open Science Grid Interoperability
Bob Jones EGEE Technical Director
Gene Oleynik, Head of Data Storage and Caching,
Scaling Science Communities Lessons learned by and future plans of the Open Science Grid Frank Würthwein OSG Executive Director Professor of Physics UCSD/SDSC.
Open Science Grid Progress and Status
Monitoring and Information Services Technical Group Report
LHC Data Analysis using a worldwide computing grid
EGI Webinar - Introduction -
Open Science Grid at Condor Week
Presentation transcript:

1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed cyberinfrastructure that brings together campus and community infrastructure facilitating the research of Virtual Organizations at all scales. VO Jobs Running on OSG in 2006

2 Why the effort is important: Sustained growth in the needs of traditional compute and data intensive science; The steady stream of scientific domains that add and expand the role of computing and data processing in their discovery process; Coupled with the administrative and physical distribution of compute and storage resources and increase in the size, diversity and scope of scientific collaborations. CPU Million Specint2000s Cache Disk in Petabytes Facility (preliminary commitments):

3 Goals of the OSG: Support data storage, distribution & computation for High Energy, Nuclear & Astro Physics collaborations, in particular delivering to the schedule, capacity and capability needed for LHC and LIGO science. Engage and benefit other Research & Science of all scales through progressively supporting their applications. Educate & train students, administrators & educators. Operate & evolve a petascale Distributed Facility across the US providing guaranteed & opportunistic access to shared compute & storage resources. Interface & Federate with Campus, Regional, other national & international Grids (including EGEE & TeraGrid). Provide an Integrated, Robust Software Stack for Facility & Applications, tested on a well provisioned at-scale validation facility. Evolve the capabilities offered by the Facility by deploying externally developed new services & technologies.

4 Challenges: Sociological and Technical Develop the organizational and management structure of an open consortium that drives such a CI. Develop the organizational and management structure for the project that builds, operates and evolves such CI. Maintain and evolve a software stack capable of offering powerful and dependable capabilities to the NSF and DOE scientific communities. Operate and evolve a dependable facility. Boston UniversityBrookhaven National Laboratory California Institute of TechologyColumbia University Cornell UniversityFermi National Accelerator Laboratory Indiana UniversityLawrence Berkeley National Laboratory Rennaisance Computing InstituteStanford Linear Accelerator Center University of California, San DiegoUniversity of Chicago/Argonne National Laboratory University of Florida University of Iowa University of Wisconsin, Madison

5 Software StackSoftware Release Process Grid of Grids: From Local to Global Computational Science: Here, There and Everywhere Global Research & Shared Resources

6 Integrated Network Management Timeline & Milestones (preliminary) LHC Simulations Support 1000 Users; 20PB Data Archive Contribute to Worldwide LHC Computing Grid LHC Event Data Distribution and Analysis Contribute to LIGO Workflow and Data Analysis +1 Community Additional Science Communities +1 Community Facility Security : Risk Assessment, Audits, Incident Response, Management, Operations, Technical Controls Plan V11st AuditRisk Assessment AuditRisk Assessment AuditRisk Assessment AuditRisk Assessment VDT and OSG Software Releases: Major Release every 6 months; Minor Updates as needed VDT 1.4.0VDT 1.4.1VDT 1.4.2………… Advanced LIGO LIGO Data Grid dependent on OSG CDF Simulation STAR, CDF, D0, Astrophysics D0 Reprocessing STAR Data Distribution and Jobs 10KJobs per Day D0 Simulations CDF Simulation and Analysis LIGO data run SC5 Facility Operations and Metrics: Increase robustness and scale; Operational Metrics defined and validated each year. Interoperate and Federate with Campus and Regional Grids Project startEnd of Phase I End of Phase II VDT Incremental Updates dCache with role based authorization OSG 0.6.0OSG 0.8.0OSG 1.0OSG 2.0OSG 3.0… AccountingAuditing VDS with SRM Common S/w Distribution with TeraGrid EGEE using VDT 1.4.X Transparent data and job movement with TeraGrid Transparent data management with EGEE Federated monitoring and information services Data Analysis (batch and interactive) Workflow Extended Capabilities & Increase Scalability and Performance for Jobs and Data to meet Stakeholder needs SRM/dCache Extensions “Just in Time” Workload Management VO Services Infrastructure Improved Workflow and Resource Selection Work with SciDAC-2 CEDS and Security with Open Science +1 Community Community