Ruth Pordes Executive Director University of Washingon Seattle OSG Consortium Meeting 21st August University of Washingon Seattle.

Slides:



Advertisements
Similar presentations
9/25/08DLP1 OSG Operational Security D. Petravick For the OSG Security Team: Don Petravick, Bob Cowles, Leigh Grundhoefer, Irwin Gaines, Doug Olson, Alain.
Advertisements

 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
1 IWLSC, Kolkata India 2006 Jérôme Lauret for the Open Science Grid consortium The Open-Science-Grid: Building a US based Grid infrastructure for Open.
May 9, 2008 Reorganization of the OSG Project The existing project organization chart was put in place at the beginning of It has worked very well.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
R. Pordes, I Brazilian LHC Computing Workshop 1 What is Open Science Grid?  High Throughput Distributed Facility  Shared opportunistic access to existing.
Open Science Grid Frank Würthwein UCSD. 2/13/2006 GGF 2 “Airplane view” of the OSG  High Throughput Computing — Opportunistic scavenging on cheap hardware.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Assessment of Core Services provided to USLHC by OSG.
Open Science Ruth Pordes Fermilab, July 17th 2006 What is OSG Where Networking fits Middleware Security Networking & OSG Outline.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
GIG Software Integration: Area Overview TeraGrid Annual Project Review April, 2008.
Open Science Grid Software Stack, Virtual Data Toolkit and Interoperability Activities D. Olson, LBNL for the OSG International.
Key Project Drivers - FY11 Ruth Pordes, June 15th 2010.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
OSG Security Review Mine Altunay June 19, June 19, Security Overview Current Initiatives  Incident response procedure – top priority (WBS.
Integration and Sites Rob Gardner Area Coordinators Meeting 12/4/08.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
May 8, 20071/15 VO Services Project – Status Report Gabriele Garzoglio VO Services Project – Status Report Overview and Plans May 8, 2007 Computing Division,
Apr 30, 20081/11 VO Services Project – Stakeholders’ Meeting Gabriele Garzoglio VO Services Project Stakeholders’ Meeting Apr 30, 2008 Gabriele Garzoglio.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Steven Newhouse EGEE’s plans for transition.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
OSG Project Manager Report for OSG Council Meeting OSG Project Manager Report for OSG Council Meeting October 14, 2008 Chander Sehgal.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
The Open Science Grid OSG Ruth Pordes Fermilab. 2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers.
Open Science Grid An Update and Its Principles Ruth Pordes Fermilab.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
BNL Tier 1 Service Planning & Monitoring Bruce G. Gibbard GDB 5-6 August 2006.
Open Science Grid Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab September 1, 2005.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
INFSO-RI Enabling Grids for E-sciencE EGEE SA1 in EGEE-II – Overview Ian Bird IT Department CERN, Switzerland EGEE.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
9 Oct Overview Resource & Project Management Current Initiatives  Generate SOWs  8 written and 6 remain;  drafts will be complete next week 
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
External Communication & Coordination Ruth Pordes
EGI-InSPIRE Steven Newhouse Interim EGI.eu Director EGI-InSPIRE Project Director Technical Director EGEE-III 1GDB - December 2009.
April 26, Executive Director Report Executive Board 4/26/07 Things under control Things out of control.
Jan 2010 OSG Update Grid Deployment Board, Feb 10 th 2010 Now having daily attendance at the WLCG daily operations meeting. Helping in ensuring tickets.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab October 25, 2005.
OSG Deployment Preparations Status Dane Skow OSG Council Meeting May 3, 2005 Madison, WI.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
Открытая решетка науки строя открытое Cyber- инфраструктура для науки GRID’2006 Dubna, Россия Июнь 26, 2006 Robertovich Gardner Университет Chicago.
1 An update on the Open Science Grid for IHEPCCC Ruth Pordes, Fermilab.
INFSO-RI Enabling Grids for E-sciencE Operations Parallel Session Summary Markus Schulz CERN IT/GD Joint OSG and EGEE Operations.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.
CERN 1 DataGrid Architecture Group Bob Jones CERN.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Summary of OSG Activities by LIGO and LSC LIGO NSF Review November 9-11, 2005 Kent Blackburn LIGO Laboratory California Institute of Technology LIGO DCC:
INFSO-RI Enabling Grids for E-sciencE EGEE general project update Fotis Karayannis EGEE South East Europe Project Management Board.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
Ruth Pordes, March 2010 OSG Update – GDB Mar 17 th 2010 Operations Services 1 Ramping up for resumption of data taking. Watching every ticket carefully.
OSG Facility Miron Livny OSG Facility Coordinator and PI University of Wisconsin-Madison Open Science Grid Scientific Advisory Group Meeting June 12th.
Defining the Technical Roadmap for the NWICG – OSG Ruth Pordes Fermilab.
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
1 Open Science Grid Progress & Vision Keith Chadwick, Fermilab
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
What is OSG? (What does it have to do with Atlas T3s?) What is OSG? (What does it have to do with Atlas T3s?) Dan Fraser OSG Production Coordinator OSG.
Bob Jones EGEE Technical Director
Open Science Grid Progress and Status
Monitoring and Information Services Technical Group Report
Open Science Grid at Condor Week
Presentation transcript:

Ruth Pordes Executive Director University of Washingon Seattle OSG Consortium Meeting 21st August University of Washingon Seattle

OSG Consortium, 2/21/06 2 Thank you to our hosts.

OSG Consortium, 2/21/06 3 The Context OSG has few face to face meetings. The main ones are the semi- annual Consortium meetings. So we use this meeting to a) Review what has happened over the past six months. b) Hear from our contributors and partners on some aspects of mutual interest. c) Share and communicate our goals and plans. d) Move the technical program of the OSG forward. The goal is that we all listen, discuss, complain and construct. Goal is to make the OSG the “A” Grid.

OSG Consortium, 2/21/06 4 This week we will.. Hear from our Contributors about the Science from current use of the OSG. We will Engage with new communities and partners. We will hear from our partners about Campus and peer Grid Infrastructures. Get updates of the status of OSG, the Facility and Education programs; We will cover much about Security --- Risk, Responsibility, Tracking for OSG itself, Virtual Organizations, Sites and interfaces to peer grids ---. We will have sessions where people share their experiences and learn from each other --- Site, Facility and VO Administrators, Users etc. We will have a few Demonstrations tomorrow. Discuss and decide on aspects of the short term Technical Program: Make decisions for the OSG software release and associated VDT releases. Plan the next Steps in: Information Services, Information Management, Data Management, Workload Management, Education, Inclusion, Communication.

OSG Consortium, 2/21/06 5 First the recap..

OSG Consortium, 2/21/06 6 What we have not gotten done since the last Consortium meeting  Site Space management so that VOs and Sites have robust, managed shared storage and space services.  Simple metrics of use and accounting not deployed uniformly across sites.  Fixed severe lack of robustness of Authz components (especially GUMS) and simplicity of use.  VOs knowing and negotiating with Sites to really support their applications.  Changed the perception that OSG is “mainly physics”. Why is this? GADU only has had one major run; suite of math jobs from Football pool problem; Nanohub jobs undergoing troubleshooting.  …

OSG Consortium, 2/21/06 7 What we have Accomplished  Sustained and operated the OSG to the benefit of >15 user organizations. (during ramp down of funded Grid projects and uncertainty in future).  Increased the robustness and scalability of the Compute Element head node by implementing a managed-fork queue and “nfs-less” shared disk areas.  Made effective contributions to WLCG (grid and application) Service Challenges as the US common infrastructure for ATLAS and CMS  Released 3 versions of the Virtual Data Toolkit and released OSG  Increased the average number of jobs on OSG by 1/3 (2000-> 3000) and the number of Sites - Compute and Storage - by a few %. Increased LIGO and STAR jobs a few.  Written a comprehensive Security Risk Assesment of the OSG and an initial security plan.  Achieved Federation of local-area Campus Grids bridging to wide-area cyber-infrastrastructures.  Run the 4th successful Summer Grid Workshop with >45 students who actually adapted their own applications to run across multiple remote sites.  Released OSG RA.

OSG Consortium, 2/21/ Virtual Organizations - participating Groups 3 with >1000 jobs max. (all particle physics) 3 with max. (all outside physics) 5 with max (particle, nuclear, and astro physics)

OSG Consortium, 2/21/06 9 GridCat continues to list the Resources

OSG Consortium, 2/21/06 10 More Documentation

OSG Consortium, 2/21/06 11 And..oh yes.. The OSG proposal was finished and submitted to DOE and NSF in March & February. In June funding the agencies asked for revised budget with more than a 25% reduction to $6M/year: ~33 FTE/year of effort. We are going ahead with our planning on the assumption that funding is coming in the next few months. The project will be accountable in its promises and deliverables to the funding sources; we expect to be reviewed on our project plan and milestones ~3 months after funding. We will have Expectations of the project, which will be documented, tracked and planned across the funded effort. For the next 5 years we must be serious about delivering the fundamental Production Quality Facility for our Users.

OSG Consortium, 2/21/06 12 the OSG Project and its part in the OSG Consortium

OSG Consortium, 2/21/06 13 The OSG Project will deliver Maintainenance of an expanding Production Distributed Facility which provides core capabilities in Operations, Security, Software Releases, the Virtual Data Toolkit, Troublshooting and supports the Engagement of new communities to use the OSG. Education, Training and Outreach with a core focus on expanding the number, location and scope of the Grid Schools and web based training. User Support, Service and System Extensions for increased scale, performance and capability of the common services for the stakeholder VOs. OSG Staff: executive director, project consultant, administrative help and Technical documentation for training system administrators, users and VO administrators.

OSG Consortium, 2/21/06 14 OSG Project Execution Plan (PEP) - FTEs 33Total FTEs 3.0Staff 9.0Extensions in capability and scale. 1.0Facility management 2.0Education, outreach & training 2.0Engagement 6.5Software release and support 4.5Security and troubleshooting 5.0Facility operations FTEs

OSG Consortium, 2/21/06 15 OSG PEP - Organization

OSG Consortium, 2/21/06 16 Part of the OSG Consortium Contributors Project

OSG Consortium, 2/21/06 17 OSG PEP - High Level Milestones 2006Q3Release OSG software stack version Q3Project baseline review 2006Q4Sign off on OSG Security Plan. 2006Q4Meet operational metrics for Q1Accounting reports available for users and resource owners. 2007Q2Production use of OSG by one additional science community. 2007Q2OSG-TeraGrid: software releases based on same NMI software base. 2007Q2Release OSG software version 0.8.0: Complete extensions for LHC data taking. 2007Q2Support for ATLAS and CMS data taking. 2007Q31 year Project Review. 2007Q4Meet 2007 deliverables as defined by science stakeholders. 2007Q4Meet operational metrics for Q4Release OSG software version Q2Production use of OSG by 2 additional science communities. 2008Q3OSG-TeraGrid: production service interoperation. 2008Q32nd year Project Review. 2008Q4Meet 2008 deliverables as defined by science stakeholders. 2008Q4Meet operational metrics for Q2Support for all STAR analysis (10,000 jobs/day). 2010Q1Support for data taking with order of magnitude increase in LIGO sensitivity.

OSG Consortium, 2/21/06 18 OSG PEP - Security, Safety, Risk Management The OSG Facility assesses, monitors and responds to security issues. The Security Officer coordinates these. Each site, user and administrator has responsibility for local security and reporting incidents that may occur. The OSG will have a comprehensive security plan modeled on the NIST process. While Environment, Safety and Health (ES&H) remains the responsibility of the owners of the resources made accessible to the Open Science Grid, the project organization will pay attention to ESH issues during travel, meetings and OSG activities

OSG Consortium, 2/21/06 19 OSG Project Effort Distribution Year 1 Each Institution will have a signed Statement of Work (MOU). Each individual will submit open monthly written reports. Finance Board will review the accounts and deliverables. Executive Board will review the plans and achievements. Activities will be covered by the Project Plan and WBS. Effort distribution will be reviewed and potentially modified each year.

OSG Consortium, 2/21/06 20 Must Support LHC and LIGO Scaling circa Data distribution must routinely exceed 1 GB/Sec at ~10-20 sites Workflows must support >10,000 batch jobs per client Jobs/Day must exceed 20,000 per VO with >99% success rate. Accessible Storage >~10PB. Facility Availability/Uptime must be >99.x% with no single points of failure.

OSG Consortium, 2/21/06 21 Year 1 OSG WBS and the Plans Bakul Banerjee - Project Consultant, working with all coordinators to complete plans and schedules. Plan for initial review at Thursdays Council meeting. Ready for review by Science Advisory Council and/or external reviewers in ~3 months.

OSG Consortium, 2/21/06 22 Operations, Security, Troubleshooting, Software Expect to “Re-Plan”. All plans allow for unanticipated opportunities. Make Plans Help not hinder.

OSG Consortium, 2/21/06 23 Continued focus on OSG Core Competencies Integration: Software, Systems, Virtual Organizations. Operations: Common Support & Grid Services. Inter-Operation: Bridging Administrative & Technical Boundaries. with Integrated Security Operations and Management. with Validation, Verification and Diagnosis at each step.

OSG Consortium, 2/21/06 24 Reminder of the S/W Stack and Deployment Life-Cycle VDT increases the effectiveness of Condor and Globus by integrating them with the other services needed for fully functional Cyber-environments. OSG project funding for VDT will enable more storage and data management services to be included in the future.

OSG Consortium, 2/21/06 25 OSG Facility Operation: Operations, Maintenance Engagement Support OSG Facility Operation: Operations, Maintenance Engagement Support OSG Extensions: Requirements development, & testing on parochial grids Resources Providers (Sites) Applications OSG Facility Provisioning: VDT, Integration, Validation, System Integration Testbed OSG Facility Provisioning: VDT, Integration, Validation, System Integration Testbed Release External Projects: Development & Research Ready OSG Project Does Not Do Software Development Condor, Globus, EGEE-JRA1, dCache, SRM, US LHC S&C, LIGO PIF, Security for Open Science Center for Distributed Science etc.

OSG Consortium, 2/21/ Join OSG 1. VO Registers with with Operations Center.User registers with VO. 2. Sites Register with the Operations Center. 3. VOs and Sites provide Support Center Contact and join Operations groups. The OSG VO 1. A VO for individual researchers small groups. 2. Managed by the OSG itself. 3. Where one can learn how to use the Grid! Core Operations and Common Support

OSG Consortium, 2/21/06 27 Grid of Grids - from local to global CS/IT Campus Grids Science Community Grid e.g LIGO, STAR, D0 … National & International Infrastructures for Science e.g. Teragrid, EGEE, NAREGI… Campus & Regional Infrastructures e.g. CrimsonGrid, GLOW, NWICG…

OSG Consortium, 2/21/06 28 Grid of Grids - OSG is one grid among many CS/IT Campus Grids Science Community Grid e.g LIGO, STAR, D0 … National & International Infrastructures for Science e.g. Teragrid, EGEE, NAREGI… Campus & Regional Infrastructures e.g. CrimsonGrid, GLOW, NWICG… National & International Infrastructures for Science e.g. Teragrid, EGEE, NAREGI… National & International Infrastructures for Science e.g. Teragrid, EGEE, NAREGI… National & International Infrastructures for Science e.g. OSG, Teragrid, EGEE, NAREGI… Campus & Regional Infrastructures e.g. CrimsonGrid, GLOW, NWICG… Campus & Regional Infrastructures e.g. CrimsonGrid, GLOW, NWICG… Campus & Regional Infrastructures e.g. CrimsonGrid, GLOW, NWICG… Campus & Regional Infrastructures e.g. CrimsonGrid, GLOW, NWICG… Users must be able to operate transparently across Grid boundaries. OSG program of work focuses on Interoperability and Bridging of data and jobs across these boundaries.

OSG Consortium, 2/21/06 29 Continuing with this meeting

OSG Consortium, 2/21/06 30 Welcome to those from near and far Most far -- Bob Jones - Director of Enabling Grids for EsciencE- II, CERN. Simon Lin - Director Computing Center Academia Sinica, Taiwan. Kazushige Saga - NAREGI, Tokyo Institute of Technology. Dave Kelsey - Chair of Joint Security Working Group and Rutherford Appleton Laboratory, England. Sergio Andreozzi - gLITE-JRA1 and INFN Most near -- Univeristy of Washington, Seattle and nearby - David Baker, Richard Coffey, James DeRoest, Tony Hey, Margaret Romine, Oren Sreenby, Gordon Watts

OSG Consortium, 2/21/06 31 I look forward to a thoughtful and productive meeting and discussions with you all. OSG is For the Community, By the Community, Throughout the Community.