October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.

Slides:



Advertisements
Similar presentations
US CMS DOE/NSF Review: May 8-10, US CMS Cost & Schedule Mark Reichanadter US CMS Project Engineer DOE/NSF Review 8 May 2001.
Advertisements

Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
DOE/NSF U.S. CMS Operations Program Review Closeout Report Fermi National Accelerator Laboratory March 10, 2015 Anadi Canepa, TRIUMF Anna Goussiou, University.
Amber Boehnlein, FNAL D0 Computing Model and Plans Amber Boehnlein D0 Financial Committee November 18, 2002.
Distributed IT Infrastructure for U.S. ATLAS Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
12. March 2003Bernd Panzer-Steindel, CERN/IT1 LCG Fabric status
Title US-CMS User Facilities Vivian O’Dell US CMS Physics Meeting May 18, 2001.
1 US CMS ASCB (Advisory Software & Computing Board) u Function u Membership u Activities u Future Plans.
NuMI Offaxis Costs and Whither Next Stanley Wojcicki Stanford University Cambridge Offaxis workshop January 12, 2004.
Ian M. Fisk Fermilab February 23, Global Schedule External Items ➨ gLite 3.0 is released for pre-production in mid-April ➨ gLite 3.0 is rolled onto.
N A managed approach to planning and controlling the implementation of complex application software. n A flexible tool kit, designed to support the Project.
U.S. ATLAS Physics and Computing Budget and Schedule Review John Huth Harvard University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
Term 2, 2011 Week 3. CONTENTS The physical design of a network Network diagrams People who develop and support networks Developing a network Supporting.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
October 23, 2000USCMS S&C Project Matthias Kasemann1 US CMS Software and Computing The Project Plan Matthias Kasemann Fermilab FNAL Oversight Panel October.
November 15, 2000US CMS Tier2 Plans Matthias Kasemann1 US CMS Software and Computing Tier 2 Center Plans Matthias Kasemann Fermilab DOE/NSF Baseline Review.
Requirements Review – July 21, Requirements for CMS Patricia McBride July 21, 2005.
SCSC 311 Information Systems: hardware and software.
U.S. MICE Schedule, Cost, & Risks Peter H. Garbincius Mark Palmer, Alan Bross, Rich Krull Fermilab Presented at RAL – November 13, 2013.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
US-CMS Software and Computing 1st Meeting of the FNAL Oversight Panel, October 2000 Core Applications Software Lucas Taylor Northeastern University.
December 10,1999: MONARC Plenary Meeting Harvey Newman (CIT) Phase 3 Letter of Intent (1/2)  Short: N Pages è May Refer to MONARC Internal Notes to Document.
LIGO-G M Planning and Implementation Strategy for Advanced LIGO Gary Sanders LSC Meeting Hanford, August 14, 2001.
US CMS/D0/CDF Jet/Met Workshop – Jan. 28, The CMS Physics Analysis Center - PAC Dan Green US CMS Program Manager Jan. 28, 2004.
DOE/NSF Review May 8-10, US CMS US CMS M&O Planning Dan Green US CMS Project Manager DOE/NSF Lehman Review, May 8-10, 2001.
09/02 ID099-1 September 9, 2002Grid Technology Panel Patrick Dreher Technical Panel Discussion: Progress in Developing a Web Services Data Analysis Grid.
U.S. ATLAS Tier 1 Planning Rich Baker Brookhaven National Laboratory US ATLAS Computing Advisory Panel Meeting Argonne National Laboratory October 30-31,
9 Systems Analysis and Design in a Changing World, Fourth Edition.
Atlas CAP Closeout Thanks to all the presenters for excellent and frank presentations Thanks to all the presenters for excellent and frank presentations.
METROPOLITAN TRANSPORTATION COMMISSION Key Steps to Delivering Project 1.Lead agency submits final project scope, budget and schedule to MTC 2.MTC.
October 30, 2001ATLAS PCAP1 LHC Computing at CERN and elsewhere The LHC Computing Grid Project as approved by Council, on September 20, 2001 M Kasemann,
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory.
CMS Computing and Core-Software USCMS CB Riverside, May 19, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects Fermi National Laboratory November.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
LIGO-G M Organization and Budget Gary Sanders NSF Operations Review Caltech, February 26, 2001.
CMS Si Tracker Project - US CMS Meeting at Riverside – May 19, US CMS Tracker Outer Barrel (TOB) Silicon Project Tim Bolton (for Regina Demina)
Computing Coordination Aspects for HEP in Germany International ICFA Workshop on HEP Networking, Grid and Digital Divide Issues for Global e-Science nLCG.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
CD FY09 Tactical Plan Status FY09 Tactical Plan Status Report for Neutrino Program (MINOS, MINERvA, General) Margaret Votava April 21, 2009 Tactical plan.
Final Implementation of a High Performance Computing Cluster at Florida Tech P. FORD, X. FAVE, K. GNANVO, R. HOCH, M. HOHLMANN, D. MITRA Physics and Space.
U.S. ATLAS Computing Facilities Overview Bruce G. Gibbard Brookhaven National Laboratory U.S. LHC Software and Computing Review Brookhaven National Laboratory.
US ATLAS Tier 1 Facility Rich Baker Deputy Director US ATLAS Computing Facilities October 26, 2000.
Budget Outlook Glen Crawford P5 Meeting Sep
U.S. ATLAS Computing Facilities DOE/NFS Review of US LHC Software & Computing Projects Bruce G. Gibbard, BNL January 2000.
LCG Service Challenges SC2 Goals Jamie Shiers, CERN-IT-GD 24 February 2005.
1 Project Management Software management is distinct and often more difficult from other engineering managements mainly because: – Software product is.
U.S. ATLAS Computing Facilities U.S. ATLAS Physics & Computing Review Bruce G. Gibbard, BNL January 2000.
PCAP Close Out Feb 2, 2004 BNL. Overall  Good progress in all areas  Good accomplishments in DC-2 (and CTB) –Late, but good.
Site Services and Policies Summary Dirk Düllmann, CERN IT More details at
DPS/ CMS RRB-T Core Software for CMS David Stickland for CMS Oct 01, RRB l The Core-Software and Computing was not part of the detector MoU l.
CMS: T1 Disk/Tape separation Nicolò Magini, CERN IT/SDC Oliver Gutsche, FNAL November 11 th 2013.
NCSX Strykowsky 1Independent Project Review (IPR) June 8-9, 2004 NCSX Project Review June 8-9, 2004 Cost, Schedule, and Project Controls Ron Strykowsky.
Workshop summary Outline  Workshop’s aims  Highlights from the presentations (my selection!)  Costing Exercise – What we learnt  Summary - Roadmap.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
LQCD Computing Project Overview
SCT Project Management Issues
DOE/NSF Lehman Review, May 8-10, 2001
Completion and Pre-Exploitation Costs for the Initial ATLAS Detector
LHC Science Goals & Objectives
S4 will be a “big” Collaboration:
LQCD Computing Operations
Patrick Dreher Research Scientist & Associate Director
Details supporting the increase
Preliminary Project Execution Plan
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab FNAL Oversight Panel October 24, 2000

Milestones, Funding of USCMS S&C Matthias Kasemann2 CMS planning at LHCC review October 2000

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann3 US CMS S&C schedule adapted to LHC US CMS S&C schedule adapted to LHC  Constraints for updated S&C schedule: è Working CMS detector by mid-2005 è LHC pilot run in Fall 2005, few weeks of luminosity running è Complete detector and start LHC running in Spring 2006  Lessons learned from other experiments: è Pilot running: for detector as well as for computing and analysisFall 2005 è Finish facilities before physics dataearly FY06 è CMS Software schedule not changed  Adapt schedule: delay start of implementation of facilities by 12 months (03->04) è 9 month delay in startup of LHC luminosity running è 3 month: US FY05 ends in 9/2005

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann4  Received draft funding profile from DOE: 5/30/2000  Assume NSF funding profile after discussion  This is a BIG step for S&C projects: è guidance to allow detailed and solid planning  We want to continue to work with funding agencies to make S&C for CMS a success è profit for science from investment into LHC program US CMS S&C funding profile

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann5 CAS Project: Scope + Funding  US CMS CAS personnel contributions to CMS Core software: è determined by canonical scaling from CMS top-level resource- loaded WBS for Software è Add ~25% for additional US-specific software support  Proposal for CAS personnel contingency: è add a fixed percentage to base cost as management reserve e.g. p 10% for FY 2001 and 2002 p 25% for FY 2003 and beyond  All this is unchanged due to the updated schedule.

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann6 US CMS S&C profile: January 00 version now USCMS S&C expected: 40%-25% of UF people contribution from FNAL, not shown here

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann7 Schedule for US-CMS S&C Project  In order to match the proposed funding profile from DOE/NSF more closely as well as the LHC schedule, we have stretched out the CMS Software and Computing Project Schedule.  User facilities: è Now – end FY2003:R&D phase è FY2004-FY2006:Implementation Phase è FY2007 and onwards:Operations Phase  This required a new hardware cost estimate and resource loaded WBS.

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann8 UF: Updated schedule  Modified: è Stretch hardware installation on UF and Tier2 è Reevaluate UF facility manpower needs p Very similar results  Tier 1 and Tier 2 center scope are: è proposed by MONARC, è defined by CMS Computing Model è Reviewed by CERN Hoffmann Review  Hardware requirements need to remain flexible to respond to è advances in technology p in storage: capacity, cost, => disk vs. tape, store vs. recalculate p in networking: bandwidth, latency, => local vs. remote data storage and access, replication p in computing: speed, cost, => servers vs. commodity type computing è results of review and planning process at CERN è results of Grid R&D, availability of integrated ‘middleware’ software  => Serious planning requires contingency here!

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann9 Hardware Costing For UF w/o T2 Hardware costing for User Facilities, excluding Tier 2

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann10 Hardware Costing For UF Hardware costing for User Facilities, INCLUDING Tier 2 from GriPhyN

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann11 User Facility Personnel Costs

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann12 Total User Facility Costs 

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann13 Total S&C Project cost Management reserve: 10% on manpower

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann14 DOE funding vs. UF(Tier 1) + 2/3 CAS Management reserve: 10% on manpower

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann  1.1 Tier 1 Regional Center  1.2 System and User Support  1.3 Operations and Infrastructure  1.4 Tier 2 RCs  1.5 Networking  1.6 Computing and Software R&D  1.7 Construction Phase Computing  1.8 Support of FNAL based Computing  User Facilities (total)  (Without T2 Personnel)  Full time staff working on User Facilities tasks: (technicians, support staff, computing professionals) From “Bottoms Up” Approach

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann16 NSF funding vs. UF(Tier 2) + 1/3 CAS Tier 2 center cost: start of Prototype centers spread over FY01-04 final implementation stretched over 2-3 years (FY04-06) dominated by potential networking cost

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann17 Funding - Request

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann18 Evaluation of Profiles  The US CMS S&C project is managed and coordinated by the L1 PM, who carries budget authority è Funding shared differently for the two subprojects: p Tier1 center funded by DOE p Tier2 centers expected to be funded by NSF p CAS traditionally funded 2/3 DOE + 1/3 NSF  NSF part: under-funded by ~$1M/year è Tier 2 center cost dominated by networking cost è Uncertainty in networking cost for Tier 2 sites p Site Networking connectivity must be argument in selection process è Leveraging from University sites must contribute to cost of Tier2 centers  DOE part: è Dominated by Tier 1 personnel cost p Leveraging from FNAL infrastructure? Part of WBS 1.3: Operations & Infrastructure Part of WBS 1.5: Networking: LAN, WAN, monitoring, security Part of WBS 1.8: Desktop systems, Support, Remote Control Room

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann19 Conclusion (1)  The project needs to have a project office with: è Two project engineers supporting the L1 PM p coordinating the activities at all levels p Overseeing purchases p Establishing MoU’s, SOW’s p Preparing for Reviews p Tracking budget p Tracking the project progress è Not yet added to the previous numbers  Need to have management reserve / contingency for manpower (10% proposed, shown in the graphs)  The back loaded funding profile creates big problems now, additional help needed.  It is essential that we keep flexibility to use funds where needed most; only in this way are we able to take advantage of matching funds.

October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann20  Profile recognizes needs for operating S&C  Profiles heavily back-loaded towards 2005  Result: è Funding does not allow to finish R&D and implementation for 2006 è Very hard to reach milestones: p Deliverables for Core Application Software p Contributions of US share to Mock Data Challenges, production exercises for trigger and physics studies Test beam analyses for detector optimization è Preliminary analysis: implementation ready by 2007  Severe impact for US based CMS data analysis: è Will not be able to fully contribute to CMS analysis è Will compromise readiness of CMS and US CMS for Physics è Disadvantage can only slowly be corrected because of operational needs Summary