U.S. ATLAS Project Overview John Huth Harvard University LHC Computing Review FNAL November 2001.

Slides:



Advertisements
Similar presentations
Project Overview John Huth Harvard University U.S. ATLAS Physics and Computing Project Review ANL October 2001.
Advertisements

DOE/NSF U.S. CMS Operations Program Review Closeout Report Fermi National Accelerator Laboratory March 10, 2015 Anadi Canepa, TRIUMF Anna Goussiou, University.
Distributed IT Infrastructure for U.S. ATLAS Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Architecture/Framework Status David R. Quarrie LBNL U.S. ATLAS Physics and Computing Project Review ANL October 2001.
Current Monte Carlo calculation activities in ATLAS (ATLAS Data Challenges) Oxana Smirnova LCG/ATLAS, Lund University SWEGRID Seminar (April 9, 2003, Uppsala)
US ATLAS Project Management J. Shank U.S. ATLAS Computing and Physics meeting Aug., 2003 BNL.
U.S. ATLAS Software WBS 2.2 S. Rajagopalan July 8, 2003 DOE/NSF Review of LHC Computing.
US ATLAS Distributed IT Infrastructure Rob Gardner Indiana University October 26, 2000
Software Project Status Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
Software Status/Plans Torre Wenaus, BNL/CERN US ATLAS Software Manager US ATLAS PCAP Review November 14, 2002.
Trigger and online software Simon George & Reiner Hauser T/DAQ Phase 1 IDR.
U.S. ATLAS Physics and Computing Budget and Schedule Review John Huth Harvard University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
BNL PCAP Meeting Jan U.S. ATLAS Computing  Overview  Status of ATLAS computing  U.S. ATLAS  Project Management Organization  Status of.
BNL ATLAS Meeting July 1999 U.S. ATLAS Computing  Goals for the next year  Status of ATLAS computing  U.S. ATLAS  Management proposal  Brief status.
Software Overview and LCG Project Status & Plans Torre Wenaus BNL/CERN DOE/NSF Review of US LHC Software and Computing NSF, Arlington June 20, 2002.
U.S. ATLAS Computing Project: Budget Profiles, Milestones Jim Shank Boston University Physics and Computing Advisory Panel Review LBNL Nov., 2002.
Distributed Facilities for U.S. ATLAS Rob Gardner Indiana University PCAP Review of U.S. ATLAS Physics and Computing Project Argonne National Laboratory.
Software Project Status Torre Wenaus, BNL/CERN US ATLAS Software Manager DOE/NSF Review of the US ATLAS Physics and Computing Project January 15, 2003.
US-ATLAS Management Overview John Huth Harvard University Agency Review of LHC Computing Lawrence Berkeley Laboratory January 14-17, 2003.
U.T. Arlington High Energy Physics Research Summary of Activities August 1, 2001.
U.S. ATLAS Computing Facilities Bruce G. Gibbard Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects LBNL, Berkeley, California.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
LHC Computing Review - Resources ATLAS Resource Issues John Huth Harvard University.
Tier 1 Facility Status and Current Activities Rich Baker Brookhaven National Laboratory NSF/DOE Review of ATLAS Computing June 20, 2002.
PPDG and ATLAS Particle Physics Data Grid Ed May - ANL ATLAS Software Week LBNL May 12, 2000.
PCAP Management Overview John Huth Harvard University PCAP Review of U.S. ATLAS Lawrence Berkeley Laboratory NOVEMBER 14-16, 2002.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
LCG Applications Area – Overview, Planning, Resources Torre Wenaus, BNL/CERN LCG Applications Area Manager LHCC Comprehensive Review.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
ATLAS, U.S. ATLAS, and Databases David Malon Argonne National Laboratory DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
ATLAS, U.S. ATLAS, and Databases David Malon Argonne National Laboratory PCAP Review of U.S. ATLAS Computing Project Argonne National Laboratory
June 02 John Huth, LHC Computing 1 U.S. ATLAS Overview  Project ManagementJ. Huth  SoftwareT.Wenaus  ArchitectureD. Quarrie  PhysicsI. Hinchliffe 
October LHCUSA meeting BNL Bjørn S. Nilsen Update on NSF-ITR Proposal Bjørn S. Nilsen The Ohio State University.
19 November 98 1 Jürgen Knobloch ATLAS Computing ATLAS Computing - issues for 1999 Jürgen Knobloch Slides also on:
Questions for ATLAS  How can the US ATLAS costs per SW FTE be lowered?  Is the scope of the T1 facility matched to the foreseen physics requirements.
U.S. ATLAS Tier 1 Planning Rich Baker Brookhaven National Laboratory US ATLAS Computing Advisory Panel Meeting Argonne National Laboratory October 30-31,
Atlas CAP Closeout Thanks to all the presenters for excellent and frank presentations Thanks to all the presenters for excellent and frank presentations.
Software Project Status Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Brookhaven National Laboratory May 21, 2001.
U.S. ATLAS Software WBS 2.2 S. Rajagopalan July 8, 2004 DOE-NSF Review of U.S. ATLAS Computing.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects Fermi National Laboratory November.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery1 Integration with iVDGL è International Virtual-Data Grid Laboratory  A global Grid laboratory (US, EU, Asia,
ATLAS Simulation/Reconstruction Software James Shank DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory NOVEMBER 14-17,
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
Software Project Status Torre Wenaus BNL DOE/NSF Review of US LHC Software and Computing Fermilab Nov 29, 2001.
CD FY09 Tactical Plan Status FY09 Tactical Plan Status Report for Neutrino Program (MINOS, MINERvA, General) Margaret Votava April 21, 2009 Tactical plan.
U.S. ATLAS Computing Facilities (Overview) Bruce G. Gibbard Brookhaven National Laboratory US ATLAS Computing Advisory Panel Meeting Argonne National Laboratory.
U.S. ATLAS Computing Facilities Overview Bruce G. Gibbard Brookhaven National Laboratory U.S. LHC Software and Computing Review Brookhaven National Laboratory.
U.S. ATLAS Computing Facilities (Overview) Bruce G. Gibbard Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects Fermi National.
Tier 1 at Brookhaven (US / ATLAS) Bruce G. Gibbard LCG Workshop CERN March 2004.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
U.S. ATLAS Computing Facilities Bruce G. Gibbard Brookhaven National Laboratory Mid-year Review of U.S. LHC Software and Computing Projects NSF Headquarters,
Introduction S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting.
US ATLAS – new grid initiatives John Huth Harvard University US ATLAS Software Meeting: BNL Aug 03.
PCAP Close Out Feb 2, 2004 BNL. Overall  Good progress in all areas  Good accomplishments in DC-2 (and CTB) –Late, but good.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL DOE/NSF Review of US LHC Software and Computing Fermilab Nov 29, 2001.
DPS/ CMS RRB-T Core Software for CMS David Stickland for CMS Oct 01, RRB l The Core-Software and Computing was not part of the detector MoU l.
DOE/NSF Quarterly review January 1999 Particle Physics Data Grid Applications David Malon Argonne National Laboratory
Architecture/Framework Status David R. Quarrie LBNL DOE/NSF Review of U.S. ATLAS Physics and Computing Project FNAL November 2001.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
1 ALICE Summary LHCC Computing Manpower Review September 3, 2003.
S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting
Department of Licensing HP 3000 Replatforming Project Closeout Report
Preliminary Project Execution Plan
Presentation transcript:

U.S. ATLAS Project Overview John Huth Harvard University LHC Computing Review FNAL November 2001

November 01 John Huth, LHC Computing 2 Outline  International and U.S. ATLAS  Organization  U.S. ATLAS  External Groups  Project Management Plan  Milestones  Status  Software  Facilities  Grid efforts  Physics  Funding Profile  Impact

November 01 John Huth, LHC Computing 3 International ATLAS  Computing Oversight Board  Computing Steering Group  Matrix of detector/task orientation  PBS structure gives tasks, schedules, resource loading  Maps directly onto U.S. ATLAS WBS  Planning officer is now Torre Wenaus (U.S. ATLAS SW Mgr.)  Software deliverables delineated in Software Agreements  Major milestones associated with releases, data challenges  Data Challenge coordinator: Gilbert Poulard (CERN)

November 01 John Huth, LHC Computing 4 ATLAS Detector/Task matrix Offline Coordinator ReconstructionSimulationDatabaseChair N. McCubbin D. Rousseau K. Amako D. Malon Inner Detector D. Barberis D. Rousseau F. Luehring S. Bentvelsen Liquid Argon J. Collot S. Rajagopalan M. Leltchouk S. Simion/ R. Sobie Tile Calorimeter A. Solodkov F. Merritt A. Solodkov T. LeCompte Muon To Be Named J.F. Laporte A. Rimoldi S. Goldfarb LVL 2 Trigger/ Trigger DAQ S. George S. Tapprogge M. Weilers A. Amorim Event Filter F. Touchard M. Bosman Physics Coordinator: F.Gianotti Chief Architect: D.Quarrie

November 01 John Huth, LHC Computing 5 International ATLAS Computing Org. simulationreconstructiondatabasecoordinator QC groupsimulation reconstruction databaseArch. team Event filter Technical Group National Comp. Board Comp. Steering Group Physics Comp. Oversight Board Detector system

November 01 John Huth, LHC Computing 6 Recent Events  Progress toward coherent, integrated effort  First Software Agreement Signed! (Control/framework)  Second one in progress (QA/AC)  Data Challenge Coordinator named (Gilbert Poulard)  Lund physics meeting (Lund Athena release)  ARC Report  Endorsement of Athena  Upcoming Data Challenge 0  Continuity test – November Athena release  Personnel Changes  D. Malon now solo data management leader  Helge Meinhard – planning, moves to IT Division  Now replaced by Torre Wenaus

November 01 John Huth, LHC Computing 7 Project Core SW FTE

November 01 John Huth, LHC Computing 8 FTE Fraction of Core SW

November 01 John Huth, LHC Computing 9 U.S. ATLAS Goals  Deliverables to International ATLAS and LHC projects  Software  Control/framework (SW agreement signed)  Portion of data management  Event Store  Collaboratory tools  Detector subsystem reconstruction  Grid integration  Computing resources devoted to data analysis, simulation  Tier 1, Tier 2 centers  Support of U.S. ATLAS Physicists  Computing resources  Support functions (librarian, nightly builds, site support)

November 01 John Huth, LHC Computing 10 U.S. ATLAS Project

November 01 John Huth, LHC Computing 11 Project Management Plan  New version: extensive revisions from last year  Description of institutional MOU’s  Two draft Inst. MOU’s exist  Liaison list  Performance metrics established  Personnel effort - FTE  Hardware – fraction of turn-on functional per year  Change control  Reporting  Quarterly reports  Transition to “Research Program” in FY 07

November 01 John Huth, LHC Computing 12 U.S. ATLAS WBS Structure  2.1Physics  Support of event generators  Data challenge support in U.S.  2.2 Software  Core Software (framework, database)  Subsystem efforts  Training  2.3 Facilities  Tier 1  Tier 2, infrastructure (networking)  2.4 Project Management

November 01 John Huth, LHC Computing 13 U.S. ATLAS Developments  Athena (control/framework)  Lund release done  DC 0 release  Incorporation of G4 interface  Database  Effort augmented  Coordination of Oracle,Obj, Root evaluations  Facilities  Ramp delayed by funding profile, DC preparation reduced scope  Common grid plan worked out  Modest personnel ramp – BNL SW/Fac/ANL  Librarian support, nightly builds at BNL (from CERN)

November 01 John Huth, LHC Computing 14 External Groups  iVDGL funding (Tier 2 personnel, Hardware) approved  But 50% cut in hardware relative to original planning  PPDG effort in progress  ITR funding of Indiana (grid telemetry)  Integrated planning on software, facilities for grids  Liaisons named in PMP  GriPhyN/iVDGL – R. Gardner (J. Schopf CS liaison)  PPDG – T. Wenaus (J. Schopf CS liaison)  EU Data grid – C. Tull  HEP Networking – S. McKee

November 01 John Huth, LHC Computing 15 Software  Migration from SRT to CMT begun  Effort redirection, but long term benefit  Upcoming Nov. Release of Athena  Support of DC 0  Funding shortfall impacts  Shift of D. Day (USDP, Python scripting) – postdoc hire to fill  FY 03 delay hire at BNL possible – loss of ROOT expertise  Data management architecture proposed (non product specific)  Root I/O service  G4 Integration into Athena  Development of “pacman” for deployment of software (BU) at remote sites

November 01 John Huth, LHC Computing 16 Facilities Schedule  LHC start-up projected to be a year later  2005/2006  2006/2007  30% facility in 06  100% facility in 07  ATLAS Data Challenges (DC’s) have, so far, stayed fixed  DC0 – Nov/Dec 2001 – 10 5 events  Software continuity test  DC1 – Feb/Jul 2002 – 10 7 events  ~1% scale test  Data used for US ATLAS Grid testbed integration tests  DC2 – Jan/Sep 2003 – 10 8 events  ~10% scale test  A serious functionality & capacity exercise  A high level of US ATLAS facilities participation is deemed very important

November 01 John Huth, LHC Computing 17 Facilities  Tier 1 particularly hard hit by budget shortfall  Delays in hiring  Scalable online storage prototype work delayed approx. 7 mos.  DC2 capability reduced relative to plan (1 vs. 5%)  Small increments ($300k) can help substantially  Year end funding of $284k from DOE (Aug 01)  Major usage of Tier 1 for shielding calculations  Anticipate major usage in DC’s and in grid tests  Examination of tape vs. disk for event store at start of data taking  Tier 2  Selection of first prototype centers (I.U., B.U.)  iVDGL funding of prototype hardware  Deployment of SW on testbed sites in progress

November 01 John Huth, LHC Computing 18 CPU Capacity (kSi95)

November 01 John Huth, LHC Computing 19 US ATLAS Persistent Grid Testbed Calren Esnet, Abilene, Nton Esnet, Mren UC Berkeley LBNL-NERSC Esnet NPACI, Abilene Brookhaven National Laboratory Indiana University Boston University Argonne National Laboratory U Michigan Oklahoma University Abilene Prototype Tier 2s HPSS sites

November 01 John Huth, LHC Computing 20 Grid Efforts/Physics  Many sources of effort/shared  GriPhyN/iVDGL/PPDG/EU activities/New CERN mgmt.  Common U.S. ATLAS plan  Use existing tools as much as possible  Use existing platforms as much as possible  Gain experience in  Replica catalog  Metadata description  Deployment/release of tools  Philosophy is to gain expertise, not await a grand international synthesis  Physics: support hire for generator interface, data challenge

November 01 John Huth, LHC Computing 21 Networking  Networking is a crucial component for the success of the grid model of distributed computing  This has not been included as part of project funding profile  Agency guidance  Nevertheless, it must be planned for  Transatlantic planning group report (H. Newman, L. Price)  Tier 1- Tier 2 connectivity requirements  See talk by S. McKee  Scale and requirements are being established  Funding sources must be identified

November 01 John Huth, LHC Computing 22 Major Milestones

November 01 John Huth, LHC Computing 23 Comments on Budgeting  Agencies so far have worked hard to come up the profile, but, given budget issues, this had been difficult, have come up short  Construction project borrowing  We cannot plan based on our wishes, but rather realistic expectations  Current budgeting profile  Increase in grid activities – long term benefit/short term redirection of effort  Need to factor in networking costs for Tier 1-Tier 2 connections (note: Transatlantic report) – via external funds, but must be budgeted  Relief in form of overall NSF funding for “research program”  Overall profile for M+O, upgrades and computing  Major components in computing:  Tier 2 sites, grid integration with ATLAS software  SW Professionals located at CERN (hires through universities)

November 01 John Huth, LHC Computing 24 Funding Guidance

November 01 John Huth, LHC Computing 25 Budget Profile by Item

November 01 John Huth, LHC Computing 26

November 01 John Huth, LHC Computing 27 FTE Profile

November 01 John Huth, LHC Computing 28 FTE by Category in 02

November 01 John Huth, LHC Computing 29 FTE’s in 07

November 01 John Huth, LHC Computing 30 Matching of Profiles

November 01 John Huth, LHC Computing 31 Risks  Software  Loss of expertise in control/framework (scripting)  New hire as mitigation  Loss of Ed Frank (U. Chicago) – data management  Delay of new hire at BNL – ROOT persistency expertise  Support questions in tools (e.g. CMT)  Facilities  Slowed ramp-up in personnel, hardware  Facility preparation for DC2 implies reduced scale  Grid  Some shift of effort from core areas into grid developments – following the development of integrated model of comp. centers

November 01 John Huth, LHC Computing 32 NSF Proposal  Covers Computing, upgrades and M+O  Computing:  3 Software FTE’s – located at CERN (hire by University)  Alleviates shortfall of 2 FTE, covers remaining portion in out years  Physics support person  Support of generator interfaces, data challenges  Main source of Tier 2 funding  Tier 2 hardware  Personnel  Common (w/ US CMS) team to debug last mile networking problems  Alleviates shortfalls in the program  N.B. this also frees up DOE funds for labs, allowing a better Tier 1 ramp, preparation for data challenges.

November 01 John Huth, LHC Computing 33

November 01 John Huth, LHC Computing 34 Summary  Much progress on many fronts  Funding profile is still an issue  Personnel ramp  Ramp of facility  Some possible solutions  Funds for loan payback are incremental  “Research Program” NSF proposal is necessary  Progress in International collaboration  Software agreements  The single biggest help from the committee would be a favorable recommendation on the computing request in the NSF proposal  In addition, endorsement of proposed project scope, schedule, budgets and management plan.