U.S. ATLAS Computing Project: Budget Profiles, Milestones Jim Shank Boston University Physics and Computing Advisory Panel Review LBNL 14-16 Nov., 2002.

Slides:



Advertisements
Similar presentations
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Advertisements

Project Overview John Huth Harvard University U.S. ATLAS Physics and Computing Project Review ANL October 2001.
DOE/NSF U.S. CMS Operations Program Review Closeout Report Fermi National Accelerator Laboratory March 10, 2015 Anadi Canepa, TRIUMF Anna Goussiou, University.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
1 Lothar A T Bauerdick Fermilab US CMS + US ATLAS ITR (pre-)proposal “Globally Enabled Analysis Communities” Sound Bites: “Dynamic workspaces” “Private.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
Current Monte Carlo calculation activities in ATLAS (ATLAS Data Challenges) Oxana Smirnova LCG/ATLAS, Lund University SWEGRID Seminar (April 9, 2003, Uppsala)
US ATLAS Project Management J. Shank U.S. ATLAS Computing and Physics meeting Aug., 2003 BNL.
Software Project Status Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
Software Status/Plans Torre Wenaus, BNL/CERN US ATLAS Software Manager US ATLAS PCAP Review November 14, 2002.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
BNL PCAP Meeting Jan U.S. ATLAS Computing  Overview  Status of ATLAS computing  U.S. ATLAS  Project Management Organization  Status of.
ATLAS Data Challenge Production and U.S. Participation Kaushik De University of Texas at Arlington BNL Physics & Computing Meeting August 29, 2003.
Experiment Requirements for Global Infostructure Irwin Gaines FNAL/DOE.
Software Overview and LCG Project Status & Plans Torre Wenaus BNL/CERN DOE/NSF Review of US LHC Software and Computing NSF, Arlington June 20, 2002.
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
Software Project Status Torre Wenaus, BNL/CERN US ATLAS Software Manager DOE/NSF Review of the US ATLAS Physics and Computing Project January 15, 2003.
US-ATLAS Management Overview John Huth Harvard University Agency Review of LHC Computing Lawrence Berkeley Laboratory January 14-17, 2003.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
Tier 1 Facility Status and Current Activities Rich Baker Brookhaven National Laboratory NSF/DOE Review of ATLAS Computing June 20, 2002.
PCAP Management Overview John Huth Harvard University PCAP Review of U.S. ATLAS Lawrence Berkeley Laboratory NOVEMBER 14-16, 2002.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
ATLAS, U.S. ATLAS, and Databases David Malon Argonne National Laboratory DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
June 02 John Huth, LHC Computing 1 U.S. ATLAS Overview  Project ManagementJ. Huth  SoftwareT.Wenaus  ArchitectureD. Quarrie  PhysicsI. Hinchliffe 
SA1/SA2 meeting 28 November The status of EGEE project and next steps Bob Jones EGEE Technical Director EGEE is proposed as.
U.S. ATLAS Tier 1 Planning Rich Baker Brookhaven National Laboratory US ATLAS Computing Advisory Panel Meeting Argonne National Laboratory October 30-31,
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
Software Project Status Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Brookhaven National Laboratory May 21, 2001.
ATLAS Simulation/Reconstruction Software Reported by S. Rajagopalan work done by most US Institutes. U.S. ATLAS PCAP review Lawrence Berkeley National.
U.S. ATLAS Project Overview John Huth Harvard University LHC Computing Review FNAL November 2001.
The GriPhyN Planning Process All-Hands Meeting ISI 15 October 2001.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects Fermi National Laboratory November.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
ATLAS Simulation/Reconstruction Software James Shank DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory NOVEMBER 14-17,
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
Post-DC2/Rome Production Kaushik De, Mark Sosebee University of Texas at Arlington U.S. Grid Phone Meeting July 13, 2005.
Software Project Status Torre Wenaus BNL DOE/NSF Review of US LHC Software and Computing Fermilab Nov 29, 2001.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
LHC Computing, CERN, & Federated Identities
U.S. ATLAS Computing Facilities Overview Bruce G. Gibbard Brookhaven National Laboratory U.S. LHC Software and Computing Review Brookhaven National Laboratory.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
Introduction S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting.
Ian Bird Overview Board; CERN, 8 th March 2013 March 6, 2013
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL DOE/NSF Review of US LHC Software and Computing Fermilab Nov 29, 2001.
David Adams ATLAS ATLAS Distributed Analysis (ADA) David Adams BNL December 5, 2003 ATLAS software workshop CERN.
DOE/NSF Quarterly review January 1999 Particle Physics Data Grid Applications David Malon Argonne National Laboratory
Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
David Adams ATLAS ADA: ATLAS Distributed Analysis David Adams BNL December 15, 2003 PPDG Collaboration Meeting LBL.
Victoria A. White Head, Computing Division, Fermilab Fermilab Grid Computing – CDF, D0 and more..
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Bob Jones EGEE Technical Director
JRA3 Introduction Åke Edlund EGEE Security Head
Ian Bird GDB Meeting CERN 9 September 2003
S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting
Long-term Grid Sustainability
ATLAS DC2 ISGC-2005 Taipei 27th April 2005
US ATLAS Physics & Computing
Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002
Collaboration Board Meeting
Presentation transcript:

U.S. ATLAS Computing Project: Budget Profiles, Milestones Jim Shank Boston University Physics and Computing Advisory Panel Review LBNL Nov., 2002

Nov 2002 Jim Shank, PCAP Review LBNL 2 The last 2 days… Workshop to prepare the 2003 NSF ITR proposalWorkshop to prepare the 2003 NSF ITR proposal Large ITR: $15M over 5 years Joint ATLAS/CMS + Computing Sciences The Workshop was charged with identifying a critical area of computing needed for LHC that was not being funded by exsting programsThe Workshop was charged with identifying a critical area of computing needed for LHC that was not being funded by exsting programs Participants from a wide spectrum LCG project EDG project Trillium (GriPhyN, iVDGL, PPDG) CMS/ATLAS communities Some other physics experiments… 2 Working groups:2 Working groups: The BIG Picture group The ITR group

Nov 2002 Jim Shank, PCAP Review LBNL 3 Categories of missing pieces Transition to production level grids (middleware support, error recovery, robustness, 24x7, monitoring and system usage optimization, strategy and policy for resource allocation, authentication and authorization, simulation of grid operations, tools for optimizing distributed systems)Transition to production level grids (middleware support, error recovery, robustness, 24x7, monitoring and system usage optimization, strategy and policy for resource allocation, authentication and authorization, simulation of grid operations, tools for optimizing distributed systems) Globally Enabled Analysis Communities (WG2)Globally Enabled Analysis Communities (WG2) Enabling Global Collaboration (a medium ITR?)Enabling Global Collaboration (a medium ITR?) Slide from WG1 Summary (I. Gaines)

Nov 2002 Jim Shank, PCAP Review LBNL 4 Who Fills in Missing Pieces Experiments (from existing budgets) (but remember gap between bare bones funding level and originally proposed leadership funding level)Experiments (from existing budgets) (but remember gap between bare bones funding level and originally proposed leadership funding level) LCG (from existing funding)LCG (from existing funding) Current grid projects (both connected to HEP and more general projects)Current grid projects (both connected to HEP and more general projects) Near future grid projectsNear future grid projects This large ITR: Globally Enabled Analysis CommunitiesThis large ITR: Globally Enabled Analysis Communities Additional medium ITR (?): Enabling Global CollaborationAdditional medium ITR (?): Enabling Global Collaboration Slide from WG1 Summary (I. Gaines)

Nov 2002 Jim Shank, PCAP Review LBNL 5 Globally Empowered Analysis Communities: think globally, act locally User Grid Interactions (Mike, Bolek, Craig, Shaowen)User Grid Interactions (Mike, Bolek, Craig, Shaowen) data browsing tools User Interfaces Visualisation Tools Education outreach Interactive tools Developing automation (Higher level services, AI) Dynamic Resource Control (Sridahara, Kaushik, John)Dynamic Resource Control (Sridahara, Kaushik, John) Resource scheduling job scheduling environment control resource auditing, priority and priv. Data Provenance and workflow (Mike, Rick, David Adams)Data Provenance and workflow (Mike, Rick, David Adams) Community sharing and collaboration metadata management and tools (Greg and Jaideep, David Malon)metadata management and tools (Greg and Jaideep, David Malon) specification of data sets spec. of user analysis queries (metadata browser) Equivalence Data Management (Ian, Torre)Data Management (Ian, Torre) storage and data management data management optimisations (Obj. level) Slide from WG2 Summary (R. Cavanaugh)

Nov 2002 Jim Shank, PCAP Review LBNL 6 Template for summarising the different Topics Make a strong connection to the use casesMake a strong connection to the use cases how it is different from existing projectshow it is different from existing projects why it is new and revolutionary Status of TopicStatus of Topic Description of abilitiesDescription of abilities Generalisation beyond HEPGeneralisation beyond HEP How it ties into TitleHow it ties into Title Slide from WG2 Summary (R. Cavanaugh)

Nov 2002 Jim Shank, PCAP Review LBNL 7 Still need... $15M spread over 5 years$15M spread over 5 years scope needs to be consistent with this Need deliverables Need designated editors for textNeed designated editors for text Rob Ed., Ruth assists Need list of participantsNeed list of participants Slide from WG2 Summary (R. Cavanaugh)

Nov 2002 Jim Shank, PCAP Review LBNL 8 Project Core SW FTE

Nov 2002 Jim Shank, PCAP Review LBNL 9 ATLAS Subsystem/Task Matrix Offline Coordinator ReconstructionSimulationDatabaseChair N. McCubbin D. Rousseau A. Dell’Acqua D. Malon Inner Detector D. Barberis D. Rousseau F. Luehring S. Bentvelsen / D. Calvet Liquid Argon J. Collot S. Rajagopalan M. Leltchouk H. Ma Tile Calorimeter A. Solodkov F. Merritt V.Tsulaya T. LeCompte MuonJ.Shank J.F. Laporte A. Rimoldi S. Goldfarb LVL 2 Trigger/ Trigger DAQ S. George S. Tapprogge M. Weilers A. Amorim / F. Touchard Event Filter V. Vercesi F. Touchard Computing Steering Group members/attendees: 4 of 19 from US (Malon, Quarrie, Shank, Wenaus) Physics Coordinator: F.Gianotti Chief Architect: D.Quarrie

Nov 2002 Jim Shank, PCAP Review LBNL 10 Budget Profile Overview What has happened since last reviewWhat has happened since last review The Bare-Bones profile from last summer The construction project stretch-outThe construction project stretch-out The NSF M&O/Computing proposalThe NSF M&O/Computing proposal New profile estimateNew profile estimate

Nov 2002 Jim Shank, PCAP Review LBNL 11 Bare Bones Budget from June 2002

Nov 2002 Jim Shank, PCAP Review LBNL 12 Recent BCP (approved recently)

Nov 2002 Jim Shank, PCAP Review LBNL 13 NSF M&O/Computing proposal budget Submitted Oct. 02

Nov 2002 Jim Shank, PCAP Review LBNL 14 Nov 2002 Profile Estimate

Nov 2002 Jim Shank, PCAP Review LBNL 15 Nov02 Profile Estimate Breakout

Nov 2002 Jim Shank, PCAP Review LBNL 16 Profile comparison

Nov 2002 Jim Shank, PCAP Review LBNL 17 Software Project Developments Recent software progress has been driven by the ATLAS data challenges.Recent software progress has been driven by the ATLAS data challenges. This will be mentioned in many other talks at this review Some details of the ongoing and soon to be ongoing DC’sSome details of the ongoing and soon to be ongoing DC’s The US ATLAS TestbedThe US ATLAS Testbed Review of ATLAS MilestonesReview of ATLAS Milestones

Nov 2002 Jim Shank, PCAP Review LBNL 18 ATLAS DC1 Phase 1 : July-August 2002 (A. Putzer) 1.Australia 2.Austria 3.Canada 4.CERN 5.Czech Republic 6.France 7.Germany 8.Israel 9.Italy 10.Japan 11.Nordic 12.Russia 13.Spain 14.Taiwan 15.UK 16.USA

Nov 2002 Jim Shank, PCAP Review LBNL 19 ATLAS DC1 Phase 2 : October-November 2002 Pile-Up Production (High and Low Luminosity)Pile-Up Production (High and Low Luminosity) About the same CPU neeed as for phase 1 70 Tbyte files Additional countries/institutes will joinAdditional countries/institutes will join Large scale Grid test foreseen end NovemberLarge scale Grid test foreseen end November As many sites involved as possible Stability test : ~ 1-2 weeks Test of the Worldwide Computing Model Next Steps (2003)Next Steps (2003) Reconstruction (scheduled activities) Analysis (`chaotic access‘: most demanding)

Nov 2002 Jim Shank, PCAP Review LBNL 20 ATLAS DC2 : October March 2004 Use Geant4Use Geant4 Perform large scale physics analysisPerform large scale physics analysis Use LCG common softwareUse LCG common software Use widely Grid middlewareUse widely Grid middleware Further test of the computing modelFurther test of the computing model ~ same amount of data as for DC1~ same amount of data as for DC1 ATLAS DC3 : End Begin times more data than for DC25 times more data than for DC2 ATLAS DC4 : End Begin times more data than for DC32 times more data than for DC3

Nov 2002 Jim Shank, PCAP Review LBNL 21 Summary Major Milestones Green: Done Gray: Original date Blue: Current date

Nov 2002 Jim Shank, PCAP Review LBNL 22 Major Milestones One DC per year until startup

Nov 2002 Jim Shank, PCAP Review LBNL 23 Summary Great progress/success with the Data Challenges.Great progress/success with the Data Challenges. US ATLAS Testbed has become MUCH more functionalUS ATLAS Testbed has become MUCH more functional Driven by DC and the SuperComputing 2002 demonstrations taking place next week Interactions with CERNInteractions with CERN ATLAS interaction with LCG needs strengthening although US is quite active. International ATLAS responding positively to our pressure: SIT