U.S. ATLAS Software WBS 2.2 S. Rajagopalan July 8, 2003 DOE/NSF Review of LHC Computing.

Slides:



Advertisements
Similar presentations
Project Overview John Huth Harvard University U.S. ATLAS Physics and Computing Project Review ANL October 2001.
Advertisements

Athena/POOL integration
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Architecture/Framework Status David R. Quarrie LBNL U.S. ATLAS Physics and Computing Project Review ANL October 2001.
US ATLAS Project Management J. Shank U.S. ATLAS Computing and Physics meeting Aug., 2003 BNL.
Software Status/Plans Torre Wenaus, BNL/CERN US ATLAS Software Manager US ATLAS PCAP Review November 14, 2002.
Simulation Project Organization update & review of recommendations Gabriele Cosmo, CERN/PH-SFT Application Area Internal.
ATLAS-Specific Activity in GridPP EDG Integration LCG Integration Metadata.
Nightly Releases and Testing Alexander Undrus Atlas SW week, May
David Adams ATLAS ATLAS Distributed Analysis David Adams BNL March 18, 2004 ATLAS Software Workshop Grid session.
Software Project Status Torre Wenaus, BNL/CERN US ATLAS Software Manager DOE/NSF Review of the US ATLAS Physics and Computing Project January 15, 2003.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Charles Leggett The Athena Control Framework in Production, New Developments and Lessons Learned.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
LCG Applications Area – Overview, Planning, Resources Torre Wenaus, BNL/CERN LCG Applications Area Manager LHCC Comprehensive Review.
ATLAS, U.S. ATLAS, and Databases David Malon Argonne National Laboratory DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
ATLAS, U.S. ATLAS, and Databases David Malon Argonne National Laboratory PCAP Review of U.S. ATLAS Computing Project Argonne National Laboratory
June 02 John Huth, LHC Computing 1 U.S. ATLAS Overview  Project ManagementJ. Huth  SoftwareT.Wenaus  ArchitectureD. Quarrie  PhysicsI. Hinchliffe 
19 November 98 1 Jürgen Knobloch ATLAS Computing ATLAS Computing - issues for 1999 Jürgen Knobloch Slides also on:
24/06/03 ATLAS WeekAlexandre Solodkov1 Status of TileCal software.
US ATLAS Project Management J. Shank DOE/NSF review of LHC Computing 8 July, 2003 NSF Headquarters.
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
U.S. ATLAS Software WBS 2.2 S. Rajagopalan July 8, 2004 DOE-NSF Review of U.S. ATLAS Computing.
Data Management Overview David M. Malon U.S. ATLAS Computing Meeting Brookhaven, New York 28 August 2003.
U.S. ATLAS Project Overview John Huth Harvard University LHC Computing Review FNAL November 2001.
Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.
CMS Computing and Core-Software USCMS CB Riverside, May 19, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM.
SEAL Core Libraries and Services CLHEP Workshop 28 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
ATLAS Core Software - Status & Plans David R. Quarrie LBNL U.S. ATLAS Physics and Computing Project Review LBNL November 2002.
SEAL Project Core Libraries and Services 18 December 2002 P. Mato / CERN Shared Environment for Applications at LHC.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
- Early Adopters (09mar00) May 2000 Prototype Framework Early Adopters Craig E. Tull HCG/NERSC/LBNL ATLAS Arch CERN March 9, 2000.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
CMS Computing and Core-Software Report to USCMS-AB (Building a Project Plan for CCS) USCMS AB Riverside, May 18, 2001 David Stickland, Princeton University.
The LHC Computing Grid Project (LCG) and ROOT Torre Wenaus, BNL/CERN LCG Applications Area Manager John Harvey, CERN EP/SFT Group Leader
Software Project Status Torre Wenaus BNL DOE/NSF Review of US LHC Software and Computing Fermilab Nov 29, 2001.
G.Govi CERN/IT-DB 1 September 26, 2003 POOL Integration, Testing and Release Procedure Integration  Packages structure  External dependencies  Configuration.
23/2/2000Status of GAUDI 1 P. Mato / CERN Computing meeting, LHCb Week 23 February 2000.
Data Management Overview David M. Malon Argonne U.S. ATLAS Physics and Computing Project Advisory Panel Meeting Berkeley, CA November 2002.
Data Management Overview David M. Malon Argonne U.S. LHC Computing Review Berkeley, CA January 2003.
12 February 2004 ATLAS presentation to LCG PEB 1 Why ATLAS needs MySQL  For software developed by the ATLAS offline group, policy is to avoid dependencies.
Overview of US Work on Simulation and Reconstruction Frederick Luehring August 28, 2003 US ATLAS Computing Meeting at BNL.
Data Management Overview David M. Malon Argonne NSF/DOE Review of U.S. ATLAS Physics and Computing Project NSF Headquarters 20 June 2002.
- LCG Blueprint (19dec02 - Caltech Pasadena, CA) LCG BluePrint: PI and SEAL Craig E. Tull Trillium Analysis Environment for the.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Introduction S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting.
US_ATLAS Computing Review Jan 2000 Architecture & Framework David R. Quarrie Lawrence Berkeley National Lab
PCAP Close Out Feb 2, 2004 BNL. Overall  Good progress in all areas  Good accomplishments in DC-2 (and CTB) –Late, but good.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL DOE/NSF Review of US LHC Software and Computing Fermilab Nov 29, 2001.
David Adams ATLAS ATLAS Distributed Analysis (ADA) David Adams BNL December 5, 2003 ATLAS software workshop CERN.
David Adams ATLAS ATLAS Distributed Analysis and proposal for ATLAS-LHCb system David Adams BNL March 22, 2004 ATLAS-LHCb-GANGA Meeting.
L. Perini DATAGRID WP8 Use-cases 19 Dec ATLAS short term grid use-cases The “production” activities foreseen till mid-2001 and the tools to be used.
Follow-up to SFT Review (2009/2010) Priorities and Organization for 2011 and 2012.
Project Work Plan SEAL: Core Libraries and Services 7 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
Architecture/Framework Status David R. Quarrie LBNL DOE/NSF Review of U.S. ATLAS Physics and Computing Project FNAL November 2001.
David Adams ATLAS ADA: ATLAS Distributed Analysis David Adams BNL December 15, 2003 PPDG Collaboration Meeting LBL.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Development Project Management Jim Kowalkowski. Outline Planning and managing software development – Definitions – Organizing schedule and work (overall.
Dario Barberis: Conclusions ATLAS Software Week - 10 December Conclusions Dario Barberis CERN & Genoa University.
Marco Cattaneo, 3-June Event Reconstruction for LHCb  What is the scope of the project?  What are the goals (short+medium term)?  How do we organise.
S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting
ATLAS Core Software - Status & Plans
Architecture/Framework Status
Simulation Framework Subproject cern
ATLAS DC2 & Continuous production
SEAL Project Core Libraries and Services
Presentation transcript:

U.S. ATLAS Software WBS 2.2 S. Rajagopalan July 8, 2003 DOE/NSF Review of LHC Computing

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 2 Outline   Organizational Issues  ATLAS & U.S. ATLAS software   Current Affairs  Current resource allocation including LCG contributions  Major milestones met   FY04 Planning  Planning, coordination with international ATLAS  Near term milestones  Priorities and request for FY04   Conclusions

Organizational Issues

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 4 New Computing Organization x x x x x x x x x x x x x x x x x

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 5 Computing Management Board  Coordinate & Manage computing activities  Set priorities and take executive decisions  Computing Coordinator (chair)  Software Project Leader (D. Quarrie, LBNL)  TDAQ Liaison  Physics Coordinator  International Computing Board Chair  GRID, Data Challenge and Operations Coordinator  Planning & Resources Coordinator (T. Lecompte, ANL)  Data Management Coordinator (D. Malon, ANL)  Meets bi-weekly

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 6 Software Project Management Board  Coordinate the coherent development of software  (core, applications and software support)  Software Project Leader (chair) D. Quarrie  Simulation coordinator  Event Selection, Reconstruction & Analysis Tools coordinator  Core Services Coordinator (D. Quarrie)  Software Infrastructure Team Coordinator  LCG Applications Liaison (T. Wenaus, BNL)  Physics Liaison  TDAQ Liaison  Sub-System: Inner Detector, Liquid Argon, Tile, Muon coordinators  Liquid Argon: S. Rajagopalan (BNL), Muon: S. Goldfarb (U Mich)  Meets bi-weekly

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 7 US ATLAS Software Organization Software Project (WBS 2.2) S. Rajagopalan Core Services (WBS 2.2.2) D. Quarrie Data Management (WBS 2.2.3) D. Malon Application Software (WBS 2.2.4) F. Luehring Software Support (WBS 2.2.5) A. Undrus  US ATLAS software WBS scrubbed, consistent with ATLAS  Resource Loading and Reporting established at Level 4  Major change compared to previous WBS:  Production and Grid Tools & Services moved under Facilities Coordination (WBS 2.2.1)

Current Affairs

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 9 WBS Coordination  David Quarrie (LBNL) :  ATLAS Software Project Manager  ATLAS Chief Architect  U.S. ATLAS Core Services Level 3 Manager  David Malon (ANL) :  ATLAS Data Management Coordinator  U.S. ATLAS Data Management Level 3 Manager  Other U.S. Atlas personnel playing leading roles in ATLAS:  S. Goldfarb (Muon), T. LeCompte (Planning),  S. Rajagopalan (LAr), T. Wenaus (LCG Liaison)

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 10 WBS Core Services (D. Quarrie)  P. Calafiura (LBNL) :  Framework support, Event Merging, EDM infrastructure  M. Marino (LBNL) :  SEAL plug-in and component support  W. Lavrijsen (LBNL) :  User interfaces, Python scripting, binding to dictionary, integration with GANGA.  C. Leggett (LBNL) :  Conditions infrastructure, G4 Service integration in Athena, Histogramming support. Redirected to other tasks in FY04  H. Ma, S. Rajagopalan (BNL) (Base Program) : EDM infrastructure  C. Tull (LBNL) (PPDG) : Athena Grid Integration coordination

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 11 WBS Key Accomplishments  Python based user interfaces to CMT, Athena, and ROOT  Interval of Validity Service to allow time-based retrieval of conditions data into transient memory  Support for plug-in manager in LCG/SEAL  gcc-3.2 support, multithreading support, pile-up support.  Services to upload persistent addresses for on-demand retrieval of data objects  Common Material Definition across sub-systems, creation of G4 geometries from this description demonstrated

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 12 WBS Data Management (D. Malon)  S. Vanyachine (ANL) :  Database Services & Servers, NOVA database  Kristo Karr (ANL) :  New Hire, replaces S. Eckmann  Collections, Catalogs and Metadata  Valeri Fine (BNL) :  Integration of Pool with Athena  David Adams (BNL) :  Event datasets  Victor Perevotchikov (BNL) :  POOL evaluation, foreign object persistent in ROOT.

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 13 WBS Key Accomplishments  ATLAS specific  Athena-Pool conversion service prototype  Will be available to end user in July (tied to POOL release)  Support for NOVA database  (primary source for detector description for simulation)  Support for interval of validity  NOVA automatic object generation  Data additions, embedded MYSQL support for G4  Authentications, access to databases behind firewalls  LCG contributions  Delivered POOL collections/metadata WP interface, doc & unit tests  Delivered relational implementation of POOL explicit collections  Delivered MYSQL and related package support  Foreign object persistence

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 14 WBS Application Software (F. Luehring)  Geant3 simulation support  Calorimeter (LAr & Tile) software incl. calibration  Pixel, TRT detector simulation & digitization  Muon reconstruction and database  Hadronic calibration, tau and jet reconstruction  electron-gamma reconstruction  High Level Trigger software  Physics analysis with new software BNL ANL, BNL, Nevis Labs, U. Arizona, U. Chicago, U. Pittsburgh, SMU Indiana U., LBNL BNL, Boston U., LBNL, U. Michigan U. Arizona, U. Chicago, ANL, BNL, LBNL BNL, Nevis Labs, SMU U. Wisconsin U. S. ATLAS

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 15 WBS Software Support (A. Undrus)  Release and maintenance of ATLAS and all associated external software (including LCG software, LHCb Gaudi builds) at the Tier 1 Facility.  Deployment of a nightly build system at BNL, CERN and now used by LCG as well.  Testing releases with new compilers (gcc-3.2, SUN 5.2).  Software Infrastructure Team : Forum for discussions of issues related to support of ATLAS software and associated tools. A. Undrus is a member of this body.

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 16 US FY03 contribution in international context Category Category US (FTE) Non-US (FTE) Total(FTE)LCG(FTE)Framework EDM 0.5 P B Det. Description Data Management Graphics SW Infrastructure Total 9.05 P B * Excludes David Quarrie & Torre Wenaus coordination role contributions

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 17 LCG Application Component  US effort in SEAL : 1.0 FTE (FY03)  Plug-in manager (M. Marino (0.75 FTE, LBNL)  Internal use by POOL now, Full integration into Athena Q  Scripting Services (W. Lavjrisen; 0.25 FTE, LBNL)  Python support and integration  US effort in POOL : 1.2 FTE (FY03)  Principal responsibility in POOL collections and metadata WP  D. Malon, K. Karr, S. Vanyachine (0.5 FTE) [ANL] D. Adams, 0.2 FTE, BNL)  POOL Datasets (D. Adams, 0.2 FTE, BNL)  Common Data Management Software  V. Perevoztchikov, ROOT I/O foreign object persistence (0.3 FTE, BNL]  POOL mysql package and server configurations (ANL, 0.2 FTE)

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 18 US ATLAS contribution in LCG Contribution to Application Area only Contribution to Application Area only Snapshot (June 2003) contribution Snapshot (June 2003) contribution

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 19 ATLAS interactions with LCG  Lack of manpower has made ATLAS participation weaker than we would like  Little or no effort available to :  Participate in design discussions of POOL & SEAL omponents for which we are not directly responsible  Evaluate and test new features  Write ATLAS acceptance tests for POOL releases and for specifically requested features  Ensure that ATLAS priorities are kept prominent in LCG plans (ATLAS does this, but our voice has at times seemed not as loud as that of our sisters)  Less development contributed in the collections/metadata work package (for which we are responsible) than we would have liked, though this should improve soon with recent hire at ANL

FY04 Plans

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 21 International ATLAS Planning  ATLAS has a planning officer: T. LeCompte (ANL)  The current focus is on defining the WBS and establishing coherent short term plans.  US WBS used as a starting point!  Responsibility in monitoring all deliverables including non- ATLAS components (such as LCG) and assessing the impact from any delays.  Responsibility for establishing the software agreements and scope with international ATLAS institutions.

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 22 ATLAS Computing Timeline NOW Jul 03POOL/SEAL release Jul 03 ATLAS release 7 (with POOL persistency) Jul 03 ATLAS release 7 (with POOL persistency) Aug 03 LCG-1 deployment Aug 03 LCG-1 deployment Dec 03 ATLAS complete Geant4 validation Dec 03 ATLAS complete Geant4 validation Mar 04 ATLAS release 8 Mar 04 ATLAS release 8 Apr 04 DC2 Phase 1: simulation production Apr 04 DC2 Phase 1: simulation production Jun 04 DC2 Phase 2: reconstruction (the real challenge!) Jun 04 DC2 Phase 2: reconstruction (the real challenge!) Jun 04 Combined test beams (barrel wedge) Jun 04 Combined test beams (barrel wedge) Dec 04 Computing Model paper Dec 04 Computing Model paper Jul 05 ATLAS Computing TDR and LCG TDR Jul 05 ATLAS Computing TDR and LCG TDR Oct 05 DC3: produce data for PRR and test LCG-n Oct 05 DC3: produce data for PRR and test LCG-n Nov 05 Computing Memorandum of Understanding Nov 05 Computing Memorandum of Understanding Jul 06 Physics Readiness Report Jul 06 Physics Readiness Report Oct 06 Start commissioning run Oct 06 Start commissioning run Jul 07 GO! Jul 07 GO!

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 23 Major near term milestones  July to Dec 2003: SEAL/POOL/PI deployment by LCG  Sept. 2003: Geant 4 based simulation release  Dec. 2003: Validate Geant4 release for DC2 and test-beam  Dec. 2003: First release of full ATLAS software chain using LCG components and Geant4 for use in DC2 and combined test-beam.  Spring 2004: Combined Test-Beam runs.  Spring 2004: Data Challenge 2  Principal means by which ATLAS will test and validate its proposed Computing Model  Dec. 2004: Computing Model Document released

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 24 U.S. scope issues  : Develop sufficient core software infrastructure to deploy and exercise a reasonable prototype of the ATLAS Computing Model  ATLAS is quite far from being able to do this  Now is not the time to sacrifice core software development  Doing so puts the TDR and hence the readiness for LHC turn-on at risk.  U.S was asked to lead the effort in coordinating, developing and deploying the ATLAS architecture (from ground-zero in 1999).  Leadership roles in Software Project, Architecture and Data Management. + major responsibilities - but minimal resources to work with.  We are responsible to ensure the success of ATLAS architecture.  Efforts are continuing to be made in encouraging and recruiting non-US institutions & US universities to contribute to core and leveraging from LCG.

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 25 Core Software & Physicists  The presence of a strong core team in the U.S. has helped U.S. physicists make significant contributions to reconstruction, simulation and physics analysis. – in turn allowing them to play an influential role in the overall ATLAS software program.  Examples from LAr, InDet simulation and Calo, Muon reconstruction, event generation infrastructure, egamma, tau, jet reconstruction, calibration, …  Conversely, this has also allowed U.S. physicists to provide valuable feedback to core software and in some cases contribute to the core development  Examples are the Event Data Model and the Detector Description efforts. This harmony is necessary to allow U.S. to develop the necessary expertise and effectively contribute to the physics at turn-on.

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 26 Incremental Effort: Core Services  Redirections:  C. Leggett (0.5 Calibration Infrastructure to EDM)  M. Marino (0.25 Training to SEAL/Framework)  Additions (prioritized):  FTE in Detector Description, WBS (U. Pittsburgh)  New Hire to work with J. Boudreau  FTE in Analysis Tools support, WBS  New Hire or redirection of effort  FTE in Graphics, WBS (UC Santa Cruz)  Existing person (G. Taylor) who is currently making significant contributions to ATLANTIS (Atlas Graphics Package). Decreasing Priority

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 27 Detector Description  ATLAS lacked a Detector Description Model  Numbers hardwired in reconstruction, no commonality with simulation.  Along came Joe Boudreau (U. Pittsburgh) CDF experience  Successfully designed, developed and deployed a prototype model for both material and readout geometry. We encouraged this!  Automatically handles alignments, Optimized for memory (5 MB for describing ATLAS geometry), Not coupled to visualization software.  Currently resident at Oxford, helping sub-systems migrate.  No surprise, the work load on Joe has increased  Critical items include Material Integration Service, Configuration Utility, Identifiers and Transient Model for readout geometry Important to support such university based initiatives to core software

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 28 Incremental Effort: Data management  Our plan has always been to sustain 6.5 FTE effort.  Recent Cuts in 2002 …  Ed Frank, U. Chicago  BNL Hire : job offered but retracted due to last minute budget cuts … have impacted our ability to deliver the promised  Unable to save and restore objects from persistent event store  No ATLAS interfaces to Event collections, catalogs and metadata Approximate allocation of new effort: Approximate allocation of new effort:  FTE Collections, Catalogs, and Metadata (WBS )  FTE Common Data Management Software (WBS )  FTE Event Store (WBS )  Redirect from WBS & (0.5 each) if no funds available. Decreasing Priority

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 29 Impact of Insufficient Funds  -1.0 FTE in Graphics  Impacts our ability to have any reasonable visualization software for test-beam or Data Challenge 2.  FTE in Analysis Tools  Impacts our ability to deliver a framework for analysis  FTE in Data Management  0.5 FTE for supporting Non-Event Data Management.  0.5 FTE in supporting basic database services  FTE in Detector Description  Jeopardizes our ability to deliver key components including Material Service Integration, common geometry for simulation and reconstruction,  FTE in Common Data Management Software  Impacts contributions to POOL and integration aspects, schema management  FTE in Event Store  Support for a persistent EDM and Event Selection Descoping Order Model 4 Model 5 Model 6

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 30 FY04 Ramp-Up Cost Prioritized incremental Ramp-Up in FTE Cost (FY04 k$) FY04 guidance from J. Shank

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 31 WBS-Personnel Summary

DOE/NSF Review of LHC Computing, 8 July 2003 S. Rajagopalan 32 Conclusions  Request for a + 5 FTE in FY04:  2.5 FTE to bring Data Management to its intended Level of Effort  1 FTE university based for Detector Description  0.5 FTE for contribution to Analysis Tools  1 FTE university based for support Graphics  Guidance given for FY04 can handle only 1.5 FTE  U.S. ATLAS LCG contribution will be 4.0 FTE in FY04  2.0 FTE each in Core Services and Data Management WP