Download presentation
Presentation is loading. Please wait.
Published byLydia Fields Modified over 9 years ago
1
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University
2
ATLAS Needs Long term, ATLAS needs a fully Grid-enabled Reconstruction, Analysis and Simulation environment Short-term, the first ATLAS priority is a Monte Carlo production system, building towards the full system ATLAS has an agreed program of Data Challenges (based in MC data) to develop and test the computing model
3
RWL Jones, Lancaster University Data Challenge 0 Runs from October-December 2001 Continuity test of MC code chain. Only modest samples 10 5 event samples, and essentially all in flat file format. All the Data Challenges will be run on Linux systems compilers distributed with the code if not already installed locally in the correct version.
4
RWL Jones, Lancaster University Data Challenge 1 Data Challenge 1 Runs in the first half of 2002 Several sets of 10 7 events (high level trigger studies, physics analysis). Intend to generate and store 8Tbytes in the UK, 1-2Tbytes in Objectivity. Will use of M9 DataGrid deliverables and as many other Grid tools as time permits. Tests of distributed reconstruction and analysis Test of database technologies
5
RWL Jones, Lancaster University Data Challenge 2 Data Challenge 2 Runs for the first half of 2003 Will generate several samples of 10 8 events Mainly in OO-databases Full use of the Testbed 1 and Grid tools Complexity and scalability tests of the distributed computing system Large-scale distributed physics analysis using Grid tools, calibration and alignement Large-scale distributed physics analysis using Grid tools, calibration and alignement
6
RWL Jones, Lancaster University LHC Computing Model (Cloud) LHC Computing Model (Cloud) CERN Tier2 Lab a Uni a Lab c Uni n Lab m Lab b Uni b Uni y Uni x Physics Department Desktop Germany Tier 1 USA FermiLab UK France Italy NL USA Brookhaven ………. The LHC Computing Centre
7
RWL Jones, Lancaster University Implications of Cloud Model Internal: need cost sharing between global regions within collaboration External (on Grid services): Need authentication/accounting/priority on the basis of experiment/region/team/local region/user Note: The NW believes this is a good model for tier-2 resources as well.
8
RWL Jones, Lancaster University ATLAS Software Late in moving to OO as physics TDR etc given a high priority Generations and reconstruction now done in C++/OO Athena framework Detector simulation still in transition to OO/C++/Geant4; DC1 will still use G3 Athena common framework with LHCb Gaudi
9
RWL Jones, Lancaster University Simulation software for DC1. Simulation software for DC1. HepMc Detector simulation Dice: slug+geant3 fortran produce GENZ+KINE bank ZEBRA ATHENA Fast det.simulation Reconstruction C++ reads GENZ +kine convert to HepMc produce Ntuples ATHENA Particle lev. simulation GeneratorModules C++, linux ---------------- Py6 +code dedicated to B-physics ---------------- PYJETS->HepMc --------------- EvtGen BaBar package ( later). Atlfast++ reads HepMc produce Ntuples
10
RWL Jones, Lancaster University Requirement Capture Extensive use case studies:“ATLAS Grid Use Cases and Requirements” 15/X/01 Many more could be developed, especially in the monitoring areas Short-term use case centred on immediate MC production needs Obvious overlaps with LHCb – joint projects Three main projects defined, “Proposed ATLAS UK Grid Projects” 26/X/01
11
RWL Jones, Lancaster University Grid User interface for Athena Completely common project with LHCb Obtains resource estimates and applies quota and security policies Query installation tools Correct software installed? Install if not Job submission guided by resource broker Run-time monitoring and job deletion Output to MSS and bookkeeping update
12
RWL Jones, Lancaster University Installation Tools Tools to automatically generate installation kits, deploy using Grid tools and install at remote sites via Grid job Should be integrated with a remote autodetection service for installed software Initial versions should cope with pre-built libraries and executables Should later deploy development environment ATLAS and LHCb build environments converging on CMT – some commonality here
13
RWL Jones, Lancaster University MC Production System For DC1, will use existing MC production system (G3), integrated with M9 tools (Aside: M9/WP8 validation and DC kit development in parallel) Decomposition of MC system into components: Monte Carlo job submission, bookkeeping services, metadata catalogue services, monitoring and quality-control tools Bookkeeping and data-management projects already ongoing – will work in close collaboration, good link with US projects Close link with Ganga developments
14
RWL Jones, Lancaster University Allow regional management of large productions Job script and steering generated Remote installation as required Production site chosen by resource broker. Generate events and store locally Write log to web Copy data to local/regional store through interface with Magda (data management). Copy data from local storage to remote MSS Update book-keeping database
15
RWL Jones, Lancaster University Work AreaPMB Allocation (FTE) Previously Allocated (FTE) Total Allocation (FTE) ATLAS/LHCb2.00.02.0 ATLAS1.01.52.5 LHCb1.0 2.0 This will just allow us to cover the three projects Additional manpower must be found for monitoring tasks, testing the computing model in DC2, and the simple running of the Data Challenges
16
RWL Jones, Lancaster University WP8 M9 Validation WP8 M9 Validation now beginning Glasgow, Lancaster(, RAL?) involved in the ATLAS M9 validation Validation is exercises the tools using the ATLAS kit The software used is behind the current version This is likely to be the case in all future tests (decouples software changes from tool tests) Previous test of MC production using Grid tools a success DC1 validation (essentially of ATLAS code); Glasgow, Lancaster (Lancaster is working on tests of standard generation and reconstruction quantities to be deployed as part of kit) Cambridge to contribute
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.