ATLAS-Specific Activity in GridPP EDG Integration LCG Integration Metadata.

Slides:



Advertisements
Similar presentations
1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
Advertisements

GridPP9 – 5 February 2004 – Data Management DataGrid is a project funded by the European Union GridPP is funded by PPARC GridPP2: Data and Storage Management.
B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
Stephen Burke - WP8 Status - 14/2/2002 Partner Logo WP8 Status Stephen Burke, PPARC/RAL.
Reporting of the Experiments Follow procedures set up for technical WP of EDG Spreadsheet report man month effort Pro-forma reply sheet Pro-forma sheet.
Nick Brook University of Bristol The LHC Experiments & Lattice EB News Brief overview of the expts  ATLAS  CMS  LHCb  Lattice.
Experience with ATLAS Data Challenge Production on the U.S. Grid Testbed Kaushik De University of Texas at Arlington CHEP03 March 27, 2003.
Current Monte Carlo calculation activities in ATLAS (ATLAS Data Challenges) Oxana Smirnova LCG/ATLAS, Lund University SWEGRID Seminar (April 9, 2003, Uppsala)
Ian M. Fisk Fermilab February 23, Global Schedule External Items ➨ gLite 3.0 is released for pre-production in mid-April ➨ gLite 3.0 is rolled onto.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
CMS Report – GridPP Collaboration Meeting VIII Peter Hobson, Brunel University22/9/2003 CMS Applications Progress towards GridPP milestones Data management.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
ATLAS Data Challenge Production and U.S. Participation Kaushik De University of Texas at Arlington BNL Physics & Computing Meeting August 29, 2003.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
ATLAS and Grid Computing RWL Jones GridPP 13 5 th July 2005.
David Adams ATLAS ATLAS Distributed Analysis David Adams BNL March 18, 2004 ATLAS Software Workshop Grid session.
CHEP – Mumbai, February 2006 The LCG Service Challenges Focus on SC3 Re-run; Outlook for 2006 Jamie Shiers, LCG Service Manager.
GridPP CM, ICL 16 September 2002 Roger Jones. RWL Jones, Lancaster University EDG Integration  EDG decision to put short-term focus of effort on making.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
Offline Coordinators  CMSSW_7_1_0 release: 17 June 2014  Usage:  Generation and Simulation samples for run 2 startup  Limited digitization and reconstruction.
Nick Brook Current status Future Collaboration Plans Future UK plans.
David Adams ATLAS ATLAS Distributed Analysis Plans David Adams BNL December 2, 2003 ATLAS software workshop CERN.
L.Perini-CSN11 ATLAS Italia Calcolo Stato e piani: Ruolo di LCG Nessun finanziamento chiesto ora (a Settembre si)
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
F. Fassi, S. Cabrera, R. Vives, S. González de la Hoz, Á. Fernández, J. Sánchez, L. March, J. Salt, A. Lamas IFIC-CSIC-UV, Valencia, Spain Third EELA conference,
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
ATLAS Data Challenge Production Experience Kaushik De University of Texas at Arlington Oklahoma D0 SARS Meeting September 26, 2003.
K. Harrison CERN, 25th September 2003 GANGA: GAUDI/ATHENA AND GRID ALLIANCE - Project news - Ganga release 1 - Work towards Ganga release 2 - Interaction.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
Δ Storage Middleware GridPP10 What’s new since GridPP9? CERN, June 2004.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
Tier-2  Data Analysis  MC simulation  Import data from Tier-1 and export MC data CMS GRID COMPUTING AT THE SPANISH TIER-1 AND TIER-2 SITES P. Garcia-Abia.
INFSO-RI Enabling Grids for E-sciencE SA1 and gLite: Test, Certification and Pre-production Nick Thackray SA1, CERN.
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
The Experiments – progress and status Roger Barlow GridPP7 Oxford 2 nd July 2003.
ATLAS: Heavier than Heaven? Roger Jones Lancaster University GridPP19 Ambleside 28 August 2007.
Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.
Owen SyngeTitle of TalkSlide 1 Storage Management Owen Synge – Developer, Packager, and first line support to System Administrators. Talks Scope –GridPP.
CMS Computing and Core-Software USCMS CB Riverside, May 19, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM.
David Adams ATLAS DIAL/ADA JDL and catalogs David Adams BNL December 4, 2003 ATLAS software workshop Production session CERN.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
ATLAS is a general-purpose particle physics experiment which will study topics including the origin of mass, the processes that allowed an excess of matter.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
LCG ARDA status Massimo Lamanna 1 ARDA in a nutshell ARDA is an LCG project whose main activity is to enable LHC analysis on the grid ARDA is coherently.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
Post-DC2/Rome Production Kaushik De, Mark Sosebee University of Texas at Arlington U.S. Grid Phone Meeting July 13, 2005.
SC4 Planning Planning for the Initial LCG Service September 2005.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
ATLAS-specific functionality in Ganga - Requirements for distributed analysis - ATLAS considerations - DIAL submission from Ganga - Graphical interfaces.
Introduction S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting.
ATLAS Distributed Computing perspectives for Run-2 Simone Campana CERN-IT/SDC on behalf of ADC.
Overview of ATLAS Data Challenge Oxana Smirnova LCG/ATLAS, Lund University GAG monthly, February 28, 2003, CERN Strongly based on slides of Gilbert Poulard.
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
PCAP Close Out Feb 2, 2004 BNL. Overall  Good progress in all areas  Good accomplishments in DC-2 (and CTB) –Late, but good.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL DOE/NSF Review of US LHC Software and Computing Fermilab Nov 29, 2001.
Distributed Analysis Tutorial Dietrich Liko. Overview  Three grid flavors in ATLAS EGEE OSG Nordugrid  Distributed Analysis Activities GANGA/LCG PANDA/OSG.
K. Harrison CERN, 21st February 2005 GANGA: ADA USER INTERFACE - Ganga release Python client for ADA - ADA job builder - Ganga release Conclusions.
David Adams ATLAS ATLAS Distributed Analysis (ADA) David Adams BNL December 5, 2003 ATLAS software workshop CERN.
David Adams ATLAS ATLAS Distributed Analysis and proposal for ATLAS-LHCb system David Adams BNL March 22, 2004 ATLAS-LHCb-GANGA Meeting.
David Adams ATLAS ADA: ATLAS Distributed Analysis David Adams BNL December 15, 2003 PPDG Collaboration Meeting LBL.
Dario Barberis: Conclusions ATLAS Software Week - 10 December Conclusions Dario Barberis CERN & Genoa University.
Comments on SPI. General remarks Essentially all goals set out in the RTAG report have been achieved. However, the roles defined (Section 9) have not.
LHCb Computing 2015 Q3 Report Stefan Roiser LHCC Referees Meeting 1 December 2015.
U.S. ATLAS Grid Production Experience
Tim Barrass Split ( ?) between BaBar and CMS projects.
Readiness of ATLAS Computing - A personal view
Simulation use cases for T2 in ALICE
LCG middleware and LHC experiments ARDA project
US ATLAS Physics & Computing
ATLAS DC2 & Continuous production
Presentation transcript:

ATLAS-Specific Activity in GridPP EDG Integration LCG Integration Metadata

22 September 2003GridPP8 meeting, Bristol 2 Personalia Alvin Tan (Birmingham) (3.1.4,3.1.7) Fully funded by GridPP to work on MC and production Clear link with GANGA activity, to which he contributes Alvin Tan (Birmingham) (3.1.4,3.1.7) Fully funded by GridPP to work on MC and production Clear link with GANGA activity, to which he contributes Frederic Brochu: (3.1.7, 3.1.9) Half-funded by GridPP Integration of EDG and LCG into ATLAS Framework Data Challenges Frederic Brochu: (3.1.7, 3.1.9) Half-funded by GridPP Integration of EDG and LCG into ATLAS Framework Data Challenges

22 September 2003GridPP8 meeting, Bristol 3 Mike Gardner (-2003) Fully funded post at RHUL (3.1.9) Installation and packaging tools Analysis architectures Fully funded post at RHUL (3.1.9) Installation and packaging tools Analysis architectures As alluded to in last report, Mike was ill and off work from the end of Q2 Despite encouraging signs, Mike died of liver cancer on Sunday 14 th September Last contribution a very nice paper for All Hands 03; we intend to publish a fuller version, with a dedication We will discuss the continuation of the post over the next week or so As alluded to in last report, Mike was ill and off work from the end of Q2 Despite encouraging signs, Mike died of liver cancer on Sunday 14 th September Last contribution a very nice paper for All Hands 03; we intend to publish a fuller version, with a dedication We will discuss the continuation of the post over the next week or so

22 September 2003GridPP8 meeting, Bristol 4 EDG Integration (3.1.4, 3.1.9) Recent UK mini production used EDG 1.4 input data stored on RAL’s tape server, the requirements in JDL the IC Resource Broker a boxed set executables. SiteRALCambridgeI CBirmingham Jobs Allocated Success The test took only 1 week, 1 operator and 3000 SpecInt95 days. The success rate was higher than 90 %. Not suitable for production but encouraging step towards a brokered production system. UK integrating validating EDG grid middleware for ATLAS EDG1.2 on the core sites revealed many problems with Resource Broker saturation and the use of a single Replica Catalogue (solved with RLS) The job success rate was only 70%.

22 September 2003GridPP8 meeting, Bristol 5 LCG-1 Integration & Validation (3.1.4) Partially functional LCG-1 release in July: Full functionality by October Pool integration underway, but bugs being fixed LCG-1 is only just ready for integration and validation: Deliverable will be missed because of late delivery (3 months) ATLAS-LCG team (FB a key member) now becomes ATLAS-EDG team EDG-2 is also just ready for integration and validation: We will also integrate this We will have another mini-production Partially functional LCG-1 release in July: Full functionality by October Pool integration underway, but bugs being fixed LCG-1 is only just ready for integration and validation: Deliverable will be missed because of late delivery (3 months) ATLAS-LCG team (FB a key member) now becomes ATLAS-EDG team EDG-2 is also just ready for integration and validation: We will also integrate this We will have another mini-production

22 September 2003GridPP8 meeting, Bristol 6 Installation & Packaging (3.1.9) Despite the loss of Mike, I&P activity continues: Incorporation into ATLAS standard release (internal deliverable by Dec) Extension to analysis code deployment SCRAM/DAR: ATLAS has committed to remaining with CMT, pacman, so this remains an active line of development Deliverables: production of the development kit (started) expected end 12/ deployment of the development kit on test site start: 08/03 end : 10/ integration with Grid infrastructure start: 09/03 end : 12/ systematic deployment of the development kit start: 10/03 end : 12/ production of the full source kit start: 09/03 end : 12/03 Looking at ways to make use of pacman from Ganga Concentration on deployment for analysis Despite the loss of Mike, I&P activity continues: Incorporation into ATLAS standard release (internal deliverable by Dec) Extension to analysis code deployment SCRAM/DAR: ATLAS has committed to remaining with CMT, pacman, so this remains an active line of development Deliverables: production of the development kit (started) expected end 12/ deployment of the development kit on test site start: 08/03 end : 10/ integration with Grid infrastructure start: 09/03 end : 12/ systematic deployment of the development kit start: 10/03 end : 12/ production of the full source kit start: 09/03 end : 12/03 Looking at ways to make use of pacman from Ganga Concentration on deployment for analysis

22 September 2003GridPP8 meeting, Bristol 7 Data Challenges (3.1.9) Data Challenge 2 is the key test of the Grid Computing Model Tools must be ready and tested for intensive tests in April-June (1 week bursts) Unlike DC1, data produced is not the main aim, and the computing is the client Frederic Brochu directs the DC in the UK DC Tools Task Force now created GANGA acknowledged as key input tool to the DC Alvin Tan is on the DCTF Integrating AMI, MAGDA etc Defining required bookkeeping and Metadata services Job analysis skeleton exists (SG, MG); this needs a lot more work for a production version Data Challenge 2 is the key test of the Grid Computing Model Tools must be ready and tested for intensive tests in April-June (1 week bursts) Unlike DC1, data produced is not the main aim, and the computing is the client Frederic Brochu directs the DC in the UK DC Tools Task Force now created GANGA acknowledged as key input tool to the DC Alvin Tan is on the DCTF Integrating AMI, MAGDA etc Defining required bookkeeping and Metadata services Job analysis skeleton exists (SG, MG); this needs a lot more work for a production version

22 September 2003GridPP8 meeting, Bristol 8 Test Bench –Data Challenges DC 1Jul 2002-May 2003 Showed the many resources available (hardware, willing people) Made clear the need for integrated system Some tests of Grid software Mainly driven by HLT and Physics Workshop needs One external driver is sustainable, two is not! DC2April-July 04 Real test of computing model for computing TDR Must use Grid systems Analysis and calibration + reconstruction and simulation Pre-production period (Nov03…) then 1-week intensive tests DC3 05/06 Physics readiness TDR. Big increase in scale

22 September 2003GridPP8 meeting, Bristol 9 At this stage the goal includes:  Full use of Geant4; POOL; LCG applications  Pile-up and digitization in Athena  Deployment of the complete Event Data Model and the Detector Description  Simulation of full ATLAS and 2004 combined Testbeam  Test the calibration and alignment procedures  Use widely the GRID middleware and tools  Large scale physics analysis  Computing model studies (document end 2004)  Run as much as possible the production on LCG-1  Combined Test beam operation foreseen as concurrent with DC2 and using same tools DC2: April – July 2004

 September 03: Release7  Mid-November 03: pre- production release  February 27 th 04: Release 8 (production)  April 1 st 04:  June 1 st 04: “DC2”  July 15th  Put in place, understand & validate:  Geant4; POOL; LCG applications  Event Data Model  Digitization; pile-up; byte-stream  Conversion of DC1 data to POOL; large scale persistency tests and reconstruction  Testing and validation  Run test-production  Start final validation  Start simulation; Pile-up & digitization  Event mixing  Transfer data to CERN  Intensive Reconstruction on “Tier0”  Distribution of ESD & AOD  Calibration; alignment  Start Physics analysis  Reprocessing DC2:Scenario & Time scale Test beam runs in parallel

22 September 2003GridPP8 meeting, Bristol 11 Metadata (3.1.9, GridPP2) Metadata definition and structure is a crucial issue for ATLAS DC2 Metadata must not only describe (collections of) files but (collections of events) and even at a sub-event level POOL provides one layer of metadata – how does this interact with RC etc? We require metadata services (middleware issue) but also an instantiation and design for our own Metadata workshop in Oxford, th July 2003 (RWLJ, AS, AT in attendance) Initial designs evolving GANGA must allow metadata queries, dynamic collection definition Metadata service integration, query services, metadata design will be key elements of the ATLAS GridPP2 activity.

22 September 2003GridPP8 meeting, Bristol 12 Papers/Presentations General ATLAS-UK Grid poster at Lepton-Photon 03 Installation & packaging paper and talk at All Hands GANGA paper & talk at all hands MC production system paper & poster at All Hands 03 Integration and validation paper & poster at All Hands 03

22 September 2003GridPP8 meeting, Bristol 13 Conclusions The GridPP effort for ATLAS has rightly focussed on GANGA The associated effort tries to realise the services that are needed for ATLAS production and analysis Installation and Packaging EDG and LCG integration and validation The DCs continue to test and validate the Grid tools Good progress, but Mike Gardner is a sad loss to the project The metadata services/schema/query tools/bookkeeping etc are moving onto the critical path and will be a major focus in GridPP2