CLRC Grid Team 18.07.00 Glenn Patrick LHCb GRID Plans Glenn Patrick 18.07.00 LHCb has formed a GRID technical working group to co-ordinate practical Grid.

Slides:



Advertisements
Similar presentations
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Advertisements

LHCb Bologna Workshop Glenn Patrick1 Backbone Analysis Grid A Skeleton for LHCb? LHCb Grid Meeting Bologna, 14th June 2001 Glenn Patrick (RAL)
LHCb(UK) Meeting Glenn Patrick1 LHCb Grid Activities in UK LHCb(UK) Meeting Cambridge, 10th January 2001 Glenn Patrick (RAL)
05/07/00LHCb Computing Model Meeting (CERN) LHCb(UK) Computing Status Glenn Patrick Prototype (Tier 1) UK national computing centre Bid to Joint.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
NIKHEF Testbed 1 Plans for the coming three months.
Status of SICB and Monte Carlo productions Eric van Herwijnen Friday, 2 march 2001.
Andrew McNab - Manchester HEP - 6 November Old version of website was maintained from Unix command line => needed (gsi)ssh access.
David Adams ATLAS DIAL Distributed Interactive Analysis of Large datasets David Adams BNL March 25, 2003 CHEP 2003 Data Analysis Environment and Visualization.
Production Planning Eric van Herwijnen Thursday, 20 june 2002.
K.Harrison CERN, 23rd October 2002 HOW TO COMMISSION A NEW CENTRE FOR LHCb PRODUCTION - Overview of LHCb distributed production system - Configuration.
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
LHCb Software Meeting Glenn Patrick1 First Ideas on Distributed Analysis for LHCb LHCb Software Week CERN, 28th March 2001 Glenn Patrick (RAL)
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
Workload Management WP Status and next steps Massimo Sgaravatto INFN Padova.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
LHCb DataModel Nick Brook Glenn Patrick University of Bristol Rutherford Lab Motivation DataModel Options Future plans.
LHCb Applications and GRID Integration Domenico Galli Catania, April 9, st INFN-GRID Workshop.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
3rd Nov 2000HEPiX/HEPNT CDF-UK MINI-GRID Ian McArthur Oxford University, Physics Department
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
SLICE Simulation for LHCb and Integrated Control Environment Gennady Kuznetsov & Glenn Patrick (RAL) Cosener’s House Workshop 23 rd May 2002.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
Data Import Data Export Mass Storage & Disk Servers Database Servers Tapes Network from CERN Network from Tier 2 and simulation centers Physics Software.
DataGrid Applications Federico Carminati WP6 WorkShop December 11, 2000.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
8th November 2002Tim Adye1 BaBar Grid Tim Adye Particle Physics Department Rutherford Appleton Laboratory PP Grid Team Coseners House 8 th November 2002.
Nick Brook Current status Future Collaboration Plans Future UK plans.
1 st December 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
LHCb and DataGRID - the workplan for 2001 Eric van Herwijnen Wednesday, 28 march 2001.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
LHCb planning for DataGRID testbed0 Eric van Herwijnen Thursday, 10 may 2001.
The ALICE short-term use case DataGrid WP6 Meeting Milano, 11 Dec 2000Piergiorgio Cerello 1 Physics Performance Report (PPR) production starting in Feb2001.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
PHENIX and the data grid >400 collaborators Active on 3 continents + Brazil 100’s of TB of data per year Complex data with multiple disparate physics goals.
T3 analysis Facility V. Bucard, F.Furano, A.Maier, R.Santana, R. Santinelli T3 Analysis Facility The LHCb Computing Model divides collaboration affiliated.
1 LHCb on the Grid Raja Nandakumar (with contributions from Greig Cowan) ‏ GridPP21 3 rd September 2008.
A B A B AR InterGrid Testbed Proposal for discussion Robin Middleton/Roger Barlow Rome: October 2001.
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
PHENIX and the data grid >400 collaborators 3 continents + Israel +Brazil 100’s of TB of data per year Complex data with multiple disparate physics goals.
LHCb Computing Model and updated requirements John Harvey.
Andrew McNab - Manchester HEP - 17 September 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –“How much of the Testbed has.
LHCb Data Challenge in 2002 A.Tsaregorodtsev, CPPM, Marseille DataGRID France meeting, Lyon, 18 April 2002.
10-Jan-00 CERN Building a Regional Centre A few ideas & a personal view CHEP 2000 – Padova 10 January 2000 Les Robertson CERN/IT.
LHCb GRID Meeting 11/12 Sept Sept LHCb-GRID T. Bowcock 2 AGENDA 9:30 LHCb MC Production –Points SICB Processing Req. Data Storage Data Transfer.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
LHCb Grid MeetingLiverpool, UK GRID Activities Glenn Patrick Not particularly knowledgeable-just based on attending 3 meetings.  UK-HEP.
LHCb Current Understanding of Italian Tier-n Centres Domenico Galli, Umberto Marconi Roma, January 23, 2001.
LHCb computing model and the planned exploitation of the GRID Eric van Herwijnen, Frank Harris Monday, 17 July 2000.
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
Status of Task Forces Ian Bird GDB 8 May 2003.
Moving the LHCb Monte Carlo production system to the GRID
ALICE Computing Model in Run3
Alice Software Demonstration
LHCb(UK) Computing Status Glenn Patrick
Gridifying the LHCb Monte Carlo production system
LHCb thinking on Regional Centres and Related activities (GRIDs)
Status and plans for bookkeeping system and production tools
Short to middle term GRID deployment plan for LHCb
The LHCb Computing Data Challenge DC06
Presentation transcript:

CLRC Grid Team Glenn Patrick LHCb GRID Plans Glenn Patrick LHCb has formed a GRID technical working group to co-ordinate practical Grid developments with reps from regional facilities.  Liverpool,RAL,CERN,IN2P3,INFN,Nikhef…  First meetings:14th June(RAL), 5th July(CERN) Next meeting:August(Liverpool)? A number of realistic, short-term goals have been identified which will: Initiate activity in this area. Map on to longer term LHCb applications in WP8. Provide us with practical experience in Grid tools. Globus to be installed and tested at CERN,RAL and Liverpool (version already at RAL and CERN). Regional centres (eg. CLRC) are production centres for simulated data  archive produced MC data.

Reminder: LHCb WP8 Application Target Application MAP Farm(300 cpu) at Liverpool to generate 10 7 events over 4 months. “Initial” data volumes transferred between facilities: Liverpool to RAL3TB (RAW,ESD,AOD,TAG) RAL to Lyon/CERN0.3TB (AOD and TAG) Lyon to LHCb inst.0.3TB (AOD and TAG) RAL to LHCb inst.100GB (ESD for sys. studies) Physicists run jobs at regional centre or move AOD & TAG data to local institute and run jobs there. Also, copy ESD for 10% of events for systematic studies. Formal EU production scheduled  start 2002 to mid-2002  But already doing distributed MC production for TDRs.

Data Challenges Any Grid work has to fit in with ongoing production and focus on existing data challenges (nothing “mock” about them). TDR schedule: CalorimetersSept 2000 RICHSept 2000 MuonJan 2001 Outer TrackerMarch 2001 Vertex DetectorApril 2001 Inner TrackerSept 2001 TriggerJan 2002 ComputingJuly 2002 Physicssignals, backgrounds, analysis

Short term plans Globus installed and tested at CERN, RAL and Liverpool. Members of Grid group given access to the respective testbeds. Cross-check that jobs can be run on each others machines. Extend to other centres once we understand. Ensure that SICBMC can be run at CERN,RAL and Liverpool using same executable. Verify that data produced by SICBMC can be shipped back to CERN and written to tape (VTP, globus-copy?). Only small event samples of 500 events. Benchmarking tests between sites to identify bottlenecks. Aim to complete basic tests  end of September Mainly a learning exercise using production software/systems

Issues along the way Interfacing to PBS,LSF and MAP batch scheduling systems. Role of “meta” batch systems? Extend existing LHCb Java tools to manage job submission, tape management and bookkeeping to use Grid technology. Where to publish the data - MDS/LDAP? To what extent can we standardise on commonarchitectures (Redhat 6.1 at the moment) to enable other institutes to easily join the Grid? Need more than one operating system to develop/debug programs (eg. NT). Requirement for filesystems like AFS? Token passing? How to fetch & access remote files - GASS server? RSL scripting, recovering log files, sending job parameters? Aim for “production” run using Grid  December

Which Grid Topology for LHCb(UK)? Flexibility important. CERN INFN RAL IN2P3 Liverpool Glasgow Edinburgh Department    Desktop users etc….