LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

GridPP July 2003Stefan StonjekSlide 1 SAM middleware components Stefan Stonjek University of Oxford 7 th GridPP Meeting 02 nd July 2003 Oxford.
S.L.LloydGrid Steering Committee 8 March 2002 Slide 1 Status of GridPP Overview Financial Summary Recruitment Status EU DataGrid UK Grid Status GridPP.
GridPP9 – 5 February 2004 – Data Management DataGrid is a project funded by the European Union GridPP is funded by PPARC GridPP2: Data and Storage Management.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Tony Doyle Executive Summary, PPARC, MRC London, 15 May 2003.
Your university or experiment logo here What is it? What is it for? The Grid.
Partner Logo Tier1/A and Tier2 in GridPP2 John Gordon GridPP6 31 January 2003.
B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
S.L.LloydGridPP Collaboration Meeting IC Sept 2002Slide 1 Introduction Welcome to the 5 th GridPP Collaboration Meeting Steve Lloyd, Chair of GridPP.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
LHCb Bologna Workshop Glenn Patrick1 Backbone Analysis Grid A Skeleton for LHCb? LHCb Grid Meeting Bologna, 14th June 2001 Glenn Patrick (RAL)
LHCb(UK) Meeting Glenn Patrick1 LHCb Grid Activities in UK LHCb(UK) Meeting Cambridge, 10th January 2001 Glenn Patrick (RAL)
05/07/00LHCb Computing Model Meeting (CERN) LHCb(UK) Computing Status Glenn Patrick Prototype (Tier 1) UK national computing centre Bid to Joint.
GridPP From Prototype to Production David Britton 21/Sep/06 1.Context – Introduction to GridPP 2.Performance of the GridPP/EGEE/wLCG Grid 3.Some Successes.
GridPP News NeSC opening “Media” dissemination Tier 1/A hardware Web pages Collaboration meetings Nick Brook University of Bristol.
 Changes to sources of funding for computing in the UK.  Past and present computing resources.  Future plans for computing developments. UK Status &
Production Planning Eric van Herwijnen Thursday, 20 june 2002.
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
25 February 2000Tim Adye1 Using an Object Oriented Database to Store BaBar's Terabytes Tim Adye Particle Physics Department Rutherford Appleton Laboratory.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
LHCb Software Meeting Glenn Patrick1 First Ideas on Distributed Analysis for LHCb LHCb Software Week CERN, 28th March 2001 Glenn Patrick (RAL)
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
LHCb Applications and GRID Integration Domenico Galli Catania, April 9, st INFN-GRID Workshop.
November 7, 2001Dutch Datagrid SARA 1 DØ Monte Carlo Challenge A HEP Application.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
Nick Brook Current status Future Collaboration Plans Future UK plans.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
LHCb and DataGRID - the workplan for 2001 Eric van Herwijnen Wednesday, 28 march 2001.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
LHCb planning for DataGRID testbed0 Eric van Herwijnen Thursday, 10 may 2001.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Dave Newbold, University of Bristol8/3/2001 UK Testbed 0 Sites Sites that have committed to TB0: RAL (R) Birmingham (Q) Bristol (Q) Edinburgh (Q) Imperial.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
BaBar and the GRID Tim Adye CLRC PP GRID Team Meeting 3rd May 2000.
LHCb Data Challenge in 2002 A.Tsaregorodtsev, CPPM, Marseille DataGRID France meeting, Lyon, 18 April 2002.
CLRC Grid Team Glenn Patrick LHCb GRID Plans Glenn Patrick LHCb has formed a GRID technical working group to co-ordinate practical Grid.
LHCb Grid MeetingLiverpool, UK GRID Activities Glenn Patrick Not particularly knowledgeable-just based on attending 3 meetings.  UK-HEP.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
LHCb computing model and the planned exploitation of the GRID Eric van Herwijnen, Frank Harris Monday, 17 July 2000.
Moving the LHCb Monte Carlo production system to the GRID
UK GridPP Tier-1/A Centre at CLRC
UK Testbed Status Testbed 0 GridPP project Experiments’ tests started
D. Galli, U. Marconi, V. Vagnoni INFN Bologna N. Brook Bristol
LHCb(UK) Computing Status Glenn Patrick
Gridifying the LHCb Monte Carlo production system
Computing activities at Victoria
LHCb thinking on Regional Centres and Related activities (GRIDs)
Status and plans for bookkeeping system and production tools
Short to middle term GRID deployment plan for LHCb
Development of LHCb Computing Model F Harris
Presentation transcript:

LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities

Architecture of LHCb Computing Model - based on a distributed multi-Tier regional centre Processing of real data at CERN (production centre) Regional centre simulation production

Present Facilities:  Liverpool MAP node facility  RAL NT farm - closed in February after LHCb MC production  RAL CSF facility node Linux facility  RAL datastore - IBM 3494 tape robot  RAL NT delivered approx 100k RAWH events/week  MAP can deliver ~16k DST2 events/week if dedicated to LHCb

RAL DataGrid Testbed RAL CSF 120 Linux cpu IBM 3494 tape robot LIVERPOOL MAP 300 Linux cpu CERN pcrd25.cern.ch lxplus009.cern.ch RAL (PPD) Bristol Imperial College Oxford GLASGOW/ EDINBURGH “Proto-Tier 2” Proposed LHCb-UK “Testbed” Institutes Exists Planned Cambridge

 250 PC99 JREI funding ~£0.4M

Prototype “testbed” Architecture Based around existing production facilities Intel PCs running Linux Redhat 6.1 Mixture of batch systems (LSF at CERN, PBS at RAL, FCS at MAP). Globus everywhere. Standard file transfer tools (eg. globus-rcp, GSIFTP). GASS servers for secondary storage? Java tools for controlling production, bookkeeping, etc. MDS/LDAP for bookkeeping database(s).

 Globus been used to remotely submit scripts and run SICBMC (CERN/RAL/Liverpool)  LHCb batch (PBS) jobs run on RAL CSF via Globus  ongoing investigations of Globus toolkt: globus-rcp, GSIFTP, RAL datastore as GASS server  work on-going to “open up” MAP for general use i.e analysis type activitites  co-ordinating LHCb external computing - work on-going on baseline DataModel  work on-going to harness University NT resources  major i/p to Grid PP proposal (most detailed forward look of all UK LHC expts) - disappointment on the emphasis of final document GRID Activities

Summary of current UK LHCb resources  2,775 SI95 shared for LHCb between RAL & Liverpool (the lion’s share being MAP)  will increase due to RAL upgrade to 3,400 SI95 in March  end of 2001 “Scotch” facility add additional 2,250 SI95  total end of 2001 to 5,650 SI95  by end of 2001  3.5Tb disk space distributed across RAL/Edinburgh/Liverpool  at RAL an additional 13Tb robotic tape by year end

Estimation of needs by 2003 Assume 10-15% of resources needed in 2003 & assuming UK contribution 20-25%  10,800-20,250 SI95 needed 7-14 Tb of disk 23-48Tb of “robotic” storage Figures consistent with “bottom-up” estimation performed at CERN Factors 2-4 greater than resources available by end of 2001

RICH software UK is co-ordinating the s/w effort current FORTRAN simulation written by UK FORTRAN simulation used in current RICH studies still (e.g. alignment studies & optimisation wrt T11) since TDR move towards OO - UK prominent (e.g. GEANT4 studies, interfacing current FORTRAN databanks to OO-framework…) investigation of fast RICH reconstruction

Rich simulation with OO software (work just getting underway) Simulation of testbeam setup – testing of Čerenkov radiation within GEANT4 Interfacing current info. from FORTRAN simulation into OO environment Interest in photodetector simulation OO environment

Fast Online Rich Particle Identification developing complimentary PID algorithm to offline method for use in online applications online PID searches for potential Čerenkov rings for a single track. Much less computation involved and therefore faster. initial results with algorithm are encouraging and studies into potential online applications are progressing. possible gains in global PID performance with online results merged with global algorithm, for example background suppression.

Summary UK in LHCb in prominent position leading Grid effort vital momentum remain and built upon through “GridPP” effort UK leading position in RICH s/w - UK effort is now moving into OO s/w and use of RICH “online”