Presentation is loading. Please wait.

Presentation is loading. Please wait.

Building a UK Computing Grid for Particle Physics

Similar presentations


Presentation on theme: "Building a UK Computing Grid for Particle Physics"— Presentation transcript:

1 Building a UK Computing Grid for Particle Physics
GridPP Building a UK Computing Grid for Particle Physics

2 CERN Large Hadron Collider
The world’s highest energy particle physics accelerator Due to turn on in 2007 Looking for the Higgs boson, supersymmetry, quark-gluon plasmas, micro-black holes….

3 How will LHC work? LHC will collide beams of protons at an energy of 14 TeV Using the latest super-conducting technologies, it will operate at about – 2700C, just above absolute zero of temperature. With its 27 km circumference, the accelerator will be the largest superconducting installation in the world.

4 ATLAS CMS LHCb ALICE 4 LHC Experiments
general purpose: origin of mass, supersymmetry, micro-black holes 2,000 scientists from 34 countries CMS general purpose detector muon tracking, electromagnetic calorimeter, central tracking and hadron calorimeter LHCb to study the differences between matter and antimatter producing over 100 million b and b-bar mesons each year ALICE heavy ion collisions, to create quark-gluon plasmas 50,000 particles in each collision

5 The LHC Data Challenge A particle collision = an event
Physicist's goal is to count, trace and characterise all the particles produced and fully reconstruct the process. Among all tracks, the presence of “special shapes” is the sign for the occurrence of interesting interactions. One way to find the Higgs boson: look for characteristic decay pattern producing 4 muons

6 The LHC Data Challenge Starting from this event…
Selectivity: 1 in 1013 Like looking for 1 person in a thousand world populations Or for a needle in 20 million haystacks! ..we are looking for this “signature”

7 LHC data 40 million collisions per second
Concorde (15 Km) Balloon (30 Km) DVD stack with 1 year LHC data (~ 20 Km) Mt. Blanc (4.8 Km) LHC data 40 million collisions per second After filtering, 100 collisions of interest per second A Megabyte of data digitised for each collision = recording rate of 0.1 Gigabytes/sec 1010 collisions recorded each year = 10 Petabytes/year of data 1 Gigabyte (1GB) = 1000MB A CD album 1 Terabyte (1TB) = 1000GB World annual book production 1 Petabyte (1PB) = 1000TB Annual production of one LHC experiment 1 Exabyte (1EB) = 1000 PB World annual information production

8 data processing Simulation: start from theory and detector characteristics and compute what detector should have seen Reconstruction: transform signals from the detector to physical properties (energies, charge of particles, ..) Analysis: Find collisions with similar features, use of complex algorithms to extract physics…

9 LHC Computing Grid (LCG)
By 2007: 100,000 CPUs - >200 institutions worldwide building on middleware being developed in advanced grid technology projects, both in Europe and in the USA Currently running on around 200 sites.

10 LCG monitoring applet Monitor:
resource brokers virtual organisations ATLAS CMS LHCb DTeam Other SQL queries to logging and book-keeping database

11 GridPP Middleware Development
Grid Data Management Network Monitoring Workload Management Information Services Security Storage Interfaces

12 R-GMA Information and monitoring tool developed by Rutherford Appleton Laboratory Matches up users’ jobs with suitable resources Monitors the progress of jobs Deployed in gLite middleware R-GMA structure

13 Enabling Grids for E-sciencE
Deliver a 24/7 Grid service to European science build a consistent, robust and secure Grid network that will attract additional computing resources. continuously improve and maintain the middleware in order to deliver a reliable service to users. attract new users from industry as well as science and ensure they receive the high standard of training and support they need. 100 million euros/4years, funded by EU >400 software engineers + service support 70 European partners

14 The Grid What the User sees 1 11 Job Submit Event 2 Input Dataset
gridui JDL Output Dataset 11 Job Submit Event 2 The Grid

15 What the Grid does Replica Catalog Resource Broker Information Service
Input Dataset 1 gridui JDL 3 Output Dataset 11 Job Submit Event 2 7 5 Resource Broker Information Service 6 4 Job Submission Service 9 8 Computing Nodes Storage Job Status 10 Logging & Bookkeeping

16 Interoperability LCG collaborates with
to form the worldwide LHC Computing Grid NorduGrid OSG

17 GridPP – the UK’s contribution to LCG
19 UK Universities, CCLRC (RAL & Daresbury) and CERN Funded by the UK Particle Physics and Astronomy Research Council (PPARC) £33m, from

18 UK Tier-1/A Centre – Rutherford Appleton Laboratory
High quality data services National and International Role UK focus for International Grid development 1000 Dual CPU 200 TB Disk 220 TB Tape (Capacity 1PB) Grid Operations Centre

19 UK Tier-2 Centres ScotGrid SouthGrid LondonGrid
Durham Edinburgh Glasgow NorthGrid Daresbury Lancaster Liverpool Manchester Sheffield SouthGrid Birmingham Bristol Cambridge Oxford RAL PPD Warwick LondonGrid Brunel Imperial QMUL RHUL UCL

20 Applications – US experiments
BaBar (SLAC) investigating asymmetry between matter and antimatter use LCG to run: Monte Carlo events analysis of real data SAM and SamGrid (Fermilab) used by the D0 and CDF experiments and being tested by MINOS combines SAM (data handling system) with JIM (Job Information and Management)

21 Applications – other experiments
ZEUS a large particle detector at the electron-proton collider HERA at the DESY laboratory in Hamburg needs the Grid to respond to increasing demand for MC production up to 40 million events per month on Grid since August 2004 QCDGrid For Quantum Chromodynamics (QCD) Currently a 6-site data grid Key technologies used - Globus Toolkit 2.4 - gLite eXist XML database ILDG Browser Client 1.5 Multi-terabyte storage facility

22 Applications - LHC ATLAS LHCb CMS
GANGA software framework (jointly with LHCb) data challenges producing Monte Carlo data 10 million CPU hours per year LHCb DIRAC software to submit analysis jobs using Grid 2006 analysis job completion efficiency improved to 91% CMS Monte Carlo production, data transfer, job submission CMS transfers top a petabyte a month for the last three months


Download ppt "Building a UK Computing Grid for Particle Physics"

Similar presentations


Ads by Google