Presentation on theme: "GridPP Building a UK Computing Grid for Particle Physics www.gridpp.ac.uk A PPARC funded project."— Presentation transcript:
GridPP Building a UK Computing Grid for Particle Physics www.gridpp.ac.uk A PPARC funded project
CERN Large Hadron Collider The worlds highest energy particle physics accelerator Due to turn on in 2007 Looking for the Higgs boson, supersymmetry, quark-gluon plasmas, micro-black holes….
LHC will collide beams of protons at an energy of 14 TeV Using the latest super-conducting technologies, it will operate at about – 270 0 C, just above absolute zero of temperature. With its 27 km circumference, the accelerator will be the largest superconducting installation in the world. How will LHC work?
4 LHC Experiments CMS - general purpose detector - muon tracking, electromagnetic calorimeter, central tracking and hadron calorimeter ATLAS - general purpose: origin of mass, supersymmetry, micro-black holes - 2,000 scientists from 34 countries LHCb - to study the differences between matter and antimatter - producing over 100 million b and b-bar mesons each year ALICE - heavy ion collisions, to create quark-gluon plasmas - 50,000 particles in each collision
A particle collision = an event Physicist's goal is to count, trace and characterize all the particles produced and fully reconstruct the process. Among all tracks, the presence of special shapes is the sign for the occurrence of interesting interactions. One way to find the Higgs boson: look for characteristic decay pattern producing 4 muons The LHC Data Challenge
Starting from this event…..we are looking for this signature Selectivity: 1 in 10 13 Like looking for 1 person in a thousand world populations Or for a needle in 20 million haystacks! The LHC Data Challenge
LHC data 40 million collisions per second After filtering, 100 collisions of interest per second A Megabyte of data digitised for each collision = recording rate of 0.1 Gigabytes/sec 10 10 collisions recorded each year = 10 Petabytes/year of data Concorde (15 Km) Balloon (30 Km) DVD stack with 1 year LHC data (~ 20 Km) Mt. Blanc (4.8 Km) 1 Gigabyte (1GB) = 1000MB A CD album 1 Terabyte (1TB) = 1000GB World annual book production 1 Petabyte (1PB) = 1000TB Annual production of one LHC experiment 1 Exabyte (1EB) = 1000 PB World annual information production
LHC processing Simulation: start from theory and detector characteristics and compute what detector should have seen Reconstruction: transform signals from the detector to physical properties (energies, charge of particles,..) Analysis: Find collisions with similar features, use of complex algorithms to extract physics…
LHC Computing Grid (LCG) By 2007: - 100,000 CPU - 300 institutions worldwide - building on software being developed in advanced grid technology projects, both in Europe and in the USA Currently running on around 200 sites.
LCG monitoring applet Monitor: –resource brokers –virtual organisations ATLAS CMS LHCb DTeam Other SQL queries to logging and book-keeping database
R-GMA Information and monitoring tool developed by Rutherford Appleton Laboratory Matches up users jobs with suitable resources Monitors the progress of jobs Deployed in LCG middleware and gLite R-GMA structure
Enabling Grids for E-sciencE Deliver a 24/7 Grid service to European science build a consistent, robust and secure Grid network that will attract additional computing resources. continuously improve and maintain the middleware in order to deliver a reliable service to users. attract new users from industry as well as science and ensure they receive the high standard of training and support they need. 100 million euros/4years, funded by EC >400 software engineers + service support 70 European partners
GridPP – the UKs contribution to LCG 19 UK Universities, CCLRC (RAL & Daresbury) and CERN Funded by the UK Particle Physics and Astronomy Research Council (PPARC) £33m, over 2001-2007
Applications - LHC ATLAS GANGA software framework (jointly with LHCb) data challenges producing Monte Carlo data LHCb DIRAC software to submit jobs using Grid 2004 data challenge 190m events, 65TB CMS Monte Carlo production, data transfer, job submission Data challenge 75 million events, 150TB
Applications – US experiments BaBar (SLAC) - investigating asymmetry between matter and antimatter - aim to use LCG to run: - Monte Carlo events - analysis of real data SAM and SamGrid (Fermilab) - used by the D0 and CDF experiments and being tested by MINOS - combines SAM (datahandling system) with JIM (Job information and management)
Applications – other experiments ZEUS a large particle detector at the electron-proton collider HERA at the DESY laboratory in Hamburg needs the Grid to respond to increasing demand for MC production 1 million events on Grid since August 2004 QCDGrid For Quantum Chromodynamics (QCD) Currently a 6-site data grid Key technologies used - Globus Toolkit 2.4 - European Data Grid -eXist XML database Multi-terabyte storage facility
Middleware Development Network Monitoring Security Information Services
UK Tier-1/A Centre – Rutherford Appleton Laboratory High quality data services National and International Role UK focus for International Grid development Grid Operations Centre 1000 Dual CPU 200 TB Disk 220 TB Tape (Capacity 1PB)
UK Tier-2 Centres ScotGrid Durham Edinburgh Glasgow NorthGrid Daresbury Lancaster Liverpool Manchester Sheffield SouthGrid Birmingham Bristol Cambridge Oxford RAL PPD Warwick LondonGrid Brunel Imperial QMUL RHUL UCL
Your consent to our cookies if you continue to use this website.