Presentation is loading. Please wait.

Presentation is loading. Please wait.

CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it Visit of Professor Jerzy Szwed Under Secretary of State Ministry of Science and Higher.

Similar presentations


Presentation on theme: "CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it Visit of Professor Jerzy Szwed Under Secretary of State Ministry of Science and Higher."— Presentation transcript:

1 CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it Visit of Professor Jerzy Szwed Under Secretary of State Ministry of Science and Higher Education Poland Tuesday 23 rd February 2010 The LHC Computing Grid Frédéric Hemmer IT Department Head

2 CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it 7000 tons, 150 million sensors generating data 40 millions times per second i.e. a petabyte/s The ATLAS experiment 2 The LHC Computing Grid, February 2010 Frédéric Hemmer

3 3 A collision at LHC 3 The LHC Computing Grid, February 2010 Frédéric Hemmer

4 Ian.Bird@cern.ch 4 The Data Acquisition 4 The LHC Computing Grid, February 2010 Frédéric Hemmer

5 Tier 0 at CERN: Acquisition, First pass processing Storage & Distribution 1.25 GB/sec (ions)

6 CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it 6 The LHC Data Challenge The accelerator will run for 10- 15 years Experiments will produce about 15 Million Gigabytes of data each year (about 20 million CDs!) LHC data analysis requires a computing power equivalent to ~100,000 of today's fastest PC processors Requires many cooperating computer centres, as CERN can only provide ~20% of the capacity The LHC Computing Grid, February 2010 Frédéric Hemmer

7 CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it The LHC Computing Grid, February 2010 7 CPU DiskTape Computing Resources Frédéric Hemmer

8 CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it The LHC Computing Grid, March 2009 8 Solution: the Grid Use the Grid to unite computing resources of particle physics institutes around the world The World Wide Web provides seamless access to information that is stored in many millions of different geographical locations The Grid is an infrastructure that provides seamless access to computing power and data storage capacity distributed over the globe Frédéric Hemmer

9 CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it The LHC Computing Grid, February 2010 9 How does the Grid work? It makes multiple computer centres look like a single system to the end- user Advanced software, called middleware, automatically finds the data the scientist needs, and the computing power to analyse it. Middleware balances the load on different resources. It also handles security, accounting, monitoring and much more. Frédéric Hemmer

10 The LHC Computing Grid, February 2010 Tier 0 – Tier 1 – Tier 2 Tier-0 (CERN): Data recording Initial data reconstruction Data distribution Tier-1 (11 centres): Permanent storage Re-processing Analysis Tier-2 (~130 centres): Simulation End-user analysis 10Frédéric Hemmer

11 LCG Service Hierarchy Tier-0: the accelerator centre Data acquisition & initial processing Long-term data curation Distribution of data  Tier-1 centres Canada – Triumf (Vancouver) France – IN2P3 (Lyon) Germany – Forschunszentrum Karlsruhe Italy – CNAF (Bologna) Netherlands – NIKHEF/SARA (Amsterdam) Nordic countries – distributed Tier-1 Spain – PIC (Barcelona) Taiwan – Academia SInica (Taipei) UK – CLRC (Oxford) US – FermiLab (Illinois) – Brookhaven (NY) Tier-1: “online” to the data acquisition process  high availability Managed Mass Storage –  grid-enabled data service Data-heavy analysis National, regional support Tier-2: ~140 centres in ~35 countries Simulation End-user analysis – batch and interactive Frédéric Hemmer11The LHC Computing Grid, February 2010

12 CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it The CERN Tier-0 – 24x7 operator support and System Administration services to support 24x7 operation of all IT services. – Hardware installation & retirement (~7,000 hardware movements/year) – Management and Automation framework for large scale Linux clusters – Installed Capacity 6’300 systems, 39’000 processing cores – CPU servers, disk servers, infrastructure servers – Tenders planned or in progress: 2’400 systems, 16’000 processing cores 13’900 TB usable on 42’600 disk drives – Tenders planned or in progress: 19’000 TB usable on 20’000 disk drives 34’000 TB on 45’000 tape cartridges – (56’000 slots), 160 tape drives The LHC Computing Grid, February 2010 12 Frédéric Hemmer

13 Frédéric Hemmer, CERN, IT Departme nt 13 The LHC Computing Grid, February 2010 Frédéric Hemmer

14 The European Network Backbone LCG working group with Tier-1s and national/ regional research network organisations New GÉANT 2 – research network backbone  Strong correlation with major European LHC centres Swiss PoP at CERN 14Frédéric HemmerThe LHC Computing Grid, February 2010

15 Overall summary  November – Ongoing productions – Cosmics data taking November – December – Beam data and collisions – Productions + analysis December – February – Ongoing productions – Cosmics WLCG service has been running according to the defined procedures – Reporting and follow up of problems at same level – Middleware process – updates & patches – as planned 15The LHC Computing Grid, February 2010Frédéric Hemmer

16 2009 Physics Data Transfers Final readiness test (STEP’09) Preparation for LHC startupLHC physics data Nearly 1 petabyte/week More than 8 GB/s peak transfers from Castor fileservers at CERN 16The LHC Computing Grid, February 2010Frédéric Hemmer

17 Reliabilities This is not the full picture: Experiment-specific measures give complementary view Need to be used together with some understanding of underlying issues 17The LHC Computing Grid, February 2010Frédéric Hemmer

18 From APEL accounting portal for Aug.’08 to Jan.’09; #s in MSI2k AliceATLASCMSLHCbTotal Tier-1s6.2432.0330.732.5071.5034.3% Tier-2s9.6152.2355.0420.14137.0265.7% Total15.8584.2685.7722.64208.52 Main outstanding issues related to service/site reliability Frédéric HemmerThe LHC Computing Grid, February 201018

19 GRID COMPUTING NOW 19The LHC Computing Grid, February 2010Frédéric Hemmer

20 Archeology Astronomy Astrophysics Civil Protection Comp. Chemistry Earth Sciences Finance Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences … >250 sites 48 countries >50,000 CPUs >20 PetaBytes >10,000 users >150 VOs >150,000 jobs/day LCG has been the driving force for the European multi-science Grid EGEE (Enabling Grids for E-sciencE) EGEE is now a global effort, and the largest Grid infrastructure worldwide Co-funded by the European Commission (Cost: ~170 M€ over 6 years, funded by EU ~100M€) EGEE already used for >100 applications, including… Impact of the LHC Computing Grid in Europe

21 21Health-e-Child Similarity Search Temporal Modelling Visual Data Mining Genetics Profiling Treatment Response Inferring Outcome Biomechanical Models Tumor Growth Modelling Semantic Browsing Personalised Simulation Surgery Planning RV and LV Automatic Modelling Measurement of Pulmonary Trunk

22 Example: The Grid Attacks Avian Flu The Grid has been used to analyse 300,000 possible potential drug compounds against bird flu virus, H5N1. 2000 computers at 60 computer centres in Europe, Russia, Asia and Middle East ran during four weeks - the equivalent of 100 years on a single computer. BioSolveIt donated 6000 FlexX licenses.BioSolveIt Results –Avian flu: 20% of compounds better than Tamiflu –Malaria: 6/30 compounds similar or better than PepstatinA –Ongoing tests with compounds from later calculations. Neuraminidase, one of the two major surface proteins of influenza viruses, facilitating the release of virions from infected cells. Image Courtesy Ying-Ta Wu, AcademiaSinica.

23 Example: Geocluster industrial application The first industrial application (GeoCluster) successfully running on EGEE Developed by the Compagnie Générale de Géophysique (CGG) in France, doing geophysical simulations for oil, gas, mining and environmental industries EGEE technology helps CGG to federate its computing resources around the globe

24 Sustainability Need to prepare for permanent Grid infrastructure Ensure a high quality of service for all user communities Independent of short project funding cycles Infrastructure managed in collaboration with National Grid Initiatives (NGIs) European Grid Initiative (EGI)

25 CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it For more information about the Grid: Thank you for your kind attention! www.cern.ch/lcgwww.eu-egee.org www.eu-egi.org/ www.gridcafe.org 25 The LHC Computing Grid, February 2010 Frédéric Hemmer

26 CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it Th e LH C Co mp uti ng Gri d, Fe br ua ry 20 10 26


Download ppt "CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it Visit of Professor Jerzy Szwed Under Secretary of State Ministry of Science and Higher."

Similar presentations


Ads by Google