Presentation is loading. Please wait.

Presentation is loading. Please wait.

The LHC Computing Grid Visit of Her Royal Highness

Similar presentations


Presentation on theme: "The LHC Computing Grid Visit of Her Royal Highness"— Presentation transcript:

1 The LHC Computing Grid Visit of Her Royal Highness
Princess Maha Chakri Sirindhorn The Kingdom of Thailand Monday 16th March 2009 Frédéric Hemmer IT Department Head 1

2 generating data 40 millions times per second
The ATLAS experiment 7000 tons, 150 million sensors generating data 40 millions times per second i.e. a petabyte/s

3 A collision at LHC 3 3

4 The Data Acquisition 4 4

5 Tier 0 at CERN: Acquisition, First pass processing Storage & Distribution
The next two slides illustrate what happens to the data as it moves out from the experiments. Each of CMS and ATLAS produce data at the rate of 1 DVD-worth every 15 seconds or so, while the rates for LHCb and ALICE are somewhat less. However, during the part of the year when LHC will accelerate lead ions rather than protons, ALICE (which is an experiment dedicated to this kind of physics) alone will produce data at the rate of over 1 Gigabyte per second (1 DVD every 4 seconds). Initially the data is sent to the CERN Computer Centre – the Tier 0 - for storage on tape. Storage also implies guardianship of the data for the long term – the lifetime of the LHC – at least 20 years. This is not passive guardianship but requires migrating data to new technologies as they arrive. We need large scale sophisticated mass storage systems that not only are able to manage the incoming data streams, but also allow for evolution of technology (tapes and disks) without hindering access to the data. The Tier 0 centre provides the initial level of data processing – calibration of the detectors and the first reconstruction of the data. 1.25 GB/sec (ions) 5 5

6 The LHC Computing Grid, March 2009
The LHC Data Challenge The accelerator will be completed in 2008 and run for years Experiments will produce about 15 Million Gigabytes of data each year (about 20 million CDs!) LHC data analysis requires a computing power equivalent to ~100,000 of today's fastest PC processors Requires many cooperating computer centres, as CERN can only provide ~20% of the capacity The LHC Computing Grid, March 2009

7 The LHC Computing Grid, March 2009
Solution: the Grid Use the Grid to unite computing resources of particle physics institutes around the world The World Wide Web provides seamless access to information that is stored in many millions of different geographical locations The Grid is an infrastructure that provides seamless access to computing power and data storage capacity distributed over the globe The LHC Computing Grid, March 2009

8 The LHC Computing Grid, March 2009
How does the Grid work? It makes multiple computer centres look like a single system to the end-user Advanced software, called middleware, automatically finds the data the scientist needs, and the computing power to analyse it. Middleware balances the load on different resources. It also handles security, accounting, monitoring and much more. The LHC Computing Grid, March 2009

9 Tier 0 – Tier 1 – Tier 2 Tier-0 (CERN): Data recording
Initial data reconstruction Data distribution Tier-1 (11 centres): Permanent storage Re-processing Analysis Tier-2 (~130 centres): Simulation End-user analysis The Tier 0 centre at CERN stores the primary copy of all the data. A second copy is distributed between the 11 so-called Tier 1 centres. These are large computer centres in different geographical regions of the world, that also have a responsibility for long term guardianship of the data. The data is sent from CERN to the Tier 1s in real time over dedicated network connections. In order to keep up with the data coming from the experiments this transfer must be capable of running at around 1.3 GB/s continuously. This is equivalent to a full DVD every 3 seconds. The Tier 1 sites also provide the second level of data processing and produce data sets which can be used to perform the physics analysis. These data sets are sent from the Tier 1 sites to the around 130 Tier 2 sites. A Tier 2 is typically a university department or physics laboratories and are located all over the world in most of the countries that participate in the LHC experiments. Often, Tier 2s are associated to a Tier 1 site in their region. It is at the Tier 2s that the real physics analysis is performed. 10 10

10 Frédéric Hemmer, CERN, IT Department
WLCG Grid Activity WLCG ran ~ 44 million jobs in 2007 – workload has continued to increase Average over May total: (10.5 M) 340k jobs / day ATLAS average >200k jobs/day CMS average > 100k jobs/ day with peaks up to 200k This is the level needed for 2008/9 Data distribution from CERN to Tier-1 sites: All experiments exceeded required rates for extended periods, & simultaneously 1.3 GB/s target Well above 2 GB/s achievable All Tier 1s achieved (or exceeded) their target acceptance rates Latest test in May show that the data rates required for LHC start-up have been reached and can be sustained over long periods Frédéric Hemmer, CERN, IT Department

11 Example: The Grid Attacks Avian Flu
The Grid has been used to analyse 300,000 possible potential drug compounds against bird flu virus, H5N1. 2000 computers at 60 computer centres in Europe, Russia, Asia and Middle East ran during four weeks - the equivalent of 100 years on a single computer. BioSolveIt donated 6000 FlexX licenses. Results Avian flu: 20% of compounds better than Tamiflu Malaria: 6/30 compounds similar or better than PepstatinA Ongoing tests with compounds from later calculations. Neuraminidase, one of the two major surface proteins of influenza viruses, facilitating the release of virions from infected cells. Image Courtesy Ying-Ta Wu, AcademiaSinica. The LHC Computing Grid, March 2009

12 For more information about the Grid:
Thank you for your kind attention! 17


Download ppt "The LHC Computing Grid Visit of Her Royal Highness"

Similar presentations


Ads by Google