The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 1 st March 2011 Visit of Dr Manuel Eduardo Baldeón.

Slides:



Advertisements
Similar presentations
Programme: 145 sessions & social events
Advertisements

EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Why Grids Matter to Europe Bob Jones EGEE.
Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
INFSO-RI Enabling Grids for E-sciencE The EGEE project Fabrizio Gagliardi Project Director EGEE CERN, Switzerland Research Infrastructures.
Sue Foffano LCG Resource Manager WLCG – Resources & Accounting LHCC Comprehensive Review November, 2007 LCG.
Welcome to CERN Accelerating Science and Innovation 2 nd March 2015 – Bidders Conference – DO-29161/EN.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Welcome to CERN Research Technology Training Collaborating.
Project Status Report Ian Bird Computing Resource Review Board 30 th October 2012 CERN-RRB
13 October 2014 Eric Grancher, head of database services, CERN IT Manuel Martin Marquez, data scientist, CERN openlab.
Massive Computing at CERN and lessons learnt
Solar Physics Board Meeting Rio de Janeiro July, 2009.
LHC: An example of a Global Scientific Community Sergio Bertolucci CERN 5 th EGEE User Forum Uppsala, 14 th April 2010.
Resources and Financial Plan Sue Foffano WLCG Resource Manager C-RRB Meeting, 12 th October 2010.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Emmanuel Tsesmelis 2 nd CERN School Thailand 2012 Suranaree University of Technology.
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – October 2006 LHC Computing and Grids Frédéric Hemmer IT Deputy Department Head October 10,
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
Petabyte-scale computing for LHC Ian Bird, CERN WLCG Project Leader WLCG Project Leader ISEF Students 18 th June 2012 Accelerating Science and Innovation.
Rackspace Analyst Event Tim Bell
IST E-infrastructure shared between Europe and Latin America High Energy Physics Applications in EELA Raquel Pezoa Universidad.
Notur: - Grant f.o.m is 16.5 Mkr (was 21.7 Mkr) - No guarantees that funding will increase in Same level of operations maintained.
Capitalist. Main Points In a capitalist or free-market country, people can own their own businesses and property. People can also buy services for private.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
Ian Bird LCG Project Leader Project status report WLCG LHCC Referees’ meeting 16 th February 2010.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks The EGEE project’s status and future Bob.
Progress in Computing Ian Bird ICHEP th July 2010, Paris
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Karel van der Toorn President University of Amsterdam Wednesday 10 th.
Jürgen Knobloch/CERN Slide 1 A Global Computer – the Grid Is Reality by Jürgen Knobloch October 31, 2007.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014.
WelcomeWelcome CSEM – CERN Day 23 rd May 2013 CSEM – CERN Day 23 rd May 2013 to Accelerating Science and Innovation to Accelerating Science and Innovation.
CERN as a World Laboratory: From a European Organization to a global facility CERN openlab Board of Sponsors July 2, 2010 Rüdiger Voss CERN Physics Department.
Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
LHC Computing, CERN, & Federated Identities
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Frédéric Hemmer IT Department 26 th January 2010 Visit of Michael Dell 1 Frédéric.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
WLCG Worldwide LHC Computing Grid Markus Schulz CERN-IT-GT August 2010 Openlab Summer Students.
tons, 150 million sensors generating data 40 millions times per second producing 1 petabyte per second The ATLAS experiment.
Ian Bird WLCG Networking workshop CERN, 10 th February February 2014
The Mission of CERN  Push back  Push back the frontiers of knowledge E.g. the secrets of the Big Bang …what was the matter like within the first moments.
Chris Onions Update on training for ATLAS computing 19/5/99 1 Update on Training for ATLAS Computing Chris Onions.
WLCG after 1 year with data: Prospects for the future Ian Bird; WLCG Project Leader openlab BoS meeting CERN4 th May 2011.
Figure 1. PARTICIPATING STEM CELL DONOR REGISTRIES Number of registries Year ©BMDW.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
WLCG: The 1 st year with data & looking to the future WLCG: Ian Bird, CERN WLCG Project Leader WLCG Project LeaderLCG-France; Strasbourg; 30 th May 2011.
Global Aluminium Pipe and Tube Market to 2018 (Market Size, Growth, and Forecasts in Nearly 60 Countries) Published Date: Jul-2014 Reports and Intelligence.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
WLCG – Status and Plans Ian Bird WLCG Project Leader openlab Board of Sponsors CERN, 23 rd April 2010.
European Innovation Scoreboard European Commission Enterprise and Industry DG EPG DGs meeting, May 2008.
Computing infrastructures for the LHC: current status and challenges of the High Luminosity LHC future Worldwide LHC Computing Grid (WLCG): Distributed.
Best Sustainable Development Practices for Food Security UV-B radiation: A Specific Regulator of Plant Growth and Food Quality in a Changing Climate The.
Dr. Ian Bird LHC Computing Grid Project Leader Göttingen Tier 2 Inauguration 13 th May 2008 Challenges and Opportunities.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
The CMS Experiment at LHC
The 5 minutes tour of CERN The 5 minutes race of CERN
Ian Bird WLCG Workshop San Francisco, 8th October 2016
Grid Computing in HIGH ENERGY Physics
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
LHC Computing Grid Status of Resources Financial Plan and Sue Foffano
Project Status Report Computing Resource Review Board Ian Bird
Dagmar Adamova, NPI AS CR Prague/Rez
WLCG Collaboration Workshop;
Overview & Status Al-Ain, UAE November 2007.
The LHC Computing Grid Visit of Professor Andreas Demetriou
Electrification business
Presentation transcript:

The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 1 st March 2011 Visit of Dr Manuel Eduardo Baldeón National Secretary of Higher Education, Science, Technology and Innovation Republic of Ecuador

The LHC Computing Challenge 2  Signal/Noise: (10 -9 offline)  Data volume High rate * large number of channels * 4 experiments  15 PetaBytes of new data each year  Compute power Event complexity * Nb. events * thousands users  100 k of (today's) fastest CPUs  45 PB of disk storage  Worldwide analysis & funding Computing funding locally in major regions & countries Efficient analysis everywhere  GRID technology 26 June 2009Ian Bird, CERN

150 million sensors deliver data … … 40 million times per second

4 A collision at LHC 26 June 2009Ian Bird, CERN

5 The Data Acquisition 26 June 2009

Tier 0 at CERN: Acquisition, First pass reconstruction, Storage & Distribution 1.25 GB/sec (ions) 6

7 WLCG data processing model Tier-1 (11 centres): Permanent storage Re-processing Analysis Tier-0 (CERN): Data recording Initial data reconstruction Data distribution Tier-2 (~130 centres): Simulation End-user analysis

Lyon/CCIN2P3 Barcelona/PIC De-FZK US-FNAL Ca- TRIUMF NDGF CERN US-BNL UK-RAL Taipei/ASGC Ian Bird, CERN826 June 2009 Today we have 49 MoU signatories, representing 34 countries: Australia, Austria, Belgium, Brazil, Canada, China, Czech Rep, Denmark, Estonia, Finland, France, Germany, Hungary, Italy, India, Israel, Japan, Rep. Korea, Netherlands, Norway, Pakistan, Poland, Portugal, Romania, Russia, Slovenia, Spain, Sweden, Switzerland, Taipei, Turkey, UK, Ukraine, USA. Today we have 49 MoU signatories, representing 34 countries: Australia, Austria, Belgium, Brazil, Canada, China, Czech Rep, Denmark, Estonia, Finland, France, Germany, Hungary, Italy, India, Israel, Japan, Rep. Korea, Netherlands, Norway, Pakistan, Poland, Portugal, Romania, Russia, Slovenia, Spain, Sweden, Switzerland, Taipei, Turkey, UK, Ukraine, USA. WLCG Collaboration Status Tier 0; 11 Tier 1s; 64 Tier 2 federations WLCG Collaboration Status Tier 0; 11 Tier 1s; 64 Tier 2 federations Amsterdam/NIKHEF-SARA Bologna/CNAF

Worldwide resources Today >140 sites ~250k CPU cores ~100 PB disk Today >140 sites ~250k CPU cores ~100 PB disk We said we would have: 15 PB new data/year 100 (or 200) k “today’s fastest” CPU 45 PB dis k We said we would have: 15 PB new data/year 100 (or 200) k “today’s fastest” CPU 45 PB dis k

Successes: We have a working grid infrastructure Experiments have truly distributed models Has enabled physics output in a very short time Network traffic close to that planned – and the network is extremely reliable Significant numbers of people doing analysis (at Tier 2s) Successes: We have a working grid infrastructure Experiments have truly distributed models Has enabled physics output in a very short time Network traffic close to that planned – and the network is extremely reliable Significant numbers of people doing analysis (at Tier 2s)

1 st year of LHC data Writing up to 70 TB / day to tape (~ 70 tapes per day) Writing up to 70 TB / day to tape (~ 70 tapes per day) Data written to tape (Gbytes/day) Disk Servers (Gbytes/s) Tier 0 storage: Accepts data at average of 2.6 GB/s; peaks > 7 GB/s Serves data at average of 7 GB/s; peaks > 18 GB/s CERN Tier 0 moves ~ 1 PB data per day Tier 0 storage: Accepts data at average of 2.6 GB/s; peaks > 7 GB/s Serves data at average of 7 GB/s; peaks > 18 GB/s CERN Tier 0 moves ~ 1 PB data per day Stored ~ 15 PB in 2010 >5GB/s to tape during HI ~ 2 PB/month to tape pp ~ 4 PB to tape in HI >5GB/s to tape during HI ~ 2 PB/month to tape pp ~ 4 PB to tape in HI

Significant use of Tier 2s for analysis – frequently-expressed concern that too much analysis would be done at CERN is not reflected CPU – July Tier 0 capacity underused in general – But this is expected to change as luminosity increases

Data transfer capability today able to manage much higher bandwidths than expected/feared/planned Ian Bird, CERN13 Data transfer Fibre cut during STEP’09: Redundancy meant no interruption Fibre cut during STEP’09: Redundancy meant no interruption Data transfer: SW: gridftp, FTS (interacts with endpoints, recovery), experiment layer HW: light paths, routing, coupling to storage Operational: monitoring Data transfer: SW: gridftp, FTS (interacts with endpoints, recovery), experiment layer HW: light paths, routing, coupling to storage Operational: monitoring & the academic/research networks for Tier1/2!

Ian Bird, CERN14 Data transfers Final readiness test (STEP’09) Preparation for LHC startupLHC physics data Nearly 1 petabyte/week 2009: STEP09 + preparation for data Traffic on OPN up to 70 Gb/s! - ATLAS early reprocessing campaigns Traffic on OPN up to 70 Gb/s! - ATLAS early reprocessing campaigns LHC running: April – Sept 2010

WLCG has been leveraged on both sides of the Atlantic, to benefit the wider scientific community – Europe: Enabling Grids for E-sciencE (EGEE) European Grid Infrastructure (EGI) – USA: Open Science Grid (OSG) (+ extension?) Many scientific applications  15 Impact of the LHC Computing Grid Archeology Astronomy Astrophysics Civil Protection Comp. Chemistry Earth Sciences Finance Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences … Archeology Astronomy Astrophysics Civil Protection Comp. Chemistry Earth Sciences Finance Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences …

The ROC-LA (Regional Operations Center – Latin America) – Started September 2009, joint initiative between: Brazilian Center for Research in Physics (CBPF, Brazil), Institute of Nuclear Science of UNAM (ICN-UNAM, Mexico) and Universidad de los Andes (Uniandes, Colombia). – Latin American groups in ALICE, ATLAS, CMS, LHCb, have grid sites supported by the ROC-LA Previously, EC-funded projects (such as EELA) had collaborated with Latin American groups – Ecuador participated in 2 nd phase of EELA Grid in Latin America