WLCG – Status and Plans Ian Bird WLCG Project Leader openlab Board of Sponsors CERN, 23 rd April 2010.

Slides:



Advertisements
Similar presentations
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Advertisements

Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Welcome to CERN Research Technology Training Collaborating.
1 Developing Countries Access to Scientific Knowledge Ian Willers CERN, Switzerland.
Project Status Report Ian Bird Computing Resource Review Board 30 th October 2012 CERN-RRB
13 October 2014 Eric Grancher, head of database services, CERN IT Manuel Martin Marquez, data scientist, CERN openlab.
Massive Computing at CERN and lessons learnt
LHC: An example of a Global Scientific Community Sergio Bertolucci CERN 5 th EGEE User Forum Uppsala, 14 th April 2010.
Resources and Financial Plan Sue Foffano WLCG Resource Manager C-RRB Meeting, 12 th October 2010.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Emmanuel Tsesmelis 2 nd CERN School Thailand 2012 Suranaree University of Technology.
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
Experience with the WLCG Computing Grid 10 June 2010 Ian Fisk.
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
Petabyte-scale computing for LHC Ian Bird, CERN WLCG Project Leader WLCG Project Leader ISEF Students 18 th June 2012 Accelerating Science and Innovation.
User views from outside of Western Europe MarkoBonac, Arnes, Slovenia.
Rackspace Analyst Event Tim Bell
Notur: - Grant f.o.m is 16.5 Mkr (was 21.7 Mkr) - No guarantees that funding will increase in Same level of operations maintained.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
Workshop summary Ian Bird, CERN WLCG Workshop; DESY, 13 th July 2011 Accelerating Science and Innovation Accelerating Science and Innovation.
Ian Bird LCG Project Leader OB Summary GDB 10 th June 2009.
Ian Bird LCG Project Leader Project status report WLCG LHCC Referees’ meeting 16 th February 2010.
Progress in Computing Ian Bird ICHEP th July 2010, Paris
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Jürgen Knobloch/CERN Slide 1 A Global Computer – the Grid Is Reality by Jürgen Knobloch October 31, 2007.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
WelcomeWelcome CSEM – CERN Day 23 rd May 2013 CSEM – CERN Day 23 rd May 2013 to Accelerating Science and Innovation to Accelerating Science and Innovation.
CERN as a World Laboratory: From a European Organization to a global facility CERN openlab Board of Sponsors July 2, 2010 Rüdiger Voss CERN Physics Department.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
Bert van Pinxteren GA 24, Budapest, 21 October TERENA Compendium 2005: Summary of key findings Bert van Pinxteren TERENA
Ian Bird LCG Project Leader WLCG Update 6 th May, 2008 HEPiX – Spring 2008 CERN.
Project Status Report Ian Bird Computing Resource Review Board 20 th April, 2010 CERN-RRB
LHC Computing, CERN, & Federated Identities
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Frédéric Hemmer IT Department 26 th January 2010 Visit of Michael Dell 1 Frédéric.
MND review. Main directions of work  Development and support of the Experiment Dashboard Applications - Data management monitoring - Job processing monitoring.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
WLCG Worldwide LHC Computing Grid Markus Schulz CERN-IT-GT August 2010 Openlab Summer Students.
Ian Bird WLCG Networking workshop CERN, 10 th February February 2014
Ian Bird LCG Project Leader WLCG Status Report CERN-RRB th April, 2008 Computing Resource Review Board.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 1 st March 2011 Visit of Dr Manuel Eduardo Baldeón.
The Mission of CERN  Push back  Push back the frontiers of knowledge E.g. the secrets of the Big Bang …what was the matter like within the first moments.
WLCG after 1 year with data: Prospects for the future Ian Bird; WLCG Project Leader openlab BoS meeting CERN4 th May 2011.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
WLCG: The 1 st year with data & looking to the future WLCG: Ian Bird, CERN WLCG Project Leader WLCG Project LeaderLCG-France; Strasbourg; 30 th May 2011.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
European Innovation Scoreboard European Commission Enterprise and Industry DG EPG DGs meeting, May 2008.
Computing infrastructures for the LHC: current status and challenges of the High Luminosity LHC future Worldwide LHC Computing Grid (WLCG): Distributed.
Best Sustainable Development Practices for Food Security UV-B radiation: A Specific Regulator of Plant Growth and Food Quality in a Changing Climate The.
Dr. Ian Bird LHC Computing Grid Project Leader Göttingen Tier 2 Inauguration 13 th May 2008 Challenges and Opportunities.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
Ian Bird, CERN WLCG Project Leader Amsterdam, 24 th January 2012.
Evolution of storage and data management
The CMS Experiment at LHC
Ian Bird WLCG Workshop San Francisco, 8th October 2016
Grid Computing in HIGH ENERGY Physics
LHC Computing Grid Status of Resources Financial Plan and Sue Foffano
Project Status Report Computing Resource Review Board Ian Bird
Long-term Grid Sustainability
Dagmar Adamova, NPI AS CR Prague/Rez
The LHC Computing Grid Visit of Her Royal Highness
WLCG Collaboration Workshop;
New strategies of the LHC experiments to meet
Input on Sustainability
LHC Data Analysis using a worldwide computing grid
Overview & Status Al-Ain, UAE November 2007.
The LHC Computing Grid Visit of Professor Andreas Demetriou
Presentation transcript:

WLCG – Status and Plans Ian Bird WLCG Project Leader openlab Board of Sponsors CERN, 23 rd April 2010

Ian Bird, CERN2... And now at 7 TeV

Ian Bird, CERN 3 Collisions at the LHC: summary

pp collisions at 14 TeV at cm -2 s -1 How to extract this: (Higgs  4 muons) From this: With: 20 proton-proton collisions overlap And this repeats every 25 ns… A very difficult environment … Z  at LEP (e+e-) 4Ian Bird, CERN

5 The LHC Computing Challenge  Signal/Noise: (10 -9 offline)  Data volume High rate * large number of channels * 4 experiments  15 PetaBytes of new data each year  Compute power Event complexity * Nb. events * thousands users  200 k of (today's) fastest CPUs  45 PB of disk storage  Worldwide analysis & funding Computing funding locally in major regions & countries Efficient analysis everywhere  GRID technology

A distributed computing infrastructure to provide the production and analysis environments for the LHC experiments Managed and operated by a worldwide collaboration between the experiments and the participating computer centres The resources are distributed – for funding and sociological reasons Our task is to make use of the resources available to us – no matter where they are located – We know it would be simpler to put all the resources in 1 or 2 large centres – This is not an option... today Ian Bird, CERN6 WLCG – what and why?

De-FZK US-FNAL Ca- TRIUMF NDGF CERN Barcelona/PIC Lyon/CCIN2P3 US-BNL UK-RAL Taipei/ASGC Ian Bird, CERN 7 Today we have 49 MoU signatories, representing 34 countries: Australia, Austria, Belgium, Brazil, Canada, China, Czech Rep, Denmark, Estonia, Finland, France, Germany, Hungary, Italy, India, Israel, Japan, Rep. Korea, Netherlands, Norway, Pakistan, Poland, Portugal, Romania, Russia, Slovenia, Spain, Sweden, Switzerland, Taipei, Turkey, UK, Ukraine, USA. WLCG Today Tier 0; 11 Tier 1s; 61 Tier 2 federations (121 Tier 2 sites) Amsterdam/NIKHEF-SARA Bologna/CNAF

Running increasingly high workloads: – Jobs in excess of 650k / day; Anticipate millions / day soon – CPU equiv. ~100k cores Workloads are: – Real data processing – Simulations – Analysis – more and more (new) users Data transfers at unprecedented rates  next slide Ian Bird, CERN8 Today WLCG is: e.g. CMS: no. users doing analysis

Ian Bird, CERN9 Data transfers Final readiness test (STEP’09) Preparation for LHC startupLHC physics data Nearly 1 petabyte/week 2009: STEP09 + preparation for data Castor traffic last week: > 4 GB/s input > 13 GB/s served Castor traffic last week: > 4 GB/s input > 13 GB/s served Real data – from 30/3

Ian Bird, CERN10 Fibre cut during STEP’09: Redundancy meant no interruption Fibre cut during STEP’09: Redundancy meant no interruption

MoU defines key performance and support metrics for Tier 1 and Tier 2 sites – Reliabilities are an approximation for some of these – Also metrics on response times, resources, etc. The MoU has been an important tool in bringing services to an acceptable level Ian Bird, CERN11 Service quality: defined in MoU

Ian Bird, CERN12 From testing to data: Independent Experiment Data Challenges Service Challenges proposed in 2004 To demonstrate service aspects: -Data transfers for weeks on end -Data management -Scaling of job workloads -Security incidents (“fire drills”) -Interoperability -Support processes Service Challenges proposed in 2004 To demonstrate service aspects: -Data transfers for weeks on end -Data management -Scaling of job workloads -Security incidents (“fire drills”) -Interoperability -Support processes SC1 Basic transfer rates SC2 Basic transfer rates SC3 Sustained rates, data management, service reliability SC4 Nominal LHC rates, disk  tape tests, all Tier 1s, some Tier 2s CCRC’08 Readiness challenge, all experiments, ~full computing models STEP’09 Scale challenge, all experiments, full computing models, tape recall + analysis Focus on real and continuous production use of the service over several years (simulations since 2003, cosmic ray data, etc.) Data and Service challenges to exercise all aspects of the service – not just for data transfers, but workloads, support structures etc. Focus on real and continuous production use of the service over several years (simulations since 2003, cosmic ray data, etc.) Data and Service challenges to exercise all aspects of the service – not just for data transfers, but workloads, support structures etc. e.g. DC04 (ALICE, CMS, LHCb)/DC2 (ATLAS) in 2004 saw first full chain of computing models on grids

Has meant very rapid data distribution and analysis – Data is processed and available at Tier 2s within hours! Ian Bird, CERN13 Readiness of the computing CMS ATLAS LHCb

Ian Bird, CERN14 And physics output...

CERN and the HEP community have been involved with grids from the beginning Recognised as a key technology for implementing the LHC computing model HEP work with EC-funded EDG/EGEE in Europe, iVDGL/Grid3/OSG etc. in US has been of clear mutual benefit – Infrastructure development driven by HEP needs – Robustness needed by WLCG is benefitting other communities – Transfer of technology from HEP Ganga, AMGA, etc used by many communities now Ian Bird, CERN15 Grids & HEP: Common history

LHC, the experiments, & computing have taken ~20 years to build and commission They will run for at least 20 years We must be able to rely on long term infrastructures – Global networking – Strong and stable NGIs (or their evolution) That should be eventually self-sustaining – Long term sustainability - must come out of the current short term project funding cycles Ian Bird, CERN16 Large scale = long times

Long term sustainability of the infrastructure Need to adapt to changing technologies – Major re-think of storage and data access – Virtualisation as a solution for job management – Complexity of the middleware compared to the actual use cases Network infrastructure – This is the most reliable service we have – Invest in networks and make full use of the distributed system (i.e. Leave data where it is)? Ian Bird, CERN17 Longer term future We have achieved what we set out to do – provide an environment for LHC computing; And we have spun-off significant general science grid infrastructures BUT: is it sustainable in the long term??? We have achieved what we set out to do – provide an environment for LHC computing; And we have spun-off significant general science grid infrastructures BUT: is it sustainable in the long term???

Grid middleware – Is still dependent upon project funding – but this is a very risky strategy now – Limited development support in EMI (for example) Must (continue) to push for mainstream, industrial solutions: – Messaging, Nagios for monitoring are good examples – Fabric and job management are good candidates for non- HEP-specific solutions Because.... Data Management is not solved – And we must invest significant effort here to improve the reliability and overall usability; must reduce complexity (e.g. SRM – functionality and implementations) – But – we are not alone – other sciences expect to have significant data volumes soon – Must take care not to have special solutions Ian Bird, CERN18 Sustainability

WLCG needs to be able to rely on strong and stable global e-science infrastructures – In Europe this means the NGIs and EGI WLCG is a very structured large user community – It can serve as a model for others – they can also learn from our mistakes CERN has connections to the other EIROs which are also large scientific communities, several of which are associated with ESFRI projects – Can play a role in bringing these to EGI CERN also supports other visible communities: – E.g. UNOSat Ian Bird, CERN19 CERN, WLCG and EGI – the future

HEP has been a leader in needing and building global collaborations in order to achieve its goals It is no longer unique – many other sciences now have similar needs – Life sciences, astrophysics, ESFRI projects – Anticipate huge data volumes – Need global collaborations There are important lessons from our experiences, – HEP was able to do this because it has a long history of global collaboration; missing from many other sciences We must also collaborate on common solutions where possible Ian Bird, CERN20 LHC is not alone

Ian Bird, CERN Summary LHC is operational and producing physics! WLCG is successfully delivering computing – at the scales foreseen Time to consider how to address long term sustainability 21