Visit of US House of Representatives Committee on Appropriations

Slides:



Advertisements
Similar presentations
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Why Grids Matter to Europe Bob Jones EGEE.
Advertisements

Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
Fighting Malaria With The Grid. Computing on The Grid The Internet allows users to share information across vast geographical distances. Using similar.
Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
INFSO-RI Enabling Grids for E-sciencE The EGEE project Fabrizio Gagliardi Project Director EGEE CERN, Switzerland Research Infrastructures.
An overview of the EGEE project Bob Jones EGEE Technical Director DTI International Technology Service-GlobalWatch Mission CERN – June 2004.
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Jerzy Szwed Under Secretary of State Ministry of Science and Higher.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
1 Developing Countries Access to Scientific Knowledge Ian Willers CERN, Switzerland.
Les Les Robertson WLCG Project Leader WLCG – Worldwide LHC Computing Grid Where we are now & the Challenges of Real Data CHEP 2007 Victoria BC 3 September.
Les Les Robertson LCG Project Leader LCG - The Worldwide LHC Computing Grid LHC Data Analysis Challenges for 100 Computing Centres in 20 Countries HEPiX.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
What is EGI? The European Grid Infrastructure enables access to computing resources for European scientists from all fields of science, from Physics to.
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – October 2006 LHC Computing and Grids Frédéric Hemmer IT Deputy Department Head October 10,
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
Frédéric Hemmer, CERN, IT Department The LHC Computing Grid – June 2006 The LHC Computing Grid Visit of the Comité d’avis pour les questions Scientifiques.
LHC Tier 2 Networking BOF Joe Metzger Joint Techs Vancouver 2005.
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
The Grid Beyond Physics Bob Jones, CERN EGEE project director.
Ian Bird LCG Deployment Manager EGEE Operations Manager LCG - The Worldwide LHC Computing Grid Building a Service for LHC Data Analysis 22 September 2006.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE – paving the way for a sustainable infrastructure.
INFSO-RI Enabling Grids for E-sciencE Plan until the end of the project and beyond, sustainability plans Dieter Kranzlmüller Deputy.
1 The LHC Computing Grid – February 2007 Frédéric Hemmer, CERN, IT Department LHC Computing and Grids Frédéric Hemmer Deputy IT Department Head January.
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Karel van der Toorn President University of Amsterdam Wednesday 10 th.
Jürgen Knobloch/CERN Slide 1 A Global Computer – the Grid Is Reality by Jürgen Knobloch October 31, 2007.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
ATLAS Tier 1 at BNL Overview Bruce G. Gibbard Grid Deployment Board BNL 5-6 September 2006.
Service, Operations and Support Infrastructures in HEP Processing the Data from the World’s Largest Scientific Machine Patricia Méndez Lorenzo (IT-GS/EIS),
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – September 2007 Wolfgang von Rüden, CERN, IT Department The LHC Computing Grid Frédéric Hemmer.
Ian Bird LCG Project Leader WLCG Update 6 th May, 2008 HEPiX – Spring 2008 CERN.
Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –
The LHC Computing Grid – February 2008 CERN’s Integration and Certification Services for a Multinational Computing Infrastructure with Independent Developers.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Frédéric Hemmer IT Department 26 th January 2010 Visit of Michael Dell 1 Frédéric.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
The LHC Computing Grid Visit of Dr. John Marburger
1 The LHC Computing Grid – April 2007 Frédéric Hemmer, CERN, IT Department The LHC Computing Grid A World-Wide Computer Centre Frédéric Hemmer Deputy IT.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 1 st March 2011 Visit of Dr Manuel Eduardo Baldeón.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
INFSO-RI Enabling Grids for E-sciencE EGEE general project update Fotis Karayannis EGEE South East Europe Project Management Board.
Top 5 Experiment Issues ExperimentALICEATLASCMSLHCb Issue #1xrootd- CASTOR2 functionality & performance Data Access from T1 MSS Issue.
Dr. Ian Bird LHC Computing Grid Project Leader Göttingen Tier 2 Inauguration 13 th May 2008 Challenges and Opportunities.
1 CERN, GRID and E-Science Contents: Introduction Computer intensive science Particle physics and the LHC The LHC data challenge LCG – the LHC Computing.
Bob Jones EGEE Technical Director
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
Grid Computing in HIGH ENERGY Physics
Physics Data Management at CERN
Kors Bos NIKHEF, Amsterdam.
IT Department and The LHC Computing Grid
U.S. ATLAS Tier 2 Computing Center
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Long-term Grid Sustainability
The LHC Computing Challenge
EGEE support for HEP and other applications
The LHC Computing Grid Visit of Her Royal Highness
Input on Sustainability
CERN, the LHC and the Grid
EGI – Organisation overview and outreach
LHC Data Analysis using a worldwide computing grid
EGI Webinar - Introduction -
Cécile Germain-Renaud Grid Observatory meeting 19 October 2007 Orsay
Collaboration Board Meeting
The LHC Grid Service A worldwide collaboration Ian Bird
LHC Tier 2 Networking BOF
The LHC Computing Grid Visit of Prof. Friedrich Wagner
Overview & Status Al-Ain, UAE November 2007.
The LHC Computing Grid Visit of Professor Andreas Demetriou
Presentation transcript:

Visit of US House of Representatives Committee on Appropriations The LHC Computing Grid Visit of US House of Representatives Committee on Appropriations Dr Ian Bird LCG Project Leader 28 March 2008

Ian Bird, CERN, IT Department The LHC Data Challenge The accelerator will be completed in 2008 and run for 10-15 years Experiments will produce about 15 Million Gigabytes of data each year (about 20 million CDs!) LHC data analysis requires a computing power equivalent to ~100,000 of today's fastest PC processors Requires many cooperating computer centres, as CERN can only provide ~20% of the capacity Ian Bird, CERN, IT Department

Ian Bird, CERN, IT Department CPU Disk Tape Ian Bird, CERN, IT Department

Ian Bird, CERN, IT Department Solution: the Grid Use the Grid to unite computing resources of particle physics institutions around the world The World Wide Web provides seamless access to information that is stored in many millions of different geographical locations The Grid is an infrastructure that provides seamless access to computing power and data storage capacity distributed over the globe Ian Bird, CERN, IT Department

Ian Bird, CERN, IT Department How does the Grid work? It makes multiple computer centres look like a single system to the end-user Advanced software, called middleware, automatically finds the data the scientist needs, and the computing power to analyse it. Middleware balances the load on different resources. It also handles security, accounting, monitoring and much more. Area of strong EU-US collaboration: Virtual Data Toolkit; ETICS-NMI : both have NSF funding Ian Bird, CERN, IT Department

150 million sensors deliver data … … 40 million times per second View of the ATLAS detector (under construction) 150 million sensors deliver data … … 40 million times per second Ian Bird, CERN, IT Department

Frédéric Hemmer, CERN, IT Department

Frédéric Hemmer, CERN, IT Department

Frédéric Hemmer, CERN, IT Department

LHC Computing Grid project (LCG) More than 140 computing centres 12 large centres for primary data management: CERN (Tier-0) and eleven Tier-1s 38 federations of smaller Tier-2 centres 12 Tier 2 in US: 16 universities and 1 national lab 35 countries involved BNL FNAL Ian Bird, CERN, IT Department

Ian Bird, CERN, IT Department LCG Service Hierarchy Tier-0: the accelerator centre Data acquisition & initial processing Long-term data curation Distribution of data  Tier-1 centres Canada – Triumf (Vancouver) France – IN2P3 (Lyon) Germany – Forschunszentrum Karlsruhe Italy – CNAF (Bologna) Netherlands – NIKHEF/SARA (Amsterdam) Nordic countries – distributed Tier-1 Spain – PIC (Barcelona) Taiwan – Academia SInica (Taipei) UK – CLRC (Oxford) US – FermiLab (Illinois) – Brookhaven (NY) Tier-1: “online” to the data acquisition process  high availability Managed Mass Storage –  grid-enabled data service Data-heavy analysis National, regional support Tier-2: ~140 centres in ~35 countries Simulation End-user analysis – batch and interactive Ian Bird, CERN, IT Department

Ian Bird, CERN, IT Department WLCG Collaboration The Collaboration 4 LHC experiments ~140 computing centres 12 large centres (Tier-0, Tier-1) 38 federations of smaller “Tier-2” centres ~35 countries Memorandum of Understanding Agreed in October 2005, now being signed Resources Focuses on the needs of the four LHC experiments Commits resources each October for the coming year 5-year forward look Agrees on standards and procedures Relies on EGEE and OSG (and other regional efforts) Ian Bird, CERN, IT Department

Data Transfer Data distribution from CERN to Tier-1 sites Target 2008 The target rate was achieved in 2006 under test conditions Autumn 2007 & CCRC’08 under more realistic experiment testing, reaching & sustaining target rate with ATLAS and CMS active Target 2008

Grid activity 165k/day WLCG ran ~ 44 M jobs in 2007 – workload has continued to increase – now at ~ 165k jobs/day Distribution of work across Tier0/Tier1/Tier 2 really illustrates the importance of the grid system Tier 2 contribution is around 50%; > 85% is external to CERN Tier 2 sites

Impact of the LHC Computing Grid in Europe LCG has been the driving force for the European multi-science Grid EGEE (Enabling Grids for E-sciencE) EGEE is now a global effort, and the largest Grid infrastructure worldwide Co-funded by the European Commission (Cost: ~130 M€ over 4 years, funded by EU ~70M€) EGEE already used for >100 applications, including… Bio-informatics Education, Training Medical Imaging Ian Bird, CERN, IT Department

Impact of the LCG in the US LCG is a major stakeholder in the US Open Science Grid national infrastructure. OSG is funded by the National Science Foundation and Department of Energy’s SciDAC program at $6M/year for 5 years, starting in 2006. More than 20 US LHC Universities (Tier-2 and Tier-3s) are members. OSG supports other sciences and provides training and common software. NSF further contributes through Grid middleware projects - Globus, Condor - and other collaborative projects such as DISUN. Weather Forecasting B. Etherton, RENCI Education and Training Ian Bird, CERN, IT Department

The EGEE project EGEE Objectives Started in April 2004, now in second phase with 91 partners in 32 countries 3rd phrase (2008-2010) in preparation Objectives Large-scale, production-quality grid infrastructure for e-Science Attracting new resources and users from industry as well as science Maintain and further improve “gLite” Grid middleware EGEE-II Partners in USA: Universities of: Chicago, Southern California & Wisconsin, and the Renaissance Computing Institute (RENCI) University of Chicago, USA Chicago University of Southern California, USA USC The Board of Regents of the University of Wisconsin System, Madison, WI USA UWisc-Madison Renaissance Computing Institute, Chapel Hill, NC, US RENCI (University of North Carolina-Chapel Hill, Duke University and North Carolina State University.) Ian Bird, CERN, IT Department

Registered Collaborating Projects 25 projects have registered as of September 2007: web page Applications improved services for academia, industry and the public Support Actions key complementary functions Infrastructures geographical or thematic coverage Ian Bird, CERN, IT Department 18

Collaborating infrastructures Ian Bird, CERN, IT Department

Archeology Astronomy Astrophysics Civil Protection Comp. Chemistry Earth Sciences Finance Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences … >250 sites 48 countries >50,000 CPUs >20 PetaBytes >10,000 users >150 VOs >150,000 jobs/day 20

In silico drug discovery Diseases such as HIV/AIDS, SRAS, Bird Flu etc. are a threat to public health due to world wide exchanges and circulation of persons Grids open new perspectives to in silico drug discovery Reduced cost and adding an accelerating factor in the search for new drugs International collaboration is required for: Early detection Epidemiological watch Prevention Search for new drugs Search for vaccines Avian influenza: bird casualties

WISDOM http://wisdom.healthgrid.org/

Example: Geocluster industrial application The first industrial application successfully running on EGEE Developed by the Compagnie Générale de Géophysique (CGG) in France, doing geophysical simulations for oil, gas, mining and environmental industries EGEE technology helps CGG to federate its computing resources around the globe Ian Bird, CERN, IT Department

OSG and the LCG The 17 US Tier-2 centers are funded by the NSF and participate in the LCG through the OSG. Common software through the Virtual Data Toolkit is used by many different Grids. WLCG ATLAS and CMS processing usage for the past year (WLCG Apel reports). US ATLAS Tier-2s US CMS Tier-2s Midwest Tier-2: Indiana University, University of Chicago Caltech Florida Southeast Tier-2: Oklahoma University, University of Texas Arlington, Langston University, University of New Mexico MIT Nebraska Stanford Linear Accelerator Center Purdue Northeast Tier-2: Boston University, Harvard University U. of Wisconsin Great Lakes Tier-2: University of Michigan, Michigan State U. of California, San Diego

Ian Bird, CERN, IT Department Sustainability Need to prepare for permanent Grid infrastructure in Europe and the world Ensure a high quality of service for all user communities Independent of short project funding cycles Infrastructure managed in collaboration with National Grid Initiatives (NGIs) European Grid Initiative (EGI) Future of projects like OSG, NorduGrid, ... ? Ian Bird, CERN, IT Department

For more information: Thank you for your kind attention! www.eu-egee.org www.cern.ch/lcg www.opensciencegrid.org www.gridcafe.org www.eu-egi.org/ Ian Bird, CERN, IT Department 26