IT Department and The LHC Computing Grid

Slides:



Advertisements
Similar presentations
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Why Grids Matter to Europe Bob Jones EGEE.
Advertisements

S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
An overview of the EGEE project Bob Jones EGEE Technical Director DTI International Technology Service-GlobalWatch Mission CERN – June 2004.
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Jerzy Szwed Under Secretary of State Ministry of Science and Higher.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Les Les Robertson WLCG Project Leader WLCG – Worldwide LHC Computing Grid Where we are now & the Challenges of Real Data CHEP 2007 Victoria BC 3 September.
Les Les Robertson LCG Project Leader LCG - The Worldwide LHC Computing Grid LHC Data Analysis Challenges for 100 Computing Centres in 20 Countries HEPiX.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – October 2006 LHC Computing and Grids Frédéric Hemmer IT Deputy Department Head October 10,
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
Frédéric Hemmer, CERN, IT Department The LHC Computing Grid – June 2006 The LHC Computing Grid Visit of the Comité d’avis pour les questions Scientifiques.
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
Ian Bird LCG Deployment Manager EGEE Operations Manager LCG - The Worldwide LHC Computing Grid Building a Service for LHC Data Analysis 22 September 2006.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE – paving the way for a sustainable infrastructure.
INFSO-RI Enabling Grids for E-sciencE Plan until the end of the project and beyond, sustainability plans Dieter Kranzlmüller Deputy.
To the Grid From the Web Dr. Francois Grey IT Department, CERN.
1 The LHC Computing Grid – February 2007 Frédéric Hemmer, CERN, IT Department LHC Computing and Grids Frédéric Hemmer Deputy IT Department Head January.
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Karel van der Toorn President University of Amsterdam Wednesday 10 th.
Jürgen Knobloch/CERN Slide 1 A Global Computer – the Grid Is Reality by Jürgen Knobloch October 31, 2007.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
Service, Operations and Support Infrastructures in HEP Processing the Data from the World’s Largest Scientific Machine Patricia Méndez Lorenzo (IT-GS/EIS),
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – September 2007 Wolfgang von Rüden, CERN, IT Department The LHC Computing Grid Frédéric Hemmer.
Ian Bird LCG Project Leader WLCG Update 6 th May, 2008 HEPiX – Spring 2008 CERN.
Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Frédéric Hemmer IT Department 26 th January 2010 Visit of Michael Dell 1 Frédéric.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
The LHC Computing Grid Visit of Dr. John Marburger
1 The LHC Computing Grid – April 2007 Frédéric Hemmer, CERN, IT Department The LHC Computing Grid A World-Wide Computer Centre Frédéric Hemmer Deputy IT.
The Worldwide LHC Computing Grid Frédéric Hemmer IT Department Head Visit of INTEL ISEF CERN Special Award Winners 2012 Thursday, 21 st June 2012.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
WLCG – Status and Plans Ian Bird WLCG Project Leader openlab Board of Sponsors CERN, 23 rd April 2010.
Top 5 Experiment Issues ExperimentALICEATLASCMSLHCb Issue #1xrootd- CASTOR2 functionality & performance Data Access from T1 MSS Issue.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Introduction to Grids and the EGEE project.
Dr. Ian Bird LHC Computing Grid Project Leader Göttingen Tier 2 Inauguration 13 th May 2008 Challenges and Opportunities.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Enabling Grids for E-sciencE - EGEE Bob Jones,
Bob Jones EGEE Technical Director
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
Ian Bird WLCG Workshop San Francisco, 8th October 2016
Grid Computing in HIGH ENERGY Physics
Physics Data Management at CERN
Grid site as a tool for data processing and data analysis
Kors Bos NIKHEF, Amsterdam.
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Ian Bird GDB Meeting CERN 9 September 2003
Data Challenge with the Grid in ATLAS
Grid related projects CERN openlab LCG EDG F.Fluckiger
Long-term Grid Sustainability
The LHC Computing Challenge
EGEE support for HEP and other applications
Readiness of ATLAS Computing - A personal view
The INFN TIER1 Regional Centre
The LHC Computing Grid Visit of Her Royal Highness
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
A high-performance computing facility for scientific research
Connecting the European Grid Infrastructure to Research Communities
Visit of US House of Representatives Committee on Appropriations
LHC Data Analysis using a worldwide computing grid
Collaboration Board Meeting
The LHC Computing Grid Visit of Prof. Friedrich Wagner
Overview & Status Al-Ain, UAE November 2007.
The LHC Computing Grid Visit of Professor Andreas Demetriou
Presentation transcript:

IT Department and The LHC Computing Grid Visit of Mr. François Gounand Conseiller de l’Administrateur Général du CEA Conseiller pour les très grandes installations de Recherche au Ministère de la Recherche Frédéric Hemmer Deputy Head, IT Department October 4, 2006 The LHC Computing Grid – October 2006

Outline IT Department in brief Fabrics The LCG Project Beyond LCG The Challenge The (W)LCG Project The LCG Infrastructure The LCG Service Beyond LCG Real-Time LCG monitor The LHC Computing Grid – October 2006

Services provided by IT Dept (I) Basic Services Campus and external networking, Internet Exchange Point Productivity tools (Windows, Linux, Mac, exchange, office tools, other applications) PC-shop via Stores, Printing, backups, phones, faxes Security, user support Administrative Computing Engineering Support Databases (Oracle), EDMS, maths tools Electrical and mechanical engineering Simulation The LHC Computing Grid – October 2006

Services provided by IT Dept (II) Computing for Physics Software process support, database applications Interactive and batch services (Linux & Solaris) Central data recording, mass storage Linux support, system management Control systems (SCADA, PLCs, Fieldbuses, …) Major projects: LCG, EGEE, openlab Tier0/Tier1 centre at CERN Data challenges, grid deployment & operations, middleware Advanced high-speed networks, transatlantic connections Service Challenges Collaboration with industry The LHC Computing Grid – October 2006

A few IT Department numbers Around 280 staff positions Adding fellows, associates, students and visitors, we are nearly 500 people in total Materials budget is ~50 MCHF in 2006 with the largest fraction devoted to LHC computing, dropping to ~40MCHF/year ~3000 central computers, 1’500 TB disk storage, 10PB tape storage capacity (being increased for LHC) ~15000 accounts The LHC Computing Grid – October 2006

CERN openlab Concept Partner/contributor sponsors latest hardware, software and brainware (young researchers) CERN provides experts, test and validation in Grid environment Partners: 500’000 €/ year, 3 years Contributors: 150’000 €, 1 year Current Activities Platform competence centre Grid interoperability centre Security activities Joint events The LHC Computing Grid – October 2006

Computer Center The LHC Computing Grid – October 2006

Computer Centre Upgrade 2006 2004 2005 The LHC Computing Grid – October 2006

Computer Centre Automation Developed Systems Management Tools Quattor Automated installation, configuration & management of clusters Lemon Computer Center Monitoring System Leaf Hardware and State Management Castor Advance Storage Manager moving data from disk to tape and vice-versa The LHC Computing Grid – October 2006 9

The LHC Computing Grid The LHC Computing Grid – October 2006

New frontiers in data handling ATLAS experiment: ~150 million channels @ 40MHz ~ 10 million Gigabytes per second Massive data reduction on-line Still ~1 Gigabyte per second to handle The LHC Computing Grid – October 2006

The Data Challenge The accelerator will be completed in 2007 and run for 10-15 years LHC experiments will produce 10-15 million Gigabytes of data each year (about 20 million CDs!) LHC data analysis requires a computing power equivalent to ~100,000 of today's fastest PC processors. Requires many cooperating computer centres, CERN providing only ~20% of the computing resources The LHC Computing Grid – October 2006

(extracted by physics topic) Data Handling and Computation for Physics Analysis reconstruction detector event filter (selection & reconstruction) analysis processed data event summary data raw data batch physics analysis event reprocessing simulation analysis objects (extracted by physics topic) event simulation interactive physics analysis les.robertson@cern.ch The LHC Computing Grid – October 2006

LCG Service Hierarchy Tier-0 – the accelerator centre Data acquisition & initial processing Long-term data curation Data Distribution to Tier-1 centres Canada – Triumf (Vancouver) France – IN2P3 (Lyon) Germany –Karlsruhe Italy – CNAF (Bologna) Netherlands – NIKHEF/SARA (Amsterdam) Nordic countries – distributed Tier-1 Spain – PIC (Barcelona) Taiwan – Academia SInica (Taipei) UK – CLRC (Oxford) US – FermiLab (Illinois) – Brookhaven (NY) Tier-1 – “online” to the data acquisition process  high availability Managed Mass Storage –  grid-enabled data service All re-processing passes Data-heavy analysis National, regional support Tier-2 – ~100 centres in ~40 countries Simulation End-user analysis – batch and interactive Services, including Data Archive and Delivery, from Tier-1s The LHC Computing Grid – October 2006

Summary of Computing Resource Requirements ~ 100K of today’s fastest processors Summary of Computing Resource Requirements All experiments - 2008 From LCG TDR - June 2005 CERN All Tier-1s All Tier-2s Total CPU (MSPECint2000s) 25 56 61 142 Disk (PetaBytes) 7 31 19 57 Tape (PetaBytes) 18 35 53 CPU Disk Tape The LHC Computing Grid – October 2006

WLCG Collaboration The Collaboration 4 LHC experiments ~120 computing centres 12 large centres (Tier-0, Tier-1) 38 federations of smaller “Tier-2” centres Growing to ~40 countries Memorandum of Understanding Agreed in October 2005, now being signed Resources Commitment made each October for the coming year 5-year forward look The LHC Computing Grid – October 2006

The LHC Computing Grid – October 2006

The new European Network Backbone LCG working group with Tier-1s and national/ regional research network organisations New GÉANT 2 – research network backbone  Strong correlation with major European LHC centres Swiss PoP at CERN The LHC Computing Grid – October 2006

LHC Computing Grid Project - a Collaboration Building and operating the LHC Grid – a global collaboration between The physicists and computing specialists from the LHC experiments The national and regional projects in Europe and the US that have been developing Grid middleware The regional and national computing centres that provide resources for LHC The research networks Researchers Computer Scientists & Software Engineers Service Providers The LHC Computing Grid – October 2006

LCG depends on 2 major science grid infrastructures … The LCG service runs & relies on grid infrastructure provided by: EGEE - Enabling Grids for E-Science OSG - US Open Science Grid The LHC Computing Grid – October 2006

LCG Service planning Pilot Services – stable service from 1 June 06 2006 cosmics LHC Service in operation – 1 Oct 06 over following six months ramp up to full operational capacity & performance 2007 LHC service commissioned – 1 Apr 07 first physics 2008 full physics run The LHC Computing Grid – October 2006

Service Challenge 4 (SC4) the Pilot LHC Service from June 2006 A stable service on which experiments can make a full demonstration of their offline chain DAQ  Tier-0  Tier-1 data recording, calibration, reconstruction Offline analysis - Tier-1  Tier-2 data exchange simulation, batch and end-user analysis And sites can test their operational readiness LCG services -- monitoring  reliability Grid services Mass storage services, including magnetic tape Extension to most Tier-2 sites Target for service by end September – Service metrics  90% of MoU service levels Data distribution from CERN to tape at Tier-1s at nominal LHC rates The LHC Computing Grid – October 2006

CERN Tier-1 Data Distribution Sustaining ~900 MBytes/sec The data rate required during LHC running for all four experiments is 1.6 GBytes/sec – -- which was demonstrated during a test period in April ATLAS alone has moved 1 PetaByte of data during its data challenge between 19 June and 7 August CMS has moved 3.8 PetaBytes of data between sites during a four months period this week The LHC Computing Grid – October 2006

CMS Data Transfers The LHC Computing Grid – October 2006

Production Grids for LHC what has been achieved Basic middleware A set of baseline services agreed and initial versions in production Pro-active grid operation – distributed across several sites All major LCG sites active ~50K jobs/day > 10K simultaneous jobs during prolonged periods on the EGEE grid Reliable data distribution service demonstrated at 1.6 GB/sec CERNTier-1s mass storage to mass storage = nominal LHC data rate The LHC Computing Grid – October 2006

Impact of the LHC Computing Grid in Europe LCG has been the driving force for the European multi-science Grid EGEE (Enabling Grids for E-sciencE) EGEE is now a global effort, and the largest Grid infrastructure worldwide Co-funded by the European Commission (~130 M€ over 4 years) EGEE already used for >20 applications, including… Bio-informatics Education, Training Medical Imaging The LHC Computing Grid – October 2006

The EGEE Project Infrastructure operation Middleware Currently includes >200 sites across 40 countries Continuous monitoring of grid services & automated site configuration/management http://gridportal.hep.ph.ic.ac.uk/rtm/launch_frame.html Middleware Production quality middleware distributed under business friendly open source licence User Support - Managed process from first contact through to production usage Training Documentation Expertise in grid-enabling applications Online helpdesk Networking events (User Forum, Conferences etc.) Interoperability Expanding interoperability with related infrastructures The LHC Computing Grid – October 2006

EGEE Grid Sites : Q1 2006 sites EGEE: Steady growth over the lifetime of the project CPU EGEE: > 180 sites, 40 countries > 24,000 processors, ~ 5 PB storage The LHC Computing Grid – October 2006

Applications on EGEE More than 20 applications from 7 domains Astrophysics MAGIC, Planck Computational Chemistry Earth Sciences Earth Observation, Solid Earth Physics, Hydrology, Climate Financial Simulation E-GRID Fusion Geophysics EGEODE High Energy Physics 4 LHC experiments (ALICE, ATLAS, CMS, LHCb) BaBar, CDF, DØ, ZEUS Life Sciences Bioinformatics (Drug Discovery, GPS@, Xmipp_MLrefine, etc.) Medical imaging (GATE, CDSS, gPTM3D, SiMRI 3D, etc.) Multimedia Material Sciences … The LHC Computing Grid – October 2006

Example: EGEE Attacks Avian Flu EGEE used to analyse 300,000 possible potential drug compounds against bird flu virus, H5N1. 2000 computers at 60 computer centres in Europe, Russia, Taiwan, Israel ran during four weeks in April - the equivalent of 100 years on a single computer. Potential drug compounds now being identified and ranked Neuraminidase, one of the two major surface proteins of influenza viruses, facilitating the release of virions from infected cells. Image Courtesy Ying-Ta Wu, AcademiaSinica. The LHC Computing Grid – October 2006

Example: Geocluster industrial application The first industrial application successfully running on EGEE Developed by the Compagnie Générale de Géophysique (CGG) in France, doing geophysical simulations for oil, gas, mining and environmental industries. EGEE technology helps CGG to federate its computing resources around the globe. The LHC Computing Grid – October 2006

European e-Infrastructure Coordination Evolution European e-Infrastructure Coordination EGEE EGEE-II EDG EGEE-III Testbeds Utility Service Routine Usage The LHC Computing Grid – October 2006

The LHC Computing Grid – October 2006