WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014.

Slides:



Advertisements
Similar presentations
The LHC experiments AuthZ Interoperation requirements GGF16, Athens 16 February 2006 David Kelsey CCLRC/RAL, UK
Advertisements

Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
T1 at LBL/NERSC/OAK RIDGE General principles. RAW data flow T0 disk buffer DAQ & HLT CERN Tape AliEn FC Raw data Condition & Calibration & data DB disk.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Les Les Robertson WLCG Project Leader WLCG – Worldwide LHC Computing Grid Where we are now & the Challenges of Real Data CHEP 2007 Victoria BC 3 September.
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
LHC Experiment Dashboard Main areas covered by the Experiment Dashboard: Data processing monitoring (job monitoring) Data transfer monitoring Site/service.
CERN IT Department CH-1211 Genève 23 Switzerland t Status and Plans TERENA 2010 Vilnius, Lithuania John Shade /CERN.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
1. 2 CERN European Organization for Nuclear Research Founded in 1954 by 12 countries – Norway one of them Today: 20 member states, around 2500 staff –
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
Tier 1 in Dubna for CMS: plans and prospects Korenkov Vladimir LIT, JINR AIS-GRID School 2013, April 25.
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
Experience with the WLCG Computing Grid 10 June 2010 Ian Fisk.
Preparation of KIPT (Kharkov) computing facilities for CMS data analysis L. Levchuk Kharkov Institute of Physics and Technology (KIPT), Kharkov, Ukraine.
1 Kittikul Kovitanggoon*, Burin Asavapibhop, Narumon Suwonjandee, Gurpreet Singh Chulalongkorn University, Thailand July 23, 2015 Workshop on e-Science.
IST E-infrastructure shared between Europe and Latin America High Energy Physics Applications in EELA Raquel Pezoa Universidad.
LHC Computing Plans Scale of the challenge Computing model Resource estimates Financial implications Plans in Canada.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
Grid Computing Status Report Jeff Templon PDP Group, NIKHEF NIKHEF Scientific Advisory Committee 20 May 2005.
Tim Bell 24/09/2015 2Tim Bell - RDA.
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
A.Golunov, “Remote operational center for CMS in JINR ”, XXIII International Symposium on Nuclear Electronics and Computing, BULGARIA, VARNA, September,
Metadata requirements for HEP Paul Millar. Slide 2 12 September 2007 Metadata requirements for HEP Some of the players in this game... WLCG – Umbrella.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Karel van der Toorn President University of Amsterdam Wednesday 10 th.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Tim 18/09/2015 2Tim Bell - Australian Bureau of Meteorology Visit.
…building the next IT revolution From Web to Grid…
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
Content: India’s e-infrastructure an overview The Regional component of the Worldwide LHC Computing Grid (WLCG ) India-CMS and India-ALICE Tier-2 site.
Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
ATP Future Directions Availability of historical information for grid resources: It is necessary to store the history of grid resources as these resources.
Julia Andreeva on behalf of the MND section MND review.
LHC Computing, CERN, & Federated Identities
CERN IT Department CH-1211 Genève 23 Switzerland t Experiment Operations Simone Campana.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI Monitoring of the LHC Computing Activities Key Results from the Services.
1 Andrea Sciabà CERN The commissioning of CMS computing centres in the WLCG Grid ACAT November 2008 Erice, Italy Andrea Sciabà S. Belforte, A.
tons, 150 million sensors generating data 40 millions times per second producing 1 petabyte per second The ATLAS experiment.
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 1 st March 2011 Visit of Dr Manuel Eduardo Baldeón.
The Worldwide LHC Computing Grid Frédéric Hemmer IT Department Head Visit of INTEL ISEF CERN Special Award Winners 2012 Thursday, 21 st June 2012.
1 A collision in the CMS detector Particle trajectories are reconstructed with precision of few microns (1 μ = m)
My Jobs at CERN April 2015 My Jobs at CERN2
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Big Data for Big Discoveries How the LHC looks for Needles by Burning Haystacks Alberto Di Meglio CERN openlab Head DOI: /zenodo.45449, CC-BY-SA,
IT-DSS Alberto Pace2 ? Detecting particles (experiments) Accelerating particle beams Large-scale computing (Analysis) Discovery We are here The mission.
The ATLAS detector … … is composed of cylindrical layers: Tracking detector: Pixel, SCT, TRT (Solenoid magnetic field) Calorimeter: Liquid Argon, Tile.
TIFR, Mumbai, India, Feb 13-17, GridView - A Grid Monitoring and Visualization Tool Rajesh Kalmady, Digamber Sonvane, Kislay Bhatt, Phool Chand,
Collaborative Research Projects in Australia: High Energy Physicists Dr. Greg Wickham (AARNet) Dr. Glenn Moloney (University of Melbourne) Global Collaborations.
Grid technologies for large-scale projects N. S. Astakhov, A. S. Baginyan, S. D. Belov, A. G. Dolbilov, A. O. Golunov, I. N. Gorbunov, N. I. Gromova, I.
High Performance Computing & Society Ian Bird, CERN | 28-29th September 2013.
Computing infrastructures for the LHC: current status and challenges of the High Luminosity LHC future Worldwide LHC Computing Grid (WLCG): Distributed.
J. Templon Nikhef Amsterdam Physics Data Processing Group Large Scale Computing Jeff Templon Nikhef Jamboree, Utrecht, 10 december 2012.
ATLAS Computing: Experience from first data processing and analysis Workshop TYL’10.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
Ian Bird WLCG Workshop San Francisco, 8th October 2016
Grid site as a tool for data processing and data analysis
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Philippe Charpentier CERN – LHCb On behalf of the LHCb Computing Group
The LHC Computing Grid Visit of Her Royal Highness
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
LHC Data Analysis using a worldwide computing grid
The LHC Computing Grid Visit of Professor Andreas Demetriou
Presentation transcript:

WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014

Overview 2 The WLCG Project India-CERN Collaboration

7000 tons, 150 million sensors generating data 40 millions times per second producing 1 petabyte per second The ATLAS experiment

4 A collision at LHC The Worlwide LHC Computing Grid 4 ATLAS Experiment LHC Beam

The Data Acquisition 5 Data from ATLAS 1 PB/sec from all sub-detectors 1 GB/sec raw data sent to Data Centre Reduction factor of 1 million.

CERN Computer Centre (Tier-0): Acquisition, First pass reconstruction, Storage & Distribution The Worlwide LHC Computing Grid ALICE 1.25 GB/sec (ions) 2012: ~1 GB/sec 2012: ~300 MB/sec 2012: ~4GB/sec ATLAS ~320 MB/sec CMS ~220 MB/sec LHCb ~50 MB/sec

WLCG: Worldwide LHC Computing Grid Typical simultaneous jobs: more than 250,000 Global Transfer rate: billion bytes per second

Tier 0 (CERN) Data recording Initial data reconstruction Data distribution Tier 1 (12 centres) Permanent storage Re-processing Analysis 10 Gbit/s links Tier 2 (~140 centres) Simulation End-user analysis Overall ~150 sites, 39 countries 250,000 cores 173 PB of storage 2 million jobs/day WLCG: Worldwide LHC Computing Grid Tier-2 sites (about 140) Tier-1 sites Gbit/s links

WLCG sites in India 9 2 WLCG T2 sites in India –VECC (Kolkata) –TIFR (Mumbai) Used by ALICE & CMS experiments

Cooperation agreement signed in 1991 Research & Development for accelerators, detectors, computing, and high energy physics Around 30 Addenda have been completed In the last years on LHC and NAT projects (CTF3, CLIC, LINAC4, Computing) with RRCAT & BARC Solid State Modulator for Linac4 Vacuum chambers and magnets development, CTF3 controls software development 10 India-CERN Collaboration 27 Feb 2014

Protocol P060/LHC ( ), Computing and Grid Technology for LHC 76 FTE involved since 2002 (26 since 2008) Wide range of computational areas: – Fabric Management – Databases – Grid Monitoring – Visualization – Reporting – Cloud Infrastructure Management 11 Collaboration in Computing 27 Feb 2014

Visualisation of Grid Monitoring data – Status, availability, reliability of Grid services and sites – Tools for WLCG management Cloud Computing – Graphical management portal for OpenNebula and OpenStack backends – Identify CERN and WLCG quota management requirements – Integration of WLCG accounting schema 12 Computing area (last 2 years) 27 Feb 2014

Projects between BARC and CERN – 2 FTEs each Centralised Multi-level User Quota Management System for OpenStack OpenStack Enhancements for Physics Geant-V Optimisations 13 Computing (starting now) 27 Feb 2014

Conclusions Major Indian participation in LHC Very valuable collaboration in areas of common interest We want to continue this in the future 14