1 Developing Countries Access to Scientific Knowledge Ian Willers CERN, Switzerland.

Slides:



Advertisements
Similar presentations
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Advertisements

Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Highest Energy e + e – Collider LEP at CERN GeV ~4km radius First e + e – Collider ADA in Frascati GeV ~1m radius e + e – Colliders.
Last update: 01/06/ :09 LCG les robertson - cern-it 1 LHC Computing Grid Project - LCG The LHC Computing Grid First steps towards a Global Computing.
Grids for the LHC Paula Eerola Lund University, Sweden Four Seas Conference Istanbul 5-10 September 2004 Acknowledgement: much of the material is from.
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
Last update: 02/06/ :05 LCG les robertson - cern-it 1 The LHC Computing Grid Project Preparing for LHC Data Analysis NorduGrid Workshop Stockholm,
Enlargement and the ERA Infrastructures and Networks The CERN experience Luciano Maiani, CERN 28 June 2001.
Conseil Europèenne pour la Recherche Nucléaire Where it is ? What is it ? How is it managed ? International Cooperation for the Large Hadron Collider Conclusions.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Large Scale Facilities and Centres of Excellence The CERN experience Luciano MAIANI. CERN.
Welcome to CERN Research Technology Training Collaborating.
Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU) TIER2 of LCG (LHC Computing Grid) 1M. Lokajicek Dell Presentation.
Les Les Robertson WLCG Project Leader WLCG – Worldwide LHC Computing Grid Where we are now & the Challenges of Real Data CHEP 2007 Victoria BC 3 September.
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
Experiment Requirements for Global Infostructure Irwin Gaines FNAL/DOE.
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – October 2006 LHC Computing and Grids Frédéric Hemmer IT Deputy Department Head October 10,
CERN TERENA Lisbon The Grid Project Fabrizio Gagliardi CERN Information Technology Division May, 2000
DESY Participation in an External Experiment Joachim Mnich PRC Meeting
Frédéric Hemmer, CERN, IT Department The LHC Computing Grid – June 2006 The LHC Computing Grid Visit of the Comité d’avis pour les questions Scientifiques.
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
20-May-03D.P.Kelsey, LCG-1 Security, HEPiX1 Grid Security for LCG-1 HEPiX, NIKHEF, 20 May 2003 David Kelsey CCLRC/RAL, UK
LHCb computing in Russia Ivan Korolko (ITEP Moscow) Russia-CERN JWGC, October 2005.
CERN Deploying the LHC Computing Grid The LCG Project Ian Bird IT Division, CERN CHEP March 2003.
Rackspace Analyst Event Tim Bell
11 December 2000 Paolo Capiluppi - DataGrid Testbed Workshop CMS Applications Requirements DataGrid Testbed Workshop Milano, 11 December 2000 Paolo Capiluppi,
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
1 The LHC Computing Grid – February 2007 Frédéric Hemmer, CERN, IT Department LHC Computing and Grids Frédéric Hemmer Deputy IT Department Head January.
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Karel van der Toorn President University of Amsterdam Wednesday 10 th.
CERN What Happens at CERN? "In the matter of physics, the first lessons should contain nothing but what is experimental and interesting to see. A pretty.
…building the next IT revolution From Web to Grid…
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
LCG Denis Linglin - 1 MàJ : 9/02/03 07:24 LHC Computing Grid Project Status Report 12 February 2003.
WelcomeWelcome CSEM – CERN Day 23 rd May 2013 CSEM – CERN Day 23 rd May 2013 to Accelerating Science and Innovation to Accelerating Science and Innovation.
CERN as a World Laboratory: From a European Organization to a global facility CERN openlab Board of Sponsors July 2, 2010 Rüdiger Voss CERN Physics Department.
, VilniusBaltic Grid1 EG Contribution to NEEGRID Martti Raidal On behalf of Estonian Grid.
LHC Computing, CERN, & Federated Identities
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
November 27, 2001DOE/NSF review of US LHC S&C projects1 The Software and Computing Committee (SC2) in the LHC Computing Grid Project M Kasemann, FNAL.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
05 Novembre years of research in physics European Organization for Nuclear Research.
25-September-2005 Manjit Dosanjh Welcome to CERN International Workshop on African Research & Education Networking September ITU, UNU and CERN.
Germany and CERN / June 2009Germany and CERN | May Welcome - Willkommen CERN: to CERN: Accelerating Science and Innovation Professor Wolfgang A.
LHCb Current Understanding of Italian Tier-n Centres Domenico Galli, Umberto Marconi Roma, January 23, 2001.
Collaborative Research Projects in Australia: High Energy Physicists Dr. Greg Wickham (AARNet) Dr. Glenn Moloney (University of Melbourne) Global Collaborations.
WLCG – Status and Plans Ian Bird WLCG Project Leader openlab Board of Sponsors CERN, 23 rd April 2010.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
Grid Computing at NIKHEF Shipping High-Energy Physics data, be it simulated or measured, required strong national and trans-Atlantic.
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
The 5 minutes tour of CERN The 5 minutes race of CERN
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
CERN presentation & CFD at CERN
CERN Students Summer School 2008
Russian Regional Center for LHC Data Analysis
The LHC Computing Challenge
LHC DATA ANALYSIS INFN (LNL – PADOVA)
LHC Data Analysis using a worldwide computing grid
The LHC Computing Grid Project Status & Plans
What is CERN? About CERN's Name from the Web
Gridifying the LHCb Monte Carlo production system
What is CERN?.
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

1 Developing Countries Access to Scientific Knowledge Ian Willers CERN, Switzerland

2 Structure of talk Brief introduction to CERN CERN’s relationship with different countries The experiments The computing challenge Special example of Pakistan and the CMS experiment Conclusion

3 OBSERVERS: UNESCO EU Israel Turkey SPECIAL OBSERVERS (for LHC): USA Japan Russia Twenty Member States of CERN

4 International Collaboration for LHC construction Gross NMS contributions US:200M$ Russia:100MCHF Japan:170MCHF Canada: 30MCHF India: 25M$ Cost sharing for LHC (BCHF): MS, Material:2.1 MS, Personnel:1.1 (approx.) Host States:0.2 NMS (net):0.6 (≈15%) 4.0

5 Aerial view

6 From LEP to the LHC

7 LHC Experiments ATLAS, CMS: - Higgs boson(s) - SUSY particles - …?? ALICE: Quark Gluon Plasma LHC-B: - CP violation in B

8 CMS Magnet Yoke

9 Some examples The LHC dipole n. 360 from Novosibirsk CMS feet from Pakistan LHC corrector magnet from India

10 Access to CERN It may be tempting to make “ access to large facilities ” dependent on “membership”, but particle physicists has been able to follow a different approach Experiments running on our facilities tend to be based on very large ( person) collaborations This allows people from economically weaker countries to join with those from stronger regions So we tend not to look at the passport of the people making proposals But (in general) we expect people who have not funded the lab infrastructure to contribute more than their “fair share” to the cost of the experiment But the contribution can take many forms, such as assembly effort, software, … look for the “win-win”

11 The LHC Computing Challenge New Levels of Data Acquisition New Levels Of Event Complexity Enormous Quantities of Data Access Worldwide

12 Tier 0 at CERN Estimated CPU Capacity required at CERN 0 1,000 2,000 3,000 4,000 5, Jan 2000: 3.5K SI95 Other experiments LHC K SI95 Moore’s law – some measure of the capacity technology advances provide for a constant number of processors or investment

13 Five Emerging Models of Networked Computing From The Grid Distributed Computing –|| synchronous processing High-Throughput Computing –|| asynchronous processing On-Demand Computing –|| dynamic resources Data-Intensive Computing –|| databases Collaborative Computing –|| scientists

14 CERN's Network in the World 267 institutes in Europe, 4603 users 208 institutes elsewhere, 1632 users some points = several institutes

15 Monitoring tools

16 LHC Computing Grid prototype service Tier 0 CERN Tier 1 Centres Brookhaven National Lab CNAF Bologna Fermilab FZK Karlsruhe IN2P3 Lyon Rutherford Appleton Lab (UK) University of Tokyo CERN Other Centres Academica Sinica (Taipei) Barcelona Caltech GSI Darmstadt Italian Tier 2s(Torino, Milano, Legnaro) Manno (Switzerland) Moscow State University NCP, NUST, Pinstech, Islamabad (soon) NIKHEF Amsterdam Ohio Supercomputing Centre Sweden (NorduGrid) Tata Institute (India) Triumf (Canada) UCSD UK Tier 2s University of Florida– Gainesville University of Prague ……

17

18 Benefits – see talk by Arshad Ali

19 High Level Involvement scientist works hard builds up relationship Rector Visits CERN Minister Signs agreement President gives blessing

20 Success in Particle Physics Collaboration Some Important Features The Scientific Goals are of the highest importance The Research requires technological advances ….. of value to all The foundations lie in a network of competent institutes worldwide The facilities are open to everyone but the results must be published

21 Summary Coming together is a beginning Keeping together is progress Working together is success CERN demonstrates successful worldwide international collaboration is possible We intend to keep it that way