EGEE-III Enabling Grids for E-sciencE www.eu-egee.org EGEE and gLite are registered trademarks Andreas Gellrich, DESY The Grid – The Future of Scientific.

Slides:



Advertisements
Similar presentations
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Why Grids Matter to Europe Bob Jones EGEE.
Advertisements

Fighting Malaria With The Grid. Computing on The Grid The Internet allows users to share information across vast geographical distances. Using similar.
Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
Highest Energy e + e – Collider LEP at CERN GeV ~4km radius First e + e – Collider ADA in Frascati GeV ~1m radius e + e – Colliders.
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Applications on the EGEE Grid infrastructure.
Parallel Programming on the SGI Origin2000 With thanks to Moshe Goldberg, TCC and Igor Zacharov SGI Taub Computer Center Technion Mar 2005 Anne Weill-Zrahia.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
HEP Prospects, J. Yu LEARN Strategy Meeting Prospects on Texas High Energy Physics Network Needs LEARN Strategy Meeting University of Texas at El Paso.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
INFSO-RI Enabling Grids for E-sciencE EGEE – applications and training Vincent Breton, on behalf of NA4 Application identification.
Welcome e-Science in the UK Building Collaborative eResearch Environments Prof. Malcolm Atkinson Director 23 rd February 2004.
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
EGEE-III INFSO-RI Enabling Grids for E-sciencE Nov. 18, EGEE and gLite are registered trademarks EGEE-III, Regional, and National.
1 EGEE Grid in Asia Simon C. Lin Academia Sinica Grid Computing Centre Taipei, Taiwan 16 November 2007 Do-Son ACGrid School in Hanoi, Vietnam.
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – October 2006 LHC Computing and Grids Frédéric Hemmer IT Deputy Department Head October 10,
IST E-infrastructure shared between Europe and Latin America Biomedical Applications in EELA Esther Montes Prado CIEMAT (Spain)
3 rd DPHEP Workshop CERN, 7-8 December 2009 G. LAMANNA CTA C herenkov Telescope Array Giovanni Lamanna LAPP - Laboratoire d'Annecy-le-Vieux de Physique.
INFSO-RI Enabling Grids for E-sciencE EGEODE VO « Expanding GEosciences On DEmand » Geocluster©: Generic Seismic Processing Platform.
A short introduction to GRID Gabriel Amorós IFIC.
Frédéric Hemmer, CERN, IT Department The LHC Computing Grid – June 2006 The LHC Computing Grid Visit of the Comité d’avis pour les questions Scientifiques.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks The EGEE Infrastructure and Remote Instruments.
INFSO-RI Enabling Grids for E-sciencE EGEE Induction Grid training for users, Institute of Physics Belgrade, Serbia Sep. 19, 2008.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
Grid Infrastructure for the ILC Andreas Gellrich DESY European ILC Software and Physics Meeting Cambridge, UK,
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
INFSO-RI Enabling Grids for E-sciencE V. Breton, 30/08/05, seminar at SERONO Grid added value to fight malaria Vincent Breton EGEE.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
INFSO-RI Enabling Grids for E-sciencE ES applications in EGEEII – M. Petitdidier –11 February 2008 Earth Science session Wrap up.
The Grid Beyond Physics Bob Jones, CERN EGEE project director.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Non-HEP Grid Use of EGEE/GridPP IHEPCCC Meeting (June 06) Peter Watkins  EGEE Virtual Organisations  Grid Utilisation by VOs  Biomed Data Challenges.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE – paving the way for a sustainable infrastructure.
EGEE-II INFSO-RI Enabling Grids for E-sciencE WISDOM, a grid enabled virtual screening initiative Yannick Legré LPC Clermont-Ferrand,
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
1 The LHC Computing Grid – February 2007 Frédéric Hemmer, CERN, IT Department LHC Computing and Grids Frédéric Hemmer Deputy IT Department Head January.
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Karel van der Toorn President University of Amsterdam Wednesday 10 th.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Jürgen Knobloch/CERN Slide 1 A Global Computer – the Grid Is Reality by Jürgen Knobloch October 31, 2007.
EGEE-II INFSO-RI Enabling Grids for E-sciencE WISDOM in EGEE-2, biomed meeting, 2006/04/28 WISDOM : Grid-enabled Virtual High Throughput.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Eine Einführung ins Grid Andreas Gellrich IT Training DESY Hamburg
Experimental Particle Physics Do you want to discover… What is the origin of mass ? Discover the Higgs boson with ATLAS Why is there more matter than anti-matter.
Your university or experiment logo here What is it? What is it for? The Grid.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Applications Using the EGEE Grid Infrastructure.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
H. Kornmayer MAGIC-Grid EGEE, Panel discussion, Pisa, Monte Carlo Production for the MAGIC telescope A generic application of EGEE Towards.
1 The LHC Computing Grid – April 2007 Frédéric Hemmer, CERN, IT Department The LHC Computing Grid A World-Wide Computer Centre Frédéric Hemmer Deputy IT.
05 Novembre years of research in physics European Organization for Nuclear Research.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Spanish National Research Council- CSIC Isabel.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE: Enabling grids for E-Science Bob Jones.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Introduction to Grids and the EGEE project.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Applications on the EGEE Grid infrastructure.
EGEE-II INFSO-RI Enabling Grids for E-sciencE Practical using EGEE middleware: Putting it all together!
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks The EGEE project and the future of European.
Collaborative Research Projects in Australia: High Energy Physicists Dr. Greg Wickham (AARNet) Dr. Glenn Moloney (University of Melbourne) Global Collaborations.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Introduction to Grids and the EGEE project.
2 nd EGEE/OSG Workshop Data Management in Production Grids 2 nd of series of EGEE/OSG workshops – 1 st on security at HPDC 2006 (Paris) Goal: open discussion.
Grids and SMEs: Experience and Perspectives Emanouil Atanassov, Todor Gurov, and Aneta Karaivanova Institute for Parallel Processing, Bulgarian Academy.
] Open Science Grid Ben Clifford University of Chicago
INFSO-RI Enabling Grids for E-sciencE Earth Science Applications M. Petitdidier (CNRS/IPSL-Paris)
Stanford Linear Accelerator
Introduction to CERN F. Hahn / CERN PH-DT1 10. May 2007.
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Applications Using the EGEE Grid Infrastructure
Grids & E-Science Fotis Georgatos
EUChinaGRID Applications
CS258 Spring 2002 Mark Whitney and Yitao Duan
Presentation transcript:

EGEE-III Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Andreas Gellrich, DESY The Grid – The Future of Scientific Computing Andreas Gellrich DESY IT Training DESY, Hamburg

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 2 Collaborating e-Infrastructures Potential for linking ~80 countries by 2008

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 3 The need for Grid in HEP CERN: the world's largest particle physics laboratory Particle physics requires special tools to create and study new particles: accelerators and detectors Mont Blanc (4810 m) Downtown Geneva Large Hadron Collider (LHC): –One of the most powerful instruments ever built to investigate matter –4 experiments: ALICE, ATLAS, CMS, LHCb –27 km circumference tunnel –Due to start up mid 2008

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 4 Highlights of EGEE-II >200 VOs from several scientific domains –Astronomy & Astrophysics –Civil Protection –Computational Chemistry –Comp. Fluid Dynamics –Computer Science/Tools –Condensed Matter Physics –Earth Sciences –Fusion –High Energy Physics –Life Sciences Further applications under evaluation 98k jobs/day Applications have moved from testing to routine and daily usage ~80-90% efficiency

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 5 Biomedical applications Biomedicine is also a pilot application area More than 20 applications deployed and being ported Three sub domains –Medical image processing –Biomedicine –Drug discovery Use Grid as platform for collaboration (don’t need same massive processing power or storage as HEP)

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 6 Applications Example Grid-enabled drug discovery process for neglected diseases –In silico docking  compute probability that potential drugs dock with target protein –To speed up and reduce cost to develop new drugs WISDOM (World-wide In Silico Docking On Malaria) –First biomedical data challenge –46 million ligands docked in 6 weeks –1TB of data produced –1000 computers in 15 countries  Equivalent to 80 CPU years Second data challenge on Avian flu in April 2006 –300,000 possible drug components tested –8 different targets –2000 computers used for 4 weeks

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 7 Astroparticle physics Major Atmospheric Gamma Imaging Cherenkov telescope (MAGIC) –Origin of VHE  -rays (30 GeV – TeV)  Active Galactic Nuclei (AGN)  Supernova Remnants  Unidentified EGRET sources  Gamma Ray Bursts –Huge hadronic background  MC simulations  to simulate the background of one night, 70 CPUs (P4 2GHz) need to run for days –Observation data are big too! MAGIC Grid –Use three national Grid centres as backbone –All are members of EGEE Work to build a second telescope is currently in progress  Towards a virtual observatory for VHE  -rays

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 8 Astroparticle physics PLANCK satellite mission –Measure cosmic microwave background (CMB)  At even higher resolution than previous missions –Launch in 2008; duration >1 year Application –N simulations of the whole Planck/LFI mission  different cosmological and instrumental parameters –Full sky map for frequencies 30 up to 850 GHz (two complete sky surveys) –22 channels for LFI, 48 for HFI > 12 times faster!!! but ~5% failure rate

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 9 Computational Chemistry GEMS (Grid Enabled Molecular Simulator) application –Calculation and fitting of electronic energies of atomic and molecular aggregates (using high level ab initio methods) –The use of statistical kinetics and dynamics to study chemical processes Virtual Monitors –Angular distributions –Vibrational distributions –Rotational distributions –Many body systems End-User applications –Nanotubes –Life sciences –Statistical Thermodynamics –Molecular Virtual Reality Angular distribution Rotational distribution Vibrational distribution Many body system Angular distribution

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 10 Fusion Large Nuclear Fusion installations –E.g. International Thermonuclear Experimental Reactor (ITER) –Distributed data storage and handling needed –Computing power needed for  Making decisions in real time  Solving kinetic transport  particle orbits  Stellarator optimization  magnetic field to contain the plasma

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 11 Earth Science Applications Community –Many small groups that aggregate for projects (and separate afterwards) The Earth –Complex system –Independent domains with interfaces  Solid Earth – Ocean – Atmosphere –Physics, chemistry and/or biology Applications –Earth observation by satellite –Seismology –Hydrology –Climate –Geosciences –Pollution –Meteorology, Space Weather –Mars Atmosphere –Database Collection

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 12 Earthquake analysis Seismic software application determines: Epicentre, magnitude, mechanism  May make it possible to predict future earthquakes  Assess potential impact on specific regions Analysis of Indonesian earthquake (28 March 2005) –Data from French seismic sensor network GEOSCOPE transmitted to IPGP within 12 hours after the earthquake –Solution found within 30 hours after earthquake occurred  10 times faster on the Grid than on local computers –Results  Not an aftershock of December 2004 earthquake  Different location (different part of fault line further south)  Different mechanism Rapid analysis of earthquakes is important for relief efforts

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 13 Industrial applications EGEODE –Industrial application from Compagnie Générale de Géophysique running on EGEE infrastructure  Seismic processing platform  Based on industrial application Geocluster© used at CGG  Being ported to EGEE for Industry and Academia OpenPlast project –French R&D programme to develop and deploy Grid platform for plastic industry (SMEs) –Based on experience from EGEE (supported by CS) –Next: Interoperability with other Grids

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 14 HEP High Energy Physics is a pilot application domain for EGEE –Large datasets –Large computing requirements  Major need for Grid technology to support distributed communities Support for LHC experiments through LHC Computing Grid (LCG) –ATLAS, CMS, LHCb, ALICE Also support for other major international HEP experiments –BaBar (US) –CDF (US) –DØ (US) –H1 and ZEUS (Germany)

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 15 LCG Tier Model Tier2 Centre ~1 TIPS Online System Offline Farm ~20 TIPS CERN Computer Centre >20 TIPS GridKa Regional Centre US Regional Centre French Regional Centre Italian Regional Centre Institute Institute ~0.25TIPS Workstations ~100 MBytes/sec Mbits/sec One bunch crossing per 25 ns 100 triggers per second Each event is ~1 Mbyte Physicists work on analysis “channels” Each institute has ~10 physicists working on one or more channels Data for these channels should be cached by the institute server Physics data cache ~PBytes/sec ~ Gbits/sec or Air Freight Tier2 Centre ~1 TIPS ~Gbits/sec Tier 0 Tier 1 Tier 3 1 TIPS = 25,000 SpecInt95 PC (1999) = ~15 SpecInt95 DESY ~1 TIPS Tier 2

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 16 LHC Data 40 million collisions per second After filtering, 100 collisions of interest per second A Megabyte of data for each collision = recording rate of 0.1 Gigabytes/sec collisions recorded each year  When LHC starts operation: will generate ~ 15 Petabytes/year of data* *corresponding to more than 20 million CDs! Concorde (15 Km) Balloon (30 Km) CD stack with 1 year LHC data! (~ 20 Km) Mt. Blanc (4.8 Km)

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 17 LHC Computing Grid Aim: to develop, build and maintain a distributed computing environment for the storage and analysis of data from the four LHC experiments  Ensure the computing service  … and common application libraries and tools “Tier” infrastructure with Tier-0 at CERN, 11 Tier-1 centres and more than 100 Tier-2, and Tier-3 centres Phase I – – Development & planning Phase II – – Deployment & commissioning of the initial services  LCG is not a development project – it relies on EGEE (and other Grid projects) for Grid middleware development, application support, Grid operation and deployment

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 18 HEP success stories Fundamental activity in preparation of LHC start up –Physics –Computing systems Examples: –LHCb: ~700 CPU/years in 2005 on the EGEE infrastructure –ATLAS: over 20,000 jobs per day  Comprehensive analysis: see S.Campana et al., “Analysis of the ATLAS Rome Production experience on the EGEE Computing Grid“, e-Science 2005, Melbourne, Australia –A lot of activity in all involved applications (including as usual a lot of activity within non-LHC experiments like BaBar, CDF and D0) ATLAS LHCb

Enabling Grids for E-sciencE EGEE-III Andreas Gellrich, DESY Introduction to Grids and the EGEE project 19 Monitoring