Presentation on theme: "CERN IT Department1 / 17 Tour of CERN Computer Center and the Grid at CERN Information Technologies Department Tour of CERN Computer Center and the Grid."— Presentation transcript:
CERN IT Department1 / 17 Tour of CERN Computer Center and the Grid at CERN Information Technologies Department Tour of CERN Computer Center and the Grid at CERN Welcome!
CERN IT Department2 / 17 Tour of CERN Computer Center and the Grid at CERN IT Department Computing at CERN 1.General Purpose Computing Environment 2.Administrative Computing Services 3.Physics and engineering computing 4.Consolidation, coordination and standardization of computing activities 5.Physics applications (e.g., for data acquisition/offline analysis) 6.Accelerator design and operations
CERN IT Department3 / 17 Tour of CERN Computer Center and the Grid at CERN LHC Data every year 40 million collisions per second After filtering, 100 collisions of interest per second > 1 Megabyte of data digitised per collision recording rate > 1 Gigabyte / sec 1010 collisions recorded each year stored data > 15 Petabytes / year CMS LHCb ATLAS ALICE 1 Megabyte (1MB) A digital photo 1 Gigabyte (1GB) = 1000MB 5GB = A DVD movie 1 Terabyte (1TB) = 1000GB World annual book production 1 Petabyte (1PB) = 1000TB Annual production of one LHC experiment 1 Exabyte (1EB) = 1000 PB 3EB = World annual information production
CERN IT Department4 / 17 Tour of CERN Computer Center and the Grid at CERN LHC Data every year LHC data correspond to about 20 million CDs each year Balloon (30 Km) Concorde (15 Km) Mt. Blanc (4.8 Km) Where will the experiments store all of these data? CD stack with 1 year LHC data! (~ 20 Km)
CERN IT Department5 / 17 Tour of CERN Computer Center and the Grid at CERN LHC Data Processing LHC data analysis requires a computing power equivalent to ~ 100,000 of today's fastest PC processors Where will the experiments find such a computing power?
CERN IT Department6 / 17 Tour of CERN Computer Center and the Grid at CERN High-throughput computing based on reliable “commodity” technology More than 8500 CPUs in about 3500 boxes (Linux) 4 Petabytes on 14’000 drives (NAS Disk storage) 10 Petabytes on 45’000 tape slots with 170 high speed drives Nowhere near enough! Computing power available at CERN
CERN IT Department7 / 17 Tour of CERN Computer Center and the Grid at CERN Problem: even with Computer Centre upgrade, CERN can provide only a fraction of the necessary resources Europe: 267 institutes 4603 users Ailleurs: 208 institutes 1632 users Computing for LHC Solution: Computing centers, which were isolated in the past, will be connected, uniting the computing resources of particle physicists worldwide
CERN IT Department8 / 17 Tour of CERN Computer Center and the Grid at CERN The World Wide Web provides seamless access to information that is stored in many millions of different geographical locations What is the Grid? In contrast, the Grid is an emerging infrastructure that provides seamless access to computing power and data storage capacity distributed over the globe
CERN IT Department9 / 17 Tour of CERN Computer Center and the Grid at CERN NASA Information Power Grid DOE Science Grid NSF National Virtual Observatory NSF GriPhyN DOE Particle Physics Data Grid NSF TeraGrid DOE ASCI Grid DOE Earth Systems Grid DARPA CoABS Grid NEESGrid DOH BIRN NSF iVDGL One Web but many Grids DataGrid (CERN,...) EuroGrid (Unicore) DataTag (CERN,…) Astrophysical Virtual Observatory GRIP (Globus/Unicore) GRIA (Industrial applications) GridLab (Cactus Toolkit) CrossGrid (Infrastructure Components) EGSO (Solar Physics) UK e-Science Grid Netherlands – VLAM, PolderGrid Germany – UNICORE, Grid proposal France – Grid funding approved Italy – INFN Grid Eire – Grid proposals Switzerland - Network/Grid proposal Hungary – DemoGrid, Grid proposal Norway, Sweden - NorduGrid Grid development has been initiated by the academic, scientific and research community, but industry is also interested.
CERN IT Department10 / 17 Tour of CERN Computer Center and the Grid at CERN 1. Sharing resources on a global scale Main issues are trust, different management policies, virtual organisations, 24 hour access and support. 2. Security Main issues are well-defined yet flexible rules, authentication, authorisation, compatibility and standards 3. Balancing the load This is more than just cycle scavenging, need middleware to monitor and broker resources 4. The death of distance Networks delivered 56Kb/s 10 years ago, now we have 155Mb/s, for the LHC anticipate 10 Gb/s 5. Open standards Grid standards are converging, and include Web services, the Globus Toolkit, various protocols What are the principles behind the Grid? 5 big ideas…
CERN IT Department11 / 17 Tour of CERN Computer Center and the Grid at CERN Medical/Healthcare (imaging, diagnosis and treatment ) Bioinformatics (study of the human genome and proteome to understand genetic diseases) Nanotechnology (design of new materials from the molecular scale) Engineering (design optimization, simulation, failure analysis and remote instrument access and control) Natural Resources and the Environment (weather forecasting, earth observation, modeling and prediction of complex systems) Grid applications for Science
CERN IT Department12 / 17 Tour of CERN Computer Center and the Grid at CERN CERN projects: LHC Computing Grid (LCG) EU funded projects led by CERN: Enabling Grids for E-SciencE (EGEE) + others Industry funded projects: CERN openlab for DataGrid applications
CERN IT Department13 / 17 Tour of CERN Computer Center and the Grid at CERN Timeline 2002: start project 2003: 2003: service opened (LCG-1 started in September with 12 sites) 2004 LCG-2 released : deploy the LCG environment 2006 – 2008: build and operate the LCG service As of April 2007: biggest Grid project in the world 177 sites in more than 30 countries 30’000 processors 14 millions Gigabytes storage LCG: LHC Computing Grid
CERN IT Department14 / 17 Tour of CERN Computer Center and the Grid at CERN Access to a production quality GRID will change the way science and much else is done in Europe A geneticist at a conference, inspired by a talk she hears, will be able to launch a complex biomolecular simulation from her mobile phone. A team of engineering students will be able to run the latest 3D rendering programs from their laptops using the Grid. An international network of scientists will be able to model a new flood of the Danube in real time, using meteorological and geological data from several centres across Europe. The EGEE vision
CERN IT Department15 / 17 Tour of CERN Computer Center and the Grid at CERN Objectives Build an ultrahigh performance computer cluster Link it to the DataGrid and test its performance Evaluate potential of future commodity technology for LCG Openlab for Datagrid Applications
CERN IT Department16 / 17 Tour of CERN Computer Center and the Grid at CERN Computer Centre Tour Ground Floor OpenLab: equipment of the future CIXP: CERN Internet Exchange Point Basement Storage Silos: >10 petabytes PC farm: >3000 PC aligned IMPORTANT – FOR YOUR OWN SAFETY PLEASE DO NOT TOUCH EQUIPMENT AND CABLES DURING VISIT
CERN IT Department17 / 17 Tour of CERN Computer Center and the Grid at CERN To know more about the Grid…