Birmingham Particle Physics Masterclass 23 th April 2008 Birmingham Particle Physics Masterclass 23 th April 2008 The Grid What & Why? Presentation by:

Slides:



Advertisements
Similar presentations
Repositories and RAE Submission Getting More Out Of Institutional Repositories Bill Hubbard SHERPA Manager University of Nottingham.
Advertisements

UK Council of Research Repositories UKCoRR Launch - 21 st May 2007 University of Nottingham.
Putting Repositories in Their Place Bill Hubbard SHERPA and RSP Manager The Scholarly Communication Landscape: Perspectives from Manchester University.
Managing Research Outputs - embedding repositories into institutional research processes - Bill Hubbard SHERPA Manager UKSG 31 st Annual Conference April.
Using the University's repository Bill Hubbard SHERPA Manager.
Using the University's repository Bill Hubbard SHERPA Manager.
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
The Grid Professor Steve Lloyd Queen Mary, University of London.
Storage Workshop Summary Wahid Bhimji University Of Edinburgh On behalf all of the participants…
Your university or experiment logo here What is it? What is it for? The Grid.
The Grid What is it? what is it for?. Your university or experiment logo here Web: information sharing Invented at CERN by Tim Berners-Lee No. of Internet.
S.L.LloydGridPP Collaboration Meeting IC Sept 2002Slide 1 Introduction Welcome to the 5 th GridPP Collaboration Meeting Steve Lloyd, Chair of GridPP.
GridPP Building a UK Computing Grid for Particle Physics A PPARC funded project.
Slide 1 of 24 Steve Lloyd NW Grid Seminar - 11 May 2006 GridPP and the Grid for Particle Physics Steve Lloyd Queen Mary, University of London NW Grid Seminar.
From web to grid: The future of scientific computing? Neasan ONeill GridPP Dissemination Officer.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
UK Agency for the support of: High Energy Physics - the nature of matter and mass Particle Astrophysics - laws from natural phenomena Astronomy - the.
Fighting Malaria With The Grid. Computing on The Grid The Internet allows users to share information across vast geographical distances. Using similar.
GridPP From Prototype to Production David Britton 21/Sep/06 1.Context – Introduction to GridPP 2.Performance of the GridPP/EGEE/wLCG Grid 3.Some Successes.
The benefits of using cloud computing for Stem Cell Imaging Nick Trigg CEO, Constellation 24 th June 2009.
Year 11 IAG session. Aims: To understand what different qualifications mean To understand what you need for different courses/an intro to what Uni’s look.
S.L.LloydGridPP CB 29 Oct 2002Slide 1 Agenda 1.Introduction – Steve Lloyd 2.Minutes of Previous Meeting (23 Oct 2001) 3.Matters Arising 4.Project Leader's.
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
e-ScienceTalk: Supporting Grid and High Performance Computing Reporting across Europe GA No September 2010 – 31 May 2013.
Steve LloydInaugural Lecture - 24 November 2004 Slide 1 The Data Deluge and the Grid Steve Lloyd Professor of Experimental Particle Physics Inaugural Lecture.
Intro to grid computing Cristy Burne GridTalk Queen Mary University of London.
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
Emma Day – Sixth Form Careers Advisor
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Vendor Day 30 th April.
14 July 2004GridPP Collaboration BoardSlide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
The Grid Prof Steve Lloyd Queen Mary, University of London.
Techniques for Data Linkage and Anonymisation – A Funders View Turing Gateway Meeting 23 rd October 2014 Dr Mark Pitman.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
1 st EGEE Conference – April UK and Ireland Partner Dave Kant Deputy ROC Manager.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
Organisation Management and Policy Group (MPG): Responsible for setting and policy decisions and resolving any issues concerning fractional usage, acceptable.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
Large Hadron Collider Big Science for the 21 st Century David Lantz.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
To the Grid From the Web Dr. Francois Grey IT Department, CERN.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
EScience and Particle Physics Roger Barlow eScience showcase May 1 st 2007.
Dave Newbold, University of Bristol8/3/2001 UK Testbed 0 Sites Sites that have committed to TB0: RAL (R) Birmingham (Q) Bristol (Q) Edinburgh (Q) Imperial.
…building the next IT revolution From Web to Grid…
GridPP Dirac Service The 4 th Dirac User Workshop May 2014 CERN Janusz Martyniak, Imperial College London.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Your university or experiment logo here What is it? What is it for? The Grid.
The GridPP DIRAC project DIRAC for non-LHC communities.
Performance analysis extracts from GridPP OC metrics report For UKI operations meeting 15 th June 2005.
A UK Computing Facility John Gordon RAL October ‘99HEPiX Fall ‘99 Data Size Event Rate 10 9 events/year Storage Requirements (real & simulated data)
The GridPP DIRAC project DIRAC for non-LHC communities.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
Realising MRC’s Vision in Health and Bioinformatics MRC Open Council Meeting July 2014 Janet Valentine Head of Population Health and Informatics.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
Slide § David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
UK Status and Plans Catalin Condurache – STFC RAL ALICE Tier-1/Tier-2 Workshop University of Torino, February 2015.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
1.
Volunteer Computing for Science Gateways
Institutional Strategic Objectives Performance
Report from Computing Advisory Panel
Understanding the nature of matter -
The LHC Computing Grid Visit of Her Royal Highness
Building a UK Computing Grid for Particle Physics
Results and Comparisons for SCONUL
Institutional Strategic Objectives Performance
Presentation transcript:

Birmingham Particle Physics Masterclass 23 th April 2008 Birmingham Particle Physics Masterclass 23 th April 2008 The Grid What & Why? Presentation by: Alvin Tan

The BIG question –What actually happened at the Big Bang? Other questions –How does gravity work? –How do particles have mass? –Where is the rest of the universe?...a quick recap

To help answer these questions, the LHC experiments have built truly massive machines that will create about 15PB (petabytes) of data every year! 15PB ?!

5.461km (3.39 miles) GB IPods That’s more than any single system can handle at the moment! Snowdon

The Solution What do we need? The solution must be: –able to handle massive amounts of data, –able to process large computing tasks, –relatively inexpensive, –simple to use, –accessible 24/7, and –easily upgraded.

The Solution One possibility What about Super Computers ? e.g. Blue Gene/L, Red Storm, BGW, ASC Purple, MareNostrum, etc. Expensive Inaccessible Easily outdated So, can’t we just keep building them bigger and more powerful? No

The network that connects the world’s computers allowing them to communicate. The Solution The Internet

Worldwide Web File Sharing networks BOINC e.g. The Solution Current Tools on the Internet The Web specifically designed by scientists at CERN to help with their work.

Building a new tool –Computers in the various institutions are already connected –They already share files How about sharing everything? The Solution A New Tool for the Internet

The Computing GridThe Electric Grid Always on Use as much (or as little) on demand You do not need to know about the source Just plug in and go

A little history... Distributed computing has been around for some time but: –the use of different sites has to be negotiated by individual scientists, –separate access accounts are needed for each system, and –computing tasks have to be submitted and results retrieved by hand from individual systems. Distributed computing requires a significant amount of co-ordination work before any useful work can be done.

You can now submit a task and retrieve the results from the same place! Today... Grid Middleware lets you submit tasks to the Grid without having to know where the data is located or where your tasks will run.

GridPP is a collaboration of Particle Physicists and Computing Scientists from 20 UK universities and CERN building the UK arm of the Grid Individuals Hardware computing power equivalent of more than 10,000 desktop computers. 280TB of storage

University of Birmingham University of Bristol University of Brunel University of Cambridge University of Durham University of Edinburgh University of Glasgow Imperial College London University of Lancaster University of Liverpool University of Manchester University of Oxford Queen Mary, University of London Royal Holloway, University of London Rutherford Appleton Laboratory University of Sheffield University of Sussex University of Swansea University College London University of Warwick GridPP

We are members of two Grid projects (LHC Computing Grid)

WISDOM Challenges: –Avian Flu 100 years work done in 4 weeks –Malaria 50% of computing power provided by GridPP Inferno Grid –Humanities Project in Montclair University New Jersey Current texts available on the system include Aristotle, Galen, Plato, Marcus Aurelius, and Commentaries. Beyond Particle Physics

Conclusion Large physics problems ---> Large detector machines ---> Unprecedented data and computing requirements. The worldwide Grid is needed to meet these new challenges. UK GridPP is heavily involved in providing the manpower and hardware to make this all possible. Grid Computing is heavily used by Particle Physics community. Many other disciplines recognise its potential as well.