Presentation is loading. Please wait.

Presentation is loading. Please wait.

Steve LloydIoP Dublin March 2005 Slide 1 The Grid for Particle Physic(ist)s Steve Lloyd Queen Mary, University of London What is it and how do I use it?

Similar presentations


Presentation on theme: "Steve LloydIoP Dublin March 2005 Slide 1 The Grid for Particle Physic(ist)s Steve Lloyd Queen Mary, University of London What is it and how do I use it?"— Presentation transcript:

1 Steve LloydIoP Dublin March 2005 Slide 1 The Grid for Particle Physic(ist)s Steve Lloyd Queen Mary, University of London What is it and how do I use it?

2 Steve LloydIoP Dublin March 2005 Slide 2 Starting from this event… We are looking for this signature Selectivity: 1 in Like looking for 1 person in a thousand world populations Or for a needle in 20 million haystacks! LHC Data Challenge ~100,000,000 electronic channels 800,000,000 proton- proton interactions per second Higgs per second 10 PBytes of data a year (10 Million GBytes = 14 Million CDs) Computing Solution: The Grid

3 Steve LloydIoP Dublin March 2005 Slide 3 The Grid Ian Foster / Carl Kesselman: "A computational Grid is a hardware and software infrastructure that provides dependable, consistent, pervasive and inexpensive access to high-end computational capabilities." 'Grid' means different things to different people All agree its a funding opportunity!

4 Steve LloydIoP Dublin March 2005 Slide 4 Electricity Grid Analogy with the Electricity Power Grid 'Standard Interface' Power Stations Distribution Infrastructure

5 Steve LloydIoP Dublin March 2005 Slide 5 Computing Grid Computing and Data Centres Fibre Optics of the Internet

6 Steve LloydIoP Dublin March 2005 Slide 6 Middleware MIDDLEWARE CPU Disks, CPU etc PROGRAMS OPERATING SYSTEM Word/Excel /Web Your Program Games CPU Cluster User Interface Machine CPU Cluster CPU Cluster Resource Broker Information Service Single PC Grid Disk Server Your Program Middleware is the Operating System of a distributed computing system Replica Catalogue Bookkeeping Service

7 Steve LloydIoP Dublin March 2005 Slide 7 GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and CERN Funded by the Particle Physics and Astronomy Research Council (PPARC) GridPP £17m "From Web to Grid" GridPP £16m "From Prototype to Production" Not planning GridPP3 – aim is to incorporate Grid activities and facilities into baseline programme.

8 Steve LloydIoP Dublin March 2005 Slide 8 International Collaboration EU DataGrid (EDG) –Middleware Development Project US and other Grid projects Interoperability LHC Computing Grid (LCG) –Grid Deployment Project for LHC EU Enabling Grids for e-Science (EGEE) –Grid Deployment Project for all disciplines GridPP LCG EGEE

9 Steve LloydIoP Dublin March 2005 Slide 9 GridPP Support Manpower for Experiments: Manpower for Middleware Development: Hardware and Manpower at RAL (LHC Tier-1, BaBar Tier-A) Manpower for System Support at Institutes (Tier-2s) Manpower for LCG at CERN (under discussion) Metadata Storage Workload Management Security Information and Monitoring Networking (Not directly supported but using LCG)

10 Steve LloydIoP Dublin March 2005 Slide 10 Paradigm Shift? Jun: 80%:20% Aug: 27%:73% May: 89%:11% Jul: 77%:23% LHCb Monte Carlo Production Grid Non- Grid Grid

11 Steve LloydIoP Dublin March 2005 Slide 11 Tier Structure Tier 0 Tier 1 National centres Tier 2 Regional groups Tier 3 Institutes Tier 4 Workstations Offline farm Online system CERN computer centre RAL,UK ScotGridNorthGridSouthGridLondon FranceItalyGermanyUSA GlasgowEdinburghDurham

12 Steve LloydIoP Dublin March 2005 Slide 12 Resource Discovery at Tier-1 1 July October 2000 Pre-Grid GRID Load July 2004 Full again in 8 hours!

13 Steve LloydIoP Dublin March 2005 Slide 13 UK Tier-2 Centres ScotGrid Durham, Edinburgh, Glasgow NorthGrid Daresbury, Lancaster, Liverpool, Manchester, Sheffield SouthGrid Birmingham, Bristol, Cambridge, Oxford, RAL PPD, Warwick London Brunel, Imperial, QMUL, RHUL, UCL Mostly funded by HEFCE

14 Steve LloydIoP Dublin March 2005 Slide 14 Tier-2 Resources CPUDisk ALICEATLASCMSLHCbALICEATLASCMSLHCb London NorthGrid ScotGrid SouthGrid Committed Resources at each Tier-2 in 2007 Experiments Requirement of a Tier-2 in 2008 Need SRIF3 Resources! Doesnt include SRIF3. Experiment shares determined by Institutes who bought the kit Overall LCG shortfall ~30% in CPU ~50% in Disk (All Tiers)

15 Steve LloydIoP Dublin March 2005 Slide 15 The LCG Grid 123 Sites 33 Countries 10,314 CPUs 3.3 PBytes Disk

16 Steve LloydIoP Dublin March 2005 Slide 16 Grid Demo

17 Steve LloydIoP Dublin March 2005 Slide 17 Getting Started 1. Get a digital certificate 2. Join a Virtual Organisation (VO) For LHC join LCG and choose a VO 3. Get access to a local User Interface Machine (UI) and copy your files and certificate there Authentication – who you are Authorisation – what you are allowed to do

18 Steve LloydIoP Dublin March 2005 Slide 18 Job Preparation ############# athena.jdl ################# Executable = "athena.sh"; StdOutput = "athena.out"; StdError = "athena.err"; InputSandbox = {"athena.sh", "MyJobOptions.py", "MyAlg.cxx", "MyAlg.h", "MyAlg_entries.cxx", "MyAlg_load.cxx", "login_requirements", "requirements", "Makefile"}; OutputSandbox = {"athena.out","athena.err", "ntuple.root", "histo.root", "CLIDDBout.txt"}; Requirements = Member("VO-atlas-release-9.0.4", other.GlueHostApplicationSoftwareRunTimeEnvironment); ################################################ Input files Output Files Choose ATLAS Version (Satisfied by ~32 Sites) Prepare a file of Job Description Language (JDL): My C++ Code Job Options Script to run

19 Steve LloydIoP Dublin March 2005 Slide 19 Job Submission ~/atlas]$ grid-proxy-init Your identity: /C=UK/O=eScience/OU=QueenMaryLondon/L=Physics/CN=steve lloyd Enter GRID pass phrase for this identity: Creating proxy Done Your proxy is valid until: Thu Mar 17 03:25: ~/atlas]$ Make a copy of your certificate to send out (~ once a day): ~/atlas]$ edg-job-submit --vo atlas -o jobIDfile athena.jdl Selected Virtual Organisation name (from --vo option): atlas Connecting to host lxn1188.cern.ch, port 7772 Logging to host lxn1188.cern.ch, port 9002 ================================ edg-job-submit Success ==================================== The job has been successfully submitted to the Network Server. Use edg-job-status command to check job current status. Your job identifier (edg_jobId) is: - https://lxn1188.cern.ch:9000/0uDjtwbBbj8DTRetxYxoqQ The edg_jobId has been saved in the following file: /home/lloyd/atlas/jobIDfile ============================================================================================ ~/atlas]$ Submit the Job: VOJDLFile to hold job IDs

20 Steve LloydIoP Dublin March 2005 Slide 20 ~/atlas]$ edg-job-status -i jobIDfile : https://lxn1188.cern.ch:9000/tKlZHxqEhuroJUhuhEBtSA 2 : https://lxn1188.cern.ch:9000/IJhkSObaAN5XDKBHPQLQyA 3 : https://lxn1188.cern.ch:9000/BMEOq90zqALvkriHdVeN7A 4 : https://lxn1188.cern.ch:9000/l6wist7SMq6jVePwQjHofg 5 : https://lxn1188.cern.ch:9000/wHl9Yl_puz9hZDMe1OYRyQ 6 : https://lxn1188.cern.ch:9000/PciXGNuAu7vZfcuWiGS3zQ 7 : https://lxn1188.cern.ch:9000/0uDjtwbBbj8DTRetxYxoqQ a : all q : quit Choose one or more edg_jobId(s) in the list - [1-7]all:7 ************************************************************* BOOKKEEPING INFORMATION: Status info for the Job : https://lxn1188.cern.ch:9000/0uDjtwbBbj8DTRetxYxoqQ Current Status: Done (Success) Exit code: 0 Status Reason: Job terminated successfully Destination: lcg00125.grid.sinica.edu.tw:2119/jobmanager-lcgpbs-short reached on: Wed Mar 16 17:45: ************************************************************* ~/atlas]$ RAL Valencia CERN Taiwan Job Status Taiwan Find out its status: Ran at:

21 Steve LloydIoP Dublin March 2005 Slide 21 Job Retrieval ~/atlas]$ edg-job-get-output -dir. -i jobIDfile Retrieving files from host: lxn1188.cern.ch ( for https://lxn1188.cern.ch:9000/0uDjtwbBbj8DTRetxYxoqQ ) ********************************************************************************* JOB GET OUTPUT OUTCOME Output sandbox files for the job: - https://lxn1188.cern.ch:9000/0uDjtwbBbj8DTRetxYxoqQ have been successfully retrieved and stored in the directory: /home/lloyd/atlas/lloyd_0uDjtwbBbj8DTRetxYxoqQ ********************************************************************************* ~/atlas]$ ls -lt /home/lloyd/atlas/lloyd_0uDjtwbBbj8DTRetxYxoqQ total rw-r--r-- 1 lloyd hep 224 Mar 17 10:47 CLIDDBout.txt -rw-r--r-- 1 lloyd hep Mar 17 10:47 ntuple.root -rw-r--r-- 1 lloyd hep 5372 Mar 17 10:47 athena.err -rw-r--r-- 1 lloyd hep Mar 17 10:47 athena.out Retrieve the Output:

22 Steve LloydIoP Dublin March 2005 Slide 22 Conclusions The Grid is here – it works! Currently difficult to install and maintain the middleware and the experiments software It is straightforward to use There are huge resources available: Last week LXBATCH had 6500 ATLAS Jobs queued - LCG had 3017 free CPUs Need to scale to full size ~10, ,000 CPUs Need Stability, Robustness, Security (Hackers Paradise!) etc Need continued funding beyond start of LHC! Use it!

23 Steve LloydIoP Dublin March 2005 Slide 23 Further Info


Download ppt "Steve LloydIoP Dublin March 2005 Slide 1 The Grid for Particle Physic(ist)s Steve Lloyd Queen Mary, University of London What is it and how do I use it?"

Similar presentations


Ads by Google