Presentation is loading. Please wait.

Presentation is loading. Please wait.

UKQCD GridPP NeSCAC Irving, 4/2/041 9 th GridPP Collaboration Meeting QCDgrid: Status and Future Alan Irving University of Liverpool.

Similar presentations

Presentation on theme: "UKQCD GridPP NeSCAC Irving, 4/2/041 9 th GridPP Collaboration Meeting QCDgrid: Status and Future Alan Irving University of Liverpool."— Presentation transcript:

1 UKQCD GridPP NeSCAC Irving, 4/2/041 9 th GridPP Collaboration Meeting QCDgrid: Status and Future Alan Irving University of Liverpool

2 UKQCD GridPP NeSCAC Irving, 4/2/042 UKQCD and the Grid: QCDgrid architecture PPARC support GridPP1: Phase 1: data grid GridPP1: Phase 2: pilot scheme for distributed processing GridPP2: Full distributed processing GridPP2: International Lattice Data Grid activities (ILDG)

3 UKQCD GridPP NeSCAC Irving, 4/2/043 QCDOC: Columbia + IBM + UKQCD + BNL 10,000+ processors 10 Tflops, £6.6M, July procs Nov 03

4 UKQCD GridPP NeSCAC Irving, 4/2/044 Stop press.... Following exhaustive tests of the ASIC, orders have now been placed for some 14,720 ASICS for: 2048 node development machine ( > 1 Tflop sustained) for assembly in March 12,000+ node main machine, for assembly in May

5 UKQCD GridPP NeSCAC Irving, 4/2/045 UKQCD computing strategy with QCDOC Distributed computing Grid International standards (ILDG) SCIDAC: US strategy Local resources compute/data Tier 1: Edinburgh Tier 2: Edinburgh, Liverpool, Swansea, Southampton (+ RAL) Node QCDOC FE Grid UKQCD approved simulations International cooperation with: MILC, Columbia,.. Data grid for configuration acquisition and storage International nodes available Job submission software (JSS) for homogeneous physics analysis within UKQCD Need for significant clusters at computational nodes, (Liverpool, RAL,...)

6 UKQCD GridPP NeSCAC Irving, 4/2/046 Basics of the QCDgrid datagrid Currently has 4 sites with 7 RAID disk nodes Main design and implementation by EPCC (James Perry) Admin by C Maynard (Physics/Edinburgh) + local sys admins User requirement/testing driven by Liverpool (C McNeile) File replication managed by custom written software built on Globus 2 Central control thread ensures at least 2 copies of each file at different sites Replica catalogue maps logical names to physical locations Metadata catalogue associates physical parameters with files XML document for each data file XML document storage in eXist XML database, queried by Xpath

7 UKQCD GridPP NeSCAC Irving, 4/2/047 Operation of the QCDgrid datagrid Initial queries via browser GUI Production running via command line tools Current developments: –Simple interface for data/metadata submission under development –Grid administration tools –Grid recovery tools including switching of control thread –EDG software for virtual organisation management and security. –Data binding in QCDOC codes

8 UKQCD GridPP NeSCAC Irving, 4/2/048 QCDgrid metadata browser

9 UKQCD GridPP NeSCAC Irving, 4/2/049 Pilot version of job submission software Globus toolkit EDG software for VO management and security Integrated with datagrid SW Pilot running on test grid at EPCC Command line job submission Job IO can go to user console Output files returned automatically Soon... Deploy on main grid Integrate with batch systems (PBS..) Better user interface (GUI..) GridPP2.. Full system with real analysis code

10 UKQCD GridPP NeSCAC Irving, 4/2/0410 job submission test gridwork]$ qcdgrid-job-submit \ /home/alan/gridwork/testrn \ -input /home/alan/gridwork/in_seed.dat Storing results in local directory qcdgridjob Storing results in remote directory /tmp/qcdgridjob RSL=&(executable=/opt/qcdgrid/qcdgrid-job-controller) (arguments=/tmp/qcdgridjob000024/jobdesc) (environment=(LD_LIBRARY_PATH /opt/globus/lib:/opt/qcdgrid)) Connecting to port OUTPUT: iter r.n r.n. seeds written to out_seed.dat testrn: finished Ok! Job has completed Retrieving jobdesc Retrieving controller.log Retrieving wrapper.log Retrieving stdout Retrieving stderr Retrieving out_seed.dat gridwork]$

11 UKQCD GridPP NeSCAC Irving, 4/2/0411 UKQCD launched this in 2002 in Boston Participants from: USA(Scidac), Japan, Germany,.. Enable data sharing Agree standards Steering group of national reps working groups Metadata WG –XML schema –gauge formats etc International Lattice Data Grid Middleware WG –Web service standards –Storage Resource Manager ** Feb 3: CP-PACS (Japan) launch ILDG node at

12 UKQCD GridPP NeSCAC Irving, 4/2/ continent file browsing JLAB UKQCD LATT0 3

13 UKQCD GridPP NeSCAC Irving, 4/2/0413 ILDG file browser

14 UKQCD GridPP NeSCAC Irving, 4/2/0414 QCDgrid and GridPP2 Extend Job Submission Software, resource brokering,.. XML mark-up within main QCDOC production codes Web services implementation of replica and metadata catalogues Web services ILDG replica and metadata catalogues Web services based compute grid using UK and non-UK nodes

15 UKQCD GridPP NeSCAC Irving, 4/2/0415 QCDgrid websites QCDgrid home page (at GridPP?): QCDgrid project page at NeSCForge development site: ILDG project page at JLAB, USA:

16 UKQCD GridPP NeSCAC Irving, 4/2/0416 CONCLUSIONS UKQCD has operational data grid (QCDgrid) QCDOC preparations are well advanced Tier 2 nodes have been (are being) installed Work continues on XML tools Prototype job submission SW exists and is being developed International activity is increasing ( ) Open software development via NeSC Forge

Download ppt "UKQCD GridPP NeSCAC Irving, 4/2/041 9 th GridPP Collaboration Meeting QCDgrid: Status and Future Alan Irving University of Liverpool."

Similar presentations

Ads by Google