UKQCD QCDgrid Richard Kenway. UKQCD Nov 2001QCDgrid2 why build a QCD grid? the computational problem is too big for current computers –configuration generation.

Slides:



Advertisements
Similar presentations
QCDgrid User Interfaces James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Advertisements

The Quantum Chromodynamics Grid James Perry, Andrew Jackson, Matthew Egbert, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
UKQCD GridPP NeSCAC Irving, 4/2/041 9 th GridPP Collaboration Meeting QCDgrid: Status and Future Alan Irving University of Liverpool.
Jens G Jensen Atlas Petabyte store Supporting Multiple Interfaces to Mass Storage Providing Tape and Mass Storage to Diverse Scientific Communities.
Overview of local security issues in Campus Grid environments Bruce Beckles University of Cambridge Computing Service.
Provenance-Aware Storage Systems Margo Seltzer April 29, 2005.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
DataGrid Kimmo Soikkeli Ilkka Sormunen. What is DataGrid? DataGrid is a project that aims to enable access to geographically distributed computing power.
Oxford Jan 2005 RAL Computing 1 RAL Computing Implementing the computing model: SAM and the Grid Nick West.
The science of simulation falsification algorithms phenomenology machines better theories computer architectures non-perturbative QFT experimental tests.
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
Web-based Portal for Discovery, Retrieval and Visualization of Earth Science Datasets in Grid Environment Zhenping (Jane) Liu.
Edinburgh University Experimental Particle Physics Alasdair Earl PPARC eScience Summer School September 2002.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
Ian Fisk and Maria Girone Improvements in the CMS Computing System from Run2 CHEP 2015 Ian Fisk and Maria Girone For CMS Collaboration.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
ILDG5QCDgrid1 QCDgrid status report UKQCD data grid Chris Maynard.
QCDgrid UKQCD Achievements and Future Priorities Who and what Achievements QCDgrid middleware Future priorities Demo of meta-data catalogue browser Alan.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
KNMI Applications on Testbed 1 …and other activities.
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
CHEP 2000, Giuseppe Andronico Grid portal based data management for Lattice QCD data ACAT03, Tsukuba, work in collaboration with A.
Access Across Time: How the NAA Preserves Digital Records Andrew Wilson Assistant Director, Preservation.
3rd Nov 2000HEPiX/HEPNT CDF-UK MINI-GRID Ian McArthur Oxford University, Physics Department
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Setting up a Pan-European Datagrid using QCDgrid technology Chris Johnson, James Perry, Lorna Smith and Jean-Christophe Desplat EPCC, The University Of.
Nick Brook Current status Future Collaboration Plans Future UK plans.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
Jan. 17, 2002DØRAM Proposal DØRACE Meeting, Jae Yu 1 Proposal for a DØ Remote Analysis Model (DØRAM) IntroductionIntroduction Remote Analysis Station ArchitectureRemote.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
ILDG Middleware Status Chip Watson ILDG-6 Workshop May 12, 2005.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
Dan Tovey, University of Sheffield GridPP: Experiment Status & User Feedback Dan Tovey University Of Sheffield.
Neil Geddes GridPP-10, June 2004 UK e-Science Grid Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre.
Event Data History David Adams BNL Atlas Software Week December 2001.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
The european ITM Task Force data structure F. Imbeaux.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
Grid Computing Status Report Jeff Templon PDP Group, NIKHEF NIKHEF Scientific Advisory Committee 20 May 2005.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Web: Minimal Metadata for Data Services Through DIALOGUE Neil Chue Hong AHM2007.
…building the next IT revolution From Web to Grid…
Metadata Mòrag Burgon-Lyon University of Glasgow.
T3 analysis Facility V. Bucard, F.Furano, A.Maier, R.Santana, R. Santinelli T3 Analysis Facility The LHCb Computing Model divides collaboration affiliated.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
UKQCD Grid Status Report GridPP 13 th Collaboration Meeting Durham, 4th—6th July 2005 Dr George Beckett Project Manager, EPCC +44.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
A QCD Grid: 5 Easy Pieces? Richard Kenway University of Edinburgh.
Tier3 monitoring. Initial issues. Danila Oleynik. Artem Petrosyan. JINR.
Sep 25, 20071/5 Grid Services Activities on Security Gabriele Garzoglio Grid Services Activities on Security Gabriele Garzoglio Computing Division, Fermilab.
May 2005 PPARC e-Science PG School1 QCDgrid Chris Maynard A Grid for UKQCD National collaboration for lattice QCD.
Participation of JINR in CERN- INTAS project ( ) Korenkov V., Mitcin V., Nikonov E., Oleynik D., Pose V., Tikhonenko E. 19 march 2004.
Feb. 13, 2002DØRAM Proposal DØCPB Meeting, Jae Yu 1 Proposal for a DØ Remote Analysis Model (DØRAM) IntroductionIntroduction Partial Workshop ResultsPartial.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL DOE/NSF Review of US LHC Software and Computing Fermilab Nov 29, 2001.
Stephen Burke – Sysman meeting - 22/4/2002 Partner Logo The Testbed – A User View Stephen Burke, PPARC/RAL.
ECHO Technical Interchange Meeting 2013 Timothy Goff 1 Raytheon EED Program | ECHO Technical Interchange 2013.
Accessing the VI-SEEM infrastructure
Moving the LHCb Monte Carlo production system to the GRID
Ian Bird GDB Meeting CERN 9 September 2003
UK GridPP Tier-1/A Centre at CLRC
Readiness of ATLAS Computing - A personal view
LQCD Computing Operations
Web Servers (IIS and Apache)
Presentation transcript:

UKQCD QCDgrid Richard Kenway

UKQCD Nov 2001QCDgrid2 why build a QCD grid? the computational problem is too big for current computers –configuration generation is tightly coupled  a few teraflops machines –post-processing is highly diverse and distributed it involves multinational collaborations –virtual organisations are well established and growing many terabytes of data should be federated –validity + security are essential, so data provenance is a vital issue extensive software exists and should be more widely exploited –must be correct, portable and efficient QCD configurations

UKQCD Nov 2001QCDgrid3 APEmille Swansea QCDgrid infrastructure QCDOC Edinburgh QCDOC BNL QCDOC Columbia TB disk farm Glasgow TB disk farm Edinburgh TB disk farm Liverpool TB disk farm Swansea US Tier II centers German/Italian Tier II centres Tier I Tier II Tier III apeNEXT DESY/INFN ScotGrid Ulgrid

UKQCD Nov 2001QCDgrid4 data mark up 1label configurations by physics parameters and history data in XML format extensible schema propose to SciDAC as US standard translation programs: UKQCD  Columbia  US  … UKQCD data catalogue  logical file name  EDG replica management system  physical file name milestone Feb 02: UKQCD data catalogue implemented on top of EDG TB1 D1 primary data: L, , {c}, {  sea }, {history} several  1000  1 GB D2 secondary data: L, , {c}, {  sea },  valence, {history} several  1000  5  5 GB D3 tertiary data: L, , {c}, {  sea }, {  valence }, {f}, {history} many  20 MB total UKQCD data = hundred TB per year progress: XML schema working party to report in Dec physics history software

UKQCD Nov 2001QCDgrid5 data storage and distribution 2federate (global) data with open + restricted access data generation at Edinburgh, Columbia and Swansea data replication across TB RAID disk farms at Edinburgh, Glasgow, Liverpool and Swansea to be fully operational well before our existing data store closes in Sep 02 possibly extending to the rest of the US, Germany and Italy milestone Jun 02: functioning distributed data store 1. users store data according to physics parameters and update catalogue 2. replica management system generates a geographically separate copy 3. users fetch data by physics parameters 4. replica management system optimises data retrieval progress: RAID arrays installed at Edinburgh, Glasgow & Liverpool EDG TB1 tests beginning this week (Certification Authority?)

UKQCD Nov 2001QCDgrid6 user interface 3provide a high-level interface to data, codes and machines web portal with increasing grid functionality to automate and distribute data analysis jobs to be operational for QCDOC data analysis beginning in 2003 milestone Jun 02: portal supports data management Oct 02: portal supports local job submission via scripts Jun-Oct 03: portal supports distributed data analysis progress: use cases specified (important input to schema)

UKQCD Nov 2001QCDgrid7 resources required 2 FTE of programmer effort over the next 24 months –assumes appropriate EDG middleware is available via GridPP data catalogue and user interface require dedicated effort (1 FTE) –managed by Alan Irving grid implementation shared with GridPP (1 FTE) –how will UKQCD project management draw on (distributed) GridPP resources? QCDgrid is a relatively simple grid application which implements generic grid functionality and must support real users by the end of 2003