A QCD Grid: 5 Easy Pieces? Richard Kenway University of Edinburgh.

Slides:



Advertisements
Similar presentations
The Quantum Chromodynamics Grid James Perry, Andrew Jackson, Matthew Egbert, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Advertisements

UKQCD GridPP NeSCAC Irving, 4/2/041 9 th GridPP Collaboration Meeting QCDgrid: Status and Future Alan Irving University of Liverpool.
Andrew McNab - Manchester HEP - 24 May 2001 WorkGroup H: Software Support Both middleware and application support Installation tools and expertise Communication.
E-Science Collaboration between the UK and China Paul Townend ( University of Leeds.
4/2/2002HEP Globus Testing Request - Jae Yu x Participating in Globus Test-bed Activity for DØGrid UTA HEP group is playing a leading role in establishing.
What is QCD? Quantum ChromoDynamics is the theory of the strong force –the strong force describes the binding of quarks by gluons to make particles such.
1 OBJECTIVES To generate a web-based system enables to assemble model configurations. to submit these configurations on different.
Richard Kenway Everything is a computer Richard Kenway.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Telescoping Languages: A Compiler Strategy for Implementation of High-Level Domain-Specific Programming Systems Ken Kennedy Rice University.
The Cactus Portal A Case Study in Grid Portal Development Michael Paul Russell Dept of Computer Science The University of Chicago
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
Oxford Jan 2005 RAL Computing 1 RAL Computing Implementing the computing model: SAM and the Grid Nick West.
Slide 3-1 Copyright © 2004 Pearson Education, Inc. Operating Systems: A Modern Perspective, Chapter 3 Operating System Organization.
The science of simulation falsification algorithms phenomenology machines better theories computer architectures non-perturbative QFT experimental tests.
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
The SAM-Grid Fabric Services Gabriele Garzoglio (for the SAM-Grid team) Computing Division Fermilab.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
SCRAM Software Configuration, Release And Management Background SCRAM has been developed to enable large, geographically dispersed and autonomous groups.
Ian Fisk and Maria Girone Improvements in the CMS Computing System from Run2 CHEP 2015 Ian Fisk and Maria Girone For CMS Collaboration.
ILDG5QCDgrid1 QCDgrid status report UKQCD data grid Chris Maynard.
Lattice 2004Chris Maynard1 QCDml Tutorial How to mark up your configurations.
QCDgrid UKQCD Achievements and Future Priorities Who and what Achievements QCDgrid middleware Future priorities Demo of meta-data catalogue browser Alan.
Simulating Quarks and Gluons with Quantum Chromodynamics February 10, CS635 Parallel Computer Architecture. Mahantesh Halappanavar.
QCD Project Overview Ying Zhang September 26, 2005.
9/16/2000Ian Bird/JLAB1 Planning for JLAB Computational Resources Ian Bird.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
NeSC Apps Workshop July 20 th, 2002 Customizable command line tools for Grids Ian Kelley + Gabrielle Allen Max Planck Institute for Gravitational Physics.
R. Ryne, NUG mtg: Page 1 High Energy Physics Greenbook Presentation Robert D. Ryne Lawrence Berkeley National Laboratory NERSC User Group Meeting.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
HERA/LHC Workshop, MC Tools working group, HzTool, JetWeb and CEDAR Tools for validating and tuning MC models Ben Waugh, UCL Workshop on.
1 Data services and computing. 2 We tend to be dealt the computing environment in which we must operate. Few of us have enough influence to steer the.
UKQCD QCDgrid Richard Kenway. UKQCD Nov 2001QCDgrid2 why build a QCD grid? the computational problem is too big for current computers –configuration generation.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
Fusion-SDM (1) Problem description –Each run in future: ¼ Trillion particles, 10 variables, 8 bytes –Each time step, generated every 60 sec is (250x10^^9)x8x10.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
JLAB Computing Facilities Development Ian Bird Jefferson Lab 2 November 2001.
Dr Chris Maynard Application Consultant, EPCC Tools for ILDG.
Grid Computing & Semantic Web. Grid Computing Proposed with the idea of electric power grid; Aims at integrating large-scale (global scale) computing.
Holding slide prior to starting show. A Portlet Interface for Computational Electromagnetics on the Grid Maria Lin and David Walker Cardiff University.
Chroma: An Application of the SciDAC QCD API(s) Bálint Joó School of Physics University of Edinburgh UKQCD Collaboration Soon to be moving to the JLAB.
…building the next IT revolution From Web to Grid…
SciDAC Software Infrastructure for Lattice Gauge Theory Richard C. Brower QCD Project Review May 24-25, 2005 Code distribution see
CS533 - Concepts of Operating Systems 1 The Mach System Presented by Catherine Vilhauer.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
UKQCD Grid Status Report GridPP 13 th Collaboration Meeting Durham, 4th—6th July 2005 Dr George Beckett Project Manager, EPCC +44.
NEES Cyberinfrastructure Center at the San Diego Supercomputer Center, UCSD George E. Brown, Jr. Network for Earthquake Engineering Simulation NEES TeraGrid.
SURA BOT 11/5/02 Lattice QCD Stephen J Wallace. SURA BOT 11/5/02 Lattice.
1 Metadata Working G roup Report Members (fixed in mid-January) G.AndronicoINFN,Italy P.CoddingtonAdelaide,Australia R.EdwardsJlab,USA C.MaynardEdinburgh,UK.
Introduction to Grids By: Fetahi Z. Wuhib [CSD2004-Team19]
What is QCD? Quantum ChromoDynamics is the theory of the strong force
INFSO-RI Enabling Grids for E-sciencE ARDA Experiment Dashboard Ricardo Rocha (ARDA – CERN) on behalf of the Dashboard Team.
2/22/2001Greenbook 2001/OASCR1 Greenbook/OASCR Activities Focus on technology to enable SCIENCE to be conducted, i.e. Software tools Software libraries.
AHM04: Sep 2004 Nottingham CCLRC e-Science Centre eMinerals: Environment from the Molecular Level Managing simulation data Lisa Blanshard e- Science Data.
May 2005 PPARC e-Science PG School1 QCDgrid Chris Maynard A Grid for UKQCD National collaboration for lattice QCD.
UKQCD NeSCAC Irving, 24/1/061 January 06 UKQCD meeting Staggered fermion project Alan Irving University of Liverpool.
Chapter 2 Introduction to OS Chien-Chung Shen CIS/UD
© Geodise Project, Scenario: Design optimisation v Model device, discretize, solve, postprocess, optimise Scripting.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
Climate-SDM (1) Climate analysis use case –Described by: Marcia Branstetter Use case description –Data obtained from ESG –Using a sequence steps in analysis,
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Accessing the VI-SEEM infrastructure
Computational Requirements
LQCD Computing Operations
Particle Physics Theory
Lecture Topics: 11/1 Hand back midterms
Presentation transcript:

A QCD Grid: 5 Easy Pieces? Richard Kenway University of Edinburgh

Aug 2001A QCD Grid: 5 Easy Pieces2 the problem of quark confinement quarks come in six flavoursquarks are confined by the strong force (QCD) into bound states called hadrons we cannot directly measure the decay of one quark flavour into another –which may conceal clues to why matter dominates antimatter we need reliable simulations of the strong forces between the quarks in a hadron –to infer quark properties from hadron properties

Aug 2001A QCD Grid: 5 Easy Pieces3 lattice QCD quantum mechanics + special relativity –probabilities from averaging over many realisations –treat space and time on an equal footing  hypercubic lattice performance  number of processors lattice spacing must be extrapolated to zero keeping the box large enough –halving the lattice spacing  500 times more computer power a L need c teraflops years QCDOC –UK-US project –ASIC = PowerPC + 1 Gflops FPU + 4 MB + 12 links –5 Tflops / $1 per sustained Mflops by end 2002 –SciDAC funding for software

Aug 2001A QCD Grid: 5 Easy Pieces4 why build a QCD grid? the computational problem is too big for current computers –configuration generation is tightly coupled  a few petaflops machines, not metacomputing (yet) –post-processing is highly diverse and distributed it involves multinational collaborations –(overlapping) virtual organisations are well established and growing many terabytes of data should be federated –validity + security are essential, so data provenance is a vital issue extensive software exists and should be more widely exploited –must be correct, portable and efficient QCD configurations

Aug 2001A QCD Grid: 5 Easy Pieces5 5 easy pieces(?) 1data provenance label configurations by physics parameters and history planning to write data in XML format 2data grid federate global data with open + restricted access via physics parameters have translation codes: UKQCD  Columbia  MILC  NERSC planning data mirroring/caching across TByte RAID disk farms at Glasgow, Edinburgh, Liverpool and Swansea possibly extending to US and Germany 3application code library validate and maintain an open-source + restricted code base UKQCD + Columbia are building a CVS repository other US groups will contribute/exploit open-source codes (SciDAC) UKQCD objective: grid functionality, conforming to standards (eg GLOBUS, SRB) by exploiting leverage with other projects

Aug 2001A QCD Grid: 5 Easy Pieces6 5 easy pieces(?) 4QCD portal provide a high-level interface to codes, data and machines considering a web form to write job scripts, construct I/O files, submit and monitor jobs, and archive data also to construct data analysis codes from library routines 5computation grid controlled access to global computers + charging mechanism computational steering for unexplored parameter regions a long way off! farming post-processing jobs across UKQCD sites is a feasible starting point a QCD grid requires similar functionality to other grids (eg LHC, virtual observatory) but is on a more manageable scale and is low risk insufficient human resources are available to UKQCD today, but the opportunity exists