The GridPP DIRAC project DIRAC for non-LHC communities.

Slides:



Advertisements
Similar presentations
1 14 Feb 2007 CMS Italia – Napoli A. Fanfani Univ. Bologna A. Fanfani University of Bologna MC Production System & DM catalogue.
Advertisements

30-31 Jan 2003J G Jensen, RAL/WP5 Storage Elephant Grid Access to Mass Storage.
The Quantum Chromodynamics Grid James Perry, Andrew Jackson, Matthew Egbert, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
HTCondor and the European Grid Andrew Lahiff STFC Rutherford Appleton Laboratory European HTCondor Site Admins Meeting 2014.
Solar and STP Physics with AstroGrid 1. Mullard Space Science Laboratory, University College London. 2. School of Physics and Astronomy, University of.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Batch Production and Monte Carlo + CDB work status Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Services Abderrahman El Kharrim
Jiri Chudoba for the Pierre Auger Collaboration Institute of Physics of the CAS and CESNET.
Oxford Jan 2005 RAL Computing 1 RAL Computing Implementing the computing model: SAM and the Grid Nick West.
1 Bridging Clouds with CernVM: ATLAS/PanDA example Wenjing Wu
LHC Experiment Dashboard Main areas covered by the Experiment Dashboard: Data processing monitoring (job monitoring) Data transfer monitoring Site/service.
Pilots 2.0: DIRAC pilots for all the skies Federico Stagni, A.McNab, C.Luzzi, A.Tsaregorodtsev On behalf of the DIRAC consortium and the LHCb collaboration.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
Grid Initiatives for e-Science virtual communities in Europe and Latin America DIRAC TEAM CPPM – CNRS DIRAC Grid Middleware.
Computing Infrastructure Status. LHCb Computing Status LHCb LHCC mini-review, February The LHCb Computing Model: a reminder m Simulation is using.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
CERN IT Department CH-1211 Genève 23 Switzerland t Internet Services Job Monitoring for the LHC experiments Irina Sidorova (CERN, JINR) on.
Nick Brook Current status Future Collaboration Plans Future UK plans.
INFSO-RI Enabling Grids for E-sciencE Project Gridification: the UNOSAT experience Patricia Méndez Lorenzo CERN (IT-PSS/ED) CERN,
1 st December 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Grid job submission using HTCondor Andrew Lahiff.
GridPP18 Glasgow Mar 07 DØ – SAMGrid Where’ve we come from, and where are we going? Evolution of a ‘long’ established plan Gavin Davies Imperial College.
BESIII Production with Distributed Computing Xiaomei Zhang, Tian Yan, Xianghu Zhao Institute of High Energy Physics, Chinese Academy of Sciences, Beijing.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
4/5/2007Data handling and transfer in the LHCb experiment1 Data handling and transfer in the LHCb experiment RT NPSS Real Time 2007 FNAL - 4 th May 2007.
Enabling Grids for E-sciencE System Analysis Working Group and Experiment Dashboard Julia Andreeva CERN Grid Operations Workshop – June, Stockholm.
CERN Using the SAM framework for the CMS specific tests Andrea Sciabà System Analysis WG Meeting 15 November, 2007.
…building the next IT revolution From Web to Grid…
Metadata Mòrag Burgon-Lyon University of Glasgow.
T3 analysis Facility V. Bucard, F.Furano, A.Maier, R.Santana, R. Santinelli T3 Analysis Facility The LHCb Computing Model divides collaboration affiliated.
GridPP Dirac Service The 4 th Dirac User Workshop May 2014 CERN Janusz Martyniak, Imperial College London.
1 Andrea Sciabà CERN Critical Services and Monitoring - CMS Andrea Sciabà WLCG Service Reliability Workshop 26 – 30 November, 2007.
Jens G Jensen RAL, EDG WP5 Storage Element Overview DataGrid Project Conference Heidelberg, 26 Sep-01 Oct 2003.
EGEE-III INFSO-RI Enabling Grids for E-sciencE Ricardo Rocha CERN (IT/GS) EGEE’08, September 2008, Istanbul, TURKEY Experiment.
AliEn AliEn at OSC The ALICE distributed computing environment by Bjørn S. Nilsen The Ohio State University.
The ATLAS Cloud Model Simone Campana. LCG sites and ATLAS sites LCG counts almost 200 sites. –Almost all of them support the ATLAS VO. –The ATLAS production.
Andrew McNabGrid in 2002, Manchester HEP, 7 Jan 2003Slide 1 Grid Work in 2002 Andrew McNab High Energy Physics University of Manchester.
Julia Andreeva on behalf of the MND section MND review.
Service Availability Monitor tests for ATLAS Current Status Tests in development To Do Alessandro Di Girolamo CERN IT/PSS-ED.
A proposal: from CDR to CDH 1 Paolo Valente – INFN Roma [Acknowledgements to A. Di Girolamo] Liverpool, Aug. 2013NA62 collaboration meeting.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI Monitoring of the LHC Computing Activities Key Results from the Services.
DIRAC Project A.Tsaregorodtsev (CPPM) on behalf of the LHCb DIRAC team A Community Grid Solution The DIRAC (Distributed Infrastructure with Remote Agent.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI VO auger experience with large scale simulations on the grid Jiří Chudoba.
Enabling Grids for E-sciencE Experience Supporting the Integration of LHC Experiments Computing Systems with the LCG Middleware Simone.
Jiri Chudoba for the Pierre Auger Collaboration Institute of Physics of the CAS and CESNET.
D.Spiga, L.Servoli, L.Faina INFN & University of Perugia CRAB WorkFlow : CRAB: CMS Remote Analysis Builder A CMS specific tool written in python and developed.
The GridPP DIRAC project DIRAC for non-LHC communities.
WLCG Operations Coordination report Maria Alandes, Andrea Sciabà IT-SDC On behalf of the WLCG Operations Coordination team GDB 9 th April 2014.
SAM Status Update Piotr Nyczyk LCG Management Board CERN, 5 June 2007.
MND section. Summary of activities Job monitoring In collaboration with GridView and LB teams enabled full chain from LB harvester via MSG to Dashboard.
Breaking the frontiers of the Grid R. Graciani EGI TF 2012.
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
DIRAC for Grid and Cloud Dr. Víctor Méndez Muñoz (for DIRAC Project) LHCb Tier 1 Liaison at PIC EGI User Community Board, October 31st, 2013.
1 DIRAC Project Status A.Tsaregorodtsev, CPPM-IN2P3-CNRS, Marseille 10 March, DIRAC Developer meeting.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI Services for Distributed e-Infrastructure Access Tiziana Ferrari on behalf.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
Distributed Computing for Small Experiments
GridPP DIRAC Daniela Bauer & Simon Fayer.
Distributed Computing for Small Experiments
Overview of the Belle II computing
Belle II Physics Analysis Center at TIFR
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
Readiness of ATLAS Computing - A personal view
 YongPyong-High Jan We appreciate that you give an opportunity to have this talk. Our Belle II computing group would like to report on.
Status and plans for bookkeeping system and production tools
Production Manager Tools (New Architecture)
Presentation transcript:

The GridPP DIRAC project DIRAC for non-LHC communities

13/04/2015 The GridPP DIRAC project 2 Overview GridPP is a collaboration of particle physicists and computer scientists originally proposed to support the LHC experiments but it now non LHC experiments and researchers from other disciplines. LHC experiments have moved or are moving away from EMI WMS. DIRAC is being considered as a replacement for EMI WMS DIRAC also offers a file catalog – possible replacement for the LFC still widely used by small VOs. We have set up and are maintaining a DIRAC instance at Imperial College London.

Supported VOs The first installation was done in April We are currently supporting following VOs: na62.vo.gridpp.ac.uk t2k.org snoplus cernatschool.org comet.j-parc.jp pheno northgrid londongrid gridpp 13/04/2015 The GridPP DIRAC project 3

DIRAC services Dirac offers many services, some of them are particularly attractive for small VOs: WMS with pilot jobs DFC – file and metadata catalogue. LFC is just a file catalogue (+ a comment field…) Synchronous and asynchronous file transfers integrated with the DFC Can submit jobs to CREAM and ARC CEs and to cloud and VAC sites 13/04/2015 The GridPP DIRAC project 4

NA62 VO and DIRAC NA62 use the Grid to run their MC jobs. Output files are to be moved to central Castor storage at CERN and RAL We have a custom written job submission and monitoring system which allows shifters to distribute jobs over the Grid and monitor their progress. The submission can be done either to EMI WMS or GridPP DIRAC service The output files are copied to CERN and RAL using a File Transfer Controller (a Web Service based FTS client). This decouples the MC simulation task from data movement. We are planning to replace this service by DIRAC asynchronous file transfer service (e.g. use dirac-dms- replicate-and-register-request ). Need to upgrade DIRAC to allow this. Also plan to replace the LFC by the DFC. 13/04/2015 The GridPP DIRAC project 5

Cernatschool VO From the website: is a programme for school students - and driven by school students - that brings technology from CERN into the classroom to inspire the next generation of scientists and engineers ”.CERN They used the GridPP DIRAC instance to implement an example workflow: Upload a dataset 1.Raw data from the detectors; 2.Add metadata to the dataset using Python API 13/04/2015 The GridPP DIRAC project 6

Cernatschool VO, contd. Process the dataset on the Grid 1.Select data files of interest using metadata query 2.Run software (distributed via CVMFS) on selected data 3.Write output to a selected Storage Element 4.Add metadata to the generated data. Run an analysis on the processed data 1.Select data of interest using a metadata query 2.Retrieve output from the grid based on the selection Work coordinated by Tom Whyntie from QMUL. 13/04/2015 The GridPP DIRAC project 7

DIRAC instance maintenance Conservative DIRAC version updates to maintain balance between new features and stability DIRAC registry updated manually: 1.defining new users and groups 2.adding new sites and activating/deactivating them 3.adding site resources (CE, SEs) We are working on automating this at Imperial (see the GridPP DIRAC Project talk on Thursday) We use a mailing list to give support to our users. Wiki: 13/04/2015 The GridPP DIRAC project 8

Site Availability Monitoring We periodically run SAM tests to check if sites are accepting jobs: 13/04/2015 The GridPP DIRAC project 9 Thanks to Andrew McNab University of Manchester