The GridPP DIRAC project DIRAC for non-LHC communities.

Slides:



Advertisements
Similar presentations
1 14 Feb 2007 CMS Italia – Napoli A. Fanfani Univ. Bologna A. Fanfani University of Bologna MC Production System & DM catalogue.
Advertisements

MOSS 2007 Document Management Adam McCarthy 1 st April 2009.
Andrew McNab - EDG Access Control - 14 Jan 2003 EU DataGrid security with GSI and Globus Andrew McNab University of Manchester
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Batch Production and Monte Carlo + CDB work status Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Services Abderrahman El Kharrim
Jiri Chudoba for the Pierre Auger Collaboration Institute of Physics of the CAS and CESNET.
Oxford Jan 2005 RAL Computing 1 RAL Computing Implementing the computing model: SAM and the Grid Nick West.
1 Bridging Clouds with CernVM: ATLAS/PanDA example Wenjing Wu
Makrand Siddhabhatti Tata Institute of Fundamental Research Mumbai 17 Aug
LHC Experiment Dashboard Main areas covered by the Experiment Dashboard: Data processing monitoring (job monitoring) Data transfer monitoring Site/service.
Pilots 2.0: DIRAC pilots for all the skies Federico Stagni, A.McNab, C.Luzzi, A.Tsaregorodtsev On behalf of the DIRAC consortium and the LHCb collaboration.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
YAN, Tian On behalf of distributed computing group Institute of High Energy Physics (IHEP), CAS, China CHEP-2015, Apr th, OIST, Okinawa.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Real Time Monitor of Grid Job Executions Janusz Martyniak Imperial College London.
How to Install and Use the DQ2 User Tools US ATLAS Tier2 workshop at IU June 20, Bloomington, IN Marco Mambelli University of Chicago.
INFSO-RI Enabling Grids for E-sciencE Project Gridification: the UNOSAT experience Patricia Méndez Lorenzo CERN (IT-PSS/ED) CERN,
1 st December 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
Grid job submission using HTCondor Andrew Lahiff.
BESIII Production with Distributed Computing Xiaomei Zhang, Tian Yan, Xianghu Zhao Institute of High Energy Physics, Chinese Academy of Sciences, Beijing.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
4/5/2007Data handling and transfer in the LHCb experiment1 Data handling and transfer in the LHCb experiment RT NPSS Real Time 2007 FNAL - 4 th May 2007.
Enabling Grids for E-sciencE System Analysis Working Group and Experiment Dashboard Julia Andreeva CERN Grid Operations Workshop – June, Stockholm.
Maarten Litmaath (CERN), GDB meeting, CERN, 2006/02/08 VOMS deployment Extent of VOMS usage in LCG-2 –Node types gLite 3.0 Issues Conclusions.
CERN Using the SAM framework for the CMS specific tests Andrea Sciabà System Analysis WG Meeting 15 November, 2007.
…building the next IT revolution From Web to Grid…
T3 analysis Facility V. Bucard, F.Furano, A.Maier, R.Santana, R. Santinelli T3 Analysis Facility The LHCb Computing Model divides collaboration affiliated.
GridPP Dirac Service The 4 th Dirac User Workshop May 2014 CERN Janusz Martyniak, Imperial College London.
1 LHCb on the Grid Raja Nandakumar (with contributions from Greig Cowan) ‏ GridPP21 3 rd September 2008.
WebFTS File Transfer Web Interface for FTS3 Andrea Manzi On behalf of the FTS team Workshop on Cloud Services for File Synchronisation and Sharing.
1 LHCb File Transfer framework N. Brook, Ph. Charpentier, A.Tsaregorodtsev LCG Storage Management Workshop, 6 April 2005, CERN.
1 Andrea Sciabà CERN Critical Services and Monitoring - CMS Andrea Sciabà WLCG Service Reliability Workshop 26 – 30 November, 2007.
DBS/DLS Data Management and Discovery Lee Lueking 3 December, 2006 Asia and EU-Grid Workshop 1-4 December, 2006.
Jens G Jensen RAL, EDG WP5 Storage Element Overview DataGrid Project Conference Heidelberg, 26 Sep-01 Oct 2003.
EGEE-III INFSO-RI Enabling Grids for E-sciencE Ricardo Rocha CERN (IT/GS) EGEE’08, September 2008, Istanbul, TURKEY Experiment.
The ATLAS Cloud Model Simone Campana. LCG sites and ATLAS sites LCG counts almost 200 sites. –Almost all of them support the ATLAS VO. –The ATLAS production.
Andrew McNabGrid in 2002, Manchester HEP, 7 Jan 2003Slide 1 Grid Work in 2002 Andrew McNab High Energy Physics University of Manchester.
INFSO-RI Enabling Grids for E-sciencE ARDA Experiment Dashboard Ricardo Rocha (ARDA – CERN) on behalf of the Dashboard Team.
SAM Sensors & Tests Judit Novak CERN IT/GD SAM Review I. 21. May 2007, CERN.
Auditing Project Architecture VERY HIGH LEVEL Tanya Levshina.
Julia Andreeva on behalf of the MND section MND review.
The GridPP DIRAC project DIRAC for non-LHC communities.
Service Availability Monitor tests for ATLAS Current Status Tests in development To Do Alessandro Di Girolamo CERN IT/PSS-ED.
GRID Security & DIRAC A. Casajus R. Graciani A. Tsaregorodtsev.
DIRAC Project A.Tsaregorodtsev (CPPM) on behalf of the LHCb DIRAC team A Community Grid Solution The DIRAC (Distributed Infrastructure with Remote Agent.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI VO auger experience with large scale simulations on the grid Jiří Chudoba.
VOX Project Status T. Levshina. 5/7/2003LCG SEC meetings2 Goals, team and collaborators Purpose: To facilitate the remote participation of US based physicists.
Jiri Chudoba for the Pierre Auger Collaboration Institute of Physics of the CAS and CESNET.
Site Authorization Service Local Resource Authorization Service (VOX Project) Vijay Sekhri Tanya Levshina Fermilab.
WLCG Operations Coordination report Maria Alandes, Andrea Sciabà IT-SDC On behalf of the WLCG Operations Coordination team GDB 9 th April 2014.
SAM Status Update Piotr Nyczyk LCG Management Board CERN, 5 June 2007.
INFSO-RI Enabling Grids for E-sciencE File Transfer Software and Service SC3 Gavin McCance – JRA1 Data Management Cluster Service.
Breaking the frontiers of the Grid R. Graciani EGI TF 2012.
LHCb 2009-Q4 report Q4 report LHCb 2009-Q4 report, PhC2 Activities in 2009-Q4 m Core Software o Stable versions of Gaudi and LCG-AA m Applications.
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
Core and Framework DIRAC Workshop October Marseille.
DIRAC for Grid and Cloud Dr. Víctor Méndez Muñoz (for DIRAC Project) LHCb Tier 1 Liaison at PIC EGI User Community Board, October 31st, 2013.
1 DIRAC Project Status A.Tsaregorodtsev, CPPM-IN2P3-CNRS, Marseille 10 March, DIRAC Developer meeting.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI Services for Distributed e-Infrastructure Access Tiziana Ferrari on behalf.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
GridPP DIRAC Daniela Bauer & Simon Fayer.
Overview of the Belle II computing
Data Management and Database Framework for the MICE experiment
Key Activities. MND sections
Data Management and Database Framework for the MICE Experiment
Project Status A.Tsaregorodtsev, CPPM-IN2P3-CNRS, Marseille,
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
The LHCb Computing Data Challenge DC06
Presentation transcript:

The GridPP DIRAC project DIRAC for non-LHC communities

27/05/2015 Janusz Martyniak, 5th DIRAC User Workshop Overview GridPP is a collaboration of particle physicists and computer scientists originally proposed to support the LHC experiments but it now non LHC experiments and researchers from other disciplines. LHC experiments have moved or are moving away from EMI WMS. DIRAC is being considered as a replacement for EMI WMS DIRAC also offers a file catalog – possible replacement for the LFC still widely used by small VOs. We have set up and are maintaining a DIRAC instance at Imperial College London.

Supported VOs The first installation was done in April We are currently supporting following VOs: na62.vo.gridpp.ac.uk t2k.org snoplus cernatschool.org comet.j-parc.jp pheno northgrid londongrid gridpp 27/05/2015 Janusz Martyniak, 5th DIRAC User Workshop

DIRAC services Dirac offers many services, some of them are particularly attractive for small VOs: WMS with pilot jobs DFC – file and metadata catalogue. LFC is just a file catalogue (+ a comment field…) Synchronous and asynchronous file transfers integrated with the DFC Can submit jobs to CREAM and ARC CEs and to cloud and VAC sites 27/05/2015 Janusz Martyniak, 5th DIRAC User Workshop

NA62 VO and DIRAC NA62 use the Grid to run their MC jobs. Output files are to be moved to central Castor storage at CERN and RAL We have a custom written job submission and monitoring system which allows shifters to distribute jobs over the Grid and monitor their progress. The submission can be done either to EMI WMS or GridPP DIRAC service The output files are copied to CERN and RAL using a File Transfer Controller (a Web Service based FTS client). This decouples the MC simulation task from data movement. We are planning to replace this service by DIRAC asynchronous file transfer service (e.g. use dirac-dms- replicate-and-register-request ). Need to upgrade DIRAC to allow this. Also plan to replace the LFC by the DFC. 27/05/2015 Janusz Martyniak, 5th DIRAC User Workshop

Cernatschool VO From the website: is a programme for school students - and driven by school students - that brings technology from CERN into the classroom to inspire the next generation of scientists and engineers ”.CERN They used the GridPP DIRAC instance to implement an example workflow: Upload a dataset 1.Raw data from the detectors; 2.Add metadata to the dataset using Python API 27/05/2015 Janusz Martyniak, 5th DIRAC User Workshop

Cernatschool VO, contd. Process the dataset on the Grid 1.Select data files of interest using metadata query 2.Run software (distributed via CVMFS) on selected data 3.Write output to a selected Storage Element 4.Add metadata to the generated data. Run an analysis on the processed data 1.Select data of interest using a metadata query 2.Retrieve output from the grid based on the selection Work coordinated by Tom Whyntie from QMUL. 27/05/2015 Janusz Martyniak, 5th DIRAC User Workshop

DIRAC instance maintenance Conservative DIRAC version updates to maintain balance between new features and stability DIRAC registry updated manually: 1.defining new users and groups 2.Adding new VOs 3.adding new sites and activating/deactivating them 4.adding site resources (CE, SEs) We are working on automating this at Imperial (some details later) We use a mailing list to give support to our users. Wiki: 27/05/2015 Janusz Martyniak, 5th DIRAC User Workshop

Site Availability Monitoring We periodically run SAM tests to check if sites are accepting jobs: 27/05/2015 Janusz Martyniak, 5th DIRAC User Workshop Thanks to Andrew McNab University of Manchester

10 GridPP DIRAC

11 Users & Groups Agent The Users and Groups agent fetches user lists from the VOMS server and adds them to the DIRAC authentication DB. Multi-VO modifications: Multiple VOMS Servers (one per VO) Extended for multiple VOMS Roles -> Group Mappings

12 AutoBDII2CS Agent DIRAC base agent searches the BDII for all CE and SE resources and s the admin with any of these that are not yet configured. Fully Automated Adding of Resources. Deals with multiple SEs and creates a path mapping section for each one. Ignores resources matching a set of regular expressions (currently work in progress).

Some problems … DIRAC can't submit to ARC CEs when gridengine is used because the endpoint contains an extra "/" that the CS can't cope with. [Opened Aug 2014] A security alert was sent out requesting sites to upgrade to v6r13, however there are no released versions of this available. When will this [Today's production version is: v6r12p35] Asynchronous file transfers not operational in v6r11p24, v6r12p30? Upgrade broke the WMS (no pilots) 27/05/2015 Janusz Martyniak, 5th DIRAC User Workshop