Status of MC production on the grid

Slides:



Advertisements
Similar presentations
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Advertisements

Batch Production and Monte Carlo + CDB work status Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Monitoring in DIRAC environment for the BES-III experiment Presented by Igor Pelevanyuk Authors: Sergey BELOV, Igor PELEVANYUK, Alexander UZHINSKIY, Alexey.
1 Bridging Clouds with CernVM: ATLAS/PanDA example Wenjing Wu
Tier-0: Preparations for Run-2 Armin NAIRZ (CERN) ADC Technical Interchange Meeting Chicago, 29 October 2014.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
Distributed Computing for CEPC YAN Tian On Behalf of Distributed Computing Group, CC, IHEP for 4 th CEPC Collaboration Meeting, Sep ,
Software Summary 1M.Ellis - CM23 - Harbin - 16th January 2009  Four very good presentations that produced a lot of useful discussion: u Online Reconstruction.
Alexandre A. P. Suaide VI DOSAR workshop, São Paulo, 2005 STAR grid activities and São Paulo experience.
SC4 Workshop Outline (Strong overlap with POW!) 1.Get data rates at all Tier1s up to MoU Values Recent re-run shows the way! (More on next slides…) 2.Re-deploy.
Computing Infrastructure Status. LHCb Computing Status LHCb LHCC mini-review, February The LHCb Computing Model: a reminder m Simulation is using.
Jean-Yves Nief CC-IN2P3, Lyon HEPiX-HEPNT, Fermilab October 22nd – 25th, 2002.
8th November 2002Tim Adye1 BaBar Grid Tim Adye Particle Physics Department Rutherford Appleton Laboratory PP Grid Team Coseners House 8 th November 2002.
Overview of day-to-day operations Suzanne Poulat.
Sep 21, 20101/14 LSST Simulations on OSG Sep 21, 2010 Gabriele Garzoglio for the OSG Task Force on LSST Computing Division, Fermilab Overview OSG Engagement.
1 DIRAC – LHCb MC production system A.Tsaregorodtsev, CPPM, Marseille For the LHCb Data Management team CHEP, La Jolla 25 March 2003.
Belle MC Production on Grid 2 nd Open Meeting of the SuperKEKB Collaboration Soft/Comp session 17 March, 2009 Hideyuki Nakazawa National Central University.
LCG Plans for Chrsitmas Shutdown John Gordon, STFC-RAL GDB December 10 th, 2008.
Group 1 : Grid Computing Laboratory of Information Technology Supervisors: Alexander Ujhinsky Nikolay Kutovskiy.
BESIII Production with Distributed Computing Xiaomei Zhang, Tian Yan, Xianghu Zhao Institute of High Energy Physics, Chinese Academy of Sciences, Beijing.
21 st October 2002BaBar Computing – Stephen J. Gowdy 1 Of 25 BaBar Computing Stephen J. Gowdy BaBar Computing Coordinator SLAC 21 st October 2002 Second.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
June 24-25, 2008 Regional Grid Training, University of Belgrade, Serbia Introduction to gLite gLite Basic Services Antun Balaž SCL, Institute of Physics.
Distributed Computing for CEPC YAN Tian On Behalf of Distributed Computing Group, CC, IHEP for 4 th CEPC Collaboration Meeting, Sep , 2014 Draft.
Testing the UK Tier 2 Data Storage and Transfer Infrastructure C. Brew (RAL) Y. Coppens (Birmingham), G. Cowen (Edinburgh) & J. Ferguson (Glasgow) 9-13.
Getting started DIRAC Project. Outline  DIRAC information system  Documentation sources  DIRAC users and groups  Registration with DIRAC  Getting.
A B A B AR InterGrid Testbed Proposal for discussion Robin Middleton/Roger Barlow Rome: October 2001.
The GridPP DIRAC project DIRAC for non-LHC communities.
LHCb report to LHCC and C-RSG Philippe Charpentier CERN on behalf of LHCb.
WLCG Service Report ~~~ WLCG Management Board, 18 th September
Status of MICE on the GRID  MICE VO u CEs  G4MICE Installation  Example test job  Station QA Analysis  Analysis jobs  File Storage  Documentation.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI VO auger experience with large scale simulations on the grid Jiří Chudoba.
Faculty of Physics, University of Belgrade NA61/49 Collaboration meeting 8 th of October, 2012 The Belgrade cluster Dimitrije Maletic, Institute of Physics,
The GridPP DIRAC project DIRAC for non-LHC communities.
WLCG Operations Coordination report Maria Alandes, Andrea Sciabà IT-SDC On behalf of the WLCG Operations Coordination team GDB 9 th April 2014.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
Workload Scheduler plug-in for JSR 352 Java Batch IBM Workload Scheduler IBM.
WLCG IPv6 deployment strategy
Status report NIKHEF Willem van Leeuwen February 11, 2002 DØRACE.
Simulation Production System
Bulk production of Monte Carlo
Real Time Fake Analysis at PIC
MICE Computing and Software
MCproduction on the grid
Overview of the Belle II computing
Belle II Physics Analysis Center at TIFR
INFN GRID Workshop Bari, 26th October 2004
DIRAC Production Manager Tools
Workload Management System
Offline Software A. Dobbs CM43 30th October 2015.
Bulk production of Monte Carlo
Production Resources & Issues p20.09 MC-data Regeneration
ATLAS activities in the IT cloud in April 2008
1 VO User Team Alarm Total ALICE ATLAS CMS
Update on Plan for KISTI-GSDC
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
Submit BOSS Jobs on Distributed Computing System
1 VO User Team Alarm Total ALICE ATLAS CMS
Job workflow Pre production operations:
Simulation use cases for T2 in ALICE
LCG Monte-Carlo Events Data Base: current status and plans
Status of MC production on the grid
Universita’ di Torino and INFN – Torino
R. Graciani for LHCb Mumbay, Feb 2006
LHC Data Analysis using a worldwide computing grid
 YongPyong-High Jan We appreciate that you give an opportunity to have this talk. Our Belle II computing group would like to report on.
LHCb status and plans Ph.Charpentier CERN.
Status report NIKHEF Willem van Leeuwen February 11, 2002 DØRACE.
Status and plans for bookkeeping system and production tools
The LHCb Computing Data Challenge DC06
Presentation transcript:

Status of MC production on the grid MICE Collaboration Status of MC production on the grid Dimitrije Maletic Institute of Physics, University of Belgrade MICE collaboration meeting 49, 2nd of October 2017.

MICE collaboration meeting 49, 2nd of October 2017. Outline Introduction Reminder: MCproduction on the grid information pages Migration from WMS to DIRAC is done Productions done since last CM Conclusions MICE collaboration meeting 49, 2nd of October 2017.

MICE collaboration meeting 49, 2nd of October 2017. Introduction The MC production on the grid includes G4BeamLine simulation and G4 Detector simulations using MAUS. MAUS G4 detector simulation production, using already made G4BL libraries, was done upon request with no delays, since March last year. As of March this year, the G4BeamLine production was restarted on the grid. MICE collaboration meeting 49, 2nd of October 2017.

MCproduction on the grid information pages Information about finished and running MCproductions on the grid: http://micewww.pp.rl.ac.uk/projects/analysis/wiki/MCProduction Information ( and also as examples ) of MCproduction Requests http://micewww.pp.rl.ac.uk/projects/analysis/wiki/MCProductionRequests Information about MCproduction requests entries in CDB You can check MCSerialNumber entries in CDB: http://147.91.87.158/cgi-bin/get_mcserial The scripts used for MC production on the grid are available on launchpad https://launchpad.net/mice-mc-batch-submission MICE collaboration meeting 49, 2nd of October 2017.

MICE collaboration meeting 49, 2nd of October 2017. Comment on requests Ideally, all requests should have exact path of G4BL list file indicated, and proper configuration set, and also, some test production should be run before request of batch production is done. Sometimes request are incomplete, G4BL lists files are not on indicated paths, and tests are needed to be run on the grid first. Some email exchange is than necessary before full productions. All of this is slowing the productions. MICE collaboration meeting 49, 2nd of October 2017.

Migration from WMS to DIRAC During the last and this year, on several sites Workload Management System (WMS) service was stopped (outdated). Because of this, on some sites supporting MICE Virtual Organisation (VO) Computing Elements (CEs) did not take jobs, and production could be done on decreasing number of sites. The migration from WMS to DIRAC, a successor Workload System has to be done, in order to use all the sites which are supporting MICE VO, and migration had to be concluded till the end of 2017, since WMS will not be operating after that. The migration was successful one. MC production on the grid using DIRAC submission system was used for all productions since last CM. MICE collaboration meeting 49, 2nd of October 2017.

Productions since last CM 19 new productions, 427 GB in 22,621 files - Out of 73 productions 1,5 TB in 78,072 files 5 new G4BLs. Each with ~8 GB in ~1000 json files. MICE collaboration meeting 49, 2nd of October 2017.

MICE collaboration meeting 49, 2nd of October 2017. Conclusions Many new MAUS productions and some G4BL productions done since last CM. Migration from WMS to DIRAC successful Transfer of big tarball for each productions to RAL castor tape should be done after current data taking. All setups done. No more problems with uploading files to castor. MICE collaboration meeting 49, 2nd of October 2017.

MICE collaboration meeting 49, 2nd of October 2017. THANK YOU! MICE collaboration meeting 49, 2nd of October 2017.