30/07/2005Symmetries and Spin - Praha 051 MonteCarlo simulations in a GRID environment for the COMPASS experiment Antonio Amoroso for the COMPASS Coll.

Slides:



Advertisements
Similar presentations
CBPF J. Magnin LAFEX-CBPF. Outline What is the GRID ? Why GRID at CBPF ? What are our needs ? Status of GRID at CBPF.
Advertisements

INFN Testbed1 status L. Gaido, A. Ghiselli WP6 meeting CERN, 11 December 2001.
Deployment Team. Deployment –Central Management Team Takes care of the deployment of the release, certificates the sites and manages the grid services.
Monte Carlo simulation of the background for the Drell-Yan COMPASS Marialaura Colantoni Laboratory of Nuclear Problems Dubna Marialaura.
Réunion DataGrid France, Lyon, fév CMS test of EDG Testbed Production MC CMS Objectifs Résultats Conclusions et perspectives C. Charlot / LLR-École.
The DataGrid Project NIKHEF, Wetenschappelijke Jaarvergadering, 19 December 2002
Workload Management Workpackage Massimo Sgaravatto INFN Padova.
Globus activities within INFN Massimo Sgaravatto INFN Padova for the INFN Globus group
IFIN-HH LHCB GRID Activities Eduard Pauna Radu Stoica.
INFN Testbed status report L. Gaido WP6 meeting CERN - October 30th, 2002.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
The EDG Testbed Deployment Details The European DataGrid Project
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
Zhiling Chen (IPP-ETHZ) Doktorandenseminar June, 4 th, 2009.
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
BaBar Grid Computing Eleonora Luppi INFN and University of Ferrara - Italy.
F.Fanzago – INFN Padova ; S.Lacaprara – LNL; D.Spiga – Universita’ Perugia M.Corvo - CERN; N.DeFilippis - Universita' Bari; A.Fanfani – Universita’ Bologna;
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
LCG Service Challenge Phase 4: Piano di attività e impatto sulla infrastruttura di rete 1 Service Challenge Phase 4: Piano di attività e impatto sulla.
1 Recent results on Polarized Quark and Gluon Distributions at COMPASS I.Savin, JINR, Dubna on behalf of the COMPASS Collaboration Outline 1. Introduction.
IST E-infrastructure shared between Europe and Latin America High Energy Physics Applications in EELA Raquel Pezoa Universidad.
A monitoring tool for a GRID operation center Sergio Andreozzi (INFN CNAF), Sergio Fantinel (INFN Padova), David Rebatto (INFN Milano), Gennaro Tortone.
Grid infrastructure analysis with a simple flow model Andrey Demichev, Alexander Kryukov, Lev Shamardin, Grigory Shpiz Scobeltsyn Institute of Nuclear.
CMS Stress Test Report Marco Verlato (INFN-Padova) INFN-GRID Testbed Meeting 17 Gennaio 2003.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
Klaster obliczeniowy WLCG – cz.I Alice::WTU::LCG - skład: VOBOX  alicluster.if.pw.edu.plVM: saturn.if.pw.edu.pl CREAM-CE  aligrid.if.pw.edu.pl VM: saturn.if.pw.edu.pl.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
The ALICE short-term use case DataGrid WP6 Meeting Milano, 11 Dec 2000Piergiorgio Cerello 1 Physics Performance Report (PPR) production starting in Feb2001.
T3 analysis Facility V. Bucard, F.Furano, A.Maier, R.Santana, R. Santinelli T3 Analysis Facility The LHCb Computing Model divides collaboration affiliated.
Prototyping production and analysis frameworks for LHC experiments based on LCG, EGEE and INFN-Grid middleware CDF: DAG and Parametric Jobs ALICE: Evolution.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
INFSO-RI Enabling Grids for E-sciencE CRAB: a tool for CMS distributed analysis in grid environment Federica Fanzago INFN PADOVA.
Large scale data flow in local and GRID environment Viktor Kolosov (ITEP Moscow) Ivan Korolko (ITEP Moscow)
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
HEPMARK2 Consiglio di Sezione 9 Luglio 2012 Michele Michelotto - Padova.
M. Cristina Vistoli EGEE SA1 Organization Meeting EGEE is proposed as a project funded by the European Union under contract IST Regional Operations.
The DataGrid Project NIKHEF, Wetenschappelijke Jaarvergadering, 19 December 2002
Gennaro Tortone, Sergio Fantinel – Bologna, LCG-EDT Monitoring Service DataTAG WP4 Monitoring Group DataTAG WP4 meeting Bologna –
D.Spiga, L.Servoli, L.Faina INFN & University of Perugia CRAB WorkFlow : CRAB: CMS Remote Analysis Builder A CMS specific tool written in python and developed.
Transversity 2005, Como Two-hadron Adam Mielech INFN Trieste on behalf of COMPASS collaboration 7-10th. September 2005.
L. Perini DATAGRID WP8 Use-cases 19 Dec ATLAS short term grid use-cases The “production” activities foreseen till mid-2001 and the tools to be used.
Pedro Andrade > IT-GD > D4Science Pedro Andrade CERN European Organization for Nuclear Research GD Group Meeting 27 October 2007 CERN (Switzerland)
Claudio Grandi INFN Bologna Virtual Pools for Interactive Analysis and Software Development through an Integrated Cloud Environment Claudio Grandi (INFN.
BaBar & Grid Eleonora Luppi for the BaBarGrid Group TB GRID Bologna 15 febbraio 2005.
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
The CMS Beijing Tier 2: Status and Application Xiaomei Zhang CMS IHEP Group Meeting December 28, 2007.
Klaus Rabbertz Berlin, Jahrestagung der DPG 1 CMS Software Installation Andreas Nowack Shahzad Muzaffar Andrea Sciabà Stephan Wynhoff Klaus.
Workload Management Workpackage
BaBar-Grid Status and Prospects
The EDG Testbed Deployment Details
Eleonora Luppi INFN and University of Ferrara - Italy
LCG Service Challenge: Planning and Milestones
Overview of the Belle II computing
SuperB and its computing requirements
Grid Computing for the ILC
Computing Model di PANDA:
ALICE Physics Data Challenge 3
INFN – GRID status and activities
LHC DATA ANALYSIS INFN (LNL – PADOVA)
The INFN TIER1 Regional Centre
Simulation use cases for T2 in ALICE
High Energy Physics Computing Coordination in Pakistan
Gridifying the LHCb Monte Carlo production system
MonteCarlo production for the BaBar experiment on the Italian grid
The CMS Beijing Site: Status and Application
LHCb thinking on Regional Centres and Related activities (GRIDs)
ΔG/G Extraction From High-Pt Hadron Pairs at COMPASS
The LHCb Computing Data Challenge DC06
Presentation transcript:

30/07/2005Symmetries and Spin - Praha 051 MonteCarlo simulations in a GRID environment for the COMPASS experiment Antonio Amoroso for the COMPASS Coll. Univ. Turin and I.N.F.N. Dept. of General Physics “A.Avogadro”

30/07/2005Symmetries and Spin - Praha 052 The GRID Project What is GRID GRID Structure INFN Prod. GRID and LCG2 COMPASS Experiment Experimental apparatus Lambda polarization The importance of MonteCarlo simulation Software packages Software installation procedure GRID Results Transverse  pol. Analysis MonteCarlo Outline

30/07/2005Symmetries and Spin - Praha 053 The Compass Physics Program Physics with the Muon beam Gluon polarization Longitudinal and transversal spin distribution Polarization of and COmmon Muon and Proton Apparatus for Structure and Spectroscopy Physics with the Hadron beam Study of charm baryons Study of gluonic systems Hadron structure with virtual photons Exotic hadrons ~250 physicists from 28 institutes, 11 countries The beam intensity is about 2·10 8  + /  per spill (4.8 s) Data taking starts in 2002 will continue from 2006 till at least 2010

30/07/2005Symmetries and Spin - Praha 054 COMPASS Experimental apparatus First Spectrometer: LAS Geometrical Acceptance:  >30 mrad Gap: 172  229 cm 2 Integral field: 1 Tm Analyzed momentum: p<60 GeV/c Second Spectrometer: SAS Geometrical Acceptance:  <30 mrad Gap: 200  100 cm 2 Integral field: 4.4 Tm Analyzed momentum: p>10 GeV/c

30/07/2005Symmetries and Spin - Praha 055  polarization The  polarization allows to measure  T q i (x) the transversity distribution function  ’’   p V 1 (x 1,y 1,z 1 ) V 2 (x 2,y 2,z 2 ) The Semi-Inclusive DIS (SIDIS) reactions with transversely polarized target allow to access the transversely polarized quark distribution of the nucleon target.

30/07/2005Symmetries and Spin - Praha 056 To correct the acceptance of the experiment generated by analysis cuts; To test the reconstruction software; To estimate the reconstruction efficiency; To study systematic effects. The importance of MC simulation

30/07/2005Symmetries and Spin - Praha 057 What’s GRID Computing environment to combine and share the heterogeneous resources of a large number of collaborating institutions and computing centres. CMS, ATLAS, ALICE, LHCb Will collect data 6-8 PB/year COMPASS Collected data more than 270 TB/year (start from 2002) The huge demand of computing power for the coming in HEP Esperiments cannot be satisfied by traditional computing systems

30/07/2005Symmetries and Spin - Praha 058 GRID Infrastructure UI VO, Code, Info, Release GIIS  GRIS  CE GRIS  SE WNs FARM 1 JDL FARM 2 GIIS  GRIS  CE GRIS  SE WNs FARM 3 GIIS  GRIS  CE GRIS  SE WNs JOB SUBMISSION RBMDS DBII MIDDLEWARE Submit List match Get-output Software HEP Output.root UI: User Interface CE: Computing Element SE: Storage Element WN: Worker Node RB: Resource Broker MDS: Metadata Directory System BDII: Berkeley Database Information Index GIIS: GRID Information Index Server GRIS: GRID Resources Information Server JDL: Job Description Language The experiment software is installed by “VO-sgm” user in the /opt/exp_software directory of the CE

30/07/2005Symmetries and Spin - Praha 059 LCG2 and INFN Production GRID Release LCG Release INFN-Production GRID (Superset LCG ) Update 01/06/2005 LCG2INFN-Production GRID CERN RAL Lione Napoli Pisa Trieste Torino Milano LNL Taipei Padova CNAF TIER 0 TIER 1 TIER 2

30/07/2005Symmetries and Spin - Praha 0510 FARM LCG and INFN-GRID TORINO BIPROCESSOR HOST (P4 XEON 2,4 GHz – 1 GB RAM) Release LCG OS SL CE 1 SE (2TB) 1 UI 35 WNs Installation and managing as Turin Site Manager

30/07/2005Symmetries and Spin - Praha 0511 LEPTO Event Generator COMGEANT Simulation hit production CORAL Digitalization Reconstruction ~ 65 Mb mDST in ~11h. (CPU P4 2,4 GHz) PHAST Analysis (mDST) 2*10 4 events/job Total: ~ 10 8 Analysis chain for  transverse production

30/07/2005Symmetries and Spin - Praha 0512 Simulation vs Analysis for one job (2*10 4 events) ProgramTIME LEPTO < 4 min COMGEANT~ 300 min CORAL~ 350 min Simulation Chain~ 660 min x CPU ProgramTIME PHAST< 5 min CPU P4 2,4 GHz – 1GB RAM A complete simulation of required events (~ 10 8 ) takes ~ h/CPU 2*10 4 simulated and reconstructed events give a mDST~65 MB

30/07/2005Symmetries and Spin - Praha 0513 edg-job-list-match: search the resources where SGM user wants to install the software edg-job-submit –resource: submit the job forcing to a particular resouce Step 0: checking for /var/spool/pbs/server_name Step 1: removing old lambda_trans.tgz Step 2: disk space availability Step 3: Downloading ARCHIVE lambda_trans.tgz Step 4 : Unpack the ARCHIVE lambda_trans.tgz Step 5: setting envar Step 6: Removing ARCHIVE lambda_trans.tgz Submit of a test job (20 events simulation) Installation procedure on GRID resources

30/07/2005Symmetries and Spin - Praha 0514 LNL 41% Torino 44% Cagliari 9% Ferrara 6% mDST Production 20/04/2004 Transverse configuration Job distribution: 5356 jobs submitted on GRIDIT resources

30/07/2005Symmetries and Spin - Praha 0515 Lambda transverse production using INFN Prod. GRID infrastructure CPU used: 126 max simultaneous CPU Time :~63900 h/cpu (P4 2,4 GHz) Events produced: ~10 8 lambda generated Disk usage: ~320 GB on SE grid009.to.infn.it RB: CNAF, Catania, Padova Software Release Lepto Comgeant: Coral: prod Phast: 5.130

30/07/2005Symmetries and Spin - Praha 0516 Event selection Request: -  in -  out  - V 0 vertex (   p  - (Br~64%) ) h+h+ h-h- ++ ’+’+Background: - V 0 fake -   e + e - - k 0  +  - -   p  +Cuts: -100<z 1 <-40 & -30<z 1 <30 cm V 2 > 35 cm p T + > 23 MeV/c p p,  > 1 GeV/c  < 10 mrad optimized using MC simulation  V1V1 V2V2 

30/07/2005Symmetries and Spin - Praha 0517  invariant mass for MC events The data set contains only events with at least one Λ in the final state.

30/07/2005Symmetries and Spin - Praha 0518

30/07/2005Symmetries and Spin - Praha 0519 Conclusions Installation and management of the grid farm on the Torino site Development of the software for job running control and remote installation on GRID resources Setup and test of the COMPASS software on the grid farm Test of the simulation and analysis codes with the longitudinally polarized data Production of 10 8 events of unpolarized SIDIS  events with transversely polarized target configuration Analysis of MC samples with the COMPASS code Comparison with real data WORK DONE: