HEC Beam Test Software schematic view T D S MC events ASCII-TDS

Slides:



Advertisements
Similar presentations
Grid Architecture Grid Canada Certificates International Certificates Grid Canada Issued over 2000 certificates Condor G Resource TRIUMF.
Advertisements

Ian Gable University of Victoria/HEPnet Canada 1 GridX1: A Canadian Computational Grid for HEP Applications A. Agarwal, P. Armstrong, M. Ahmed, B.L. Caron,
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ATLAS Computing Planning ATLAS Software Workshop August 1998 Jürgen Knobloch Slides also.
Particle Physics and the Grid Randall Sobie Institute of Particle Physics University of Victoria Motivation Computing challenge LHC Grid Canadian requirements.
The GlueX Collaboration Meeting October 4-6, 2012 Jefferson Lab Curtis Meyer.
Central Reconstruction System on the RHIC Linux Farm in Brookhaven Laboratory HEPIX - BNL October 19, 2004 Tomasz Wlodek - BNL.
Jean-Yves Nief CC-IN2P3, Lyon HEPiX-HEPNT, Fermilab October 22nd – 25th, 2002.
Nick Brook Current status Future Collaboration Plans Future UK plans.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
LHC Computing Plans Scale of the challenge Computing model Resource estimates Financial implications Plans in Canada.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Charles Leggett The Athena Control Framework in Production, New Developments and Lessons Learned.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
Ashok Agarwal University of Victoria 1 GridX1 : A Canadian Particle Physics Grid A. Agarwal, M. Ahmed, B.L. Caron, A. Dimopoulos, L.S. Groer, R. Haria,
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
Prediction W. Buchmueller (DESY) arXiv:hep-ph/ (1999)
BeamCal Simulations with Mokka Madalina Stanescu-Bellu West University Timisoara, Romania Desy, Zeuthen 30 Jun 2009 – FCAL Meeting.
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
Questions for ATLAS  How can the US ATLAS costs per SW FTE be lowered?  Is the scope of the T1 facility matched to the foreseen physics requirements.
Development of the distributed monitoring system for the NICA cluster Ivan Slepov (LHEP, JINR) Mathematical Modeling and Computational Physics Dubna, Russia,
Combined HEC/EMEC testbeam data can be read and analyzed within the ATLAS Athena framework A “cookbook” gives an introduction for how to access the data.
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
November 2013 Review Talks Morning Plenary Talk – CLAS12 Software Overview and Progress ( ) Current Status with Emphasis on Past Year’s Progress:
ATLAS is a general-purpose particle physics experiment which will study topics including the origin of mass, the processes that allowed an excess of matter.
The CMS Simulation Software Julia Yarba, Fermilab on behalf of CMS Collaboration 22 m long, 15 m in diameter Over a million geometrical volumes Many complex.
Offline and Monitoring Software for the EMEC+HEC Combined Run Combined Test Beam and LArg Software and Performance 11 September 2002 Rob McPherson University.
12 October 2001, M. LefebvreHEC-Athena Tutorial: HEC beam test primer1 HEC Beam Test Primer Production modules of the HEC have been tested in particle.
Why A Software Review? Now have experience of real data and first major analysis results –What have we learned? –How should that change what we do next.
UTA MC Production Farm & Grid Computing Activities Jae Yu UT Arlington DØRACE Workshop Feb. 12, 2002 UTA DØMC Farm MCFARM Job control and packaging software.
Computing R&D and Milestones LHCb Plenary June 18th, 1998 These slides are on WWW at:
Combined HEC/EMEC testbeam data can be read and analyzed within the ATLAS Athena framework A “cookbook” gives an introduction for how to access the data.
TDAQ Experience in the BNL Liquid Argon Calorimeter Test Facility Denis Oliveira Damazio (BNL), George Redlinger (BNL).
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
The MEG Offline Project General Architecture Offline Organization Responsibilities Milestones PSI 2/7/2004Corrado Gatto INFN.
1 Tracker Software Status M. Ellis MICE Collaboration Meeting 27 th June 2005.
Monitoring for the EMEC/HEC Combined Run in August HEC/FCAL/Combined Testbeam Meeting 16 April 2002 Rob McPherson for the UVic group (Fincke, Kanaya, Keeler,
Software and Computing Status of Software development and MC production OpRoot-Fedra MC interface New CVS server Computing resources at CERN: present and.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
LAr Testbeam Offline Software Status and Plans Combined-Combined TB Meeting 16 June 2003 Rob McPherson University of Victoria.
ATLAS Distributed Analysis DISTRIBUTED ANALYSIS JOBS WITH THE ATLAS PRODUCTION SYSTEM S. González D. Liko
June 2004 ATLAS WeekAlexander Solodkov1 testbeam 2004 offline reconstruction.
BESF Framework Development Weidong Li
Thomas Ruf, CERN EP Experience with C++ and ROOT used in the VX Beam Test Thomas Ruf, CERN, EP  Why? Event structure for VX-data rather complex: raw hits.
4 Dec., 2001 Software Week Data flow in the LArG Reconstruction software chain Updated status for various reconstruction algorithm LAr Converters and miscellaneous.
LHCb computing model and the planned exploitation of the GRID Eric van Herwijnen, Frank Harris Monday, 17 July 2000.
ATLAS Distributed Analysis S. González de la Hoz 1, D. Liko 2, L. March 1 1 IFIC – Valencia 2 CERN.
Workload Management Workpackage
BESIII data processing
SuperB and its computing requirements
CMS High Level Trigger Configuration Management
Moving the LHCb Monte Carlo production system to the GRID
US ATLAS Physics & Computing
Grid Canada Testbed using HEP applications
(CMS GEANT4 simulation)
Tracker Software Status
 YongPyong-High Jan We appreciate that you give an opportunity to have this talk. Our Belle II computing group would like to report on.
Simulation Framework Subproject cern
Jupiter and Satellites
Simulation and Physics
MonteCarlo production for the BaBar experiment on the Italian grid
ATLAS DC2 & Continuous production
HEC-EMEC Test Beam Software
Use of GEANT4 in CMS The OSCAR Project
Computing activities at Victoria
3 Week A: May 1 – 19 3 Week B: May 22 – June 9
Short to middle term GRID deployment plan for LHCb
Planning next release of GAUDI
BES III Software: Short-term Plan ( )
Calibration Infrastructure Design
Presentation transcript:

HEC Beam Test Software schematic view T D S MC events ASCII-TDS converter ASCII Data file MC-TDS converter EPIO-ASCII converter T D S HEC raw beam test data EPIO EPIO-TDS converter hec_adc monitoring ntuple user analysis HEC-MC GEANT3 user analysis ATHENA framework hec_adc framework 9 May 2019 M. Lefebvre

HEC Beam Test OO Software Recent changes: MySQL bookkeeping added - will commit software this week Immediate work: Objectivity converter (Annecy) EPIO converter - milestone for July LARG Week Long term jobs: couple code into LARG software - detector description, calibration, reconstruction

LARG Database Many activities - little to no manpower detector description DB write an interface to provide access to the cell location stored in Objectivity or XML objectivity DB need to put LArHits into Objectivity (by end June) production DB (no time) calibration DB EM testbeam group are using MySQL for constants

Grid Research - I We have created a small Grid testbed between Victoria-TRIUMF using the Globus toolkit Bob Kowalewski, Asoka De Silva, Ashok Agarwal, Jan van Uytven we are currently testing how to handle I/O from a simple job that writes out random numbers in parallel we are learning how to use CONDOR on the testbed - CONDOR is a batch scheduler that is used in our Linux cluster - CONDOR is “grid enabled” our short term goal is be able to run a simple particle physics generator or simulation job over the grid our long term goal is to do BaBar MC production over the Grid - this requires use of Objectivity

Grid Research - II We are collaborating with HPC Group in NRC, Ottawa and helping researchers in Windsor to establish their Grid Their goals are a bit different than ours. They are interested in creating a distributed parallel processing facility. We would like to increase our effort by hiring a dedicated person as well as making a more realistic testbed We have been funded by C3.ca Association and (I hope) by NSERC We are discussing funding with C3.ca and CANARIE Bob and I will likely submit a grant to NSERC for Grid research See wwwgrid.phys.uvic.ca