The GlueX Collaboration Meeting October 4-6, 2012 Jefferson Lab Curtis Meyer.

Slides:



Advertisements
Similar presentations
Thomas Jefferson National Accelerator Facility Page 1 CLAS12 Software User Environment Introduction: Software tasks, users, projects. Tools. Simulation.
Advertisements

The June Software Review David Lawrence, JLab Feb. 16, /16/121Preparations for June Software Review David Lawrence.
O. Stézowski IPN Lyon AGATA Week September 2003 Legnaro Data Analysis – Team #3 ROOT as a framework for AGATA.
GlueX Collaboration Meeting, Newport News, October 25-27, 2007 GlueX Simulations Status Report Richard Jones GlueX Collaboration Meeting, Newport News,
Ian M. Fisk Fermilab February 23, Global Schedule External Items ➨ gLite 3.0 is released for pre-production in mid-April ➨ gLite 3.0 is rolled onto.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ATLAS Computing Planning ATLAS Software Workshop August 1998 Jürgen Knobloch Slides also.
Hall D Online Data Acquisition CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental forces of nature. 75.
The June Software Review David Lawrence, JLab Feb. 16, 2012.
Ian Fisk and Maria Girone Improvements in the CMS Computing System from Run2 CHEP 2015 Ian Fisk and Maria Girone For CMS Collaboration.
Software Overview David Lawrence, JLab Oct. 26, 2007 David Lawrence, JLab Oct. 26, 2007.
Status of Hall C 6 GeV Analysis Software Robust Fortran/CERNLIB code, “ENGINE”, for analysis of HMS/SOS coincidence and single arm experiments that has.
Improvements to Service Provisioning Platform Deployment Process Master’s Thesis – Matti Jylhä Supervisor: Professor Jorma Jormakka.
CLAS12 CalCom Activity CLAS Collaboration Meeting, March 6 th 2014.
C. Seez Imperial College November 28th, 2002 ECAL testbeam Workshop 1 Offline software for ECAL test beam The pre-processing model The offline software.
REVIEW OF NA61 SOFTWRE UPGRADE PROPOSAL. Mandate The NA61 experiment is contemplating to rewrite its fortran software in modern technology and are requesting.
Status of Hall C 6 GeV Analysis Software Robust Fortran/CERNLIB code, “ENGINE”, for analysis of HMS/SOS coincidence and single arm experiments that has.
November 7, 2001Dutch Datagrid SARA 1 DØ Monte Carlo Challenge A HEP Application.
Hall D Offline Software Performance and Status 12 GeV Software Review III February 10, 2015 Mark Ito Hall D Offline Software1.
Offline Coordinators  CMSSW_7_1_0 release: 17 June 2014  Usage:  Generation and Simulation samples for run 2 startup  Limited digitization and reconstruction.
GlueX Software Status April 28, 2006 David Lawrence, JLab.
Project Overview How to get here…. Half Way to the Test Run October 18, 2012HPS Project Overview2 …starting from here? John Jaros HPS Collaboration Meeting.
GridPP18 Glasgow Mar 07 DØ – SAMGrid Where’ve we come from, and where are we going? Evolution of a ‘long’ established plan Gavin Davies Imperial College.
Copyright © 1994 Carnegie Mellon University Disciplined Software Engineering - Lecture 3 1 Software Size Estimation I Material adapted from: Disciplined.
ALICE Simulation Framework Ivana Hrivnacova 1 and Andreas Morsch 2 1 NPI ASCR, Rez, Czech Republic 2 CERN, Geneva, Switzerland For the ALICE Collaboration.
1 Planning for Reuse (based on some ideas currently being discussed in LHCb ) m Obstacles to reuse m Process for reuse m Project organisation for reuse.
Software Status  Last Software Workshop u Held at Fermilab just before Christmas. u Completed reconstruction testing: s MICE trackers and KEK tracker.
24/06/03 ATLAS WeekAlexandre Solodkov1 Status of TileCal software.
Page 1 May 10, 2011 IT for the 12 GeV Era 2011 Review Review Closing Summary.
November 2013 Review Talks Morning Plenary Talk – CLAS12 Software Overview and Progress ( ) Current Status with Emphasis on Past Year’s Progress:
CMS Computing and Core-Software USCMS CB Riverside, May 19, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM.
Hall-D/GlueX Software Status 12 GeV Software Review III February 11[?], 2015 Mark Ito.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
The CMS Simulation Software Julia Yarba, Fermilab on behalf of CMS Collaboration 22 m long, 15 m in diameter Over a million geometrical volumes Many complex.
Introduction What is detector simulation? A detector simulation program must provide the possibility of describing accurately an experimental setup (both.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
TB1: Data analysis Antonio Bulgheroni on behalf of the TB24 team.
Simulation Commissioning, Validation, Data Quality A brain dump to prompt discussion Many points applicable to any of LHCb software but some simulation.
Jean-Roch Vlimant, CERN Physics Performance and Dataset Project Physics Data & MC Validation Group McM : The Evolution of PREP. The CMS tool for Monte-Carlo.
CD FY09 Tactical Plan Status FY09 Tactical Plan Status Report for Neutrino Program (MINOS, MINERvA, General) Margaret Votava April 21, 2009 Tactical plan.
Gluex VO Status Report Richard Jones, University of Connecticut OSG Council Meeting, September 11, 2012.
Gluex VO Status Report Richard Jones, University of Connecticut OSG Council Meeting, September 11, 2012.
The JANA Reconstruction Framework David Lawrence - JLab May 25, /25/101JANA - Lawrence - CLAS12 Software Workshop.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
Upgrade Software University and INFN Catania Upgrade Software Alessia Tricomi University and INFN Catania CMS Trigger Workshop CERN, 23 July 2009.
Computing Resources for ILD Akiya Miyamoto, KEK with a help by Vincent, Mark, Junping, Frank 9 September 2014 ILD Oshu City a report on work.
Thomas Jefferson National Accelerator Facility Page 1 Overview Talk Content Break-out Sessions Planning 12 GeV Upgrade Software Review Jefferson Lab November.
The MEG Offline Project General Architecture Offline Organization Responsibilities Milestones PSI 2/7/2004Corrado Gatto INFN.
GlueX Computing GlueX Collaboration Meeting – JLab Edward Brash – University of Regina December 11 th -13th, 2003.
Ian Bird Overview Board; CERN, 8 th March 2013 March 6, 2013
Software Overview 1M. Ellis - CM21 - 7th June 2008  Simulation Status  Reconstruction Status  Unpacking Library  Tracker Data Format  Real Data (DATE)
1 Tracker Software Status M. Ellis MICE Collaboration Meeting 27 th June 2005.
SoLID simulation thoughts Zhiwen Zhao 2015/04/02.
Status of Hall C 6 GeV Analysis Software Robust Fortran/CERNLIB code, “ENGINE”, for analysis of HMS/SOS coincidence and single arm experiments that has.
Introduction to FCC Software FCC Istanbul 11 March, 2016 Alice Robson (CERN/UNIGE) on behalf of / with thanks to the FCC software group.
Update on CHEP from the Computing Speaker Committee G. Carlino (INFN Napoli) on behalf of the CSC ICB, October
October 19, 2010 David Lawrence JLab Oct. 19, 20101RootSpy -- CHEP10, Taipei -- David Lawrence, JLab Parallel Session 18: Software Engineering, Data Stores,
CDF SAM Deployment Status Doug Benjamin Duke University (for the CDF Data Handling Group)
05/14/04Larry Dennis, FSU1 Scale of Hall D Computing CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
1 GlueX Software Oct. 21, 2004 D. Lawrence, JLab.
DØ Grid Computing Gavin Davies, Frédéric Villeneuve-Séguier Imperial College London On behalf of the DØ Collaboration and the SAMGrid team The 2007 Europhysics.
MICE Collaboration Meeting Saturday 22nd October 2005 Malcolm Ellis
European Organization for Nuclear Research
for the Offline and Computing groups
Status of Full Simulation for Muon Trigger at SLHC
Scientific Computing At Jefferson Lab
Simulation and Physics
ATLAS DC2 & Continuous production
HEC Beam Test Software schematic view T D S MC events ASCII-TDS
Presentation transcript:

The GlueX Collaboration Meeting October 4-6, 2012 Jefferson Lab Curtis Meyer

The Review Was held on June 7-8, 2012 at Jefferson Lab. Reviewed all four halls as well as online and the computer center. Mark Ito, David Lawrence and Curtis Meyer made presentations. Overall the review was very positive, but a LOT of work! 10/4/12GlueX Collaboration Meeting

Presentations Morning session --- – Overview of the Experiment – Curtis – Overview of the Offline System – Mark Afternoon parallel with Hall B --- – Details of the Offline System - David ``…special thanks to Hall D for early and comprehensive documentation.” 10/4/12GlueX Collaboration Meeting

General Recomendations Presentations in future reviews should cover end user utilization of and experience with the software in more detail. Talks from end users on usage experience with the software and analysis infrastructure would be beneficial. 10/4/12GlueX Collaboration Meeting

Hall D Observations The framework is written in C++ and is decomposed into a series of components. The event processing framework does event level parallelism, which is the appropriate scope for their problem domain. They do a nightly build of the software, which signals experts in the case of problems. In addition, they have twice weekly regression tests using known MC samples in order to find performance problems in new code. Calibration and alignment software are in a rather advanced state. Nonetheless completing the development of the calibration software is estimated to be the largest remaining offline software effort to complete in terms of FTE-years. They have implemented a run based calibration system, CCDB, which can use ASCII or MySQL back ends. The system keeps a full history and makes it easy to create a new calibration ‘era’ by cloning an existing era and modifying only the calibrations one is interested in. It also includes an easy to use python shell interface. HDDS as the single source of detector geometry description for MC and reconstruction looks very promising. This guarantees that reconstruction and simulation are in synchronization. In addition it should aid in the migration from Geant3 to Geant4. Innovative use is made of GPUs for Partial Wave Analysis. They are evaluating the use of the visualization library built by CLAS12 in order to do event visualization. 10/4/12GlueX Collaboration Meeting

Findings Workflow tools for handling bulk processing on the batch farm are only in conceptual stage. The detector simulation is based on Geant3. The collaboration has explored using grid resources; this is worthwhile so long as required manpower levels are low, to improve flexibility and capability in analysis. The JANA framework is very stable. They report they haven’t changed the code for a year. Their JANA framework is not specific to GlueX, it could be adopted by others. It hasn’t been thus far. 10/4/12GlueX Collaboration Meeting

Recommendations A series of scale tests ramping up using JLab’s LQCD farm should be planned and conducted. The data volume and processing scale of GlueX is substantial but plans for data management and workload management systems supporting the operational scale were not made clear. They should be carefully developed. Consider ROOT (with it’s schema evolution capabilities) as a possible alternative for the HDDM DST format. To ensure a smooth transition from development and deployment to operations, …, an explicitly planned program of data challenges, directed both at exercising the performance of the full analysis chain and at exercising the scaling behavior and effectiveness of the computing model at scales progressively closer to operating scale, is recommended. We heard more explicit plans from Hall D….. This data challenge program should be underway now, and should not await the full completion of the offline software. 10/4/12GlueX Collaboration Meeting

Response of GlueX We have set up our ``data challenge’’ working group to move us towards progressively larger data challenges. – Jakes work with 3pi in 2011 ~10 hours of beam time – Next data challenge should try to increase this by about a factor of 10. – Ultimately reach ~1month of simulated data. The first issue we encountered was software performance. – Tremendous push within the collaboration to understand the problems with tracking. We are now in much better shape than we were. – Substantial work has been put into photon reconstruction in both the BCAL and the FCAL. – Major push on a more global handling of particle identification has been implemented. – Our software is in substantially better shape now than it was in one year ago! 10/4/12GlueX Collaboration Meeting

Response of GlueX The limiting issue we encountered was data storage. – Richard has developed the REST format for DSTs which is smaller than what we had estimated. – Paul Mattione has developed an analysis package and has been stressing the REST format. – We are ``close’’ to having the real online format for the experiment. – We are now trying to refine the size estimates of our raw data. Can we process data on the JLab Farms? – What happens when we submit ~1000 jobs to the farm? – Can we track the results? – Can we recover from errors? – How does the software perform? – What tools do we need to manage this? 10/4/12GlueX Collaboration Meeting

Response of GlueX Generate a very large set of PYTHIA background events on the GRID and make available in REST format. – Many of our analyses efforts are now limited by the amount of PYTHIA background we have. – We in principle have access to a very large amount of core time on the grid. Again, we are limited by data storage. – Once we are happy with REST, we are ready to roll on this. Generate a very large set of PYTHIA background events at Jefferson Lab. – This effort is aimed more at understanding the issues of processing raw data at Jefferson Lab. – We are currently building up to run this and have requested tape and disk resources. – We are waiting for the Raw Data. – We still need to define what will get kept in this challenge. 10/4/12GlueX Collaboration Meeting

Calibrations We identified calibrations as the largest hole in our offline effort. We also identified that the natural time for this to move forward is after detectors construction is complete. We are moving into that phase now, and need to start moving in the direction of calibrations for GlueX. 10/4/12GlueX Collaboration Meeting

Summary The software review was a very useful exercise that made us really look carefully at what we had, what we needed and how we could get to where we needed to be. It is fair to say that we are in good shape in this regard, but that does not by any means say we are ready for data. We have taken the appropriate response to the review, and I think that we have made substantial progress since June. I hope that we will not be going through another review like this anytime soon! 10/4/12GlueX Collaboration Meeting