Why A Software Review? Now have experience of real data and first major analysis results –What have we learned? –How should that change what we do next.

Slides:



Advertisements
Similar presentations
Andrew McNab - Manchester HEP - 24 May 2001 WorkGroup H: Software Support Both middleware and application support Installation tools and expertise Communication.
Advertisements

Acceptance Testing.
Week 3. Assembly Language Programming  Difficult when starting assembly programming  Have to work at low level  Use processor instructions >Requires.
Michigan State University 4/15/ Simulation for Level 2 James T. Linnemann Michigan State University NIU Triggering Workshop October 17, 1997.
The Comparison of the Software Cost Estimating Methods
31 May 2007LCWS R&D Review - Summary1 WWS Calorimetry R&D Review: CALICE Wrap-up Paul Dauncey, Imperial College London On behalf of the CALICE Collaboration.
Anne-Marie Magnan Imperial College London Tuesday, December18th 2007 Calice Software Review -- DESY -- A.-M. Magnan (IC London) 1 Tracking and ECAL reconstruction.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
28 Feb 2006Digi - Paul Dauncey1 In principle change from simulation output to “raw” information equivalent to that seen in real data Not “reconstruction”,
EventStore Managing Event Versioning and Data Partitioning using Legacy Data Formats Chris Jones Valentin Kuznetsov Dan Riley Greg Sharp CLEO Collaboration.
CS300 Planning and Executing a Project Terry Hinton Helen Treharne.
Computers: Tools for an Information Age
Runtime alignment system SOFTWARE DESIGN IDEAS Wed 4 th May 2005 P Coe.
Anne-Marie Magnan Imperial College London Kobe, May 11th, 2007 Calice meeting --- Anne-Marie Magnan 1 Status of MC reconstruction MC Reconstruction Chain.
Applied Software Project Management 1 Introduction Dr. Mengxia Zhu Computer Science Department Southern Illinois University Carbondale.
Chapter 1 Program Design
Regression testing Tor Stållhane. What is regression testing – 1 Regression testing is testing done to check that a system update does not re- introduce.
DATA PRESERVATION IN ALICE FEDERICO CARMINATI. MOTIVATION ALICE is a 150 M CHF investment by a large scientific community The ALICE data is unique and.
The Project AH Computing. Functional Requirements  What the product must do!  Examples attractive welcome screen all options available as clickable.
Other Features Index and table of contents Macros and VBA.
Introduction to Systems Analysis and Design Trisha Cummings.
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
TESTING.
Simulation of RPC avalanche signal for a Digital Hadron Calorimeter (DHCAL) Lei Xia ANL - HEP.
Shuei MEG review meeting, 2 July MEG Software Status MEG Software Group Framework Large Prototype software updates Database ROME Monte Carlo.
C. Seez Imperial College November 28th, 2002 ECAL testbeam Workshop 1 Offline software for ECAL test beam The pre-processing model The offline software.
06/03/06Calice TB preparation1 HCAL test beam monitoring - online plots & fast analysis - - what do we want to monitor - how do we want to store & communicate.
 To explain the importance of software configuration management (CM)  To describe key CM activities namely CM planning, change management, version management.
Event Data History David Adams BNL Atlas Software Week December 2001.
The european ITM Task Force data structure F. Imbeaux.
Status report from T2K-SK group Task list of this group discussion about NEUT Kaneyuki, Walter, Konaka We have just started the discussion.
Modeling and Simulation Discrete-Event Simulation
Configuration Management and Change Control Change is inevitable! So it has to be planned for and managed.
Reconstruction Configuration with Python Chris Jones University of Cambridge.
Navigation Timing Studies of the ATLAS High-Level Trigger Andrew Lowe Royal Holloway, University of London.
Organisation of the Beatenberg Trigger Workshop Ricardo Gonçalo Higgs WG Meeting - 22 Jan.09.
Systems Development Life Cycle
Goals for Presentation Explain the basics of software development methodologies Explain basic XP elements Show the structure of an XP project Give a few.
21 Jun 2010Paul Dauncey1 First look at FNAL tracking chamber alignment Paul Dauncey, with lots of help from Daniel and Angela.
Database Management Systems (DBMS)
The CMS Simulation Software Julia Yarba, Fermilab on behalf of CMS Collaboration 22 m long, 15 m in diameter Over a million geometrical volumes Many complex.
Nigel Watson / BirminghamCALICE ECAL, UCL, 06-Mar-2006 Test Beam Task List - ECAL  Aim:  Identify all tasks essential for run and analysis of beam data.
19 July 2007Paul Dauncey - Software Review1 Preparations for the Software “Review” Paul Dauncey.
Simulations and Software CBM Collaboration Meeting, GSI, 17 October 2008 Volker Friese Simulations Software Computing.
Software Engineering 2004 Jyrki Nummenmaa 1 BACKGROUND There is no way to generally test programs exhaustively (that is, going through all execution.
Report on CALICE Software Model Review Held at DESY, 18 December 2007 Software Review Committee Dave Bailey, Paul Dauncey, Gunter Eckerlin, Steve Magill,
Thomas Kern | The system documentation as binding agent for and in between internal and external customers April 24th, 2009 | Page 1 The system documentation.
Objectives Understand Corrective, Perfective and Preventive maintenance Discuss the general concepts of software configuration management.
ERP Implementation Lifecycle
The Online World ONLINE DOCUMENTS. Online documents Online documents (such as text documents, spreadsheets, presentations, graphics and forms) are any.
 An essential supporting structure of any thing  A Software Framework  Has layered structure ▪ What kind of functions and how they interrelate  Has.
Mokka, main guidelines and future P. Mora de Freitas Laboratoire Leprince-Ringuet Ecole polytechnique - France Linear collider Workshop 2004, Paris.
ScECAL Beam FNAL Short summary & Introduction to analysis S. Uozumi Nov ScECAL meeting.
AliRoot survey: Analysis P.Hristov 11/06/2013. Are you involved in analysis activities?(85.1% Yes, 14.9% No) 2 Involved since 4.5±2.4 years Dedicated.
Vincenzo Innocente, CERN/EPUser Collections1 Grid Scenarios in CMS Vincenzo Innocente CERN/EP Simulation, Reconstruction and Analysis scenarios.
The MEG Offline Project General Architecture Offline Organization Responsibilities Milestones PSI 2/7/2004Corrado Gatto INFN.
1 Configuration Database David Forrest University of Glasgow RAL :: 31 May 2009.
Feb. 3, 2007IFC meeting1 Beam test report Ph. Bruel on behalf of the beam test working group Gamma-ray Large Area Space Telescope.
Unit 1. Introduction Creativity: The production of an idea, concept, creation or discovery that is new or original to its creator or a new combination.
1 Calice TB Review DESY 15/6/06D.R. Ward David Ward Post mortem on May’06 DESY running. What’s still needed for DESY analysis? What’s needed for CERN data.
Analysis Tools interface - configuration Wouter Verkerke Wouter Verkerke, NIKHEF 1.
11 Sep 2007Tracking - Paul Dauncey1 Tracking Code Paul Dauncey, Imperial College London.
1 MC Production and Reconstruction Summary and Plans Valeria Bartsch, Fabrizio Salvatore, A.-M. Magnan, and Nigel Watson.
Rainer Stamen, Norman Gee
Michele Faucci Giannelli
Debugging Intermittent Issues
Existing Perl/Oracle Pipeline
Debugging Intermittent Issues
Cost Estimation Van Vliet, chapter 7 Glenn D. Blank.
Project management Initiative & Self Learning Confidence in leadership
Presentation transcript:

Why A Software Review? Now have experience of real data and first major analysis results –What have we learned? –How should that change what we do next time? Roman is pivotal to whole software effort –Software requires his expertise to develop –And also to run the code, particularly the reco –Should Roman really be running the reco jobs? –How do we reduce the burden from him so his skills can be used for higher level tasks? An open and detailed discussion of the software –Get collaboration agreement to decisions which affect everyone –Generate some level of documentation for code developers

Base on Collaboration Goals For software, the aims of CALICE are –To use the physics prototypes to compare data to MC physics models and find the “best” agreement –To use these models to develop with confidence an optimised detector both for global detector and for technological prototype design We must be able to compare data and MC with confidence –As much code as possible must be common to both We must be able to handle date from different conditions and places (DESY/CERN/FNAL, missing layers, etc) –As much as possible should be automated We must be confident we can apply the tuned models to the global detector simulations without extra differences –As much as possible should be common

Some examples of issues Some random things which came to mind –Give some idea of what we would discuss –But there will be plenty more You may have strong (good!) ideas on what to do with these –Informal discussions show these are not unique –We don’t want to solve these issues today; that is what the review is for –I hope people will come along and give their opinions…

Reconstruction Code from several/many developers –Different solutions to similar issues –Should be more commonality? Needs to be more automated –Dependence on steering files is very high –Difficult to be sure it is right (even for Roman); almost impossible for users –Move all parameters into database or reco files? Data and MC reco should be common as far as possible –Need same “semi-raw data” format for data and MC for every detector? –Need MC reco for all items, e.g. trigger? –Should data be converted (=mapped) to physical position indices or MC to electronics indices?

Systematics We should be systematics limited –Else we should have taken more data! Every analysis will need to do systematic studies –Many will involve varying parameters used in reconstruction, e.g. what is effect of uncertainty of ECAL calibration on analysis result? –There could be many such effects to be studied in each analysis Must be guaranteed to reproduce original reco results if no change –Needs all cuts, constants, etc, used in reco to be available –Many cuts in steering files; others in database (but they could have been updated after reco was done) How should this all be organised? –Users rerun whole reco from raw data for each systematic? –Rerun just specific part of reco for each? –Produce centrally standard variations for each run?

Databases We are using a database with a “conditions” data structure –Entries are organised as values (e.g. temperature) at time t Tools implicitly assume only one experiment –Only one value for each item at time t; unique database folder set at start of job –We have simultaneous values because working at different sites (DESY/CERN/FNAL) so must subdivide database and use correct folder; not known until data file read in after start of job –We also use it for MC values; also not known if data or MC until data file read Currently handled by specifying folders by hand depending on run being used –Should we rewrite tools? –Read file and restart job? –Should we use a different database? 

Databases (Cont) We are using the database also for storing “configuration” data –Related to run structures, i.e. more like how we work –Imposing differently structured data is not always easy –E.g. Beam energy for run x? Need to scan times to find when run x happened and then find beam energy; takes minutes per run Should this be changed? How? –Have second database with “configuration” data structure? –Do lookup once for every run and store in run header in reco files?

Global detector studies We have (uniquely?) the whole breadth of knowledge from the detector hardware data to PFA implementation within CALICE –How do we optimise our contribution to concept groups? –LDC is clear, SiD is not (GLD I don’t know…) Use common (Mokka) simulation? –One implementation of detector model but may have less impact in concept meetings Use concept group “native” simulations? –Requires two independent implementations; are they guaranteed to be equivalent to each other and the beam test results? This may become more critical as detector concepts move to collaborations…