TILC09, 17-21 April 2009, Tsukuba P. Mato /CERN.  Former LHCb core software coordination ◦ Architect of the GAUDI framework  Applications Area manager.

Slides:



Advertisements
Similar presentations
1 Databases in ALICE L.Betev LCG Database Deployment and Persistency Workshop Geneva, October 17, 2005.
Advertisements

1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Blueprint RTAGs1 Coherent Software Framework a Proposal LCG meeting CERN- 11 June Ren é Brun ftp://root.cern.ch/root/blueprint.ppt.
Ideas on the LCG Application Architecture Application Architecture Blueprint RTAG 12 th June 2002 P. Mato / CERN.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
O. Stézowski IPN Lyon AGATA Week September 2003 Legnaro Data Analysis – Team #3 ROOT as a framework for AGATA.
© , Michael Aivazis DANSE Software Issues Michael Aivazis California Institute of Technology DANSE Software Workshop September 3-8, 2003.
Simulation Project Organization update & review of recommendations Gabriele Cosmo, CERN/PH-SFT Application Area Internal.
Simulation Project Organization update & review of recommendations Gabriele Cosmo, CERN/PH-SFT Application Area Internal.
SEAL V1 Status 12 February 2003 P. Mato / CERN Shared Environment for Applications at LHC.
ROOT An object oriented HEP analysis framework.. Computing in Physics Physics = experimental science =>Experiments (e.g. at CERN) Planning phase Physics.
ROOT: A Data Mining Tool from CERN Arun Tripathi and Ravi Kumar 2008 CAS Ratemaking Seminar on Ratemaking 17 March 2008 Cambridge, Massachusetts.
OpenAlea An OpenSource platform for plant modeling C. Pradal, S. Dufour-Kowalski, F. Boudon, C. Fournier, C. Godin.
REVIEW OF NA61 SOFTWRE UPGRADE PROPOSAL. Mandate The NA61 experiment is contemplating to rewrite its fortran software in modern technology and are requesting.
David Adams ATLAS ATLAS Distributed Analysis David Adams BNL March 18, 2004 ATLAS Software Workshop Grid session.
ALCPG Software Tools Jeremy McCormick, SLAC LCWS 2012, UT Arlington October 23, 2012.
K. Harrison CERN, 20th April 2004 AJDL interface and LCG submission - Overview of AJDL - Using AJDL from Python - LCG submission.
LC Software Workshop, May 2009, CERN P. Mato /CERN.
LHC data processing challenges: from Detector data to Physics July 2007 Pere Mat ó CERN.
The PH-SFT Group Mandate, Organization, Achievements Software Components Software Services Summary January 31st,
Updating JUPITER framework using XML interface Kobe University Susumu Kishimoto.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Common Application Software for the LHC experiments NEC’2007 International Symposium, Varna, Bulgaria September 2007 Pere Mato, CERN.
SEAL: Core Libraries and Services Project CERN/IT After-C5 Meeting 6 June 2003 P. Mato / CERN.
1 Planning for Reuse (based on some ideas currently being discussed in LHCb ) m Obstacles to reuse m Process for reuse m Project organisation for reuse.
MINER A Software The Goals Software being developed have to be portable maintainable over the expected lifetime of the experiment extensible accessible.
W. Pokorski - CERN Simulation Project1 Python binding for Geant4 toolkit using Reflex/PyROOT tool Witek Pokorski EuroPython 2006, CERN, Geneva
Introduction to Gaudi LHCb software tutorial - September
ROOT Application Area Internal Review September 2006.
Outline: LHCb and LCG-AA Ph.Charpentier B 00 l e.
Heather Kelly PPA Scientific Computing Apps LAT was launched as part of the Fermi Gamma-ray Space Telescope on June 11 th 2008.
SEAL: Common Core Libraries and Services for LHC Applications CHEP’03, March 24-28, 2003 La Jolla, California J. Generowicz/CERN, M. Marino/LBNL, P. Mato/CERN,
SEAL Core Libraries and Services CLHEP Workshop 28 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
Summary of Software Components and Libraries Track Computing in High Energy and Nuclear Physics February 2006, T.I.F.R. Mumbai, India P. Hristov,
SEAL Project Core Libraries and Services 18 December 2002 P. Mato / CERN Shared Environment for Applications at LHC.
LC Software Workshop, May 2009, CERN P. Mato /CERN.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
DPHEP Workshop CERN, December Predrag Buncic (CERN/PH-SFT) CernVM R&D Project Portable Analysis Environments using Virtualization.
The CMS Simulation Software Julia Yarba, Fermilab on behalf of CMS Collaboration 22 m long, 15 m in diameter Over a million geometrical volumes Many complex.
The LHC Computing Grid Project (LCG) and ROOT Torre Wenaus, BNL/CERN LCG Applications Area Manager John Harvey, CERN EP/SFT Group Leader
SEAL and PI Project Status LHCC Comprehensive Review of LCG November 2004 P. Mato / CERN.
Detector Description in LHCb Detector Description Workshop 13 June 2002 S. Ponce, P. Mato / CERN.
SEAL Project Overview LCG-AA Internal Review October 2003 P. Mato / CERN.
23/2/2000Status of GAUDI 1 P. Mato / CERN Computing meeting, LHCb Week 23 February 2000.
LCG – AA review 1 Simulation LCG/AA review Sept 2006.
- LCG Blueprint (19dec02 - Caltech Pasadena, CA) LCG BluePrint: PI and SEAL Craig E. Tull Trillium Analysis Environment for the.
General requirements for BES III offline & EF selection software Weidong Li.
CERN Tutorial, February Introduction to Gaudi.
Preliminary Ideas for a New Project Proposal.  Motivation  Vision  More details  Impact for Geant4  Project and Timeline P. Mato/CERN 2.
SEAL Project Status SC2 Meeting 16th April 2003 P. Mato / CERN.
INFSO-RI Enabling Grids for E-sciencE Using of GANGA interface for Athena applications A. Zalite / PNPI.
Predrag Buncic (CERN/PH-SFT) Software Packaging: Can Virtualization help?
CPT Week, November , 2002 Lassi A. Tuura, Northeastern University Core Framework Infrastructure Lassi A. Tuura Northeastern.
David Adams ATLAS ATLAS Distributed Analysis and proposal for ATLAS-LHCb system David Adams BNL March 22, 2004 ATLAS-LHCb-GANGA Meeting.
36 th LHCb Software Week Pere Mato/CERN.  Provide a complete, portable and easy to configure user environment for developing and running LHC data analysis.
Follow-up to SFT Review (2009/2010) Priorities and Organization for 2011 and 2012.
Project Work Plan SEAL: Core Libraries and Services 7 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
SuperB Computing R&D Workshop 9-12 March 2010, IUSS, Ferrara, Italy P. Mato /CERN.
VI/ CERN Dec 4 CMS Software Architecture vs Hybrid Store Vincenzo Innocente CMS Week CERN, Dec
Muon Collider Physics Workshop November 2009, Fermilab P. Mato /CERN.
SEAL: Common Core Libraries and Services for LHC Applications
The LHCb Software and Computing NSS/IEEE workshop Ph. Charpentier, CERN B00le.
CernVM Status Report Predrag Buncic (CERN/PH-SFT).
Conference on High-Energy Physics Moscow, Russia,
SW Architecture SG meeting 22 July 1999 P. Mato, CERN
Simulation Framework Subproject cern
Simulation and Physics
G4 Workshop 2002 Detector Description Parallel Session
SEAL Project Core Libraries and Services
Planning next release of GAUDI
Presentation transcript:

TILC09, April 2009, Tsukuba P. Mato /CERN

 Former LHCb core software coordination ◦ Architect of the GAUDI framework  Applications Area manager of the Worldwide LHC Computing Grid (WLCG) ◦ Develops and maintains that part of the physics applications software and associated infrastructure that is common among the LHC experiments  Leader of the Software Development for Experiments (SFT) group in the Physics (PH) Department at CERN ◦ Projects: Geant4, ROOT, GENSER, SPI, CernVM, etc. 19/4/09TILC09, April 2009, Tsukuba -- P. Mato/CERN2

 The main theme of this short presentation is a review what can be learn and reuse from the LHC software for the ILC  Emphasis on interoperability, reuse and enabling independent developments ◦ Level 1- Application Interfaces/Formats ◦ Level 2 - Software Integrating Elements  Review some of the interfaces and common packages developed in the context of LCG 19/4/09TILC09, April 2009, Tsukuba -- P. Mato/CERN3

4 non-HEP specific software packages Experiment Framework Event Det Desc. Calib. Applications Core Libraries Simulation Data Mngmt. Distrib. Analysis Every experiment has a framework for basic services and various specialized frameworks: event model, detector description, visualization, persistency, interactivity, simulation, calibrarion Many non-HEP libraries widely used (e.g. Xerces, GSL, Boost, etc.) Applications are built on top of frameworks and implementing the required algorithms (e.g. simulation, reconstruction, analysis, trigger) Core libraries and services that are widely used and provide basic functionality (e.g. ROOT, CLHEP, HepMC,…) Specialized domains that are common among the experiments (e.g. Geant4, COOL, Generators, etc.) 19/4/09

 Foundation Libraries ◦ Basic types ◦ Utility libraries ◦ System isolation libraries  Mathematical Libraries ◦ Special functions ◦ Minimization, Random Numbers  Data Organization ◦ Event Data ◦ Event Metadata (Event collections) ◦ Detector Description ◦ Detector Conditions Data  Data Management Tools ◦ Object Persistency ◦ Data Distribution and Replication TILC09, April 2009, Tsukuba -- P. Mato/CERN5  Simulation Toolkits Event generators Detector simulation  Statistical Analysis Tools Histograms, N-tuples Fitting  Interactivity and User Interfaces GUI Scripting Interactive analysis  Data Visualization and Graphics Event and Geometry displays  Distributed Applications Parallel processing Grid computing 19/4/09 One or more implementations of each component exists for LHC

 C++ used almost exclusively by all LHC Experiments ◦ LHC experiments with an initial FORTRAN code base have completed the migration to C++ long time ago  Large common software projects in C++ are in production for many years ◦ ROOT, Geant4, …  FORTRAN still in use mainly by the MC generators ◦ Large developments efforts are being put for the migration to C++ (Pythia8, Herwig++, Sherpa,…)  Java is almost inexistent ◦ Exception is the ATLAS event display ATLANTIS TILC09, April 2009, Tsukuba -- P. Mato/CERN619/4/09

 Scripting has been an essential component in the HEP analysis software for the last decades ◦ PAW macros (kumac) in the FORTRAN era ◦ C++ interpreter (CINT) in the C++ era ◦ Python is widely used by 3 out of 4 LHC experiments  Most of the statistical data analysis and final presentation is done with scripts ◦ Interactive analysis ◦ Rapid prototyping to test new ideas  Scripts are also used to “configure” complex C++ programs developed and used by the experiments ◦ “Simulation” and “Reconstruction” programs with hundreds or thousands of options to configure TILC09, April 2009, Tsukuba -- P. Mato/CERN719/4/09

 Python language is really interesting for two main reasons: ◦ High level programming language  Simple, elegant, easy to learn language  Ideal for rapid prototyping  Used for scientific programming ( ◦ Framework to “glue” different functionalities  Any two pieces of software can be glued at runtime if they offer a Python interface  With PyROOT any C++ class can be easy used from Python 19/4/09TILC09, April 2009, Tsukuba -- P. Mato/CERN8

 Experiments have developed Software Frameworks ◦ General architecture of any event processing applications (simulation, trigger, reconstruction, analysis, etc.) ◦ To achieve coherency and to facilitate software re-use ◦ Hide technical details to the end-user Physicists ◦ Help the Physicists to focus on their physics algorithms  Applications are developed by customizing the Framework ◦ By the “composition” of elemental Algorithms to form complete applications ◦ Using third-party components wherever possible and configuring them  ALICE: AliROOT; ATLAS+LHCb: Athena/Gaudi; CMS: CMSSW 19/4/09TILC09, April 2009, Tsukuba -- P. Mato/CERN9

 GAUDI is a mature software framework for event data processing used by several HEP experiments ◦ ATLAS, LHCb, HARP, GLAST, Daya Bay, Minerva, BES III,…  The same framework is used for all applications ◦ All applications behave the same way (configuration, logging, control, etc.) ◦ Re-use of ‘Services’ (e.g. Det. description) ◦ Re-use of ‘Algorithms’ (e.g. Recons -> HTL)  Equivalent to MARLIN 19/4/09TILC09, April 2009, Tsukuba -- P. Mato/CERN10

 Agreement on ‘standard formats’ for input and output data is essential for independent development of different detector concepts ◦ Detector description, configuration data, MC generator data, raw data, reconstructed data, statistical data, etc.  This has been the direction for the ILC software so far (e.g. lcdd, lcio, stdhep, gear,…)  Unfortunately LHC has been standardizing on other formats/interfaces (e.g. HepMC, GDML, ROOT files, etc.) 19/4/09TILC09, April 2009, Tsukuba -- P. Mato/CERN11

19/4/09TILC09, April 2009, Tsukuba -- P. Mato/CERN12  To facilitate the integration of independently developed components to build a coherent [single] program  Dictionaries ◦ Dictionaries provide meta data information (reflection) to allow introspection and interaction of objects in a generic manner ◦ The LHC strategy has been a single reflection system (Reflex)  Scripting languages ◦ Interpreted languages are ideal for rapid prototyping ◦ They allow integration of independently developed software modules (software bus) ◦ Standardized on CINT and Python scripting languages  Component model and plug-in management ◦ Modeling the application as a set of components with well defined interfaces ◦ Loading the required functionality at runtime

19/4/09TILC09, April 2009, Tsukuba -- P. Mato/CERN13 X.h Reflex data rootcint genreflex XDict.so ReflexROOT ROOT meta C++ CINT Python (PyROOT) Object I/O Scripting (CINT, Python) Plug-in management etc. dictionaries

 Highly optimized (speed & size) platform independent I/O system developed for more than 10 years ◦ Able to write/read any C++ object (event model independent) ◦ Almost no restrictions (default constructor needed)  Make use of ‘dictionaries’  Self-describing files ◦ Support for automatic and complex ‘schema evolution’ ◦ Usable without ‘user libraries’  All the LHC experiments rely on ROOT I/O for the next many years 19/4/09TILC09, April 2009, Tsukuba -- P. Mato/CERN14

19/4/09TILC09, April 2009, Tsukuba -- P. Mato/CERN15  FILES - based on ROOT I/O ◦ Complex data structure: event data, analysis data ◦ Management of object relationships: file catalogues ◦ Interface to Grid file catalogs and Grid file access  Relational Databases – Oracle, MySQL, SQLite ◦ Suitable for conditions, calibration, alignment, detector description data ◦ Complex use cases difficult to be satisfied by a single solution ◦ Isolating applications from the database implementations with a standardized relational database interface (CORAL)  facilitate the life of the developers  no change in the application to run in different environments  encode “good practices” once for all

19/4/09TILC09, April 2009, Tsukuba -- P. Mato/CERN16 Geant4 Fluka Flugg GDML (xml) text editor Pythia MC generators geometry (root) MCDB MCTruth (root) HepMC ROOT TGeom HL Desc. (xml)

 Geometry Description Markup Language (XML)  Low level (materials, shapes, volumes and placements) ◦ Quite verbose to edit directly  Directly understood by Geant4 and ROOT ◦ Long term commitment  Has been extended by lcdd with sensitive volumes, regions, visualization attributes, etc. 19/4/09TILC09, April 2009, Tsukuba -- P. Mato/CERN17

 Event record written in C++ for HEP MC Generators. ◦ Many extensions from HEPEVT (the Fortran HEP standard common block)  Agreed interface between MC authors and clients (LHC experiments, Geant4, …)  I/O support ◦ Read and Write ASCII files. Handy but not very efficient. ◦ HepMC events can also be stored with ROOT I/O 19/4/09TILC09, April 2009, Tsukuba -- P. Mato/CERN18

 Unfortunately very little has been done in common between the LHC experiments ◦ Different Event Models ◦ Different Data Processing Frameworks  The equivalent of the LCIO event data model did not exists  Within a given experiment, alternative set of algorithms has been developed and could be evaluated thanks to the common event model and framework 19/4/09TILC09, April 2009, Tsukuba -- P. Mato/CERN19

 ROOT is the facto standard and repository for all these common functionality ◦ Histogramming, random numbers, liner algebra, numerical algorithms, fitting, multivariate analysis, statistical classes, etc.  The aim is to continue to provide and support a coherent set of mathematical and statistical functions which are needed for the analysis, reconstruction and simulation 19/4/09TILC09, April 2009, Tsukuba -- P. Mato/CERN20

Pere Mato, CERN/PH21  Re-using existing software packages saves on development effort but complicates “software configuration” ◦ Need to hide this complexity  A configuration is a combination of packages and versions that are coherent and compatible  E.g. LHC experiments build their application software based on a given “LCG/AA configuration” ◦ Interfaced to the experiments configuration systems (SCRAM, CMT)

 CernVM is a virtual appliance that is being developed to provide a portable and complete software environment for all LHC experiments ◦ Decouples experiment software from system software and hardware  It comes with a read only file system optimized for software distribution ◦ Operational in offline mode for as long as you stay within the cache ◦ Reduces the effort to install, maintain and keep up to date the experiment software 26/9/09Distributed Data Analysis and Tools22

 LHC and ILC have been standardizing on different interfaces and packages ◦ In some cases ILC is relying on packages considered ‘obsolete’ by the LHC community (e.g. CLHEP, AIDA, stdhep, …)  Common interfaces/formats is good but adopting a common framework is even better ◦ It would enable one level up in re-use  ILC could leverage from existing structures and support for the common LHC software 19/4/09TILC09, April 2009, Tsukuba -- P. Mato/CERN23