LCG Applications Area Status Torre Wenaus, BNL/CERN LCG Applications Area Manager LHCC Meeting January 27, 2003.

Slides:



Advertisements
Similar presentations
Physicist Interfaces Project an overview Physicist Interfaces Project an overview Jakub T. Moscicki CERN June 2003.
Advertisements

LCG Project Status & Plans (with an emphasis on applications software) Torre Wenaus, BNL/CERN LCG Applications Area Manager
Distributed Analysis at the LCG Torre Wenaus, BNL/CERN LCG Applications Area Manager Caltech Grid Enabled Analysis.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
D. Düllmann - IT/DB LCG - POOL Project1 POOL Release Plan for 2003 Dirk Düllmann LCG Application Area Meeting, 5 th March 2003.
Simulation Project Organization update & review of recommendations Gabriele Cosmo, CERN/PH-SFT Application Area Internal.
Simulation Project Organization update & review of recommendations Gabriele Cosmo, CERN/PH-SFT Application Area Internal.
LHC Experiment Dashboard Main areas covered by the Experiment Dashboard: Data processing monitoring (job monitoring) Data transfer monitoring Site/service.
SPI Software Process & Infrastructure GRIDPP Collaboration Meeting - 3 June 2004 Jakub MOSCICKI
SEAL V1 Status 12 February 2003 P. Mato / CERN Shared Environment for Applications at LHC.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
SPI Software Process & Infrastructure EGEE France - 11 June 2004 Yannick Patois
REVIEW OF NA61 SOFTWRE UPGRADE PROPOSAL. Mandate The NA61 experiment is contemplating to rewrite its fortran software in modern technology and are requesting.
LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager DOE/NSF Review of US LHC Physics and Computing.
David Adams ATLAS ATLAS Distributed Analysis David Adams BNL March 18, 2004 ATLAS Software Workshop Grid session.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 Software Process panel SPI GRIDPP 7 th Collaboration Meeting 30 June – 2 July 2003 A.Aimar -
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
The LCG SPI project in LCG Phase II CHEP’06, Mumbai, India Feb. 14, 2006 Andreas Pfeiffer -- for the SPI team
LCG Applications Area Status Torre Wenaus, BNL/CERN LCG Applications Area Manager US ATLAS Physics and Computing Meeting August 28,
LCG Applications Area – Overview, Planning, Resources Torre Wenaus, BNL/CERN LCG Applications Area Manager LHCC Comprehensive Review.
SEAL: Core Libraries and Services Project CERN/IT After-C5 Meeting 6 June 2003 P. Mato / CERN.
LCG LHC Computing Grid Project – LCG CERN – European Organisation for Nuclear Research Geneva, Switzerland LCG LHCC Comprehensive.
NMI End-to-End Diagnostic Advisory Group BoF Fall 2003 Internet2 Member Meeting.
JRA Execution Plan 13 January JRA1 Execution Plan Frédéric Hemmer EGEE Middleware Manager EGEE is proposed as a project funded by the European.
Early Thinking on ARDA in the Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager PEB Dec 9, 2003.
LCG Generator Meeting, December 11 th 2003 Introduction to the LCG Generator Monthly Meeting.
SEAL: Common Core Libraries and Services for LHC Applications CHEP’03, March 24-28, 2003 La Jolla, California J. Generowicz/CERN, M. Marino/LBNL, P. Mato/CERN,
SEAL Core Libraries and Services CLHEP Workshop 28 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
SEAL Project Core Libraries and Services 18 December 2002 P. Mato / CERN Shared Environment for Applications at LHC.
The POOL Persistency Framework POOL Project Review Introduction & Overview Dirk Düllmann, IT-DB & LCG-POOL LCG Application Area Internal Review October.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
15 December 2015M. Lamanna “The ARDA project”1 The ARDA Project (meeting with the LCG referees) Massimo Lamanna CERN.
The LHC Computing Grid Project (LCG) and ROOT Torre Wenaus, BNL/CERN LCG Applications Area Manager John Harvey, CERN EP/SFT Group Leader
Feedback from LHC Experiments on using CLHEP Lorenzo Moneta CLHEP workshop 28 January 2003.
Software Engineering Overview DTI International Technology Service-Global Watch Mission “Mission to CERN in Distributed IT Applications” June 2004.
G.Govi CERN/IT-DB 1 September 26, 2003 POOL Integration, Testing and Release Procedure Integration  Packages structure  External dependencies  Configuration.
SEAL Project Overview LCG-AA Internal Review October 2003 P. Mato / CERN.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 SPI Software Process & Infrastructure for LCG Project Overview LCG Application Area Internal.
D. Duellmann - IT/DB LCG - POOL Project1 The LCG Pool Project and ROOT I/O Dirk Duellmann What is Pool? Component Breakdown Status and Plans.
State of Georgia Release Management Training
- LCG Blueprint (19dec02 - Caltech Pasadena, CA) LCG BluePrint: PI and SEAL Craig E. Tull Trillium Analysis Environment for the.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
SEAL Project Status SC2 Meeting 16th April 2003 P. Mato / CERN.
Last update: 27/02/ :04 LCG Early Thinking on ARDA in the Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager PEB Dec 9, 2003.
1 Comments to SPI. 2 General remarks Impressed by progress since last review Widespread adoption by experiments and projects Savannah, ExtSoft Build system.
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
LCG Applications Area Internal Review Response (preliminary and brief version) (main points are on last slide) Torre Wenaus, BNL/CERN LCG Applications.
News from EP SFT John Harvey FOCUS Meeting – October 3 rd 2003.
Simulation Project Setup Status Torre Wenaus, BNL/CERN LCG Applications Area Manager PEB Meeting January 28, 2003.
Follow-up to SFT Review (2009/2010) Priorities and Organization for 2011 and 2012.
Project Work Plan SEAL: Core Libraries and Services 7 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
Applications Area Status Torre Wenaus, BNL/CERN PEB Meeting October 8, 2002.
D. Duellmann, IT-DB POOL Status1 POOL Persistency Framework - Status after a first year of development Dirk Düllmann, IT-DB.
A. Aimar - IT/API LCG - Software Process & Infrastructure1 SPI - News and Status Update CERN,
SPI Software Process & Infrastructure Project Plan 2004 H1 LCG-PEB Meeting - 06 April 2004 Alberto AIMAR
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
An Architectural Blueprint for the Common LHC Physics Application Software Torre Wenaus, BNL/CERN LCG Applications Area Manager
SEAL: Common Core Libraries and Services for LHC Applications
Bob Jones EGEE Technical Director
LCG Applications Area Milestones
EGEE Middleware Activities Overview
(on behalf of the POOL team)
Ian Bird GDB Meeting CERN 9 September 2003
SPI Software Process & Infrastructure
Dirk Düllmann CERN Openlab storage workshop 17th March 2003
Thoughts on Applications Area Involvement in ARDA
Simulation Framework Subproject cern
SEAL Project Core Libraries and Services
Presentation transcript:

LCG Applications Area Status Torre Wenaus, BNL/CERN LCG Applications Area Manager LHCC Meeting January 27, 2003

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 2 Outline  Applications area scope and organization  Architecture  Personnel and planning  Little on planning since I talked to (most of you) about it in Nov  Status of applications area projects  SPI  POOL  SEAL  PI  Math libraries  Simulation  Relationship to other LCG activity areas  Conclusion

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 3 The LHC Computing Grid Project Structure Project Overview Board Project Execution Board (PEB) Software and Computing Committee (SC2) Requirements, Work plan, Monitoring WP RTAG WP Project Leader Grid Projects Project Work Packages

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 4 LCG Areas of Work Fabric (Computing System)  Physics Data Management  Fabric Management  Physics Data Storage  LAN Management  Wide-area Networking  Security  Internet Services Grid Technology  Grid middleware  Standard application services layer  Inter-project coherence/compatibility Physics Applications Software  Application Software Infrastructure – libraries, tools  Object persistency, data management tools  Common Frameworks – Simulation, Analysis,..  Adaptation of Physics Applications to Grid environment  Grid tools, Portals Grid Deployment  Data Challenges  Grid Operations  Network Planning  Regional Centre Coordination  Security & access policy

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 5 Applications Area Organization Project WP Project WP Project WP Overall management, coordination, architecture Apps Area Leader Project Leaders Work Package Leaders Architects Forum … Direct technical collaboration between experiment participants, IT, EP, ROOT, LCG personnel

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 6 Focus on Experiment Need  Project structured and managed to ensure a focus on real experiment needs  SC2/RTAG process to identify, define (need-driven requirements), initiate and monitor common project activities in a way guided by the experiments themselves  Architects Forum to involve experiment architects in day to day project management and execution  Open information flow and decision making  Direct participation of experiment developers in the projects  Tight iterative feedback loop to gather user feedback from frequent releases  Early deployment and evaluation of LCG software in experiment contexts  Success defined by experiment adoption and production deployment

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 7 Applications Area Projects  Software Process and Infrastructure (SPI) (operating – A.Aimar)  Librarian, QA, testing, developer tools, documentation, training, …  Persistency Framework (POOL) (operating – D.Duellmann)  POOL hybrid ROOT/relational data store  Mathematical libraries (operating – F.James)  Math and statistics libraries; GSL etc. as NAGC replacement  Group in India will work on this (workplan in development)  Core Tools and Services (SEAL) (operating – P.Mato)  Foundation and utility libraries, basic framework services, system services, object dictionary and whiteboard, grid enabled services  Physics Interfaces (PI) (launched – V.Innocente)  Interfaces and tools by which physicists directly use the software. Interactive (distributed) analysis, visualization, grid portals  Simulation (launch planning in progress)  Geant4, FLUKA, simulation framework, geometry model, …  Generator Services (launch as part of simu)  Generator librarian, support, tool development Bold: Recent developments (last 3 months)

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 8 Project Relationships Software Process & Infrastructure (SPI) Core Libraries & Services (SEAL) Persistency (POOL) Physicists Interface (PI) Math Libraries … LCG Applications Area Other LCG Projects in other Areas LHC Experiments

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 9 Candidate RTAG timeline from March Blue: RTAG/activity launched or (light blue) imminent

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 10 LCG Applications Area Timeline Highlights Q1 Q2 Q3 Q4 POOL first production release Distributed production using grid services First Global Grid Service (LCG-1) available Distributed end-user interactive analysis Full Persistency Framework LCG-1 reliability and performance targets “50% prototype” (LCG-3) LCG TDR Applications LCG POOL V0.1 internal release Architectural blueprint complete LCG launch week

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 11 Architecture Blueprint  Executive summary  Response of the RTAG to the mandate  Blueprint scope  Requirements  Use of ROOT  Blueprint architecture design precepts  High level architectural issues, approaches  Blueprint architectural elements  Specific architectural elements, suggested patterns, examples  Domain decomposition  Schedule and resources  Recommendations RTAG established in June Expt architects, other experts After 14 meetings, much ... A 36-page final report Accepted by SC2 October 11

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 12 Principal architecture requirements  Long lifetime: support technology evolution  C++ today; support language evolution  Seamless distributed operation  Usability off-network  Component modularity, public interfaces  Interchangeability of implementations  Integration into coherent framework and experiment software  Design for end-user’s convenience more than the developer’s  Re-use existing implementations  Software quality at least as good as any LHC experiment  Meet performance, quality requirements of trigger/DAQ software  Platforms: Linux/gcc, Linux/icc, Solaris, Windows

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 13 Component Model  Communication via public interfaces  APIs targeted to end-users, embedding frameworks, internal plug-ins  Plug-ins  Logical module encapsulating a service that can be loaded, activated and unloaded at run time  Granularity driven by  component replacement criteria  dependency minimization  development team organization

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 14 Basic Framework Foundation Libraries Simulation Framework Reconstruction Framework Visualization Framework Applications... Optional Libraries Other Frameworks Software Structure Implementation- neutral services STL, ROOT libs, CLHEP, Boost, … Grid middleware, … ROOT, Qt, …

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 15 Distributed Operation  Architecture should enable but not require the use of distributed resources via the Grid  Configuration and control of Grid-based operation via dedicated services  Making use of optional grid middleware services at the foundation level of the software structure  Insulating higher level software from the middleware  Supporting replaceability  Apart from these services, Grid-based operation should be largely transparent  Services should gracefully adapt to ‘unplugged’ environments

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 16 Managing Objects  Object Dictionary  To query a class about its internal structure  Essential for persistency, data browsing, etc.  The ROOT team and LCG plan to develop and converge on a common dictionary (common interface and implementation) with an interface anticipating a C++ standard (XTI) (Timescale ~1yr?)  Object Whiteboard  Uniform access to application-defined transient objects, including in the ROOT environment  Object definition based on C++ header files  Used by ATLAS and CMS

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 17 Other Architectural Elements  Python-based Component Bus  Plug-in integration of components providing a wide variety of functionality  Component interfaces to bus derived from their C++ interfaces  Scripting Languages  Python and CINT (ROOT) to both be available  Access to objects via object whiteboard in these environments  Interface to the Grid  Must support convenient, efficient configuration of computing elements with all needed components

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 18 Domain Decomposition Products mentioned are examples; not a comprehensive list Grey: not in common project scope (also event processing framework, TDAQ)

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 19 Use of ROOT in LCG Software  Among the LHC experiments  ALICE has based its applications directly on ROOT  The 3 others base their applications on components with implementation-independent interfaces  Look for software that can be encapsulated into these components  All experiments agree that ROOT is an important element of LHC software  Leverage existing software effectively and do not unnecessarily reinvent wheels  Therefore the blueprint establishes a user/provider relationship between the LCG applications area and ROOT  LCG AA software will make use of ROOT as an external product  Draws on a great ROOT strength: users are listened to very carefully!  So far so good: the ROOT team has been very responsive to needs for new and extended functionality coming from POOL

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 20 Blueprint RTAG Outcomes  SC2 decided in October…  Blueprint is accepted  RTAG recommendations accepted to  Start common project on core tools and services  Start common project on physics interfaces

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 21 Applications Area Personnel Status  18 LCG apps hires in place and working; +1 last week, +1 in Feb  Manpower ramp is on target (expected to reach 20-23)  Contributions from UK, Spain, Switzerland, Germany, Sweden, Israel, Portugal, US, India, and Russia  ~12 FTEs from IT (DB and API groups)  ~12 FTEs from CERN EP/SFT, experiments  CERN established a new software group as the EP home of the LCG applications area (EP/SFT)  Led by John Harvey  Taking shape well. Localized in B.32  Soon to be augmented by IT/API staff working in applications area; they will move to EP/SFT  Will improve cohesion, sense of project participation, technical management effectiveness

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 22 Applications Area Personnel Summary PeopleFTEs Total LCG hires working, total Working directly for apps area projects ROOT22.0 Grid integration work with experiments22.0 Contributions from IT/DB32.1 IT/API119.7 EP/SFT + experiments total Working directly for apps area projects199.9 Architecture, management52.0 Total directly working on apps area projects Overall applications area total Details at

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 23 Current Personnel Distribution

FTE-years

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 25 Personnel Resources – Required and Available Estimate of Required Effort Sep-02 Dec-02 Mar-03 Jun-03 Sep-03Dec-03 Mar-04 Jun-04 Sep-04Dec-04 Mar-05 Quarter ending FTEs SPI Math libraries Physicist interface Generator services Simulation SEAL POOL Blue = Available effort Future estimate based on 20 LCG, 12 IT, 28 EP + experiments Now

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 26 Schedule and Resource Tracking (example)

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 27 MS Project Integration – POOL Milestones

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 28 Apps area planning materials  Planning page linked from applications area page  Applications area plan spreadsheet: overall project plan   High level schedule, personnel resource requirements  Applications area plan document: overall project plan   Incomplete draft  Personnel spreadsheet   Currently active/foreseen apps area personnel, activities  WBS, milestones, assigned personnel resources 

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 29 L1 Milestones (1)

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 30 L1 Milestones (2)

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 31 Software Process and Infrastructure Project General Service for Software projects a. Provide general services needed by each project  CVS repository, Web Site, Software Library  Mailing Lists, Bug Reports, Collaborative Facilities b. Provide components specific to the software phases  Tools, Templates, Training, Examples, etc. Specifications Analysis and Design Development Release Debugging Testing ….. Deployment and Installation ….. Planning Software development Support  Alberto Aimar, CERN IT/API

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 32 SPI Services  CVS repositories  One repository per project  Standard repository structure and #include conventions  Will eventually move to IT CVS service when it is proven  AFS delivery area, Software Library  /afs/cern.ch/sw/lcg  Installations of LCG-developed and external software  LCG Software Library ‘toolsmith’ started in December  Build servers  Machines with needed Linux, Solaris configurations  Project portal (similar to SourceForge)  Very nice new system using Savannah (savannah.gnu.org)  Used by POOL, SEAL, SPI, CMS, …  Bug tracking, project news, FAQ, mailing lists, download area, CVS access, …

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 33

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 34

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 35 SPI Components  Code documentation, browsingDoxygen, LXR, ViewCVS  Testing FrameworkCppUnit, Oval  Memory Leaks Valgrind  Automatic BuildsNICOS (ATLAS)  Coding and design guidelines RuleChecker  Standard CVS organization  Configuration managementSCRAM  Acceptance of SCRAM decision shows ‘the system works’  Project workbook  All components and services should be in place mid-Feb  Missing at this point: Nightly build system (Being integrated with POOL for testing) Software library (Prototype being set up now)

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 36 POOL Project  Pool of persistent objects for LHC  Targeted at event data but not excluding other data  Hybrid technology approach  Object level data storage using file-based object store (ROOT)  RDBMS for meta data: file catalogs, object collections, etc (MySQL)  Leverages existing ROOT I/O technology and adds value  Transparent cross-file and cross-technology object navigation  RDBMS integration  Integration with Grid technology (eg EDG/Globus replica catalog)  network and grid decoupled working modes  Follows and exemplifies the LCG blueprint approach  Components with well defined responsibilities  Communicating via public component interfaces  Implementation technology neutral  Dirk Duellmann, CERN IT/DB

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 37  End September - V0.1 (Released Oct 2)  All core components for navigation exist and interoperate  Assumes ROOT object (TObject) on read and write  End October - V0.2 (Released Nov 15)  First collection implementation  End November - V0.3 (Released Dec 18)  First public release  EDG/Globus FileCatalog integrated  Persistency for general C++ classes (not instrumented by ROOT), but very limited: elementary types only  Event metadata annotation and query  End February – V0.4  Persistency of more complex objects, eg. with STL containers  Support object descriptions from C++ header files (gcc-xml)  June 2003 – Production release POOL Release Schedule Principal apps area milestone defined in March LCG launch: Hybrid prototype by year end

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 38 Dictionary: Reflection / Population / Conversion In progress New in POOL 0.3

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 39 POOL Milestones

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 40 Core Libraries and Services (SEAL) Project  Launched in Oct  6-member (~3 FTE) team initially; build up to 8 FTEs by the summer  Growth mainly from experiment contributions  Scope:  Foundation, utility libraries  Basic framework services  Object dictionary (taken over from POOL)  Grid enabled services  Purpose:  Provide a coherent and complete set of core classes and services in conformance with overall architectural vision (Blueprint RTAG)  Facilitate the integration of LCG and non-LCG software to build coherent applications  Avoid duplication of software; use community standards  Areas of immediate relevance to POOL given priority  Users are software developers in other projects and the experiments  Pere Mato, CERN EP/SFT/LHCb

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 41 SEAL Work Packages  Foundation and utility libraries  Boost, CLHEP, experiment code, complementary in-house development  Participation in CLHEP workshop this week  Component model and plug-in manager  The core expression in code of the component architecture described in the blueprint. Mainly in-house development.  LCG object dictionary  C++ class reflection, dictionary population from C++ headers, ROOT gateways, Python binding  Basic framework services  Object whiteboard, message reporting, component configuration, ‘event’ management  Scripting services  Python bindings for LCG services, ROOT  Grid services  Common interface to middleware  Education and documentation  Assisting experiments with integration

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 42 SEAL Schedule  Jan Initial work plan delivered to SC2 on Jan 10 th and approved  Including contents of version v1 alpha  March V1 alpha  Essential elements with sufficient functionality for the other existing LCG projects (POOL, …)  Frequent internal releases (monthly?)  June V1 beta  Complete list of the currently proposed elements implemented with sufficient functionality to be adopted by experiments  June Grid enabled services defined  The SEAL services which must be grid-enabled are defined and their implementation prioritized.

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 43 Estimation of Needed Resources for SEAL 0.5 / 1.5 Education and Documentation / 1.5Grid Services6 0.5 / 1.0Scripting Services5 0.5 / 1.0Basic Framework Services4 0.5 / 1.5LCG Object Dictionary3 0.5 / 0.5Component Model and Plug-in Manager2 0.5 / 1.0Foundation and Utility libraries1 FTE (available/required)NameWBS 3.0 / 8.0 total Current resources should be sufficient for v1 alpha (March)

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 44 Math Libraries Project  Many different libraries in use  General purpose (NAG-C, GSL,..)  HEP-specific ( CLHEP, ZOOM, ROOT)  Modern libraries dealing with matrices and vectors (Blitz++, Boost..)  Financial considerations: can we replace NAG with open source  RTAG: yes  Do comparative evaluation of NAG-C and GSL  Collect information on what is used/needed  Evaluation of functionality and performance  HEP-specific libraries expected to evolve to meet future needs  Fred James, CERN EP/SFT

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 45 Math library recommendations & status  Establish support group to provide advice and info about existing libraries, and identify and develop new functionality  Group established in October  Which libraries and modules are in use by experiments  Little response to experiment requests for info; group in India is scanning experiment code to determine usage  Detailed study should be undertaken to assess needed functionality and how to provide it, particularly via free libraries such as GSL  Group in India is undertaking this study  Initial plan of work developed with Fred James in December  Targets completion of first round of GSL enhancements for April  Based on priority needs assessment  Work plan needs to be presented to the SC2 soon  Scheduled tomorrow, but will be late

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 46 Physicist Interface (PI) Project  Interfaces and tools by which physicists will directly use the software  Planned scope:  Interactive environment: physicist’s desktop  Analysis tools  Visualization  Distributed analysis, grid portals  Currently developing plans and trying to clarify the grid area  Completed survey of experiments on their needs and interests  Talking also to grid projects, other apps area projects, ROOT, …  Initial workplan proposal will be presented to PEB, SC2 this week  Will plan inception workshops for identified work areas  Identify contributors, partners, users; deliverables and schedules; personnel assignments  Vincenzo Innocente, CERN EP/SFT/CMS

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 47 Proposed Near Term Work Items for PI  Abstract interface to analysis services  GEANT4 and some experiments do not wish to depend on a specific implementation  One implementation must be ROOT  Request for a coherent LCG analysis tool-set  Interactive analysis environment  Access to experiment objects (event-data, algorithms etc)  Access to high level POOL services (collections, metaData)  Transparent use of LCG and grid services  With possibility to ‘expose’ them for debugging and detailed monitoring  GUI (point&click) and scripting interface  Interactive event and detector visualization  Integrated with the analysis environment  Offering a large palette of 2D and 3D rendering

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 48 Simulation Project  Mandated by SC2 in Dec to initiate simulation project, following RTAG recommendations  Project being organized now  Discussions with experiments, G4, FLUKA, ROOT, … on organization and roles in progress  Probable that I will lead the overall project during a startup period, working with a slate of strong subproject leaders  Scope (these are the tentative subprojects):  Generic simulation framework  Multiple simulation engine support, geometry model, generator interface, MC truth, user actions, user interfaces, average ‘GEANE’ tracking, utilities  ALICE virtual MC as starting point if it meets requirements  Geant4 development and integration  FLUKA integration  Physics validation  simulation test and benchmark suite  Fast (shower) parameterisation  Generator services

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 49 Collaborations  LCG apps area needs to collaborate well with projects broader than the LHC  See that LHC requirements are met, provide help and support targeted at LHC priorities, while being good collaborators (e.g. not assimilators)  e.g. CLHEP: discussions at workshop this week  e.g. Geant4 in context of simulation project  We also welcome collaborators on LCG projects  e.g. (renewed?) interest from BaBar in POOL  We also depend on the other LHC activity areas  Grid Technology:  Ensuring that the needed middleware is/will be there, tested, selected and of production grade  AA distributed software will be robust and usable only if the grid middleware it uses is so  Grid Deployment: AA software deployment  Fabrics: CERN infrastructure, AA development and test facilities

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 50 A note on my own time  Nearing the one year mark of doing the apps area leader job with 75% of my time, resident at CERN  Other 25% I am an ATLAS/BNL person in the US (~1 week/mo)  Working well, and sustainable (the LCG job I mean)  From my perspective at least!  I will be lightening my ATLAS/US load, which will be welcome  Expect to hand over the US ATLAS Software Manager job to a highly capable person in a few days or weeks  Expect to hand over the ATLAS Planning Officer job shortly to someone else  Remaining formal US responsibility is BNL Physics Applications Software Group Leader, for which I have a strong deputy (David Adams)

Torre Wenaus, BNL/CERN LHCC meeting, January 27, 2003 Slide 51 Concluding Remarks  Expected area scope essentially covered by projects now defined  Manpower is in quite good shape  Buy-in by the experiments is good  Direct participation in leadership, development, prompt testing and evaluation, RTAGs  EP/SFT group taking shape well as a CERN hub  Participants remote from CERN are contributing, but it isn’t always easy  POOL and SPI are delivering, and the other projects are ramping up  Persistency prototype released in 2002, as targeted in March  Important benchmark to come: delivering production-capable POOL scheduled for June