16 th Sept’02Nick Brook – University of Bristol1 News from the EB & LCG Nick Brook University of Bristol EB News LCG News Structures Review of RTAGs.

Slides:



Advertisements
Similar presentations
Physicist Interfaces Project an overview Physicist Interfaces Project an overview Jakub T. Moscicki CERN June 2003.
Advertisements

Joint Information Systems Committee Digital Library Services BL/JISC Workshop Rachel Bruce JISC Programme Director The Digital Library and its Services,
16 th Oct’02Nick Brook – University of Bristol1 News Nick Brook University of Bristol PMB News Feedback from “review” LCG News.
Reporting of the Experiments Follow procedures set up for technical WP of EDG Spreadsheet report man month effort Pro-forma reply sheet Pro-forma sheet.
Nick Brook University of Bristol The LHC Experiments & Lattice EB News Brief overview of the expts  ATLAS  CMS  LHCb  Lattice.
Software Quality Assurance Plan
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Blueprint RTAGs1 Coherent Software Framework a Proposal LCG meeting CERN- 11 June Ren é Brun ftp://root.cern.ch/root/blueprint.ppt.
Introduction to Systems Analysis and Design
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 18 Slide 1 Software Reuse 2.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
The Preparatory Phase Proposal a first draft to be discussed.
REVIEW OF NA61 SOFTWRE UPGRADE PROPOSAL. Mandate The NA61 experiment is contemplating to rewrite its fortran software in modern technology and are requesting.
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
David Adams ATLAS ATLAS Distributed Analysis David Adams BNL March 18, 2004 ATLAS Software Workshop Grid session.
ETICS2 All Hands Meeting VEGA GmbH INFSOM-RI Uwe Mueller-Wilm Palermo, Oct ETICS Service Management Framework Business Objectives and “Best.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Nick Brook Current status Future Collaboration Plans Future UK plans.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
LCG Applications Area – Overview, Planning, Resources Torre Wenaus, BNL/CERN LCG Applications Area Manager LHCC Comprehensive Review.
MINER A Software The Goals Software being developed have to be portable maintainable over the expected lifetime of the experiment extensible accessible.
InterGrid Meeting 7 Oct 2001 Tony Doyle. Tony Doyle - University of Glasgow GridPP Status  Financial Background  Deliverables  Recruitment  Regional.
LCG LHC Computing Grid Project – LCG CERN – European Organisation for Nuclear Research Geneva, Switzerland LCG LHCC Comprehensive.
JRA Execution Plan 13 January JRA1 Execution Plan Frédéric Hemmer EGEE Middleware Manager EGEE is proposed as a project funded by the European.
The Experiments – progress and status Roger Barlow GridPP7 Oxford 2 nd July 2003.
SEAL: Common Core Libraries and Services for LHC Applications CHEP’03, March 24-28, 2003 La Jolla, California J. Generowicz/CERN, M. Marino/LBNL, P. Mato/CERN,
SEAL Core Libraries and Services CLHEP Workshop 28 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
SEAL Project Core Libraries and Services 18 December 2002 P. Mato / CERN Shared Environment for Applications at LHC.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
The POOL Persistency Framework POOL Project Review Introduction & Overview Dirk Düllmann, IT-DB & LCG-POOL LCG Application Area Internal Review October.
ATLAS is a general-purpose particle physics experiment which will study topics including the origin of mass, the processes that allowed an excess of matter.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
The LHC Computing Grid Project (LCG) and ROOT Torre Wenaus, BNL/CERN LCG Applications Area Manager John Harvey, CERN EP/SFT Group Leader
INFSO-RI Enabling Grids for E-sciencE ARDA Experiment Dashboard Ricardo Rocha (ARDA – CERN) on behalf of the Dashboard Team.
G.Govi CERN/IT-DB 1 September 26, 2003 POOL Integration, Testing and Release Procedure Integration  Packages structure  External dependencies  Configuration.
SEAL Project Overview LCG-AA Internal Review October 2003 P. Mato / CERN.
D. Duellmann - IT/DB LCG - POOL Project1 The LCG Pool Project and ROOT I/O Dirk Duellmann What is Pool? Component Breakdown Status and Plans.
Chapter 1 Basic Concepts of Operating Systems Introduction Software A program is a sequence of instructions that enables the computer to carry.
- LCG Blueprint (19dec02 - Caltech Pasadena, CA) LCG BluePrint: PI and SEAL Craig E. Tull Trillium Analysis Environment for the.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
November 27, 2001DOE/NSF review of US LHC S&C projects1 The Software and Computing Committee (SC2) in the LHC Computing Grid Project M Kasemann, FNAL.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
Ian Bird Overview Board; CERN, 8 th March 2013 March 6, 2013
David Adams ATLAS ATLAS Distributed Analysis and proposal for ATLAS-LHCb system David Adams BNL March 22, 2004 ATLAS-LHCb-GANGA Meeting.
CHEP 2012 ADC talks G. Carlino (INFN Napoli) on behalf of the Computing Speaker Committee ADC Weekly, August 29 th.
Follow-up to SFT Review (2009/2010) Priorities and Organization for 2011 and 2012.
INFSOM-RI ETICS: E-infrastructure for Testing, Integration and Configuration of Software Alberto Di Meglio Project Manager.
LCG Project Organisation Requirements and Monitoring LHCC Comprehensive Review November 24, 2003 Matthias Kasemann Software + Computing Committee (SC2)
VI/ CERN Dec 4 CMS Software Architecture vs Hybrid Store Vincenzo Innocente CMS Week CERN, Dec
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Update on CHEP from the Computing Speaker Committee G. Carlino (INFN Napoli) on behalf of the CSC ICB, October
Grid Deployment Technical Working Groups: Middleware selection AAA,security Resource scheduling Operations User Support GDB Grid Deployment Resource planning,
ScotGRID is the Scottish prototype Tier 2 Centre for LHCb and ATLAS computing resources. It uses a novel distributed architecture and cutting-edge technology,
SEAL: Common Core Libraries and Services for LHC Applications
Bob Jones EGEE Technical Director
LCG Applications Area Milestones
EGEE Middleware Activities Overview
(on behalf of the POOL team)
JRA3 Introduction Åke Edlund EGEE Security Head
Ian Bird GDB Meeting CERN 9 September 2003
Distribution and components
Dirk Düllmann CERN Openlab storage workshop 17th March 2003
Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002
SEAL Project Core Libraries and Services
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

16 th Sept’02Nick Brook – University of Bristol1 News from the EB & LCG Nick Brook University of Bristol EB News LCG News Structures Review of RTAGs

16 th Sept’02Nick Brook – University of Bristol2 EB News  Roger Barlow re-elected as deputy chair of the EB will take over as chair September’03  Reporting structures in place Measures manpower effort and deadlines but also requests expt req ts of UK (Tier1/A) resources First reports appear for Q2’02 – available on EB web page Next submissions are due now

16 th Sept’02Nick Brook – University of Bristol3 EB News  6 application submissions to Sheffield All-Hands conference Applications are beginning to deliver their projects New format to GridPP expt sessions  All but one application position is now filled (last remaining vacant post will be filled beginning of October)  Successful joint ATLAS-LHCb workshop at Coseners

16 th Sept’02Nick Brook – University of Bristol4 Fundamental Goal of the LCG To help the experiments’ computing projects get the best, most reliable and accurate physics results from the data coming from the detectors Phase 1 – prepare and deploy the environment for LHC computing Phase 2 – acquire, build and operate the LHC computing service

16 th Sept’02Nick Brook – University of Bristol5 Phase 1 - High-level Goals  development/support for applications – libraries, tools, frameworks, data management (inc. persistency), …….. common components  develop/acquire the software for managing a distributed computing system on the scale required for LHC – the local computing fabric, integration of the fabrics into a global grid  put in place a pilot service –  “proof of concept” – the technology and the distributed analysis environment  platform for learning how to manage and use the system  provide a solid service for physics and computing data challenges  produce a TDR describing the distributed LHC computing system for the first years of LHC running  maintain opportunities for re-use of developments outside the LHC programme To prepare and deploy the environment for LHC computing

16 th Sept’02Nick Brook – University of Bristol6 The LHC Computing Grid Project Organisation LHCC Reports Reviews Common Computing RRB (funding agencies) Resources Project Execution Board Software and Computing Committee (SC2) Project Overview Board Requirements, Monitoring

16 th Sept’02Nick Brook – University of Bristol7 SC2 & PEB Roles  SC2 includes the four experiments, Tier 1 Regional Centres  SC2 identifies common solutions and sets requirements for the project  may use an RTAG – Requirements and Technical Assessment Group  limited scope, two-month lifetime with intermediate report  one member per experiment + experts  PEB manages the implementation  organising projects, work packages  coordinating between the Regional Centres  collaborating with Grid projects  organising grid services  SC2 approves the work plan, monitors progress

16 th Sept’02Nick Brook – University of Bristol8 SC2 Monitors Progress of the Project Receives regular status reports from the PEB Written status report every 6 months –milestones, performance, resources –estimates time and cost to complete Organises a peer-review –about once a year –presentations by the different components of the project –review of documents –review of planning data

16 th Sept’02Nick Brook – University of Bristol9 Project Execution Organisation Four areas – each with area project manager  Applications  Grid Technology  Fabrics  Grid deployment

16 th Sept’02Nick Brook – University of Bristol10 RTAG status –in application software area data persistencycompleted – 5 th April 02 software support processcompleted – 6 th May 02 mathematical librariescompleted – 2 nd May 02 detector geometry descriptionrunning Monte Carlo generatorsrunning applications architectural blueprintrunning detector simulationrunning –in fabric area mass storage requirementscompleted – 3 rd May 02 –in Grid technology and deployment area Grid technology use casescompleted – 7 th June 02 Regional Centre categorisationcompleted – 7 th June 02 Current status of RTAGs (and available reports) on

16 th Sept’02Nick Brook – University of Bristol11 Data Persistency (RTAG1) Technology Streaming layer should be implemented using the ROOT framework’s I/O services Components with relational implementations should make no deep assumptions about the underlying technology –Nothing intentionally proposed that precludes implementation using such open source products as MySQL

16 th Sept’02Nick Brook – University of Bristol12 Data Persistency (RTAG1) Implementation – POOL Five work package areas: Storage Manager & refs File catalog & Grid integration Collections & Metadata Dictionary & Conversion Infrastructure, Integration & testing

16 th Sept’02Nick Brook – University of Bristol13 Grid Use Cases (RTAG4) 79 page report & 43 use cases Global Summary of EDG Response Use case is already implemented (release 1.2) – 19 –Mostly basic job submission and basic data management –For half of these, WP8 agrees that the functionality is implemented in 1.2, but the implementation is quite a bit more complex than that outlined in the use case (esp. data management). The release 2.0 implementations look simpler. Planned for release 2 – 10 Will be considered for release 3 – 4 Use case not detailed enough – 4 –VO-wide resource allocation to users – HEPCAL did not make strong requirements on security –“Job Splitting” and “Production Job” – were purposely vague in HEPCAL due to lack of clear vision of how massive productions will be run on the Grid. One job auto-split into thousands? Or thousands of jobs somehow logically grouped into one production? Not planned for any release – 7 –Software publishing –Virtual Datasets (reliant on GriPhyn)

16 th Sept’02Nick Brook – University of Bristol14 Regional Centres (RTAG6)  A service oriented view should be adopted for categorization of regional centres  It could be profitable to revisit the overall computing model in terms of services around 2004  The important aspects to categorize RCs are –Commitment to guarantee data management at a high QoS for the lifetime of LHC –Commitment to guarantee state-of-the-art network bandwidth to ensure efficient inter-operation –Commitment to contribute to collaborative services

16 th Sept’02Nick Brook – University of Bristol15 LCG Blueprint (RTAG8 - ongoing) Precepts  Software structure: STL/utilities, “core” infrastructure, “specialised” infrastructure  Component model: APIs (embedding frameworks, “own” plug-ins, end users), physical/logical module granularity, role of abstract interfaces, …  Service model: Uniform, flexible access to basic framework functionality  Object models: dumb vs. smart, enforced policies with run-time checking, clear and bullet-proof ownership model  Distributed operation  Global objects  Dependencies: minimisation between components, run-time rather than compile-time  Interface to external components: generic adapters – version & variant identification  Exception handling

16 th Sept’02Nick Brook – University of Bristol16 LCG Blueprint (RTAG8 - ongoing)  Scripting, interpreter (ROOTCINT, PYTHON)  GUI toolkits (to build expt specific interfaces)  Graphics (underlying general tools)  Analysis tools (histogramming, fitting, graphical representation, …)  Math libraries and statistics (already established)  Job management  Core services (platform indep interface to system resources on LCG platforms – Linux (gnu & intel compilers), Solaris & Windows)  Foundation and utility libraries (essentially maths libs & core services)  Grid middleware interfaces (already an agreed 2 expt “common” project - GANGA)

16 th Sept’02Nick Brook – University of Bristol17 LCG Blueprint (RTAG8 - ongoing)  Object dictionary and object model (in context of POOL)  Persistency and data management (in context of POOL)  Event processing framework (poss. long term common project components)  Event model  Event generation (ancilliary services & support)  Detector simulation (ditto)  Detector geometry and materials (standard tools for describing, storing & modeling detector geometry)  Trigger/DAQ  Event reconstruction  Detector calibration

16 th Sept’02Nick Brook – University of Bristol18 LCG Summary  GridPP input to both overall management structure on LCG and the RTAG activities  Activities are beginning to take off  Persistency (POOL)  Software processing & infrastructure  Grid deployment board – Oct 4 th first meeting  Interaction with middleware providers (not just EDG)  RTAG procedures seems to be slow at taking off  Lack of consistency early on – addressed by “Blueprint” RTAG  Time consuming, often overlap of necessary experts