Presentation is loading. Please wait.

Presentation is loading. Please wait.

ATLAS HLT/DAQ Referee Marzo 2006

Similar presentations


Presentation on theme: "ATLAS HLT/DAQ Referee Marzo 2006"— Presentation transcript:

1 ATLAS HLT/DAQ Referee Marzo 2006
Valerio Vercesi on behalf of all people working

2 Valerio Vercesi - INFN Pavia
S. Falciano (Roma1) Coordinatore Commissioning HLT A. Negri (Irvine, Pavia) Coordinatore Event Filter Dataflow A. Nisati (Roma1) TDAQ Institute Board chair e Coordinatore PESA Muon Slice F. Parodi (Genova) Coordinatore b-tagging PESA V. Vercesi (Pavia) Deputy HLT leader e Coordinatore PESA (Physics and Event Selection Architecture) Attività italiane Trigger di Livello-1 muoni barrel (Napoli, Roma1, Roma2) Trigger di Livello-2 muoni (Pisa, Roma1) Trigger di Livello-2 pixel (Genova) Event Filter Dataflow (LNF, Pavia) Selection software steering (Genova) Event Filter Muoni (Lecce, Napoli, Pavia, Roma1) DAQ (LNF, Pavia, Roma1) DCS (Napoli, Roma1, Roma2) Monitoring (Cosenza, Napoli, Pavia, Pisa) Pre-series commissioning and exploitation (Everybody) Referee Marzo 2006 Valerio Vercesi - INFN Pavia

3 Valerio Vercesi - INFN Pavia
ATLAS Trigger & DAQ 40 MHz ~100 kHz 2.5 ms ~3 kHz ~10 ms ~ 1 s ~200 Hz Muon LVL1 Calo Inner Pipeline Memories Read-Out Drivers Rates Latency RoI LVL2 Event builder cluster Local Storage: ~ 300 MB/s Subsystems hosting Buffers Event Filter farm EF ROB ROD Hardware based (FPGA, ASIC) Calo/Muon (coarse granularity) Software (specialised algs) Uses LVL1 Regions of Interest All sub-dets, full granularity Emphasis on early rejection High Level Trigger Offline-like algorithms Possibly seeded by LVL2 result Work with full event Full calibration/alignment info Referee Marzo 2006 Valerio Vercesi - INFN Pavia

4 TDAQ Networks and Processing
Dual(quad)-CPU nodes ~30 ~1600 ~100 ~ 500 Event rate ~ 200 Hz Local Storage SubFarm Outputs (SFOs) Event Filter (EF) Event Builder SubFarm Inputs (SFIs) LVL2 farm Second- level trigger Data storage SDX1 pROS DataFlow Manager Network switches stores LVL2 output Network switches LVL2 Super- visor Gigabit Ethernet Event data requests Delete commands Requested event data USA15 Event data pulled: partial events @ ≤ 100 kHz, full events @ ~ 3 kHz UX15 Regions Of Interest USA15 Data of events accepted by first-level trigger 1600 Read- Out Links ~150 PCs VME Dedicated links Read-Out Subsystems (ROSs) Read- Out Drivers (RODs) ATLAS detector RoI Builder First- level trigger UX15 Timing Trigger Control (TTC) Event data ≤ 100 kHz, 1600 fragments of ~ 1 kByte each Referee Marzo 2006 Valerio Vercesi - INFN Pavia

5 Pre-series system in ATLAS point-1
8 racks (10% of final dataflow, 2% of EF) 5.5 One ROS rack - TC rack + horiz. Cooling - 12 ROS 48 ROBINs RoIB rack - TC rack + horiz. cooling - 50% of RoIB One Full L2 rack - TDAQ rack - 30 HLT PCs Partial Superv’r rack - TDAQ rack - 3 HE PCs One Switch rack - TDAQ rack - 128-port GEth for L2+EB Partial EFIO rack - TDAQ rack - 10 HE PC (6 SFI SFO DFM) Partial EF rack - TDAQ rack - 12 HLT PCs Partial ONLINE rack - TDAQ rack - 4 HLT PC (monitoring) 2 LE PC (control) 2 Central FileServers underground : USA15 surface: SDX1 ROS, L2, EFIO and EF racks: one Local File Server, one or more Local Switches Machine Park: Dual Opteron and Xeon nodes, uniprocessor ROS nodes Operating System: Net booted and diskless nodes, running SLC3 Referee Marzo 2006 Valerio Vercesi - INFN Pavia

6 Commissioning and exploitation
Fully functional, small scale, version of the complete HLT/DAQ Equivalent to a detector’s ‘module 0’ Purpose and scope of the pre-series system Pre-commissioning phase To validate the complete, integrated, HLT/DAQ functionality To validate the infrastructure, needed by HLT/DAQ, at point-1 Commissioning phase To validate a component (e.g. a ROS) or a deliverable (e.g. a Level-2 rack) prior to its installation and commissioning TDAQ post-commissioning development system Validate new components (e.g. their functionality when integrated into a fully functional system) Validate new software elements or software releases before moving them to the experiment Referee Marzo 2006 Valerio Vercesi - INFN Pavia

7 Pre-series tests at Point 1
Used integrated software release ( installation image ) with offline release , Event Format version 2.4, TDAQ release , HLT release First time e/γ- and μ-selections run in a combined menu with algorithms muon calorimeter inner detector E.g. Level-2 setup 8 ROS emulators with preloaded data Data with Level-1 simulation: di-jets (17 GeV) , single e (25 GeV), single μ (100 GeV) Dataflow applications with instrumentation  measure execution times, network access times and transferred data sizes Used recently up to 20 Level-2 processors each with up to 4 applications Factor 1.9 improvement respect to one application/node Referee Marzo 2006 Valerio Vercesi - INFN Pavia

8 Valerio Vercesi - INFN Pavia
LVL2 tests Data File LVL2 Rate LVL2 Latency Processing Time RoI Col DAQ Time Data Rate Data Size #Reqs /Event Data /Req (Hz) (ms) Fraction (MB/s) bytes mu 293.1 3.41 2.78 0.62 0.19 0.084 287 1.3 223 jet 280.3 3.57 3.26 0.28 0.09 0.781 2785 1.2 2283 e 58.2 17.18 15.48 1.66 0.10 0.921 15820 7.4 2147 Prefiltered Referee Marzo 2006 Valerio Vercesi - INFN Pavia

9 Infrastruttura Event Filter
Caratteristiche principali SW infrastruttura EF Completo disaccoppiamento tra data flow (EFD) e data processing (PTs) sicurezza trattamento dei dati Massimo sfruttamento delle architetture SMP Design flessibile e completamente configurabile Muon Calo Inner LVL1 SFI RoI ROD ROD ROD Node n EFD LVL2 Input ROB ROB ROB PT #1 P T I O Sorting PT #a P T I O Event builder network EF SubFarm ExtPTs ExtPTs Calibration SFI SFO PT #2 P T I O PT #b P T I O Trash Implementation example Output Output Output SFO [debug] SFO [std] SFO [calib] Storage: ~ 300 MB/s Referee Marzo 2006 Valerio Vercesi - INFN Pavia

10 Valerio Vercesi - INFN Pavia
EF tests Verifiche e studi sulla parte infrastrutturale Ottimizzazione del protocollo di comunicazione tra EF e SFI/SFO: miglioramento delle performance per eventi piccoli (calibrazione) e farm remote Aggiunta di funzionalità addizionali Integrazione e validazione degli algoritmi di selezione Algoritmi derivati dall'offline Ma condizioni operative diverse, es: adattamento delle job-option all'online concorrenza nell'accesso al DB Integrata e validata la muon slice Altre slice in corso di validazione Tested with timing: EF-only, 9 EFDs per 2 PTs, TrigMoore algo, 1 MySQL (CERN site) All 9 nodes connect to MySQL simultaneously all 18 PTs do not 1 but 3 connections to CDI (3x18= fast scaling) 6.90.2 s – geometry 0.10.03 s – AMDCsimRecAthena 0.060.03 s – magnetic field DB-caching was used Referee Marzo 2006 Valerio Vercesi - INFN Pavia

11 Software Installation Image
TDAQ Offline HLT Common Software repositories Example Partitions / Data Files Test suites Setup / installation scripts Originally developed for Large Scale Test 2005 Contains a consistent set of all software in one file needed to run a standalone HLT setup Completely tested before deployment by PESA, HLT and DAQ specialists Used for first exploitation of pre-series Useful for outside CERN installations and new test bed setups P1 installation procedure presently being worked out  Future images snapshot of P1 installation Project builds ~ 6.5 GByte software Referee Marzo 2006 Valerio Vercesi - INFN Pavia

12 Valerio Vercesi - INFN Pavia
HLT Core Software Work plan defined for design review 2005 ( HLT compliant with trigger operation Integration with most recent TDAQ software  Cycling through TDAQ state machine (start/stop/reinitialize/…) HLT trigger configuration from data base  Use of conditions DB in HLT  Integration with online services for error reporting and system monitoring  Many of these issues have a direct impact on selection algorithms  Functionality needs to be available early in core software to give time to algorithm developers. System performance optimization  instrumentation for measurement of network transfer times, data volumes and ROS access patterns ( complementary to work in PESA group)  For commissioning and readout tests Basic fault tolerance Stability Move to new compiler gcc and operating system SLC 4 Referee Marzo 2006 Valerio Vercesi - INFN Pavia

13 Trigger Configuration Data Base
TriggerTool: GUI for DB population menu changes for experts (HLT and LVL1) LVL1 + HLT as integrated system TriggerDB online running offline shift crew offline user expert TriggerTool DB population scripts R/O interface Configuration System compilers TriggerDB: stores all information to configure the trigger: LVL1 menu, HLT menu, HLT algorithm parameters, HLT release information Versions identified with key  Configuration and Condition DB Retrieval of information for running: get information via a key, either as: XML/JobOption files direct DB read-out for both online + offline running Referee Marzo 2006 Valerio Vercesi - INFN Pavia

14 Example: Data Base Schema
keys: stored in CondDB, to retrieve information (online and offline) LVL1 HLT Early prototype of HLT part already run on 6 node system with muon selection algorithm algorithms, jobOptions software release trigger menu Referee Marzo 2006 Valerio Vercesi - INFN Pavia

15 Global Monitoring Scheme
OHP Event Monitoring Service Online Histogramming Service GNAM ROD Analysis Framework Detector Specific Plug-in Gatherer ROS Event Builder Athena Detector Specific Athena Algorithm Monitoring Data Storage Athena Monitoring Event Displays Referee Marzo 2006 Valerio Vercesi - INFN Pavia

16 GNAM Monitoring Principio: disaccoppiare e mascherare le azioni comuni dagli algoritmi di monitoring GNAM core: azioni comuni sicronizzazione con la DAQ campionamento degli eventi decodifica della parte detector-ind pubblicazione e salvataggio degli histo gestione dei comandi (update, reset, rebin) tools per gli algoritmi (circular buffer, histogram flags, histogram metadata, ...) Algoritmi di monitoring (librerie dinamiche a run-time) decodifica detector-dependent booking e filling degli istogrammi gestione di comandi specifici Eventi Comandi Dal dataflow GNAM CORE On-line Histogramming Service Event Monitoring Service USER LIB USER LIB USER LIB Istogrammi Presenter Viewer Checker Referee Marzo 2006 Valerio Vercesi - INFN Pavia

17 Online Histogram Presenter (OHP)
Interactive presenter developed in close connection to GNAM monitoring However used to display histograms published on the OHS by any producer Designed to be used both as expert mode: a browser of all the histograms on OHS shifter mode: an histogram presenter to show only predefined sets of histograms in configured tabs Completely interactive with the GNAM Core (rebin, reset, …) Completely redesigned, after the CTB experience, to minimize network traffic and to have a scalability appropriate for whole ATLAS A very useful collaboration with Computer Science students has been established. Browser part Preconfigured set of histograms in tabs Commands to the Core : rebinning, reset ... Referee Marzo 2006 Valerio Vercesi - INFN Pavia

18 Monitoring: commissioning
Sviluppato un sistema di monitoring/analisi/ validazione on-line dei rivelatori basato su GNAM produzione di istogrammi visualizzati con On-line Histogram Presenter (OHP) on-line event display (in collaborazione con Saclay) In uso al commissioning dal settembre 2005 In sviluppo reperire la configurazione dei rivelatori da DB controlli automatici e generazioni di allarmi Utilizzato da Tile e MDT, interesse espresso da altri Referee Marzo 2006 Valerio Vercesi - INFN Pavia

19 Valerio Vercesi - INFN Pavia
ROD Crate DAQ RCD usato come interafccia verso i RODs per Control, Configuration, Monitoring, Data readout (via VME) Gli sviluppi RCD hanno avuto sostanzialmente due fasi ReadoutApplication (ovvero l'applicazione che costituisce il ROD Crate DAQ, il ROS ed il Data Driven Event Builder) modificata in modo sostanziale per accomodare tutte le richieste dei rivelatori ed essere pronta con tutte le fuzionalità necessarie per il commissioning accesso standardizzato ad Information Service ed Online Histogramming possibilità di accesso ai dati in risposta agli interrupt semplificazione della costruzione delle classi per il controllo e l'acquisizione dei moduli definizione e realizzazione di un data driven event builder librerie per gestione standardizzata delle condizione di errore Supporto dei rivelatori per il commissioning Nuovo sviluppo necessario per garantire tramite una semplice interfaccia comune a RAL/CORAL che l'accesso al database di configurazione sia thread safe (fase di inizializzazione) Referee Marzo 2006 Valerio Vercesi - INFN Pavia

20 Valerio Vercesi - INFN Pavia
Attività RCD Parte specifica del detector del ROD Crate DAQ di MDT ed RPC Database database di cablaggio (molto lavoro!) database di configurazione Interfacce di online e monitoring con questi Detector Control System (DCS) Italiana tutta la parte di DCS degli RPC ed il controllo di HV e LV degli MDT Settore 13 Muoni Run combinati MDT-Tile triggerati da scintillatori Studi di sincronizzazione Referee Marzo 2006 Valerio Vercesi - INFN Pavia

21 MDT online calibration
Required precision for t0 and r-t autocalibration needs inclusive muon rates of 0.33 KHz Not suitable for EF calibration streams Need partial Event Building and streaming (under study) Already possible using LVL2 infrastructure with some modifications L2PU Thread Calibration Server Local Server Gatherer Calibration farm disk Server x 25 x ~20 ~ 9.6 MB/s TCP/IP, UDP, etc. ~ 480 kB/s Dequeue Memory queue Referee Marzo 2006 Valerio Vercesi - INFN Pavia

22 Routing m calibration data
Referee Marzo 2006 Valerio Vercesi - INFN Pavia

23 Valerio Vercesi - INFN Pavia
SDX1 – TDAQ P1 Total of 99 racks can be placed in SDX Lower Level: 49 (LVL2, EB,…) Upper Level: 50 (EF) Referee Marzo 2006 Valerio Vercesi - INFN Pavia

24 Valerio Vercesi - INFN Pavia
ROS Overview SDX1 dual-CPU nodes ~30 ~1600 ~100 ~ 500 In total ~150 ROS PCs will have to be installed Each ROS PC will be equipped with 3 or 4 ROBIN cards and one 4-port G-bit Ethernet NIC Event rate ~ 200 Hz Local Storage SubFarm Outputs (SFOs) Event Filter (EF) Event Builder SubFarm Inputs (SFIs) LVL2 farm pROS DataFlow Manager Network switches ROBIN ROS PCs in USA15 Network switches LVL2 Super- visor Event data requests Delete commands Requested event data 10-Gigabit Ethernet Regions Of Interest USA15 Data of events accepted by first-level trigger 1600 Read- Out Links ~150 PCs VME Dedicated links Read-Out Subsystems (ROSs) Read- Out Drivers (RODs) ATLAS detector RoI Builder First- level trigger UX15 Timing Trigger Control (TTC) Referee Marzo 2006 Valerio Vercesi - INFN Pavia

25 Valerio Vercesi - INFN Pavia
Hardware Procurement ROS PCs 1st batch (50 PCs) Ordered and received 2nd batch (60 PCs) Ordered. Delivery scheduled for May Remaining ROS PCs + spares Will be ordered later ROBINs German production (350 cards) Ordered and received (~20 cards did not pass the production test and still need to be repaired) UK production (350 cards) Ordered. Delivery scheduled for March 4-port NICs Ordered. Delivery scheduled for May Silicom 4-port NIC Referee Marzo 2006 Valerio Vercesi - INFN Pavia

26 Current Status of ROS-Racks in USA15
Liquid Argon Y A1 TileCal yes 50 % Y A2 Y A2 Y A2 Y A2 Y A2 Y A2 Control switch ROS PCs Installed Not installed Power & network cables yes no yes no no yes no no Commissioned (ROS level) yes Commissioned (ROD - ROS) no Referee Marzo 2006 Valerio Vercesi - INFN Pavia

27 Valerio Vercesi - INFN Pavia
PESA PESA Core SW is responsible for the implementation of the Steering and Control (built around standard Athena components) PESA Algorithms develops HLT software using realistic data access and handling specialized LVL2 and EventFilter algorithms adapted from on-line deployment in HLT testbeds PESA Validation and Performance evaluates algorithms on data samples to extract efficiency, rates, rejection factors, and physics coverage Stems from original structure, laid out in parallel with the organization of the Combined Performance working groups, in “vertical slices" (LVL1+LVL2+EF) Electrons and photons Muons Jets / Taus / ETmiss b-jet tagging B-physics Referee Marzo 2006 Valerio Vercesi - INFN Pavia

28 HLT Reconstruction Algorithms
HLT Feature extraction algorithms are available for each slice Calorimeter algorithms LVL2 and EF algorithms ready for e/g t implementation ready at LVL2 Offline tool adapted to the EF is ready for JetCone Muon algorithms LVL2 and EF algorithms are available for the barrel region; work has started on extending the LVL2 algorithm to the endcap ID to muon track matching tools are available at LVL2 and EF Muon isolation studies using calorimeters are being performed ID tracking Tracking with Si data ready at LVL2 and EF; more approaches studied in parallel Tools available for both track extension to the TRT and stand-alone TRT reconstruction; emphasis on providing a robust tool for commissioning and early running Referee Marzo 2006 Valerio Vercesi - INFN Pavia

29 Valerio Vercesi - INFN Pavia
Selections: e/g Rate and efficiency studies performed for main physics triggers: e25i , 2e15i, e60, g60, 2g20i Results for perfectly in agreement with Rome results Tools have been developed to optimize the selections In the future, results will be provided as efficiency vs. rejection curves, to provide a continuous set of working points: essential for trigger bandwidth optimization Eff % Rate L1 95.5 4.7 KHz L2 Calo 94.9 890 Hz L2 ID 91.0 280 Hz L2 Match 89.7 98 Hz EF Calo 87.6 65 Hz EF ID 81.8 35 Hz EF Match 81.0 Cluster composition We 21% Zee 5% Direct photons or quark brem e from b, c decays 37% rest 32% Referee Marzo 2006 Valerio Vercesi - INFN Pavia

30 Valerio Vercesi - INFN Pavia
Selections: LVL2 m Implemented curvature radius instead of sagitta More suitable for the endcap, recover efficiency in the barrel Same algorithm across ± 2.4 in h Resolution New LUTS for Radius Slightly worse than Resolution is OK for Standard sectors Turn-on curves comparable with Worse efficiency in the feet region (Special Sectors) Endcap extension in progress Combined reconstruction (mComb) with ID Refine the mFast pT by means of ID data sharper 6 GeV threshold Referee Marzo 2006 Valerio Vercesi - INFN Pavia

31 Valerio Vercesi - INFN Pavia
LVL2 cosmics m X-Y Z-R: bending plane BOS BMS Straight line extrapolation from y=+98.3 m BML MDT hits are station centers in X-Y. BIS MDT hits RPC hits (pair of phi,eta strips) Muon track from the surface /castor/cern.ch/user/m/muonprod/cosmics/cosmics.dig.atlas-dc3-02._0004.pool.root Monte Carlo! MDT,RPC hits are there and looks fine. Conversion of RDO to coordinates seems fine too. Next steps: MuFast modifications Referee Marzo 2006 Valerio Vercesi - INFN Pavia

32 Valerio Vercesi - INFN Pavia
Selections: EF m Studies on single muon selections have been performed for two scenarios: 6 GeV threshold at 1033cm−2s−1 luminosity and 20 GeV at 1034cm−2s−1. Cuts are defined so that a 95% efficiency is achieved at the threshold values. Sorgenti di muoni L=1034 No backgr. s.f. x1 s.f. x5 /K 54 Hz 48 Hz b 77 Hz 68 Hz c 30 Hz 26 Hz W 22 Hz 19 Hz t negligible Total ~185 Hz ~190 Hz ~180 Hz Layout Q (barrel only) MuId Combined used at EF MuComb rate reduction still to be included at LVL2 Fake rates expected to be ~1% (~12%) of total rate for s.f.x1 (s.f. x5) with this threshold (seeded mode) lower values of efficiency plateau less sharp curves near the thresholds more points are needed for a better curve definition Referee Marzo 2006 Valerio Vercesi - INFN Pavia

33 Valerio Vercesi - INFN Pavia
Jets/Taus/ETmiss LVL2 calo algorithm for taus recently separated from egamma Ongoing performance studies for selection strategies on variables At present only EM calibration for cluster energies: need for a tau calibration (also for EF, H1 style as in the offline mode?) First implementation of EF “seeded” TrigtauRec is already working making use of offline tools Once the selection strategies are defined, physics trigger-aware analyses (studying the effect of the hadronic tau trigger) can be performed Three different strategies (concerning the data preparation) are being considered Read out calorimeter and unpack the cells (unpacking time may dominate) Read out calorimeter, get Ex/ Ey calculated in ROD (faster but … resolution?) Read out TriggerTower from LVL1 Preprocessor Ongoing work to define and studies general strategy for pre-scales, in particular for jet objects Referee Marzo 2006 Valerio Vercesi - INFN Pavia

34 Jet triggers and prescales
Referee Marzo 2006 Valerio Vercesi - INFN Pavia

35 Selections: b-tagging
Two classes of tagging variables can be used: track variables (xT ) and collective (vertex) variables (xV ).The weight of each RoI is computed using the likelihood-ratio method where Ssig and Sbkg are the probability densities for signal (b-jets) and background WT : transverse (d0/sd0 ) and longitudinal (z0) WV : secondary vertex energy and mass (statistical approach) Recent work to combine SimpleVertex (1-dim fit) and VKalVrt (offline algorithm adapted to LVL2) Impact parameters Impact parameters + probabilistic vertex Impact parameters + VKalVrt/SimpleVertex combined Referee Marzo 2006 Valerio Vercesi - INFN Pavia

36 Valerio Vercesi - INFN Pavia
RoI Based B-physics Aim: use the calorimeter to identify regions of the event containing B decay products EM RoI for e and gamma. Jet RoI for hadronic B-decays Keep multiplicity low, to minimize data transfer and cpu, whilst maximising efficiency for events used in physics studies multiplicity= 1-2 The effect of different thresholds (EM&HAD and the jet RoI size on this multiplicity was studied using Rome data (1x1033) with the new TTL LVL1 simulation and pile up The requirement on multiplicity implies an ET threshold of ~ 2GeV for LVL1 EM RoI RoI Multiplicity LVL1 Threshold Energy (GeV) Towerthresh=500MeV (default) Towerthresh=750MeV Towerthresh=1000MeV LVL1 EM RoI multiplicity vs. ET cut Referee Marzo 2006 Valerio Vercesi - INFN Pavia

37 Trigger-aware physics analysis
Analyses which use trigger information as a “pre-processor” to correctly evaluate efficiencies, physics reach, etc. What this requires Trigger decision in AOD + Tag DB More detailed trigger-related information in ESD/AOD Ability to re-run hypothesis part of event selection on AOD Contribution by physics groups in selection tuning Steering Supports serializing simple objects to include in LVL2 result EF result in progress Muon slice L1 RoI seeding L2 MuFast seeding EF TrigMoore is working MuonFeature serialized and passed via L2 result TrigMoore output individually persistent via POOL so already in AOD but without HLT navigation Electron slice TrigIndetTrack and TrigElectron persistable (HLT Serializer and POOL) EMShowerMinimal + CaloCluster not compatible with serializer so cannot use for the moment Suitable electron hypotheses being developed Special ones within constraints of demo Referee Marzo 2006 Valerio Vercesi - INFN Pavia

38 Trigger & Physics Weeks
To proceed with the implementation of the guidelines of the ATLAS Operation Model, it has been recently proposed to have few-day trigger workshops in 2006, where the full experiment gets together to discuss trigger issues It was also considered good to couple these meetings to the physics workshops organized since some time by Physics Coordination, in order to strengthen more and more the links between trigger, (combined) detector performance and physics The aim of these weeks is to bring together trigger, detector performance and physics studies, and to expose trigger issues and strategy to a broad ATLAS audience Defined dates March 2006 29 May - 1 June 2006 30 October - 2 November 2006 Referee Marzo 2006 Valerio Vercesi - INFN Pavia

39 Valerio Vercesi - INFN Pavia
2006 PESA Milestones LVL1/HLT AODs fully available in Rel 12 for trigger-aware analyses – Apr 06 Very preliminary AOD information available in Rel 11 Detailed description of Rel 12 deliverables prepared by Simon HLT algorithm reviews complete – Jun 06 Detailed review of ID LVL2 algorithms already taken place Focus on system performance and implementation Results fed back into Rel 13 Online tests of selection slices with preloaded mixed files and large menus – Sep 06 First production version of trigger configuration Selection software ready for cosmic run – Oct 06 Already in PPT: need to refine meaning Blind test of HLT selection – Dec 06 In discussion with physics coordination Sample of representative events from initial ATLAS output & run full menu T&P Week T&P Week T&P Week Referee Marzo 2006 Valerio Vercesi - INFN Pavia

40 Valerio Vercesi - INFN Pavia
PESA Planning Several interactions with PESA Slice coordinators and with Algorithms developers Try and bring together something to help reinforcing the content of proposed milestones and monitoring the development process Only gone through first iteration until now… Try always to describe the work in a “task oriented” fashion, to help identifying weak areas as well as facilitate the job assignment Attempt to build a full PESA planning (Excel) starting from this information to monitor progress and allow for updates, suggestions, improvements Clearly more details on near-future objectives than on far-away ones PESA Planning Task Comments Expected PPT Workpackage LVL1 Trigger Definition of EDM Done? dec-05  ……………………………………………………………………………………………  ……………………………………………………………………………………  ………………………  ………………………….. Slices e/gamma implementation in common framework RTT, ESD, Root Analysis Framework February 2006 DH-W101 Develop tools for automatic optimisations of e/gamma selections scanning of parameter space, minuit fitting there, neural net, multi-variant method being developed March 2006 Check trigger selection w.r.t offline selection for electrons/photons Need new evaluations from offline groups Establish set of pre-scaled e-triggers using Rome datasets Photons as well First evaluation of trigger efficiencies from data For electrons, photons and muons Strategies for ETmiss calculations Revised Steering Configuration DH-W110 Prototype LVL2 Hypothesis algorithm for all Examples to be further developed in validation DH-W147 Provide documentation and examples to physics community For all selections Milestone April 2006 LVL1/HLT AODs completely available in version 12 for trigger-aware analyses Referee Marzo 2006 Valerio Vercesi - INFN Pavia

41 Valerio Vercesi - INFN Pavia
Accounting Contributo INFN alla Pre-serie 140 KCHF (ROS Racks, Monitoring, Operations, Switches, FileServer) completamenti spesi entro il 2005 Per questo e per il resto VV riceve in copia tutte le fatture Contributo CORE Online Computing System: KCHF (Monitoring, Operations) Inviati al CERN 45 KCHF a Maggio 2005 Già acquistati quattro file server Read-Out System: KCHF (ROS Racks) Gara CERN espletata con un congruo ritardo per la prima tranche (50 ROS), la parte rimanente è in consegna (60 ROS a Maggio) Imputati all’INFN per ora circa 200 KCHF (su Roma1) LVL2 processors, Event Building, Event Filter processors: KCHF In corso di perfezionamento le specifiche dettagliate (soprattutto per i processori HLT) Può darsi si possa utilizzare un marker survey fatto da CERN-IT Studi in corso anche per la valutazione delle ultime tecnologie (Moore’s law failures…) Infrastruttura: 80 KCHF (cavi, racks, cooling,…) Referee Marzo 2006 Valerio Vercesi - INFN Pavia

42 Valerio Vercesi - INFN Pavia
Cost Profile (KCHF) 2004 2005 2006 2007 2008 2009 Total Pre-series 140 Detector R/O 275 550 LVL2 Proc 65 195 230 160 650 Event Builder 50 110 70 280 Event Filter 170 180 570 380 1300 Online 45 135 Infrastructure 80 20 200 INFN Total 320 775 505 930 630 3300 TDR Total 1048 3357 4087 4544 7522 4543 25101 INFN Percentage(%) 13.4 9.5 19.0 11.1 12.4 13.9 13.1 Referee Marzo 2006 Valerio Vercesi - INFN Pavia

43 Valerio Vercesi - INFN Pavia
INFN Milestones 30/06/2005 TDAQ - Installazione, test e uso della "Pre-serie" (~ 10% TDAQ slice) “compiutamente” raggiunta in Ottobre: ritardi accumulati soprattutto sugli acquisti delle componenti Proponiamo di indicare il 100% e modificare la “matching date” 24/12/2005 TDAQ - Installazione e test dei ROS di Pixel, LAr, Tile, Muon (interfacciamento al ROD Crate e integrazione nel DAQ) Forte dipendenza dalla data di consegna dei ROS (lentezza gara, etc) Nessun problema “di principio”, il programma di lavoro è chiaro, l’esperienza della pre-serie è direttamente trasferibile Proponiamo di indicare 50% alla data prevista 30/04/2006 Completamento dei test sulla pre-serie e definizione delle funzionalità per il supporto al commissioning TDAQ 31/08/2006 Commissioning delle slice di ROS dei rivelatori utilizzando le funzionalità della pre-serie (modulo-0 del sistema finale) 31/12/2006 Presa dati integrata dei rivelatori nel pozzo con raggi cosmici Referee Marzo 2006 Valerio Vercesi - INFN Pavia

44 Goal of Early Commissioning
Near ATLAS Calibration Workshop Strba, Nov. 2004 Prepare for unexpected events… Referee Marzo 2006 Valerio Vercesi - INFN Pavia

45 … e l’unexpected è dietro l’angolo…
Referee Marzo 2006 Valerio Vercesi - INFN Pavia


Download ppt "ATLAS HLT/DAQ Referee Marzo 2006"

Similar presentations


Ads by Google