Presentation is loading. Please wait.

Presentation is loading. Please wait.

Detector Commissioning Valerio Vercesi – INFN Pavia Italo-Hellenic LHC School of Physics 2004.

Similar presentations


Presentation on theme: "Detector Commissioning Valerio Vercesi – INFN Pavia Italo-Hellenic LHC School of Physics 2004."— Presentation transcript:

1 Detector Commissioning Valerio Vercesi – INFN Pavia Italo-Hellenic LHC School of Physics 2004

2 LHC School 2004Valerio Vercesi - INFN Pavia 2 Many thanks to all friends and colleagues of CMS and ATLAS who have contributed (aware or unaware) to the following In particular M. Boonekamp, S. Cittolin, F. Gianotti, R. McPherson, L. Mapelli, P. Nevski, L. Nisati, J. Proudfoot, P. Sphicas, S. Tapprogge Most of the examples given for ATLAS, approach valid for both experiments

3 LHC School 2004Valerio Vercesi - INFN Pavia 3 Commissioning thoughts ……….…..….First beams Slow Control MagnetsLVL1 muonsDetector coolingHad Calo ROEM Calo electronicsLocal DAQLVL1 timingCryo testsAccess tests … and many more! Cosmic dataControl room What do we mean by “commissioning CMS/ATLAS?”

4 LHC School 2004Valerio Vercesi - INFN Pavia 4 Installation…

5 LHC School 2004Valerio Vercesi - INFN Pavia 5 CMS parameters DetectorChannelsControlEv. Data Pixel600000001 GB50 (kB) Tracker100000001 GB650 Preshower14500010 MB50 ECAL8500010 MB100 HCAL14000100 kB50 Muon DT20000010 MB10 Muon RPC20000010 MB5 Muon CSC40000010 MB90 Trigger1 GB16 Event size1 Mbyte Max LV1 Trigger 100 kHz Online rejection 99.999% System dead time~ % Detectors

6 LHC School 2004Valerio Vercesi - INFN Pavia 6 Inner detector Calorimetry Muon system ROL = Read Out Link = Point-to-point ROD-ROB Connection ATLAS total event size = 1.5 MB Total no. ROLs = 1628 Trigger ChannelsNo. ROLs Fragment size - kB MDT3.7x10 5 1920.8 CSC6.7x10 4 320.8 RPC3.5x10 5 320.38 TGC4.4x10 5 160.38 ChannelsNo. ROLs Fragment size - kB Pixels1.4x10 8 1200.5 SCT6.2x10 6 921.2 TRT3.7x10 5 2561.2 ChannelsNo. ROLs Fragment size - kB LAr1.8x10 5 7680.75 Tile10 4 640.75 ChannelsNo. ROLs Fragment size - kB LVL1560.5 ATLAS parameters

7 LHC School 2004Valerio Vercesi - INFN Pavia 7 Commissioning organization  The commissioning process is complex and must interleave many competing tasks. It must therefore be carefully conceived and organized  Experiments are pushed to do so also from outside  e.g. by the LHCC…  It must be based on the principle that the various components have to be commissioned as soon as possible and as much it is possible without beam  It will be a long learning process  Commissioning will be running in parallel to the detector installation work  Detector commissioning is a job that involves the entire detector and physics communities  This needs to be coordinated

8 LHC School 2004Valerio Vercesi - INFN Pavia 8 CDF Run II  CDF kept the solenoid, central calorimeter and part of the muon system  Everything else (tracker, silicon detectors, electronics, etc.) is new  Fortran  C++ (  )  Fastbus  VME  1.5ms readout  132ns pipelined readout  New Trigger and DAQ hardware  New Event Data Model for reconstruction and Level 3 filters

9 LHC School 2004Valerio Vercesi - INFN Pavia 9 Commissioning CDF  Timeline from 1999 to 2002  2 “false starts” to form commissioning organization  Started in earnest in late 1999  Engineering run in Oct 2000 Missing sections of electronics and detector systems Only simple trigger capability  Physics commissioning started spring 2001 Missing hardware Hardware problems on the detector Inadequate/missing software and software tools Unreliable DAQ system (software & hardware)  Major shutdown and detector repairs in Oct 2001 Many hardware problems fixed Improved software and tools to basic capability Began commissioning 2 nd Level trigger  Physics quality data established Mar 2002 Then began a steady grind to improve data quality and data-taking efficiency

10 LHC School 2004Valerio Vercesi - INFN Pavia 10 CDF Commissioning Organization

11 LHC School 2004Valerio Vercesi - INFN Pavia 11 A walk through CDF Run2 history  Problems getting the detector ready for the “Engineering Run ”: from fall 1999 – Oct 2000  Late delivery of readout electronics (some not even making it in time)  Continual development of the DAQ software  Unreliable DAQ hardware as more front end boards and crates were integrated into the overall system  Unreliable slow controls and monitor system  Establishment of basic calibration infrastructure (calibration software, database, evaluation software and presenter)  Lack of 2 nd level trigger and a trigger system interface that was singularly difficult to use  Ran shifts, took calibrations and cosmic ray runs routinely but  Didn’t have very user friendly software to know if the detector was performing well  Only could have a limited plan as to what to do once things went awry. Since only experts could fix things, when the system failed on the owl shift you were dead in the water  Burned up people, experts and motivation. But the detector took data in the engineering run and we learned about all of the additional work that was to be needed before physics quality data could be recorded

12 LHC School 2004Valerio Vercesi - INFN Pavia 12 2001: The Odyssey continues…  Fix noise in parts of readout (which in one case required designing and building small auxiliary boards and installing them on the detector)  Add shielding for the muon system  Install silicon detector (and think about how to fix the cooling system problem)  Test Level 2 system (and in particular design a new backplane to deal with bus arbitration problems) and the several interface boards  Finally got some decent amount of beam and came to a grinding halt with a single beam incident in which many power supplies failed simultaneously  Continue working on all sorts of software: monitoring, database, reconstruction, filter algorithms to handle the fact that we had no level 2…..  Tried hard to take physics data when there was beam, but it was compromised by holes in the detector readout (especially in the tracking detectors)  Continued burning up people, experts and motivation….

13 LHC School 2004Valerio Vercesi - INFN Pavia 13 What were the hardest problems?  A dysfunctional accelerator  Establishing the PLAN and PRIORITIES for the detector as a WHOLE  Power supply reliability and performance (e.g. noise)  Incremental commissioning as readout is added to a specific system – this causes a lot of unnecessary repeated work  Basic design or fabrication errors  Blocked cooling lines in the silicon system  Resonance excitation of silicon traces on the front end boards  Optimization of the silicon readout and SVT trigger  Commissioning the TRIGGER  L2 decision crate with custom backplane  L2 interface boards built to a common specification but with many different implementations  Bandwidth (performance not meeting expectations, also a failure to establish a clear specification)  L3 software to respond to unexpected conditions (higher trigger rates, no Level 2 etc.)  Beam induced problems  SEU’s in control units, FPGA’s  Potentially dangerous currents in the silicon readout  Development of beam monitoring and detector protection code

14 LHC School 2004Valerio Vercesi - INFN Pavia 14 Lessons learnt  Organization and management has been key to realizing successful and efficient CDF detector operations  Components arriving later than scheduled impact the commissioning of the entire detector system  The trigger and data acquisition systems are essential. Commissioning the 2 nd level trigger while trying to take physics data was painful and cost a modest amount of physics data (note this is not much different from staging)  Software is the principal way to interact with the detector readout and probably the only way when the detector is closed. If one has to develop these tools while commissioning the detector then the work proceeds slowly and painfully

15 LHC School 2004Valerio Vercesi - INFN Pavia 15 Phase A System commissioning to ROD level. System commissioning for LVL1 and DAQ Check cable connections. Infrastructure commissioning (refrigerators, water cooling, etc.) Phase C System/Trigger/DAQ combined commissioning Phase D Global commissioning cosmic ray runs, planning for initial physics runs; initial off-line analysis software available, first collisions Phase B ROD – Local DAQ connections established. Calibration runs on local systems. Skeleton TTC system needs to be available. 1/03 03/04 08/06 11/06 Commissioning in steps @ LHC

16 LHC School 2004Valerio Vercesi - INFN Pavia 16 ATLAS cavern layout

17 LHC School 2004Valerio Vercesi - INFN Pavia 17 Rock ~ Silicon 600m x 600m x 200m deep (2.33 g/cm 3 ) Air Concrete Surface building PX14/16 shielding (2.5 g/cm 3 ) PX14 (18.0 m Inner Ø) PX16 (12.6 m Inner Ø ) ATLAS Geant3 Simulation Initial detector Cosmic Muons in ATLAS

18 LHC School 2004Valerio Vercesi - INFN Pavia 18 Expected rates of cosmic muons ConditionRate(Hz) E surface > 10 GeV (PDG generator) E surface >10 GeV (ALEPH generator) ATLAS UX15--59004900 Any G3 digit--28002300 ThroughRPC Y>0 x RPC Y<0 x ID DIGI 2824 goingRPC Y>0 x RPC Y<0 x PIX DIGI 0.60.4 Pass by|Z DIGI | < 300, |R DIGI | < 60 cm12.210.2  origin |Z DIGI | < 100, |R DIGI | < 30 cm2.31.9 |Z DIGI | < 60, |R DIGI | < 20cm0.60.5 E T CELL > 5 GeV0.1 EM CalE T CLUSTER > 5 GeV0.2 E T TOTAL > 5 GeV0.4 Tile CalE TOTAL > 20 GeV1.41.2 HECE TOTAL > 20 GeV0.1 FCALE TOTAL > 20 GeV0.02 Two generators (nominal B fields), one based on PDG approximation, the other on ALEPH measurements Agreement  20% level between 2 generators, with the PDG overestimating the low energy flux

19 LHC School 2004Valerio Vercesi - INFN Pavia 19 Muon flux measurements MC DATA PX14PX16  Data ~ 1.5×simulatiom  Constant 2.33 g/cm 3 rock density used in simulation might explain the difference  Using simulation should give conservative rates Telescope ~ 1000 cm 2

20 LHC School 2004Valerio Vercesi - INFN Pavia 20 Cosmic Muons in ATLAS in 0.01 s…

21 LHC School 2004Valerio Vercesi - INFN Pavia 21 Typical cosmic events…  One track reconstructed in Muon chambers  Two tracks reconstructed in Inner Detector  Will happen every ~ 10 s

22 LHC School 2004Valerio Vercesi - INFN Pavia 22 Muon momentum distributions

23 LHC School 2004Valerio Vercesi - INFN Pavia 23 Beam halo vs Beam-gas Scoring plane Beam-Halo Beam-Gas Beam-Halo  Beam-halo  Simulation of accelerator background performed by V. Talanov et al.  based on MARS: machine optics V 6.4  nominal high-luminosity: beam current: 530 mA  scoring plane at the cavern entrance before ATLAS shielding (|z|=23 m from IP)  Particles are then transported by ATLAS full (G3) simulation Beam-gas –p(7 TeV) on p(rest) –vertices uniformly distributed over  23 m –  (pH, pC, pO, …)   (pp)×A 0.7 (inelastic only) –vacuum estimate: ~3.10 -8 torr (~10 15 mol/m 3 )

24 LHC School 2004Valerio Vercesi - INFN Pavia 24 Example of beam-halo muons

25 LHC School 2004Valerio Vercesi - INFN Pavia 25 Beam-halo expected rates ParticleRate @ cavern entrance (standard operation) All1750 kHz h ± 1515 kHz n130 kHz  105 kHz  10 GeV 16 kHz  eV 1 kHz  TeV 10 Hz ParticleRate @ cavern entrance (single-beam period) All~10 kHz h ± ~8 kHz n~0.7 kHz  ~0.5 kHz  10 GeV ~100 Hz  eV ~5 Hz  TeV ~0.5 Hz

26 LHC School 2004Valerio Vercesi - INFN Pavia 26 Beam-halo expected rates  Rates for initial period scaled from high-luminosity rates by assuming 3 x 10 10 p per bunch and 43 bunches  ~ 200 times lower current  Expected optics and vacuum for commissioning period not included yet (need input from machine people)  these results are very preliminary  Total rates are for two months of single-beam with 30% data taking efficiency  Simple definition of “useful tracks” : 2-3 segments in MDT, 3-4 disks in ID end-cap DetectorRate (B-field off ) Total (B-field off) Rate (B-field on) Total (B-field on) MDT barrel15 Hz2.5 10 7 72 Hz1.5 10 8 MDT end-cap145 Hz2.5 10 8 135 Hz2.5 10 8 Pixel/SCT1.8/17 Hz 3 10 6 / 3 10 7 2/19 Hz 3 10 6 / 3 10 7 EM E > 5 GeV 2 Hz3.5 10 6 1 Hz1.7 10 6 Tile/HEC E > 20 GeV 1.7/1.2 Hz2.9/2.1 10 6 1.6/0.9 Hz2.8/1.6 10 6 Very preliminary

27 LHC School 2004Valerio Vercesi - INFN Pavia 27 A typical beam-gas event Beam-gas collisions are essentially boosted minimum-bias events  low-p T particles Rate : ~ 2500 interactions/m/s

28 LHC School 2004Valerio Vercesi - INFN Pavia 28 Beam-gas rates in ATLAS Vertex z-position Rate (Hz) Total (2 months,  =30%)  23 m 1.2 10 5 2.1 10 11  3 m 1.6 10 4 2.4 10 10  20 cm 1.1 10 3 1.6 10 9   p T > 1 GeV 1.0 10 3 1.5 10 9 inside  3m  p T > 1 GeV 0.3 10 3 5.6 10 8 inside  3m E T spectrum in ECALE spectrum in FCAL E T charged particles

29 LHC School 2004Valerio Vercesi - INFN Pavia 29 Trigger issues How to trigger on cosmics, beam-halo and beam-gas ?  Cosmics : RPC  Beam-halo : TGC  Beam-gas : ???…  From preliminary full simulations of LVL1  Cosmics: ~ 100 Hz pass low-p T RPC LVL1  Beam-halo: ~ 1 Hz pass low-p T TGC LVL1  Small enough not worrying for LHC physics data taking  High enough Useful samples (e.g. > 10 8 cosmics events in 3 months if  =50%) for detector commissioning (these triggered muons cross the interaction region)  Beam-gas trigger  Soft particles: not obvious…  Scintillator slabs in front of FCAL  Useful also for beam-halo at low R and for minimum bias trigger during initial collision period

30 LHC School 2004Valerio Vercesi - INFN Pavia 30 Increase cosmics trigger acceptance  Exciting last opportunity!  After that, one will only be asked to reduce trigger rates…  Muon system  Requirement for cosmics and beam-halo triggers included in design: e.g. trigger ASICs include programmable delays to compensate for TOF of down-going cosmic-ray muons in barrel  Projectivity constraints result from cabling between planes of trigger chambers  Lot of flexibility in the system Timing adjustments Open L1 roads Relax coincidence requests  At LVL2 modified trigger algorithms can help in selecting non-pointing muons  Tile Calorimeter system  RPC commissioned end 2005  December 2004 : Tilecal in the pit equipped with electronic commissioning with cosmics can start need self-triggering scheme while waiting for RPC  consider back-to-back trigger towers (  x  =0.1 x 0.1, full calo depth) ask E > 1.5 GeV in both towers Expected rate from full simulation : ~ 130  /hr for 16 top+16 bottom modules

31 LHC School 2004Valerio Vercesi - INFN Pavia 31 TileCal based cosmic trigger  -distribution of muons passing Tilecal trigger

32 LHC School 2004Valerio Vercesi - INFN Pavia 32 Spectrometer commissioning  All sub-detectors will profit from cosmics data taking period  Sub-detector commissioning starts with installation (Phase A)  A case study for Muon Spectrometer  Cosmic rate high enough for polar angles up to  =75 o : ~1Hz/strad for muons going through the ID (almost projective) and p  >10GeV  Study of all barrel sectors (probably except sectors 1-9 with vertical chambers) and part of the forward chambers (EI/CSC-EM or EM-EO tracks, probably no EI-EM-EO tracks)  First test of the full reconstruction (field off/reduced/full field)  Map dead channels, chase/replace faulty FE cards  Tube efficiency, r-t relation (autocalibration):  1000 (no field)-10000 (with field) muons/tube => ~10-100 days  Check/calibration of the (barrel only?) alignment system with straight tracks (<30  m level): 2000 muons/chamber ~10 hours  Alignment (  arrel  End cap)  Spectrometer / ID)  A large part of the muon spectrometer commissioning can be done with cosmics, provided a proper trigger is available

33 LHC School 2004Valerio Vercesi - INFN Pavia 33 Commissioning ECAL S(  )/N  7 Barrel middle compartment Test-beam data Rate of cosmic muons (with |z|<30 cm and giving E cell >100 MeV) per cell of ECAL middle compartment vs ,  Rate needed to collect ~ 100  /cell over 3 months assuming 50% data taking efficiency 100 muons per cell over |  | <=1 and 70 % of  coverage

34 LHC School 2004Valerio Vercesi - INFN Pavia 34 ECAL with cosmics With 100 muons /cell /compartment check calorimeter timing to < 1 ns  input to optimal filtering in ROD check calorimeter position in  /  wrt other sub-detectors to < 1 mm check response uniformity vs  :  0.5% precision could be achieved  t = 1.62 ns/E (GeV)  19 ps (from calibration) Muons E~300 MeV  t ~ 6 ns Test-beam data 1% precision measured with ~1000   with ~ 5000  : 0.5 % precision (~ 100  /cell integrated over  )

35 LHC School 2004Valerio Vercesi - INFN Pavia 35 Commissioning Inner Detectors  of beam-gas tracks Reconstructed  of cosmics Cosmics : ℴ (1Hz) tracks in Pixels+SCT+TRT useful statistics for debugging readout, maps of dead modules, etc. check relative position Pixels/SCT/TRT and of ID wrt ECAL and Muon Spectrometer first alignment studies: may achieve statistical precision of ~ 10  m in parts of Pixels/SCT first calibration of R-t relation in straws Beam-gas : ~ 25 Hz of reconstructed tracks with p T > 1 GeV and |z|<20 cm >10 7 tracks (similar to LHC events) in 2 months enough statistics for alignment in “relaxed” environment exceed initial survey precision of 10-100  m

36 LHC School 2004Valerio Vercesi - INFN Pavia 36 Control Room(s)  Points for consideration  Operation organization inside the experiment  CR hardware and software configuration & implementation  Operations to be exported outside Point 1  Operation of the detector  Control and monitoring of the experiment with and without beam  Control & Monitoring of the experimental area  Access and general safety  Interaction with Technical Coordination and other CRs at CERN  Collaborative aspects  “Heart of experiment”  To become (after detector commissioning) experiment centred  How to organize  Operations in the context of the control room  Continuity of responsibility  Authority  Availability of people  Outreach aspects  Visitor “gallery”  Experiment operations simulator

37 LHC School 2004Valerio Vercesi - INFN Pavia 37 For example…

38 LHC School 2004Valerio Vercesi - INFN Pavia 38 The Trigger  Anything done during commissioning and cosmic periods needs a properly functional trigger  However operation modes differ in the different data acquisition environments  Depending on signal source need different timing for sub-detectors  How to setup the trigger itself to be ready for the collision mode period?  How detectors will be timed-in in order to make sure events will be properly assembled?  How we can build a robust trigger selection, against variable background levels, data corruption, hardware failures?  How can we have a redundant trigger and data acquisition structure to protect against data losses and dead time?

39 LHC School 2004Valerio Vercesi - INFN Pavia 39 CMS Architecture Collision rate40 MHz Level-1 Maximum trigger rate100 kHz Average event size ≈1 Mbyte Event Flow Control≈10 6 Msg/s No. of In-Out units 512 Readout network bandwidth≈1 Terabit/s Event filter computing power≈10 6 SI95 Data production≈Tbyte/day No. of PC motherboards≈Thousands

40 LHC School 2004Valerio Vercesi - INFN Pavia 40 HLTHLT LVL1 D E T RO ROD LVL2 Trigger DAQ 2.5  s DATAFLOWDATAFLOW 40 MHz 75 kHz ~ 2 kHz ~ 200 Hz Calo MuTrCh Other detectors ROB ROS SFI SFO RRC RRM EBN EFN FE Pipelines Read-Out Drivers ROD-ROB Connection Read-Out Buffers ROD-ROS Merger Read-Out Sub-systems Dataflow Manager Sub-Farm Input Sub-Farm Output Event Filter N/work ROIB L2P L2SV L2N Event Filter DFM EFP RoI Builder LVL2 Supervisor LVL2 Network LVL2 Proc Unit RoI RoI data = 2% RoI requests LVL2 acc = 2.5 kHz Event Building N/work ~ sec LVL1 acc = 75 kHz 40 MHz 120 GB/s ~ 300 MB/s 3+3 GB/s Event Filter Processors 120 GB/s 3 GB/s EFacc = ~0.2 kHz ~ 10 ms ATLAS Architecture

41 LHC School 2004Valerio Vercesi - INFN Pavia 41 Trigger Commissioning  Commissioning of trigger involves  LVL1 commissioning (timing-in, data coherence, calibration, …)  DAQ commissioning (event fragment coherence, dataflow robustness…)  HLT commissioning (event data coherence, algorithms, calibration, …)  … and spans a broad range of aspects  Hardware commissioning  Software commissioning  Physics (algorithm) commissioning  Follows closely detector commissioning steps  Cosmic running / Single beam running / Collisions  Analyze as many aspects as early as possible  use test pulses, pre-load simulated data, …  Time-in LVL1(CTP) with respect to LHC machine  Time-in detectors wrt LVL1 and LHC machine (L1A, BCID)  Includes LVL1 calorimeter/muon triggers  dedicated minimum bias trigger  Get DAQ globally running, combining all detectors  Ensure consistency at digital level  Get monitoring / event display at EF working, HLT running  Debug HLT algorithm, calibrate and tune cuts  Check coherency of detector mappings (DAQ, LVL1, HLT, offline)

42 LHC School 2004Valerio Vercesi - INFN Pavia 42 Timing-in: a simplified picture  Many events in FE pipeline during LVL1 decision process  Event to be accepted identified by arrival of LVL1 accept (L1A)  Event ID defined by internal counters Reset of a global signal, e.g. BCR  Timing-in: proper accounting / adjusting of all delays such that correct event is taken and is assigned the right ID  After L1A, the (HLT/DAQ) system is ID based  No more timing in necessary  Depending on signal source: need for different timing for sub-detectors  different timing adjustment  Test pulses/cosmic rays  Gain experience with timing system  Sufficient for most timing adjustments specific to sub-detectors Sampling time, data alignment  Single beam operation  Use beam-beam set-up Predicted from test pulses and simulation of time-of flight of particles  Dedicated beam-halo runs  Collision mode  Tune timing for beam-beam  Monitoring to make fine adjustments if needed

43 LHC School 2004Valerio Vercesi - INFN Pavia 43 HLT commissioning  Depends strongly on stable detector behaviour  Technical aspects (ensure data coherency)  LVL2: data flow from LVL1 via RoIBuilder to LVL2 supervisor, append LVL2 result to event record, integrate selection software  EF: integrate farms and selection software, append EF result  Activate HLT selection  Run HLT algorithms in transparent mode  Produce decision (and debugging information), no rejection enabled  Offline analysis of HLT performance  Enable rejection, check that right events are rejected  Single beam running  HLT selection could be used for clean-up of events  Assess functional performance  Trigger for beam collision mode  Share time between optimization of trigger behaviour and collection of events for detector and “physics” studies

44 LHC School 2004Valerio Vercesi - INFN Pavia 44 Commissioning Muon trigger 1. Install Local Trigger Processor, ReadOut Driver Busy and TTC systems for muon trigger 2. Install Detector Control System infrastructure for the muon trigger 3. Install MUCTPI; repeat stand-alone tests; test together with CTP 4. Install off-detector electronics and repeat stand-alone tests 5. Test Sector Logic together with MUCTPI 6. Test RODs together with DAQ 7. Install and test LV power for on/near- detector electronics - in collaboration with RPC detector group 9. Install and test on-detector electronics - installed with chambers 10. Perform timing and threshold set-up using test-pulse system 11. Check full electronic chain from RPC detectors up to CTP using the available test-pulse / test-pattern facilities 12. Configure barrel system to provide a cosmic-ray trigger using free-running 40 MHz clock provided via the CTP Muon trigger Muon Barrel off-detector Muon End-cap off-detector Muon-Central Trigger Processor Interface (MUCTPI) Muon Barrel on-detector Muon End-cap on/near-detector

45 LHC School 2004Valerio Vercesi - INFN Pavia 45 Detector Calibration  ECAL example: in situ calibration with Z  e + e - events  rate ~ 1 Hz, ~ no background, allows standalone ECAL calibration c tot = c L  c LR  Determine c LR  long-range response non-uniformities of the 400 regions  module-to-module variations, different upstream material, etc.  From full simulation studies, assuming:  c L = 0.5%  initial c LR = 1.5 % module-to-module average response variation < 0.5% from test beam implies very poor knowledge of upstream material (to factor ~2)  ~250 e  per region needed to achieve c LR  0.4%  ~ 10 5 Z  ee events needed (~ 1 day of data taking @ 10 33 ) c tot = 0.5%  0.4%  0.7%

46 LHC School 2004Valerio Vercesi - INFN Pavia 46 Startup: ATLAS staging staged in part staged One of the three pixel layers (not the B-layer) Outermost TRT wheels, half of the CSC layers MDT chambers in transition region (EES, EEL) Cryostat gap scintillators, part of high luminosity shielding Reduction of Read-out drivers for LAr calorimeter Large part of the HLT/DAQ processors

47 LHC School 2004Valerio Vercesi - INFN Pavia 47 Startup: CMS staging  Infrastructure  staging cooling and ventilation  Muon system  staging of RPC in Endcap (>1.6  )  No CSC in the 4 th station inner ring  Reduced HV granularity for RPC  Trigger&DAQ  Staging 4 Tridas slices  Start with 50% L1 accept rate  “tight”, ME4/2 out,  =67% “tight”, ME4/2 in,  =74% “loose” (unchanged),  = 92% GMT (unchanged ) CSC loose, no ME4/2 MB sample, p T > 6 GeV CSC loose

48 LHC School 2004Valerio Vercesi - INFN Pavia 48 ATLAS 2004 combined testbeam … a true ATLAS vertical slice à là commissioning Stage 3…

49 LHC School 2004Valerio Vercesi - INFN Pavia 49 What next?  ATLAS and CMS getting ready for commissioning  Profit from experience of previous detectors at hadron colliders  Extract information as much as possible during installation and cosmic periods  Commissioning is a complex activity with many players  Expect a major breakthrough already from combined testbeam activity (2004)  The huge number of readout channels, the constraints coming from the accelerator environment, the complexity of the apparata provide an unprecedented challenge for detector operations and trigger strategies  Enormous work in simulation, data challenges, grid productions…  Only real data will allow final implementation of selection procedures  Eagerly awaiting that moment…


Download ppt "Detector Commissioning Valerio Vercesi – INFN Pavia Italo-Hellenic LHC School of Physics 2004."

Similar presentations


Ads by Google