Event selection and readout Online networks and architectures Online event filter Technologies and trends Computing and communication at LHC.

Slides:



Advertisements
Similar presentations
1 AMY Detector (eighties) A rather compact detector.
Advertisements

Highest Energy e + e – Collider LEP at CERN GeV ~4km radius First e + e – Collider ADA in Frascati GeV ~1m radius e + e – Colliders.
The Large Hadron Collider By Kathleen McKay. What is the LHC? The most powerful particle accelerator in the world. A synchrotron (ring-shaped particle.
LHCb Upgrade Overview ALICE, ATLAS, CMS & LHCb joint workshop on DAQ Château de Bossey 13 March 2013 Beat Jost / Cern.
The LHCb DAQ and Trigger Systems: recent updates Ricardo Graciani XXXIV International Meeting on Fundamental Physics.
Chapter Chapter Goals Describe the layers of a computer system Describe the concept of abstraction and its relationship to computing Describe.
Chapter 1 The Big Picture Chapter Goals Describe the layers of a computer system Describe the concept of abstraction and its relationship to computing.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/20 New Experiences with the ALICE High Level Trigger Data Transport.
The ATLAS trigger Ricardo Gonçalo Royal Holloway University of London.
Chapter 1 An Overview of Personal Computers
The History of Computers By: Casey Walsh. Introduction Computer history can be broken down into five generations of change. Computer history can be broken.
1 The development of modern computer systems Early electronic computers Mainframes Time sharing Microcomputers Networked computing.
Chapter 1 The Big Picture Chapter Goals Describe the layers of a computer system Describe the concept of abstraction and its relationship to computing.
1 Chapter 1 The Big Picture. 2 2 Computing systems are dynamic entities used to solve problems and interact with their environment. They consist of devices,
1 6 Abacus An early device to record numeric values Blaise Pascal Mechanical device to add, subtract, divide & multiply Joseph Jacquard Jacquard’s Loom,
Chapter 1 The Big Picture Chapter Goals Describe the layers of a computer system Describe the concept of abstraction and its relationship to computing.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
Chapter 01 Nell Dale & John Lewis.
February 19th 2009AlbaNova Instrumentation Seminar1 Christian Bohm Instrumentation Physics, SU Upgrading the ATLAS detector Overview Motivation The current.
Intro to MIS MGMT 661 Management Information Systems Summer Dannelly 1 st Meeting.
The CMS Level-1 Trigger System Dave Newbold, University of Bristol On behalf of the CMS collaboration.
Niko Neufeld, CERN/PH-Department
CS 1410 Intro to Computer Tecnology Computers and History1.

Chapter 1 The Big Picture.
Computer history By Breanne Larsen. What would you do without a computer? Some of the most popular uses for computers are: – Homework research – Video.
1 Kittikul Kovitanggoon*, Burin Asavapibhop, Narumon Suwonjandee, Gurpreet Singh Chulalongkorn University, Thailand July 23, 2015 Workshop on e-Science.
CMSC 120: Visualizing Information 1/29/08 Introduction to Computing.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
Claudia-Elisabeth Wulz Institute for High Energy Physics Vienna Level-1 Trigger Menu Working Group CERN, 9 November 2000 Global Trigger Overview.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
The ALICE DAQ: Current Status and Future Challenges P. VANDE VYVRE CERN-EP/AID.
A.Golunov, “Remote operational center for CMS in JINR ”, XXIII International Symposium on Nuclear Electronics and Computing, BULGARIA, VARNA, September,

Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Chapter 1 The Big Picture Chapter Goals Describe the layers of a computer system Describe the concept of abstraction and its relationship to computing.
Dannelly's Very Short History of Computing CSCI 101.
History of Computers! Claire Bromm March 28 th, 2012.
History of Computer Wyatt Feiling Did you know... The first idea for a computer was in the early 1800s Charles Babbage is the man who is credited with.
LHCb front-end electronics and its interface to the DAQ.
1 The PHENIX Experiment in the RHIC Run 7 Martin L. Purschke, Brookhaven National Laboratory for the PHENIX Collaboration RHIC from space Long Island,
HIGUCHI Takeo Department of Physics, Faulty of Science, University of Tokyo Representing dBASF Development Team BELLE/CHEP20001 Distributed BELLE Analysis.
Niko Neufeld, CERN/PH. ALICE – “A Large Ion Collider Experiment” Size: 26 m long, 16 m wide, 16m high; weight: t 35 countries, 118 Institutes Material.
LHC experimental data: From today’s Data Challenges to the promise of tomorrow B. Panzer – CERN/IT, F. Rademakers – CERN/EP, P. Vande Vyvre - CERN/EP Academic.
Niko Neufeld, CERN/PH. Online data filtering and processing (quasi-) realtime data reduction for high-rate detectors High bandwidth networking for data.
Dannelly's Short History of Computing CSCI327 Social Implications of Computing.
Predrag Buncic Future IT challenges for ALICE Technical Workshop November 6, 2015.
Data Acquisition, Trigger and Control
ATLAS TDAQ RoI Builder and the Level 2 Supervisor system R. E. Blair, J. Dawson, G. Drake, W. Haberichter, J. Schlereth, M. Abolins, Y. Ermoline, B. G.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
DAQ Systems and Technologies for Flavor Physics FPCP Conference Lake Placid 1 June 2009 Beat Jost/ Cern-PH.
CODA Graham Heyes Computer Center Director Data Acquisition Support group leader.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
The First Computers Jacquard’s Loom: programmed a loom
4. History of Computing Technology
LHCb and InfiniBand on FPGA
Chapter 1 The Big Picture
PC Farms & Central Data Recording
LHC experiments Requirements and Concepts ALICE
Enrico Gamberini, Giovanna Lehmann Miotto, Roland Sipos
Status of CMS and the Austrian Contribution to the Trigger System
UNIV 103 CS Majors Seminar Dr. Blaise W. Liffick Fall 2017.
The LHCb Event Building Strategy
Commissioning of the ALICE-PHOS trigger
LHCb Trigger, Online and related Electronics
The LHCb Front-end Electronics System Status and Future Development
TELL1 A common data acquisition board for LHCb
Presentation transcript:

Event selection and readout Online networks and architectures Online event filter Technologies and trends Computing and communication at LHC

European Laboratory for Particle Physics 2 Two superconducting magnet rings in the LEP tunnel. Opal Delphi SPS PS LEP -LHC Aleph L3 LHCb Alice CMS ATLAS Experiments at LHC ATLAS A Toroidal LHC ApparatuS. (Study of Proton-Proton collisions) CMS Compact Muon Solenoid. (Study of Proton-Proton collisions) ALICE A Large Ion Collider Experiment. (Study of Ion-Ion collisions) LHCb (Study of CP violation in B-meson decays at the LHC collider) The Large Hadron Collider (LHC)

European Laboratory for Particle Physics 3 The LHC challenges: Summary

European Laboratory for Particle Physics 4 Measurement and event selection

European Laboratory for Particle Physics 5 Event selection: The trigger system

European Laboratory for Particle Physics 6 Trigger levels at LHC (the first second)

European Laboratory for Particle Physics 7 Particle identification

European Laboratory for Particle Physics 8 CMS Level-1 : calorimeters and muons

European Laboratory for Particle Physics 9 Level-1 trigger systems

European Laboratory for Particle Physics 10 Level-1 trigger summary Time needed for decision  tdec≈ 2-3  s Bunch-crossing time:  tevt≈ 25 ns Need pipelines to hold data Need fast response (e.g. dedicated detectors) Backgrounds are huge Rejection factor ≈ 10,000 Algorithms run on local, coarse data Only calorimeter & muon information Special-purpose hardware (ASICs, etc) Rates: steep function of thresholds Ultimately, determines the physics

European Laboratory for Particle Physics 11 Bunch Crossing Times: LEP, Tevatron & LHC

European Laboratory for Particle Physics 12 Trigger and readout structure at LHC ATLAS CMS

European Laboratory for Particle Physics 13 CMS front-end readouts

European Laboratory for Particle Physics 14 LHC experiments trigger and DAQ summary

European Laboratory for Particle Physics 15 Evolution of DAQ technologies and architectures

European Laboratory for Particle Physics 16 LHC trigger and data acquisition systems LHC DAQ : A computing&communication network Alice ATLAS LHCb A single network cannot satisfy at once all the LHC requirements, therefore present LHC DAQ designs are implemented as multiple (specialized) networks

European Laboratory for Particle Physics 17 CMS trigger and data acquisition

European Laboratory for Particle Physics 18 Structure option: Two physical stages (CMS)

European Laboratory for Particle Physics 19 Structure option: Three physical stages (ATLAS)

European Laboratory for Particle Physics Million channels 100 kHz LEVEL-1 TRIGGER 1 Megabyte EVENT DATA 200 Gigabyte BUFFERS 500 Readout memories 3 Gigacell buffers 1 Terabit/s Gigabit/s SERVICE LAN Petabyte ARCHIVE EnergyTracks Networks 1 Terabit/s (50000 DATA CHANNELS) 5 TeraIPS EVENT BUILDER. A large switching network ( ports) with a total throughput of approximately 500 Gbit/s forms the interconnection between the sources (Readout Dual Port Memory) and the destinations (switch to Farm Interface). The Event Manager collects the status and request of event filters and distributes event building commands (read/clear) to RDPMs EVENT FILTER. It consists of a set of high performance commercial processors organized into many farms convenient for on-line and off-line applications. The farm architecture is such that a single CPU processes one event 40 MHz COLLISION RATE ChargeTimePattern Detectors Computing services Data communication and processing at LHC

European Laboratory for Particle Physics 21 Data communication at LHC

European Laboratory for Particle Physics 22 Event building

European Laboratory for Particle Physics 23 Event builder switch technologies

European Laboratory for Particle Physics 24 Reconstruction at LHC: CMS central tracking

European Laboratory for Particle Physics 25 Parallel processing by farms

European Laboratory for Particle Physics 26 CMS trigger levels

European Laboratory for Particle Physics 27 Online event selection(rejection) Rejected % Accepted 0.001%

European Laboratory for Particle Physics 28 CMS final system: Summary

European Laboratory for Particle Physics 29 Trigger and data acquisition trends

European Laboratory for Particle Physics 30 Technology ansatz (Moore's law)

European Laboratory for Particle Physics 31 Technologies trend (Moore’s law)s

European Laboratory for Particle Physics million new users expected online by 2001 Internet traffic doubled every 100 days 5000 domain name added every day 1999: last year of the voice Need more bandwidth ->Terabit switches at LHC Voice Data Traffic load CMS experiment technical proposal CMS Data Acquisition commissioning CMS Data Acquisition technical proposal Internet growth Start of Data era

European Laboratory for Particle Physics 33 Performances required by a LHC experiment (in 2005) versus high technology expectations Accelerated Strategic Computing Initiative. ASCI 97

European Laboratory for Particle Physics 34 ASCI PathForward Program Overview

European Laboratory for Particle Physics 35 NEC 32 Tera, CompaQ HPCC

European Laboratory for Particle Physics 36 Processor farms : the 90's supercomputer

European Laboratory for Particle Physics 37 After commodity farms what next? Fusion of data communication, data processing and data archive global resources : Grid approach ?

European Laboratory for Particle Physics 38 Raw Data: 1000 Gbit/s Raw Data: 1000 Gbit/s 5 TeraIPS Events: 10 Gbit/s Events: 10 Gbit/s 10 TeraIPS Controls: 1 Gbit/s Controls: 1 Gbit/s To regional centers 622 Mbit/s To regional centers 622 Mbit/s Remote control rooms Remote control rooms Controls: 1 Gbit/s Controls: 1 Gbit/s CMS data flow and on(off) line computing

European Laboratory for Particle Physics 39 Event selection and computing stages

European Laboratory for Particle Physics 40 Computing and communication perspectives at LHC

European Laboratory for Particle Physics 41 Short history of computing and new frontiers The origin Counting. Abacus 1600Numeric techniques, Logarithms, Calculator engine 1800Punch cards, Automates. Babbage difference engine driven on a program 1900Punched cards electromechanical Holletith's tabulator, vacuum tube, 1940Electronic digital and stored-program first computers First Generation Commercial computers. UNIVAC. FORTRAN. First transistor Second Generation General purpose computers IBM,DEC. COBOL, BASIC. Integrated circuits Third Generation Arpanet. First microprocessor chip. PASCAL, C and UNIX Forth Generation 1975Minicomputer, Microcomputer. Window and mouse. Cray supercomputer. 1980Personal Computer. Apple, Microsoft. Vector processors 1984Parallel computing. Farms. OO/C Massive parallel computing and massive parallel storage. LANs and WANs. Internet. WEB 1995Commodities and network/bandwidth explosion. Network computing. High Performance Computing ASCI initiative. 2000Present Fusion of computing, communication and archiving. Grid….