Presentation is loading. Please wait.

Presentation is loading. Please wait.

Gunther Haller Plenary: X-Ray Controls & Data LCLS FAC Meeting 29 October 2007 1 Experimental Area Controls and Data-Acquisition.

Similar presentations


Presentation on theme: "Gunther Haller Plenary: X-Ray Controls & Data LCLS FAC Meeting 29 October 2007 1 Experimental Area Controls and Data-Acquisition."— Presentation transcript:

1 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 1 Experimental Area Controls and Data-Acquisition for LCLS and LUSI Instruments Plenary Session Presentation Gunther Haller Research Engineering Group SLAC Particle Physics and Astrophysics Division Photon Control and Data Systems (PCDS) 29 October 2007

2 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 2 Outline Responses to Last Review’s Comments Control and Data Systems Overview Controls Status Data Systems Status

3 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 3 Instruments NEH Hutch 3: X-ray Pump-Probe Instrument FEH: Hutch 1: X-Ray Photon Correlation Spectroscopy Instrument FEH Hutch 2: Coherent X-Ray Imaging Instrument X-Ray Beam Direction Near Experimental Hall (NEH) Far Experimental Hall (FEH) Tunnel Same data system design for CXI, XPP, XPS Instruments, XPS Transport, and AMO instrument due to commonality in requirements, design, implementation, integration NEH Hutch 2: AMO Instrument

4 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 4 Tasks The 120 Hz per pulse data collection, high data rate, large data volume, and precise timing control requirements of LCLS experiments go much beyond those at conventional synchrotron sources, requiring considerable complexity and sophistication in control/DAQ system design, implementation, and integration that are not feasible for individual experimental teams LCLS/LUSI control/data system must Provide controls to all instruments Provide timing measurement Provide data acquisition capabilities Provide data storage and management capabilities Provide certain standard data analysis capabilities

5 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 5 Comments from Last Review Comment 1: Many new types of diagnostics for x-ray beam line which are not present in e-beam line. What are the plans for implementing those? Covered in this and the parallel session presentations Comment 2: DAQ is considerably different from what the accelerator group is usually working on. Recommend to recruit people with DAQ and analysis experience from large HEP experiments See next slide

6 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 6 Photon Beam-Line Control and Data Systems (PCDS) Team G. Haller (e.g. SLD, BaBar, GLAST) Bob Sass (Schedule/Budget/Archive Software) Amedeo Perazzo (Online Manager) (e.g. BaBar) DAQ Chris O’Grady, Mike Huffer, Matt Weaver (all either BaBar, GLAST) Controls Dave Nelson (e.g. SLD, BaBar, GLAST) Sheng Peng, Carl Bouldin, Mike Przybylski, Ray Rodriquez Richard Mount, Steffen Luitz (e.g. BaBar) Offline Data Management Niels van Bakel (Detector Physicist, LUSI) Tasks are allocated to control division engineers where appropriate PPS, MPS

7 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 7 System Block-Diagram LCLS-LUSI Control & Data Systems Control Subsystem for Operation & Controls Data Subsystem for Acquisition & Management, Online Processing LCLS Control System SLAC Sci. Computing & Computing Srvs. Data Farm (PB Tape Drive) Computer Cluster (2000 processor node) Data Archiving/Retrieval Offline Analysis/Rendering Timing & Triggering High peak rate/ large volume Pulse-by-pulse info exchange Online Controls Vacuum, motion, power supply, etc Science Data-Acquisition Detector readout Data-Management/Processing Online data movement, processing, feature extraction, filtering Offline Data Management (SCCS Computing Center) Off-line processing, mass storage, off-site transfer/access

8 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 8 Example: AMO Instrument Controls & DAQ Beam line for XPP, XCS, CXI see respective LUSI sub-system presentations

9 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 9 Controls – List of Components Optics KB mirrors for focusing Refractive lens for focusing Monochromator Collimator Slits Attenuators Split-delay Pulse picker Compressor Sample environment Particle injector Cryostat Cryo-em stage Precision stages Beam Diagnostics Intensity monitors Beam positioning monitor Wavefront sensor Measurement instrument Diffractometer e- and ion TOF Mass spectrometer EO timing measurement Laser systems Pump laser and diagnostics EO laser Molecular alignment laser Vacuum systems Turbo pumps Ion pumps 2D Detectors Cornell detector for CXI BNL detector for XPP BNL detector for PCS

10 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 10 Status: Controls PCDS (Photon beam-line Control and Data Systems) has preliminary list of devices to be controlled for current AMO/XCS/XPP/CXI designs Some controls are the same or similar to e-beam line EPICS for control as rest of LCLS VME-based IOC’s running RTEM as for e-beam line Large part of motion & vacuum controllers as e-beam line In general use as much as possible controllers already used at SLAC (and for which software is available) PCDS group is integrating those in test-setups to demo and further develop/modify software Additional development work required for new items Test setups are assembled for controllers which have been selected. Infrastructure NEH rack and cable tray layout have been designed, including cooling, power Hutches, control rooms, corridor, basement server room Fiber paths from sector 20, MCC, SCCS to x-ray beam-line close to be established Network Many discussions about security plan, draft document in progress Femto-Second Timing System Fibers are being installed in tunnel & gallery to prepare for test of LBL timing system at SLAC (Test ~January)

11 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 11 Data Sub-System Data Rate/Volume of CXI Experiment (comparable to other LUSI experiments) LCLS Pulse Rep Rate (Hz)120 Detector Size (Megapixel)1.2 Intensity Depth (bit)14 Success Rate (%)30% Ave. Data Rate (Gigabit/s)0.6 Peak Data Rate (Gigabit/s)1.9 Daily Duty Cycle (%)50% Accu. for 1 station (TB/day)3.1 Challenge is to perform data-correction and image processing while keeping up with continuous incoming data-stream Tradeoff between tasks implemented online versus offline Important to produce science output without piling up more and more raw images Use common DAQ among experiments since requirements are similar Difference to conventional experiments: High peak rate & large volume comparable to high-energy physics experiments such as BaBar @ SLAC

12 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 12 Real-time Processing – Sorting in CXI Diffraction from a single molecule: single LCLS pulse noisy diffraction pattern of unknown orientation Combine 10 5 to 10 7 measurements into 3D dataset: Classify/sortAverageAlignment Reconstruct by Oversampling phase retrieval Miao, Hodgson, Sayre, PNAS 98 (2001) unknown orientation Gösta Huldt, Abraham Szöke, Janos Hajdu (J.Struct Biol, 2003 02-ERD-047) The highest achievable resolution is limited by the ability to group patterns of similar orientation Real-time?

13 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 13 Data Acquisition and Processing Overview Acquire 120-Hz Images (e.g. from pixel detectors) Without beam (dedicated calibration runs or in-between ~8-ms spaced beams) Images to be used to calculate calibration constants With beam Images which need to be corrected Event-building with 120-Hz accelerator timing & beam-quality data information Filtering of images (e.g. vetoing) Realtime pixel correction using calibration constants Processing of corrected images Example: Coherent imaging of single molecules Combine 10 5 to 10 7 images into 3-D data-set Learn/pattern recognition/classify/sort images, e.g. 1,000 bins Average bins Alignment Reconstruction ~ 5-Hz real-time display of images

14 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 14 Data Acquisition/Mgmt Architecture Detector Control Node Quick View Rendering Node Disk Arrays/ Controller Tape Drives/ Robots Volume Rendering Node Volume Rendering Cluster ADC FPGA Online Data Server SCCS LUSI 4 x 2.5 Gbit/s fiber Offline Data Servers 2D Detector SLAC LCLS DAQ Box 10–G Ethernet Accelerator 120-Hz Data Exchange & Timing Interface Data system must be able to handle peak rate as well as sustain 120-Hz throughput (corrections, filtering) Highly scalable Very high CPU-memory bandwidth Data system must be able to handle peak rate as well as sustain 120-Hz throughput (corrections, filtering) Highly scalable Very high CPU-memory bandwidth Online Processors Detector- Specific Front- End Electronics Detector Specific Experiment Common

15 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 15 Data Acquisition/Mgmt Architecture Detector Control Node Quick View Rendering Node Disk Arrays/ Controller Tape Drives/ Robots Volume Rendering Node Volume Rendering Cluster ADC FPGA Online Data Server SCCS LUSI 4 x 2.5 Gbit/s fiber Offline Data Servers 2D Detector SLAC LCLS DAQ Box 10–G Ethernet Accelerator 120-Hz Data Exchange & Timing Interface Detector (bump-bonded or integrated) Detector–specific Front-End Electronics (FEE) Local configuration registers State-machine to generate low-level detector readout signals Digitize analog pixel voltages Detector (bump-bonded or integrated) Detector–specific Front-End Electronics (FEE) Local configuration registers State-machine to generate low-level detector readout signals Digitize analog pixel voltages Online Processors Detector- Specific Front- End Electronics (FEE) Detector Specific Experiment Common Organize bits into pixel words Transmit to DAQ system IP core in FPGA for communication Up to 4 x 2.5-Gb/s fibers Organize bits into pixel words Transmit to DAQ system IP core in FPGA for communication Up to 4 x 2.5-Gb/s fibers

16 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 16 Data Acquisition/Mgmt Architecture Detector Control Node Quick View Rendering Node Disk Arrays/ Controller Tape Drives/ Robots Volume Rendering Node Volume Rendering Cluster ADC FPGA Online Data Server SCCS LUSI 4 x 2.5 Gbit/s fiber Offline Data Servers 2D Detector SLAC LCLS DAQ Box 10–G Ethernet Accelerator 120-Hz Data Exchange & Timing Interface Level 1 DAQ nodes are responsible for: Control FEE parameters Receive machine timing signals Send trigger signals to FEE Acquire FEE data Merge FEE data with beam-line data information Level 1 DAQ nodes are responsible for: Control FEE parameters Receive machine timing signals Send trigger signals to FEE Acquire FEE data Merge FEE data with beam-line data information Online Processors Detector- Specific Front- End Electronics (FEE) Detector Specific Experiment Common Low level real time data processing, e.g.: Filtering of images based on beam-line data Pixel correction using calibration constants Send collected data to Level 2 nodes Low level real time data processing, e.g.: Filtering of images based on beam-line data Pixel correction using calibration constants Send collected data to Level 2 nodes Level 1

17 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 17 ATCA-based DAQ ATCA crate Advanced Telecommunication Computing Architecture. High speed serial interconnect/switching SLAC system is based on 10-Gigabit Ethernet backplane serial communication fabric 2 SLAC custom boards Reconfigurable Cluster Element (RCE) Module Up to 8 x 2.5 Gbit/sec links to detector modules On-board processors, 8 Gbyte/sec CPU/memory interface RTEM Cluster Interconnect Module (CIM) Managed 24-port 10-G Ethernet switching Uses FULCRUM switch ASIC One ATCA crate can hold up to 14 RCE’s & 2 CIM’s Essentially 480 Gbit/sec switch capacity Naturally scalable, switch-based architecture ATCA Crate RCE CIM RCE

18 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 18 Status: Data Systems Test setup with 8-GHz waveform sampling module (Acqiris) operational (for TOF) Science data paths concept designed Prototypes of ATCA system in testing with ATCA crate Reconfigurable Cluster Element Module Cluster Interconnect Module Front-end detector example board Including communication between FE board and ATCA Software The plan is to take one system to BNL end of next month

19 Gunther Haller Plenary: X-Ray Controls & Data Systemshaller@slac.stanford.edu LCLS FAC Meeting 29 October 2007 19 Summary Control subsystem based on EPICS, standard at SLAC Selected specific controller candidates for several areas Received several controllers Test setups are being assembled Moving and processing science data is key data-acquisition task AMO data acquisition via 10-bit 8-GHz cPCI waveform sampling modules 2D-Pixel detector acquisition via ATCA SLAC DAQ Modules Peak data rate/volume requirements are comparable to HEP experiments, requiring separate data acquisition and management system Leverage significant expertise at SLAC in data acquisition and management Prototypes of ATCA DAQ modules in hand


Download ppt "Gunther Haller Plenary: X-Ray Controls & Data LCLS FAC Meeting 29 October 2007 1 Experimental Area Controls and Data-Acquisition."

Similar presentations


Ads by Google