Presentation is loading. Please wait.

Presentation is loading. Please wait.

SLAC June 7 2006 1 3 Gpix Camera Camera DAQ/Control System SLAC Program Review T. Schalk CCS team.

Similar presentations


Presentation on theme: "SLAC June 7 2006 1 3 Gpix Camera Camera DAQ/Control System SLAC Program Review T. Schalk CCS team."— Presentation transcript:

1 SLAC June 7 2006 1 3 Gpix Camera Camera DAQ/Control System SLAC Program Review T. Schalk CCS team

2 SLAC June 7 2006 2 LSST Control Systems Observatory Control System Telescope Control System Data Mgmt. Control System Camera Control System Aux. Equip. / Calibration Control System Primary Command Bus Target/Mode Request/Ack. to/from scheduler Status Data Bus Status/Data Bus Facility Database Time \ Date Distribution Data transport Scheduling activities within camera

3 SLAC June 7 2006 3 Camera Assembly Filter in stored location L1 Lens L2 Lens Shutter L1/L2 Housing Camera Base Ring Camera Housing Cryostat outer cylinder Cold Plates L3 Lens in Cryostat front-end flange Raft Tower (Raft with Sensors + FEE) Filter Carousel main bearing Utility Trunk Filter in light path Filter Changer rail paths Focal Plane fast actuators BEE Module

4 SLAC June 7 2006 4 The LSST Focal Plane Guider Sensors (yellow) 3.5 deg FOV Illumination Limit Wavefront Sensors (red) 3.2 Gpixels

5 SLAC June 7 2006 5 Full CCD showing segmentation. Science Data acquisition begins here Read Out from CCD (16*2 *9 ccd’s = 288 a to d’s)

6 SLAC June 7 2006 6 Control is distributed to the local subsystem level where possible, with time critical loops closed at the local level. Subsystem control imbedded in subsystem and communicates with CCS Master/Slave protocol. One camera control system (CCS) module (CCM) is the master and responsible for scheduling tasks and communication with the OCS Coordination via messages between the CCS and its subsystems. No direct subsystem to subsystem communication. Publish/subscribe model. Separate Command Control Bus and data buses. Extensive logging capabilities. Assume need to support engineering and maintenance modes Accommodations made for test-stand(s) support. Design strategy for this system

7 SLAC June 7 2006 7 Camera buses Camera Body Thermal (T5U)Science DAQ (SDS) Guide Analysis (GAS) WF DAQ (WDS) Lens (L2U) Thermal (T3U,T4U) Shutter (SCU) Filters (FCS) Vacuum (VCS) Power/Signal (PSU) Cryostat Thermal (T1U,T2U) FP actuation (FPU) Guide array (GSS) Wave Front (WFS) Science array (SAS) Raft Alignment (RAS) Camera Control (CCS) Command Status Auxiliary systems Control Room Observatory buses Camera Control Architecture

8 SLAC June 7 2006 8 OCS Every arrow has an interface at each end. SASSDS CommandResponse Data DM FCS Red means it’s a CCS group responsibility. CCS Subsystems that produce data. Subsystems that do not produce data (only status info). Similar for WFS/WDS and GSS/GAS (see next slide) Similar for TSS, RAS, SCU, VCS, and L2U Subsystem managers Subsystems mapping to Managers

9 SLAC June 7 2006 9 — The camera’s data are carried on 25 (21?) optical fibers (one per raft) — Data are delivered by the camera to the SDS in 2 seconds. — These fibers carry only data — Data flows only from camera to SDS on these fibers (half duplex) — The fiber protocol is TBD — The data rate from a (fully populated) raft is 281.25 Mbytes/sec (2.25 Gbits/sec) — Total aggregate data (201 CCDs) output rate is 6.432 Gbytes/sec — Data must be carried from camera and delivered to its client (software) interface with a latency of not more then one (1) second. — Interfaces define commodity networking as a MAC layer => trade study CCD transport Design assumptions

10 SLAC June 7 2006 10 Camera specific => Standard I/O CDS Architecture

11 SLAC June 7 2006 11 First detailed designs are for DAQ

12 SLAC June 7 2006 12 RNA Hardware layout a pizza box

13 SLAC June 7 2006 13 Simultaneous DMA to memory for speed

14 SLAC June 7 2006 14 Infrastructure Layer Long-Haul Communications Base to Archive and Archive to Data Centers Networks are 10 gigabits/second protected clear channel fiber optics, with protocols optimized for bulk data transfer Base Facility In Chile,. Nightly Data Pipelines and Products are hosted here on 25 teraflops class supercomputers to provide primary data reduction and transient alert generation in under 60 seconds. Mountain Site In Chile Data acquisition from the Camera Subsystem and the Observatory Control System, with read-out in 2 seconds and data transfer to the Base at 10 gigabits/second. Archive/Data Access Centers In the United States. Nightly Data Pipelines and Data Products and the Science Data Archive are hosted here. Supercomputers capable of 60 teraflops provide analytical processing, re-processing, and community data access via Virtual Observatory interfaces to a 7 petabytes/year archive. The data archive will grow at a rate of roughly 7 PB/yr.

15 SLAC June 7 2006 15 Application Layer Data Acquisition Infrastructure Image Processing Pipeline Detection Pipeline Association Pipeline Image Archive Source Catalog Object Catalog Alert Archive Deep Detect Pipeline Deep Object Catalog VO Compliant Interface Middleware Classification Pipeline Moving Object Pipeline Calibration Pipeline End User Tools Alert Processing Eng/Fac Data Archive Common Pipeline Components 2.5.1.1 Nightly Pipelines and Data Products Nightly Pipelines are executed and Data Products are produced within 60 seconds of the second exposure of each visit. 2.5.1.2 Science Data Archive These pipelines are executed on a slower cadence and the corresponding data products are those that require extensive computation and many observations for their production.

16 SLAC June 7 2006 16 Data Management Organization Team is headquartered at LSST Corporation, Tucson –Project Manager, Project Scientist, Software Engineers R&D Team is creating the MREFC, DOE proposals Construction Team will be a Tucson-based management/functional team, with a small number of single- location outsourced implementation teams (e.g. NCSA, IPAC) Application LayerMiddleware LayerInfrastructure Layer Caltech IPAC - Application architecture GMU, LLNL - Community Science scenarios NOAO - Lensed Supernovae, Non-moving transients, Photometry Princeton U - Image Processing, Galaxy Photometry U Arizona - Image Processing, Moving Objects, Association, Photometry UC Davis - Deep Detection, Shape Parameters U Pittsburgh/CMU - Photo Z, Moving Objects U Washington - Image Processing, Detection, Classification USNO - Astrometry SLAC, JHU - Database Schema/Indexing, Provenance, Performance/Scalability (ingest/query) LLNL, UCB - Database/Pipeline integration, Pipeline Construction, Alerting NCSA - Archive Data Access, Pipeline Control & Management, Security NOAO - Community Data Access/Virtual Observatory SDSC - Data Product Preservation SLAC - Data Acquisition, Mountain/Base Communications LLNL - Base Pipeline Server, Data Base Server NCSA, BNL - Archive Center/ Data Center Pipeline Servers, File Servers, Data Access Servers, Storage, Communications NOAO - Base to Archive Communications

17 SLAC June 7 2006 17

18 SLAC June 7 2006 18 ACRONYMS !! CCS camera control system CCM camera control master/module OCS Observatory control system TCS telescope control system DM LSST data manage system SAS Science array system SDS Science array DAQ system RNA Raft network adapter SCU Sample Correction Unit WFS Wave front system WDS Wave front data system GSS Guide sensor system GAS Guide sensor Acquisition system DSP digital signal processor FPU Focal Plane actuation TSS thermal control system RAS Raft alignment system SCU Shutter control system FCS Filter control system VCS vacuum control system L2U L2 actuation system UML Unified Modeling Language MAC layer medium access control (MAC) Layer, which provides a variety of functions that support the operation of local area networking FPGA Field-Programmable Gate Array DMA direct memory access MGT Multi-Gigabit Transceivers IBA InfiniBand Architecture SDR single data rate


Download ppt "SLAC June 7 2006 1 3 Gpix Camera Camera DAQ/Control System SLAC Program Review T. Schalk CCS team."

Similar presentations


Ads by Google