Presentation is loading. Please wait.

Presentation is loading. Please wait.

GLAST ADASS London Sept, 2007 R.Dubois1/24 GLAST Large Area Telescope: A Fusion of HEP and Astro Computing Richard Dubois Stanford Linear Accelerator Center.

Similar presentations


Presentation on theme: "GLAST ADASS London Sept, 2007 R.Dubois1/24 GLAST Large Area Telescope: A Fusion of HEP and Astro Computing Richard Dubois Stanford Linear Accelerator Center."— Presentation transcript:

1 GLAST ADASS London Sept, 2007 R.Dubois1/24 GLAST Large Area Telescope: A Fusion of HEP and Astro Computing Richard Dubois Stanford Linear Accelerator Center richard@slac.stanford.edu

2 GLAST ADASS London Sept, 2007 R.Dubois2/24 Outline Introduction to GLAST & LAT A HEP detector in space Code Reuse (Beg, Borrow, Steal) Bulk Processing: turning around a downlink in an hour Data Access: Catalogues and Portals Data and Service Challenges Astrophysics Analysis Summary

3 GLAST ADASS London Sept, 2007 R.Dubois3/24 GLAST Key Features Huge field of view –LAT: 20% of the sky at any instant; in sky survey mode, expose all parts of sky for ~30 minutes every 3 hours. GBM: whole unocculted sky at any time. Huge energy range, including band 10 GeV - 100 GeV Will transform the HE gamma-ray catalog: –by > order of magnitude in # point sources –spatially extended sources –sub-arcmin localizations (source-dependent) Large Area Telescope (LAT) GLAST Burst Monitor (GBM) spacecraft partner: General Dynamics Two GLAST instruments: LAT: 20 MeV – >300 GeV GBM: 10 keV – 25 MeV Launch: Apr 2008. Cape Kennedy 565 km, circular orbit 5-year mission (10-year goal)

4 GLAST ADASS London Sept, 2007 R.Dubois4/24 GN HEASARC GSFC DELTA 7920H White Sands TDRSS SN S & Ku LAT Instrument Science Operations Center GBM Instrument Operations Center GRB Coordinates Network Telemetry 1 kbps Alerts Data, Command Loads Schedules Mission Operations Center (MOC) GLAST Science Support Center GLAST Spacecraft Large Area Telescope & GBM GPS GLAST MISSION ELEMENTS

5 GLAST ADASS London Sept, 2007 R.Dubois5/24 e+e+ e–e–  Overview of LAT Precision Si-strip Tracker (TKR) 18 XY tracking planes. Single-sided silicon strip detectors (228  m pitch) Measure the photon direction; gamma ID. Hodoscopic CsI Calorimeter (CAL) Array of 1536 CsI(Tl) crystals in 8 layers. Measure the photon energy; image the shower. Segmented Anticoincidence Detector (ACD) 89 plastic scintillator tiles. Reject background of charged cosmic rays; segmentation removes self-veto effects at high energy. Electronics System Includes flexible, robust hardware trigger and software filters. Systems work together to identify and measure the flux of cosmic gamma rays with energy 20 MeV - >300 GeV. Calorimeter Tracker ACD [ surrounds 4x4 array of TKR towers]

6 Integrated Observatory in Phoenix, AZ

7 GLAST ADASS London Sept, 2007 R.Dubois7/24 Fusion of HEP & Astro Computing 1 Gev Gamma Incident Gamma e-e- e+e+ Radiated Gammas Note energy flow in direction of incident Gamma ~8.5 Radiation Lengths Full simulation/reconstruction of 1 GeV gamma Event Interpretation “Science Tools” Collection of tools for detection and characterization of gamma-ray sources (point sources and extended sources) source finding max likelihood fitting (binned/unbinned) parameterized instrument response exposure maps comparisons to model (observation sim) GRBs, periodicity searches, light curves Science Tools are FITS/FTOOLS based for dissemination to astro community Data distributed to public by Goddard + full code development environment on linux, windows (mac imminent), code and data distribution, automated code builds, documentation etc etc

8 GLAST ADASS London Sept, 2007 R.Dubois8/24  e+e+ e–e– Instrument Design Considerations Energy range and energy resolution requirements bound the thickness of calorimeter Effective area and PSF requirements drive the converter thicknesses and layout. PSF requirements also drive the sensor performance, layer spacings, and drive the design of the mechanical supports. Field of view sets the aspect ratio (height/width) Time accuracy provided by electronics and intrinsic resolution of the sensors. Electronics Background rejection requirements drive the ACD design (and influence the calorimeter and tracker layouts). Background rejection: Filter out 97% of downlink on the ground Use Classification Trees Effects of Trigger & Onboard Filtering Hardware trigger scheme CPU cycle requirements and throughput data volume per event Segmentation of ACD Relative importance and size of side tiles Rejection efficiency due to gaps and screws Important Design Considerations: Optimized via simulations - Spot Checked in particle beam tests Lateral dimension < 1.8 m Restricts geometric area => FOV Mass < 3000 kg Primarily restricts total depth of the Cal Power Budget 650 W Primarily restricts number of Tracker channels

9 GLAST ADASS London Sept, 2007 R.Dubois9/24 Event Processing Flow event based processing C++ framework provides base class definition & services completely configurable - code loaded at run time when needed Root: object I/O needed for structured data with cross linkages

10 GLAST ADASS London Sept, 2007 R.Dubois10/24 Sim/Recon Toolkit PackageDescriptionProvider ACD, CAL, TKR ReconData reconstructionLAT ACD, CAL, TKR SimInstrument simLAT GEANT4 v8Particle transport simG4 worldwide collaboration xmlParametersWorld standard Root 5C++ object I/OHEP standard GaudiC++ skeletonCERN standard doxygenCode doc toolWorld standard Visual C++/gnuDevelopment envsWorld standards CMT SConsPackage mgmt toolHEP standard ViewCvscvs web viewerWorld standard cvsFile version mgmtWorld standard

11 GLAST ADASS London Sept, 2007 R.Dubois11/24 Data Challenges A progression of data challenges. –DC1 in 2004. 1 simulated week all-sky survey simulation. find the sources, including GRBs a few physics surprises –DC2 in 2006, completed in June. 55 simulated days (1 orbit precession period) of all-sky survey. First generation of LAT source catalogue Added source variability (AGN flares, pulsars). lightcurves and spectral studies. correlations with other wavelengths. add GBM. study detection algorithms. benchmark data processing/volumes/reliability. 200k batch jobs - worked out reliability issues (< 0.1% failure rate now) Data challenges provided excellent testbeds for science analysis software. Full observation, instrument, and data processing simulation. Team uses data and tools to find the science. “Truth” revealed at the end.

12 GLAST ADASS London Sept, 2007 R.Dubois12/24 Post DC: Service Challenge No longer need blind science exercises! Coordinate simulation studies for science and Operations –a common set of simulations plus a near-constant stream of simulations to support special studies. Develop capabilities outside SLAC as needed using collaboration resources. Operations –Simulating first 16 orbits of L&EO –Run them through full LAT ground processing chain –Develop shift procedures and train collaborators Science –Full simulation of 1 orbit-year –Definitive pre-launch dataset for working groups –Expect to require 400 CPU-months to create

13 GLAST ADASS London Sept, 2007 R.Dubois13/24 Service Challenge Eye Candy Pointing with two targets LSI +61 303 GRB Trigger Time 0.02089589834 FirstRA 151.2563276 FirstDEC -38.92002236 First Estimated Error 0.5743404438 nPhot w/ [0,100) MeV 20 nPhot w/ [100,1000) MeV 2 nPhot w/ [1,10) GeV 0 nPhot w/ > 10 GeV 0 Trigger window size 40 EnergyCut -1 GRB Trigger Time 0.02089589834 FirstRA 151.2563276 FirstDEC -38.92002236 First Estimated Error 0.5743404438 nPhot w/ [0,100) MeV 20 nPhot w/ [100,1000) MeV 2 nPhot w/ [1,10) GeV 0 nPhot w/ > 10 GeV 0 Trigger window size 40 EnergyCut -1 Offline Sim of Onboard GRB Filter Alert notice!

14 GLAST ADASS London Sept, 2007 R.Dubois14/24 Pipeline Processing Started with STScI’s OPUS - then rolled our own Features: execute independent tasks keep track of state in db web view/admin of jobs use dataset catalogue (db) to track files expect millions of files! Java/Tomcat, jsp - not GLAST specific

15 GLAST ADASS London Sept, 2007 R.Dubois15/24 The Hardest Task: Downlink Processing Reconstruction Digitization Merge Register Verify Clean Calibration Monitoring Process each downlink before the next arrives: ~100 cores for 1.5 hrs Split input data into ~100 parallel pieces On success: put Humpty back together again Do monitoring

16 GLAST ADASS London Sept, 2007 R.Dubois16/24 Automated Source Monitoring

17 GLAST ADASS London Sept, 2007 R.Dubois17/24 Usage Plots: Activity Summary Many details stored per step in oracle: web displays to track usage and performance

18 GLAST ADASS London Sept, 2007 R.Dubois18/24 Data Portal/Catalog Browsable tree of datasets Events, file size, run range automatically set by “crawler” Access/ Authentification handled by web Meta-data added by creator Supports mirroring at multiple sites

19 GLAST ADASS London Sept, 2007 R.Dubois19/24 Skimmer: Data to the user Can skim any data from catalog –Data available as root and/or fits files Skimmer jobs parallelized using Pipeline –Need xrootd to spread disk load, avoiding individual disk server overload Output available for download for 10 days Access to data will require registration with GLAST member db

20 GLAST ADASS London Sept, 2007 R.Dubois20/24 Computing Resource Projections Providing resources for: flight data, reprocessing, simulations, user analysis Currently: 350 TB disk & 400 cores Add 250 & 400 for 2008 Providing resources for: flight data, reprocessing, simulations, user analysis Currently: 350 TB disk & 400 cores Add 250 & 400 for 2008

21 GLAST ADASS London Sept, 2007 R.Dubois21/24 xrootd Beginning to use xrootd –System developed at SLAC for BABAR to manage large datasets –Distributes files across disks Maximizes throughput Minimizes manual disk management Automates archiving datasets to (and restoring from) tape Provides more reliability and scalability than NFS Supports access control based on GLAST collaborator list query redirector File servers STK tape silo

22 GLAST ADASS London Sept, 2007 R.Dubois22/24 Conforming to HEASARC FTOOLS Agreed from the beginning with Mission that science tools would be jointly developed with (and distributed by) Science Support Center and adhere to FTOOLS standard –Atomic toolkit with FITS files as input/output to a string of applications, controlled by IRAF parameter files –Use scripting language to glue apps together –Very different from the instrument sim/reconstruction code! –Shared code development environment, languages –Caused a certain amount of early tension, having to bifurcate coding styles. People are spanning both worlds now. Select events Create Exposure Map Compute Diffuse Response Do Max Likelihood Fit

23 GLAST ADASS London Sept, 2007 R.Dubois23/24 Gamma Ray Analysis: Model Fitting A scarcity of photons in the GeV range… :-( Must do max likelihood model fitting –Use parametrized instrument response functions for energy, angular resolution and effective area –Tabulated exposures –Computationally intensive for crowded regions of sky HEP approach would be to perform full simulations of the sky using complete knowledge, including correlations, of the instrument performance –It remains to be seen in practice whether this approach is needed or feasible –Note that a recent 2 month orbit full simulation took ~500 CPU- days to perform BUT - that was one elapsed day on the batch farm

24 GLAST ADASS London Sept, 2007 R.Dubois24/24 Summary GLAST Observatory approaching final testing now –Launch in early 2008 LAT use HEP techniques to handle science data stream and produce photon list HEASARC FTOOLS for mainstream astrophysics analysis It remains to be seen whether HEP’s extensive use of simulations will extend into the data taking era –Invaluable pre-launch –Will “error is in the exponent” make the extra analysis precision unnecessary?


Download ppt "GLAST ADASS London Sept, 2007 R.Dubois1/24 GLAST Large Area Telescope: A Fusion of HEP and Astro Computing Richard Dubois Stanford Linear Accelerator Center."

Similar presentations


Ads by Google