Presentation is loading. Please wait.

Presentation is loading. Please wait.

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe.

Similar presentations


Presentation on theme: "Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe."— Presentation transcript:

1 Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe C. Bigongiari, INFN Padua A. deAngelis, M. Paraccini, A. Forti, University of Udine M. Delfino, PIC Barcelona M. Mazzucato, CNAF Bologna For the MAGIC Collaboration A distributed Grid-based analysis system for the MAGIC telescope Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft

2 Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken2 Outline What kind of MAGIC? Telescope requirements The basic architecture of distributed system First results the future

3 Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken3 MAGIC - introduction MAGIC Telescope Ground based Air Cerenkov Telescope LaPalma, Canary Islands ( 28° North, 18° West ) 17 m diameter operation since autumn 2003 (still in commissioning) Collaborators: IFAE Barcelona, UAB Barcelona, Humboldt U. Berlin, UC Davis, U. Lodz, UC Madrid, MPI München, INFN / U. Padova, U. Potchefstrom, INFN / U. Siena, Tuorla Observatory, INFN / U. Udine, U. Würzburg, Yerevan Physics Inst., ETH Zürich

4 Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken4 MAGIC – physics goals AGN Active Galactic NucleiSupernova RemnantsUnidentified EGRET sources (Mkn 501, Mkn 421) Gamma Ray Bursts etc.. Crab nebular by Chandra

5 Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken5 MAGIC – ground based γ-ray astronomy 1.Primary particle creates an extensive air shower 2.Secondary particles produce cerenkov light 3.Telescope collects the Cerenkov light to the focal plan 4.The camera system DAQ records the showers Gamma ray flux is low  huge collection area is required  only ground based observations possible The cosmic rays consist mainly of hadronic primaries. A gamma/hadron separation based on MC simulations is needed.

6 Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken6 MAGIC – the camera 577 Photomultiplier tubes ~ 3.5° FOV Read out with a 300 MHz FADC system Readout is based on multi level triggers Gamma ProtonMuon

7 Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken7 MAGIC – Monte Carlo simulations Based on the air shower simulation program CORSIKA Simulation of hadronic background is very CPU consuming – to simulate the background of one night, 70 CPUs (P4 2GHz) needs to run 19200 days – to simulate the gamma events of one night for a Crab like source takes 288 days. – The detector/atmosphere is volatile. A good production strategy is needed.

8 Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken8 MAGIC - summary European collaboration In operation since 1 year, still in commissioning phase parameters: – field of view 3.5 o – current trigger threshold ~ 50 GeV – current trigger rate ~ 200 Hz (typical) – current data volume / hour ~ 1.2 -2.0 GB Intensive Monte Carlo studies needed to lower the threshold A second telescope is already funded.

9 Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken9 MAGIC - summary European collaboration In operation since 1 year, but still in commissioning phase parameters: – field of view 3.5 o – current trigger threshold ~ 50 GeV – current trigger rate ~ 200 Hz (typical) – current data volume / hour ~ 1.2 -2.0 GB Intensive Monte Carlo studies needed to lower the threshold A second telescope is already funded. Crab (March 19 + 22) Mkn 421 (April 2004)

10 Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken10 Requirements Storage/amount of data: real data: > 7 TB per year MC data: > 10 TB per year Easy access to data Computation: MC today: 100 CPUs need for > 1000 CPU Analysis: today: 40 CPUs need for > 200 new methods in future General: Distributed Collaborators in whole Europe Secure restricted access Accessiblity easy access via a Portal easy access to data Availablity 24x7 Scalability MAGIC II is coming collaboration with other ACT

11 Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken11 Basic Architecture MAGIC backbone: three computing centers CNAF (Italy) PIC (Spain) FZK (Germany) + LaPalma run main services data storage job scheduling Portal collaborators „plugin“ their resources in the backbone

12 Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken12 Basic architecture II Metadata Catalog needed to classify the observation data Usage of existing components: based on LCG 2 CE/SE/UI Resource Broker Replica location service CrossGrid Portal solution Migrating Desktop Roaming Access Server JSS (Job Submission Service) Meta Data Catalogs select real data with astronomical sources select mc data with input parameters Implementation of services will start with the developments for MonteCarlo production

13 Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken13 Run and control simulations in the distributed system -Driven by three use cases - Submit jobs - Monitor jobs - Manage data -Easy to use GUI -Hide LCG commands -Java swing GUI -Job Monitoring and Data Management based on a dedicated database MAGIC Grid Simulation Tool

14 Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken14 MAGIC @ CrossGrid Used for first implementation steps LCG-2 testbed with extentions from CrossGrid 16 sites in Europe First tests done 200 jobs submitted 10% failed due to problems with setup (replica location service) Easy resubmit due to the MAGIC run database

15 Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken15 MAGIC and EGEE EGEE is the biggest Grid project in Europe MAGIC applied as a generic application (NA4) Proposal was accepted in June/July Collaboration with EGEE started since august First Results: - Virtuelle Organisation VO-MAGIC set up at NIKHEF/SARA - Usage of GILDA testbed (Cantania) agreed - Integration of first site of MAGIC collaborators planned for the end of the year

16 Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken16 Conclusion & the Future The MAGIC Grid is a good example for a distributed simulation and analysis system can be used to exploit the existing Grid middleware is on the way and first results can be seen is a prototype for the astroparticle physics can help to build collaboration with other experiments (ACT, satellites, Optical telescope,..)


Download ppt "Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe."

Similar presentations


Ads by Google