C herenkov Telescope Array Dr. Giovanni Lamanna CNRS tenured research scientist (CTA-LAPP team leader and CTA Computing Grid project coordinator) LAPP.

Slides:



Advertisements
Similar presentations
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Why Grids Matter to Europe Bob Jones EGEE.
Advertisements

CBPF J. Magnin LAFEX-CBPF. Outline What is the GRID ? Why GRID at CBPF ? What are our needs ? Status of GRID at CBPF.
1 The EU e-Infrastructure Programme EUROCRIS Conference, Brussels, 17 th July 2007 Mário Campolargo European Commission - DG INFSO Head of Unit “GÉANT.
Towards a Virtual European Supercomputing Infrastructure Vision & issues Sanzio Bassini
Institute for High Energy Physics ( ) NEC’2007 Varna, Bulgaria, September Activities of IHEP in LCG/EGEE.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
1 1 st Virtual Forum on Global Research Communities Brussels, 12 th July 2007 Mário Campolargo European Commission - DG INFSO Head of Unit “GÉANT and e-Infrastructures”
LIGO- GXXXXXX-XX-X Advanced LIGO Data & Computing Material for the breakout session NSF review of the Advanced LIGO proposal Albert Lazzarini Caltech,
Networking and GRID Infrastructure in Georgia Prof. Ramaz Kvatadze Executive Director Georgian Research and Educational Networking Association - GRENA.
1 Deployment of an LCG Infrastructure in Australia How-To Setup the LCG Grid Middleware – A beginner's perspective Marco La Rosa
Hall D Online Data Acquisition CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental forces of nature. 75.
Hunt for Molecules, Paris, 2005-Sep-20 Software Development for ALMA Robert LUCAS IRAM Grenoble France.
GRACE Project IST EGAAP meeting – Den Haag, 25/11/2004 Giuseppe Sisto – Telecom Italia Lab.
1 e-Infrastructures: the European Perspective on Scientific Data Carlos Morais Pires INFSO Directorate F Unit F3 “The views expressed in this presentation.
José M. Hernández CIEMAT Grid Computing in the Experiment at LHC Jornada de usuarios de Infraestructuras Grid January 2012, CIEMAT, Madrid.
3 rd DPHEP Workshop CERN, 7-8 December 2009 G. LAMANNA CTA C herenkov Telescope Array Giovanni Lamanna LAPP - Laboratoire d'Annecy-le-Vieux de Physique.
France Grilles: perspectives Réunion F2F du conseil de groupement – 13 Juin 2012 V. Breton La grandeur des actions humaines se mesure à l’inspiration qui.
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
CERN IT Department CH-1211 Genève 23 Switzerland t Internet Services Job Monitoring for the LHC experiments Irina Sidorova (CERN, JINR) on.
The HEAP-MM proposal P. Piattelli VLVnT The call  INFRA Research Infrastructures for astroparticle physics: High energy cosmic rays,
OOI CI LCA REVIEW August 2010 Ocean Observatories Initiative OOI Cyberinfrastructure Architecture Overview Michael Meisinger Life Cycle Architecture Review.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
CRISP & SKA WP19 Status. Overview Staffing SKA Preconstruction phase Tiered Data Delivery Infrastructure Prototype deployment.
DARIAH Rutger Kramer Software Development Coordinator DANS – KNAW, The Hague, NL EGEE09.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Grid Computing Status Report Jeff Templon PDP Group, NIKHEF NIKHEF Scientific Advisory Committee 20 May 2005.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
The ILC And the Grid Andreas Gellrich DESY LCWS2007 DESY, Hamburg, Germany
Tier-2  Data Analysis  MC simulation  Import data from Tier-1 and export MC data CMS GRID COMPUTING AT THE SPANISH TIER-1 AND TIER-2 SITES P. Garcia-Abia.
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
Research Networks and Astronomy Richard Schilizzi Joint Institute for VLBI in Europe
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks The use of grids for research infrastructures.
ESFRI & e-Infrastructure Collaborations, EGEE’09 Krzysztof Wrona September 21 st, 2009 European XFEL.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
Lofar Information System on GRID A.N.Belikov. Lofar Long Term Archive Prototypes: EGEE Astro-WISE Requirements to data storage Tiers Astro-WISE adaptation.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
DIRAC 4 EGI: Report on the experience R.G. 1,3 & A.Tsaregorodtsev 2,3 1 Universitat de Barcelona 2 Centre de Physique des Particules de Marseille 3 DIRAC.
Dr. Isabel Campos Plasencia (IFCA-CSIC) Spanish NGI Coordinator ES-GRID The Spanish National Grid Initiative.
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
VLDATA Common solution for the (very-)large data challenge EINFRA-1, focus on topics (4) & (5)
1 Cherenkov Telescope Array: a production system prototype L. Arrabito 1 C. Barbier 2, J. Bregeon 1, A. Haupt 3, N. Neyroud 2 for the CTA Consortium 1.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI strategy and Grand Vision Ludek Matyska EGI Council Chair EGI InSPIRE.
"The views expressed in this presentation are those of the author and do not necessarily reflect the views of the European Commission" Global reach of.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Software Licensing in the EGEE Grid infrastructure.
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
Breaking the frontiers of the Grid R. Graciani EGI TF 2012.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
INFSO-RI Enabling Grids for E-sciencE GILDA t-Infrastructure Antonio Fuentes Bermejo
First South Africa Grid Training June 2008, Catania (Italy) GILDA t-Infrastructure Valeria Ardizzone INFN Catania.
A Computing Tier 2 Node Eric Fede – LAPP/IN2P3. 2 Eric Fede – 1st Chinese-French Workshop Plan What is a Tier 2 –Context and definition To be a Tier 2.
ALICE Physics Data Challenge ’05 and LCG Service Challenge 3 Latchezar Betev / ALICE Geneva, 6 April 2005 LCG Storage Management Workshop.
RI EGI-InSPIRE RI Astronomy and Astrophysics Dr. Giuliano Taffoni Dr. Claudio Vuerli.
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
WP5 – Infrastructure Operations Test and Production Infrastructures StratusLab kick-off meeting June 2010, Orsay, France GRNET.
Activities and Perspectives at Armenian Grid site The 6th International Conference "Distributed Computing and Grid- technologies in Science and Education"
Science Gateway and Single Sign-On technology study for the Cherenkov Telescope Array C. Vuerli (INAF), Giovanni Lamanna (LAPP/IN2P3/CNRS), Nadine Neyroud.
DIRAC Distributed Computing Services A. Tsaregorodtsev, CPPM-IN2P3-CNRS FCPPL Meeting, 29 March 2013, Nanjing.
InSilicoLab – Grid Environment for Supporting Numerical Experiments in Chemistry Joanna Kocot, Daniel Harężlak, Klemens Noga, Mariusz Sterzel, Tomasz Szepieniec.
Page : 1 SC2004 Pittsburgh, November 12, 2004 DEISA : integrating HPC infrastructures in Europe DEISA : integrating HPC infrastructures in Europe Victor.
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
1 The Life-Science Grid Community Tristan Glatard 1 1 Creatis, CNRS, INSERM, Université de Lyon, France The Spanish Network for e-Science 2/12/2010.
The CTA Computing Grid Project Cecile Barbier, Nukri Komin, Sabine Elles, Giovanni Lamanna, LAPP, CNRS/IN2P3 Annecy-le-Vieux Cecile Barbier - Nukri Komin1EGI.
CTA Cerenkov Telescope Array G. Lamanna (1), C. Vuerli (2) (1) CNRS/IN2P3/LAPP, Annecy (FR) (2) INAF – Istituto Nazionale di Astrofisica, Trieste (IT)
Bob Jones EGEE Technical Director
Real IBM C exam questions and answers
ExaO: Software Defined Data Distribution for Exascale Sciences
GRIF : an EGEE site in Paris Region
Presentation transcript:

C herenkov Telescope Array Dr. Giovanni Lamanna CNRS tenured research scientist (CTA-LAPP team leader and CTA Computing Grid project coordinator) LAPP - Laboratoire d'Annecy-le-Vieux de Physique des Particules, Université de Savoie, CNRS/IN2P3, Annecy-le-vieux, France EGEE09 for the CTA consortium

CTA in a nutshell CTA is a new project for ground based gamma-ray astronomy planned to consist of several tens of Imaging Atmospheric Cherenkov Telescopes (IACTs). CTA is an Astroparticle research infrastructure aimed to work as an observatory providing services making gamma-ray astronomy accessible to the entire community. The CTA international consortium is currently committed in a Design Study phase. (More details in the poster in the exhibition hall) (More details in the poster in the exhibition hall) EGEE09 2 sites (North and South)

ICT vision EGEE09 The CTA OBSERVATORY main sub-systems: Science Operation Center, organization of observations; Array Operation Center, the on-site service; Data Center, is the place for software development, data analysis, data reduction, data archiving and data dissemination to observers Existing ICT-based infrastructures, such as EGEE and GEANT, are potential solutions to provide the CTA observatory with best use of e-infrastructures. This possibility is studied within a dedicated sub-project of the CTA Design Study: the CTA Computing Grid ( CTACG ) project.

ICT vision (EGEE application) CTACG-VISION: application of EGEE and GEANT e-infrastructures for such issues through a dedicated global CTA EGEE Virtual Organization. CTACG-VISION: application of EGEE and GEANT e-infrastructures for such issues through a dedicated global CTA EGEE Virtual Organization. EGEE positive experiences to fulfill MC requirements (> 400 TB SE, > 300 CPUs CE,..) EGEE positive experiences to fulfill MC requirements (> 400 TB SE, > 300 CPUs CE,..) The main issues inherent to the observatory work flow: - Monte Carlo simulations (production and analysis) - Monte Carlo simulations (production and analysis) - Data flow, data transfer and storage - Data flow, data transfer and storage - Data reduction, data analysis and open access - Data reduction, data analysis and open access

ICT requirements and ( a possible CTACG) ICT architecture DATA Total data volume from CTA in the range 1 to 3 PB per year. - Storage needs: temporary storage on site (Array-Operations-Center). Permanent storage in Europe and for different classes of data and applications (e.g. calibrations, MC, Sci. Data). - Storage needs: temporary storage on site (Array-Operations-Center). Permanent storage in Europe and for different classes of data and applications (e.g. calibrations, MC, Sci. Data). Ultimate products are digested scientific data files (e.g. FITS format) requiring limited storage. Ultimate products are digested scientific data files (e.g. FITS format) requiring limited storage. - CTACG solution : - CTACG solution : Distributed storage among the EGEE-CTA-VO Work-Nodes: data-type and applications as a function of WNs capacity. FITS files are uploaded in the distributed Virtual Observatory Astrophysics data base (through the EGEE-VO framework). - - Compute needs: CE on site (Array-Operations-Center) for acquisition, preliminary calibration and monitoring. No particular computing needs for the off-line. - Compute needs: CE on site (Array-Operations-Center) for acquisition, preliminary calibration and monitoring. No particular computing needs for the off-line. - CTACG solution : - CTACG solution : Tier1 and Tier2 LCG clusters (e.g. CCIN2P3, LAPP, CYFRONET, PIC,..) are already used in support of the EGEE-CTA-VO and suited as learned by experiences with HESS, MAGIC and the CTA-VO. The Array-Operations-Center will be one more VO-Work Nodes. - Networking : candidate sites under study to host CTA: South of Africa, Argentina, Chile, Canary Isl., Mexico, Hawai.. - Networking : candidate sites under study to host CTA: South of Africa, Argentina, Chile, Canary Isl., Mexico, Hawai.. The traditional approach (e.g. current IACTs H.E.S.S. and MAGIC) data transport, through magnetic tapes is not applicable to CTA ( technology aged, expensive and unpractical). The traditional approach (e.g. current IACTs H.E.S.S. and MAGIC) data transport, through magnetic tapes is not applicable to CTA ( technology aged, expensive and unpractical). The pan-European end-to-end networking is a critical issue. The pan-European end-to-end networking is a critical issue. - Data Access: to be guaranteed to a worldwide community and users of different level of experience with different requirements: Networking is again an issue; Concurrent applications to be managed; CTA scientific gateway is required; Secure but easy access at all levels. - Data Access: to be guaranteed to a worldwide community and users of different level of experience with different requirements: Networking is again an issue; Concurrent applications to be managed; CTA scientific gateway is required; Secure but easy access at all levels.

ICT requirements and ( a possible CTACG) ICT architecture - Networking : candidate sites under study to host CTA are in South of Africa, Argentina, Chile, Canary Isl., Mexico, Hawaii.. The pan-European end-to-end networking is a critical issue. The traditional approach (e.g. current IACTs H.E.S.S. and MAGIC) data transport, through magnetic tapes is not applicable to CTA ( technology aged, expensive and unpractical). The traditional approach (e.g. current IACTs H.E.S.S. and MAGIC) data transport, through magnetic tapes is not applicable to CTA ( technology aged, expensive and unpractical). - CTACG solution : - CTACG solution : The (Virtual-Distributed-)Data-Center and the Array-Operations-Center (VO-Work Nodes) demand a dedicated broad band connection (> Gb/s). GEANT e-infrastructures should be considered in the choice of the site. (Example: South Africa Grid (SAGrid) initiative a high-speed submarine West African Cable System (WACS) (to reach Europe) is due to come into operation between 2009 and ) (Remind: Data transfer delay acceptable. IACTs has limited duty-cycle (night).)

ICT requirements and ( a possible CTACG) ICT architecture -Data Access: to be guaranteed to a worldwide community and users of different level of experience with different requirements: -Networking is again an issue; -Networking is again an issue; -Concurrent applications to be managed; -Concurrent applications to be managed; -Secure but easy access at all levels. -Secure but easy access at all levels. - CTACG solution : - CTACG solution : -Pan-European GEANT networking and/or optimization of communication among main Work Nodes of the EGEE-VO-CTA is straightforward. - A dedicated EGEE-VO-CTA Grid Operation Center minimize costs and optimize performance trough a dedicated grid-scientific gateway (e.g. the CTACG MC-dashboard) for grid jobs administration as well as all levels of data analysis (including natural interface to Virtual Observatory for data dissemination). - The gateway is secure by means of grid certificate for VO-users, but providing grid job on demand (generic proxy to not-certified users) will be open to everybody.

Technical challenges: Technical challenges: - - network and synchronization between Array Center and (Distributed-)Data Center - Scientific Grid-Gateway open to and serving all level of observers Political challenges: Political challenges: - - convince part of the community (not familiar with GRID and EC e-infrastructures initiatives as EGI and GEANT) about the validity of the solution proposed with the CTACG approach. Suggestions (and expected feedback from this meeting): Suggestions (and expected feedback from this meeting): - The CTACG project needs direct contacts and guide-lines from EC, EGI and GEANT to better plan its feasibility study and to find support to complete it. EGEE09 Challenges and Suggestions

EGEE09 CTA Current Status & Timeline Status: Status: - - The EGEE-CTA-VO is active for the MC production and analysis purposes and has just started the development of the gateway (“dashboard”). - The CTACG-architecture feasibility study will last until The CTA Observatory construction is starting in 2013 and completed in 2018; Running with partial configuration already in Site exploration Array layout Telescope design Component prototypes Array prototype Array construction Partial operation