CMB & LSS Virtual Research Community Marcos López-Caniego Enrique Martínez Isabel Campos Jesús Marco Instituto de Física de Cantabria (CSIC-UC) EGI Community.

Slides:



Advertisements
Similar presentations
INAF experience in Grid projects C. Vuerli, G. Taffoni, V. Manna, A. Barisani, F. Pasian INAF – Trieste.
Advertisements

Challenges for Interactive Grids a point of view from Int.Eu.Grid project Remote Instrumentation Services in Grid Environment RISGE BoF Manchester 8th.
Conference xxx - August 2003 Fabrizio Gagliardi EDG Project Leader and EGEE designated Project Director Position paper Delivery of industrial-strength.
© 2006 Open Grid Forum The Astro Community and DCIs in Europe and the role of Astro-CG C. Vuerli - INAF.
INAF experience in Grid projects F. Pasian INAF. Wed 17 May GRID.IT Project The GRID.IT Project The GRID.IT Project –Application 1 Accessing Databases.
LinkSCEEM-2: A computational resource for the Eastern Mediterranean.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
Introduction to Grids and Grid applications Gergely Sipos MTA SZTAKI
FP6−2004−Infrastructures−6-SSA [ Empowering e Science across the Mediterranean ] Grids and their role towards development F. Ruggieri – INFN (EUMEDGRID.
LinkSCEEM-2: A computational resource for the development of Computational Sciences in the Eastern Mediterranean Mostafa Zoubi SESAME SESAME – LinkSCEEM.
GRID Activities at ESAC Science Archives and Computer Engineering Unit Science Operations Department ESA/ESAC – Madrid, Spain.
 Amazon Web Services announced the launch of Cluster Compute Instances for Amazon EC2.  Which aims to provide high-bandwidth, low- latency instances.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
Assessment of Core Services provided to USLHC by OSG.
© 2008 by M. Stümpert, A. Garcia; made available under the EPL v1.0 | Access the power of Grids with Eclipse Mathias Stümpert (Karlsruhe Institute.
SICSA student induction day, 2009Slide 1 Social Simulation Tutorial Session 6: Introduction to grids and cloud computing International Symposium on Grid.
What is EGI? The European Grid Infrastructure enables access to computing resources for European scientists from all fields of science, from Physics to.
EGI-InSPIRE Steven Newhouse Interim EGI.eu Director EGI-InSPIRE Project Director.
EGEE-III INFSO-RI Enabling Grids for E-sciencE Lessons learnt from the EGEE Application Porting Support activity Gergely Sipos Coordinator.
1 Web: Steve Brewer: Web: EGI Science Gateways Initiative.
Astro-WISE & Grid Fokke Dijkstra – Donald Smits Centre for Information Technology Andrey Belikov – OmegaCEN, Kapteyn institute University of Groningen.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Enabling Grids for E-sciencE 1 EGEE III Project Prof.dr. Doina Banciu ICI Bucharest GRID activities in RoGrid Consortium.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
(D)CI related activities at IFCA Marcos López-Caniego Instituto de Física de Cantabria (CSIC-UC) Astro VRC Workshop Paris Nov 7th 2011.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
CDF computing in the GRID framework in Santander
EGI-InSPIRE Steven Newhouse Interim EGI.eu Director EGI-InSPIRE Project Director Technical Director EGEE-III 1GDB - December 2009.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks NA3 work in the SWE Federation Antonio Fuentes.
BalticGrid-II Project EGEE’09, Barcelona1 GRID infrastructure for astrophysical applications in Lithuania Gražina Tautvaišienė and Šarūnas Mikolaitis Institute.
7 September 2007 AEGIS 2007 Annual Assembly Current Status of Serbian eInfrastructure: AEGIS, SEE-GRID-2, EGEE-II Antun Balaz SCL, Institute of Physics,
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Robin McConnell NA3 Activity Manager 28.
Grid Activities in Portugal Gonçalo Borges Jornadas LIP 2010 Braga, Janeiro 2010.
SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
Dr. Isabel Campos Plasencia (IFCA-CSIC) Spanish NGI Coordinator ES-GRID The Spanish National Grid Initiative.
Support to MPI and interactivity on gLite infrastructures EGEE’07 Budapest, 4th Oct 2007.
Grupo de Redes y Computación de Altas Prestaciones Vicente Hernández Universidad Politécnica de Valencia Primera Reunión sobre e-Ciencia Andaluza (proyecto.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI strategy and Grand Vision Ludek Matyska EGI Council Chair EGI InSPIRE.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Spanish National Research Council- CSIC Isabel.
1 st EGI CTA VT meeting 18 January 2013 C. Vuerli (INAF, Italy), N. Neyroud (CNRS/IN2P3/LAPP, France)
EGI-InSPIRE RI EGI Community Forum 2012 EGI-InSPIRE EGI-InSPIRE RI EGI Community Forum 2012 Kepler Workflow Manager.
E-science grid facility for Europe and Latin America CHAIN Proposal v0.1 EELA-2 compilation CERN e-Infrastructure projects Meeting ( )
Advanced User Support in the Swedish National HPC Infrastructure May 13, 2013NeIC Workshop: Center Operations best practices.
Resource Provisioning EGI_DS WP3 consolidation workshop, CERN Fotis Karayannis, GRNET.
Activities of the Spanish National Grid Initiative.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks NA2 report for CSIC Gabriel Amorós (CSIC-IFIC)
EGEE is a project funded by the European Union under contract IST Compchem VO's user support EGEE Workshop for VOs Karlsruhe (Germany) March.
SAFE SSCs for A&A, Fusion and ES Coordinator: Claudio Vuerli, INAF, Italy.
RI EGI-InSPIRE RI Astronomy and Astrophysics Dr. Giuliano Taffoni Dr. Claudio Vuerli.
Instituto de Biocomputación y Física de Sistemas Complejos Cloud resources and BIFI activities in JRA2 Reunión JRU Española.
A worldwide e-Infrastructure and Virtual Research Community for NMR and structural biology Alexandre M.J.J. Bonvin Project coordinator Bijvoet Center for.
Euclid Big Data Opportunities Tom Kitching (UCL MSSL) – Euclid Science Lead.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Constraints on primordial non-Gaussianity.
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
Astronomy and Astrophysics Cluster in EGEE-III C. Vuerli ( 1,2 ), G. Taffoni ( 1,2 ), F. Pasian ( 1,2 ), M. Sponza ( 2 ), U. Becciani ( 3 ), S. Cassisi.
1 The Life-Science Grid Community Tristan Glatard 1 1 Creatis, CNRS, INSERM, Université de Lyon, France The Spanish Network for e-Science 2/12/2010.
Accessing the VI-SEEM infrastructure
Tools and Services Workshop
Joslynn Lee – Data Science Educator
EGI Blueprint or Do not be afraid of EGI
Dr. Rüdiger Berlich / Dr. Torsten Antoni
Grid infrastructure development: current state
Long-term Grid Sustainability
NA3: User Community Support Team
Connecting the European Grid Infrastructure to Research Communities
EGI – Organisation overview and outreach
EGI Webinar - Introduction -
Expand portfolio of EGI services
Presentation transcript:

CMB & LSS Virtual Research Community Marcos López-Caniego Enrique Martínez Isabel Campos Jesús Marco Instituto de Física de Cantabria (CSIC-UC) EGI Community Forum Garching March

In the last decade the computation infrastructures available to astronomers across Europe have grown exponentially and most european countries nowadays have modern facilities to support the needs of their researchers. In many cases one could even say that the amount of resources is growing faster that the ability of the scientists to use them. The reason for this are simple:  the use of HPC, GRID, Cloud, etc. is not always straightforward and the learning curves can vary significantly depending on the complexity of the workflows.  Once people get used to one of these infrastructures, in general HPC, it is common that they don’t want to dedicate the time and effort to port their applications to another, the GRID, for example, even if it is more suitable for the kind of analysis that they do.  Part of this problem could be solved with more training activities as well as more efficient middleware to manage the workflows. Under the umbrella of the EGEE project many research groups in Europe got involved in GRID related activities to take advantage of the vast amount of resources that had been deployed.

This was the case of the Observational Cosmology and Instrumentation group at IFCA. We ported several applications to the GRID and used it for various analysis related to ESA’s Planck mission analysis. In our case, the success was directly related to the fact that in our Institute we had the support from the people in the Advanced Computing and e-Science group that were, and still are, deeply involved in GRID technology and run and maintain part of the GRID-CSIC, the largest GRID infrastructure in Spain. Things have evolved since the times of EGEE and now we have EGI coordinating the GRID activities at the European level and the NGI’s coordinating and supporting the GRID activities at the national level. But even though the infrastructures are run by the national organizations, the research groups are every day more and more international. In addition, research groups in astronomy don’t use computing infrastructures in the same way. In fact, some groups only use HPC, other only use GRID, etc. Moreover, the applications and their requirements can be totally different from one field of astronomy to another. Therefore it is difficult to run and support a VRC for the whole A&A when the use cases and problematic are very different from one sub- community to another, even within astronomy.

For this reason we would like to propose the creation of a VRC that could agglutinate research communities working in the fields of Cosmic Microwave Background and Large Scale Structure of the Universe across Europe. At IFCA we have the necessary infrastructure, manpower and expertise to support the creation of this CMB/LSS VRC. We are part of Spanish e-Science Network, are involved in the activities of the National Grid Initiative and, as it was mentioned above, run and maintain a part of the GRID-CSIC cluster, with roughly 1400 cores, 3TB RAM and 14 TB of storage at IFCA. And these numbers will increase soon as part of a planned and continuous growth of the infrastructure. From the scientific point of view we are part of the Planck and QUIJOTE collaborations working on CMB science and part of the J-Pass Survey devoted to LSS structure studies, among others. We are also involved in state of the art cosmological LSS simulations in collaboration with colleagues in Madrid, Potsdam and Sussex.

CPU hoursRAMStorageCPU hoursRAMStorage Component Separation SZ Cluster detection PS detection T and P PS Detection Bayesian Non Gaussianity Large Scale Structure simulations Total GB50 TB GB48TB Last 12 monthsNext 6 months Summary of the typical amount of resources used by the members of our group for CMB/LSS studies CMB LSS

Trying to respond to the questions posed by Claudio: 1 – This VRC aggregates observational and theoretical cosmologists involved in CMB and Large Scale Structure studies. At this stage we have not quantified the amount of potential users, but this is a very active field that involves complex and extensive computing analysis and, therefore, we suspect that the number of people willing to participate could be large. Regarding the amount of resources brought into the VRC, above we already listed the GRID-CSIC resources available at IFCA, part of which could be shared for members of the VRC. If we consider the whole GRID-CSIC infrastructure in Spain, we have access to as much as four times the amount of resources available at IFCA, not to mention other HPC infrastructures such as Altamira and the Barcelona Supercomputing Center. 2 – At this stage we can not list the services needed by our VRC that EGI already offers or that our VRC could contribute. 3 – Current status for setting up the new VRC: we are at a very early stage, identifying research groups across Europe that could be interested in joining the VRC. We only had informal contacts with people in Spain and Germany, but based on the feedback obtained in the EGI meeting in Amsterdam and the feedback from this meeting we plan to start contacting the people that have shown an interest in creating this VRC, in particular our colleagues in Potsdam. At this stage we have not established any direct contact with EGI, only through Claudio.