DIRAC Distributed Computing Services A. Tsaregorodtsev, CPPM-IN2P3-CNRS FCPPL Meeting, 29 March 2013, Nanjing.

Slides:



Advertisements
Similar presentations
1 Chapter 11: Data Centre Administration Objectives Data Centre Structure Data Centre Structure Data Centre Administration Data Centre Administration Data.
Advertisements

P-GRADE and WS-PGRADE portals supporting desktop grids and clouds Peter Kacsuk MTA SZTAKI
Introduction to Grids and Grid applications Gergely Sipos MTA SZTAKI
DIRAC Distributed Computing Services
Monitoring in DIRAC environment for the BES-III experiment Presented by Igor Pelevanyuk Authors: Sergey BELOV, Igor PELEVANYUK, Alexander UZHINSKIY, Alexey.
1 Bridging Clouds with CernVM: ATLAS/PanDA example Wenjing Wu
Sergey Belov, Tatiana Goloskokova, Vladimir Korenkov, Nikolay Kutovskiy, Danila Oleynik, Artem Petrosyan, Roman Semenov, Alexander Uzhinskiy LIT JINR The.
Pilots 2.0: DIRAC pilots for all the skies Federico Stagni, A.McNab, C.Luzzi, A.Tsaregorodtsev On behalf of the DIRAC consortium and the LHCb collaboration.
Distributed Computing for CEPC YAN Tian On Behalf of Distributed Computing Group, CC, IHEP for 4 th CEPC Collaboration Meeting, Sep ,
1 Managing distributed computing resources with DIRAC A.Tsaregorodtsev, CPPM-IN2P3-CNRS, Marseille September 2011, NEC’11, Varna.
APPLICATION DELIVERY IN UNIVERSITIES Glen D. Hauser, Joel Ahmed Engineering Computer Center (ECC) College of Engineering University of Saskatchewan.
Cloud Computing 1. Outline  Introduction  Evolution  Cloud architecture  Map reduce operation  Platform 2.
Introduction to Cloud Technology StratusLab Tutorial (Orsay, France) 28 November 2012.
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
BESIII distributed computing and VMDIRAC
Introduction to Cloud Computing
Grid Initiatives for e-Science virtual communities in Europe and Latin America DIRAC TEAM CPPM – CNRS DIRAC Grid Middleware.
YAN, Tian On behalf of distributed computing group Institute of High Energy Physics (IHEP), CAS, China CHEP-2015, Apr th, OIST, Okinawa.
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
Installation and Development Tools National Center for Supercomputing Applications University of Illinois at Urbana-Champaign The SEASR project and its.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
BESIII Production with Distributed Computing Xiaomei Zhang, Tian Yan, Xianghu Zhao Institute of High Energy Physics, Chinese Academy of Sciences, Beijing.
Tool Integration with Data and Computation Grid GWE - “Grid Wizard Enterprise”
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
T3 analysis Facility V. Bucard, F.Furano, A.Maier, R.Santana, R. Santinelli T3 Analysis Facility The LHCb Computing Model divides collaboration affiliated.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
GCRC Meeting 2004 BIRN Coordinating Center Software Development Vicky Rowley.
Conference name Company name INFSOM-RI Speaker name The ETICS Job management architecture EGEE ‘08 Istanbul, September 25 th 2008 Valerio Venturi.
GLIDEINWMS - PARAG MHASHILKAR Department Meeting, August 07, 2013.
The OSG and Grid Operations Center Rob Quick Open Science Grid Operations Center - Indiana University ATLAS Tier 2-Tier 3 Meeting Bloomington, Indiana.
Tool Integration with Data and Computation Grid “Grid Wizard 2”
The GridPP DIRAC project DIRAC for non-LHC communities.
Nanbor Wang, Balamurali Ananthan Tech-X Corporation Gerald Gieraltowski, Edward May, Alexandre Vaniachine Argonne National Laboratory 2. ARCHITECTURE GSIMF:
DIRAC 4 EGI: Report on the experience R.G. 1,3 & A.Tsaregorodtsev 2,3 1 Universitat de Barcelona 2 Centre de Physique des Particules de Marseille 3 DIRAC.
Dr. Isabel Campos Plasencia (IFCA-CSIC) Spanish NGI Coordinator ES-GRID The Spanish National Grid Initiative.
VLDATA Common solution for the (very-)large data challenge EINFRA-1, focus on topics (4) & (5)
DIRAC Project A.Tsaregorodtsev (CPPM) on behalf of the LHCb DIRAC team A Community Grid Solution The DIRAC (Distributed Infrastructure with Remote Agent.
+ Support multiple virtual environment for Grid computing Dr. Lizhe Wang.
INFSO-RI JRA2 Test Management Tools Eva Takacs (4D SOFT) ETICS 2 Final Review Brussels - 11 May 2010.
OpenNebula: Experience at SZTAKI Peter Kacsuk, Sandor Acs, Mark Gergely, Jozsef Kovacs MTA SZTAKI EGI CF Helsinki.
The GridPP DIRAC project DIRAC for non-LHC communities.
Breaking the frontiers of the Grid R. Graciani EGI TF 2012.
LHCb/DIRAC week A.Tsaregorodtsev, CPPM 7 April 2011.
CMS Experience with the Common Analysis Framework I. Fisk & M. Girone Experience in CMS with the Common Analysis Framework Ian Fisk & Maria Girone 1.
Instituto de Biocomputación y Física de Sistemas Complejos Cloud resources and BIFI activities in JRA2 Reunión JRU Española.
IBERGRID as RC Total Capacity: > 10k-20K cores, > 3 Petabytes Evolving to cloud (conditioned by WLCG in some cases) Capacity may substantially increase.
1 Globe adapted from wikipedia/commons/f/fa/ Globe.svg IDGF-SP International Desktop Grid Federation - Support Project SZTAKI.
DIRAC for Grid and Cloud Dr. Víctor Méndez Muñoz (for DIRAC Project) LHCb Tier 1 Liaison at PIC EGI User Community Board, October 31st, 2013.
Acronyms GAS - Grid Acronym Soup, LCG - LHC Computing Project EGEE - Enabling Grids for E-sciencE.
DIRAC in LHCb and beyond Philippe Charpentier (LHCb-CERN) Slides courtesy of A.Tsaregorodtsev BigPanda Workshop, October Click to edit Master title.
WP5 – Infrastructure Operations Test and Production Infrastructures StratusLab kick-off meeting June 2010, Orsay, France GRNET.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
1 DIRAC Project Status A.Tsaregorodtsev, CPPM-IN2P3-CNRS, Marseille 10 March, DIRAC Developer meeting.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI Services for Distributed e-Infrastructure Access Tiziana Ferrari on behalf.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
1 Building application portals with DIRAC A.Tsaregorodtsev, CPPM-IN2P3-CNRS, Marseille 27 April 2010, Journée LuminyGrid, Marseille.
Multi-community e-Science service connecting grids & clouds R. Graciani 1, V. Méndez 2, T. Fifield 3, A. Tsaregordtsev 4 1 University of Barcelona 2 University.
Distributed Computing Framework A. Tsaregorodtsev, CPPM-IN2P3-CNRS, Marseille EGI Webinar, 7 June 2016.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
EGI-InSPIRE RI An Introduction to European Grid Infrastructure (EGI) March An Introduction to the European Grid Infrastructure.
Distributed computing and Cloud Shandong University (JiNan) BESIII CGEM Cloud computing Summer School July 18~ July 23, 2016 Xiaomei Zhang 1.
Introduction to Cloud Technology
Dag Toppe Larsen UiB/CERN CERN,
Dag Toppe Larsen UiB/CERN CERN,
DIRAC services.
Discussions on group meeting
WLCG Collaboration Workshop;
VMDIRAC status Vanessa HAMAR CC-IN2P3.
gLite The EGEE Middleware Distribution
Presentation transcript:

DIRAC Distributed Computing Services A. Tsaregorodtsev, CPPM-IN2P3-CNRS FCPPL Meeting, 29 March 2013, Nanjing

Plan 2  DIRAC Project  Resources available to DIRAC users  DIRAC in the FCPPL Collaboration  Conclusions

DIRAC Project  DIRAC is developed originally for the LHCb experiment with the goals:  Integrate all the heterogeneous computing resources available to the community  Provide a complete solution for all the tasks - both WMS and DMS tasks  The experience collected with a production grid system of a large HEP experiment turned out to be very valuable  Several new experiments expressed interest in using this software relying on its proven in practice utility  In 2009 the core DIRAC development team decided to generalize the software to make it suitable for any user community.  Separate LHCb specific functionality into a set of extensions to the generic core services  Now DIRAC is used by multiple projects  HEP: LHCb, ILC/CLIC, Belle II, BES III, ( SuperB )  Astrophysics: CTA, Glast, Fermi-LAT, LSST  Other communities: biomed, earth sciences, etc 3

Physicist User EGEE Pilot Director EGI/WLCG Grid NDG Pilot Director NDG Grid EELA Pilot Director GISELA Grid CREAM Pilot Director CREAM CE Matcher Service Production Manager 4

WMS: using heterogeneous resources  Including resources in different grids and standalone clusters is simple with Pilot Jobs  Needs a specialized Pilot Director per resource type  Users just see new sites appearing in the job monitoring 5

Data Management components  Various Storage and Catalog service are hidden behind a uniform abstract interface  For DIRAC users the use of any Storage Element or File Catalog is transparent  Users see depending on the DIRAC Configuration  Logical Storage Elements  Logical File Catalog 6

Interware 7  DIRAC has all the necessary components to build ad-hoc grid infrastructures interconnecting computing resources of different types. This allows to speak about the DIRAC interware. Grid A Grid B User Community

DIRAC as a resource manager 8

Computing resources 9  It fully supports several grid infrastructures  gLite: EGI, GISELA, OSG, …  ARC: NorduGrid  Other types of grids can be supported if there will be customers  Conventional computing centers with different batch systems not involved in any grid infrastructure  LSF, BQS, SGE, PBS/Torque, Condor  More to come: OAR, SLURM, LoadLeveler. Etc  Clouds are a new emerging paradigm for distributed computing  DIRAC is supporting multiple cloud managers:  OpenStack, OpenNebula, CloudStack, Amazon EC2

DIRAC in FCPPL 10

Enhancing basic tools 11  Explore the possible use of distributed “NoSQL” databases for the DIRAC services  Accounting, Monitoring, etc  This has a potential to create distributed systems of a new scale  Adding more computing resources available to DIRAC users  Cloud storage resources based on the Amazon S3 standard  Desktop volunteer virtualized grids

BOINC Desktop Grids 12  Work done by Jie Wu and followed now by Wenxiao Kan at CPPM  PhD students  On the client PC the third party components are installed:  VirtualBox hypervisor  Standard BOINC client  A special BOINC application  Starts a requested Virtual Machine  Executes Pilot Job in the VM  Prototype system is functioning in the DIRAC installation in CC/Lyon

Experiment support 13  Building the distributed computing system for the BES III experiment  Continue the support of the already deployed system  Add more functionality as needed  E.g. Condor sites support  Data management operations  Data Catalogs  Setting up the MC production system for the TREND experiment  Perform the case study  Setup available computing resources with the DIRAC service at IHEP  Create the MC production workflow

DIRAC as a Service 14

DIRAC as a service 15  DIRAC client is easy to install  Part of a usual tutorial  DIRAC services are easy to install but  Needs dedicated hardware for hosting  Configuration, maintenance needs expert manpower  Monitoring computing resources  Small user communities can not afford maintaining dedicated DIRAC services  Still need easy grid access  Large grid infrastructures can provide DIRAC services for their users  Distributed computing resources should be available to the users as easily as Google search engine, for example

France-Grilles DIRAC service 16  Started as a support for user grid tutorials  The FG-DIRAC joint project  Hosted by the CC/IN2P3, Lyon, T1 center  Distributed team of service administrators  5 participating universities  France-Grilles users  15 Virtual Organizations, 88 users registered  astro, biomed, esr, euasia, vo.france-asia.org, etc  More VO’s and users can be added as necessary  In production since May 2012  First ~3 millions jobs went through the system  Mostly biomed applications

“CAS-DIRAC” service 17  Proposal to build similar service at IHEP  Leverage the FG-DIRAC project experience and experience of the DIRAC usage at IHEP  Computing resources  Conventional grids  France-Asia grid sites  CNGrid  CC/IHEP(Beijing), National Supercomputing Centre (Shenzhen)  IHEP campus desktop grid based on the BOINC technology  Computing clouds ( e.g. OpenStack )  Storage resources  Conventional, e.g. gLite/SRM SE, DIRAC SE  S3 Cloud storage  iRods storage

“CAS-DIRAC” service (cont’d) 18  Target user communities  Collaborations with involvement of users from China  Users from the Asia Pacific region  Use the service to provide assistance to the actual and potential users of the system  Tutorials  Assistance in porting applications to the grid

Conclusions 19  DIRAC is a versatile tool to build distributed computing systems used in multiple user communities in different scientific domains  Within the FCPPL Collaboration DIRAC will be further enhanced to use novel technologies and computing resources  Setting up the DIRAC general purpose service in IHEP will allow to share the HEP experience in distributed computing with other user communities