Experiences with using UNICORE in Production Grid Infrastructures DEISA and D-Grid Michael Rambadt

Slides:



Advertisements
Similar presentations
W w w. h p c - e u r o p a. o r g HPC-Europa Portal: Uniform Access to European HPC Infrastructure Ariel Oleksiak Poznan Supercomputing.
Advertisements

March 6 th, 2009 OGF 25 Unicore 6 and IPv6 readiness and IPv6 readiness
Forschungszentrum Jülich in der Helmholtz-Gesellschaft December 2006 A European Grid Middleware Achim Streit
Current status of grids: the need for standards Mike Mineter TOE-NeSC, Edinburgh.
OMII-UK Steven Newhouse, Director. © 2 OMII-UK aims to provide software and support to enable a sustained future for the UK e-Science community and its.
Towards a Virtual European Supercomputing Infrastructure Vision & issues Sanzio Bassini
The UNICORE GRID Project Karl Solchenbach Gesellschaft für Parallele Anwendungen und Systeme mbH Pallas GmbH Hermülheimer Straße 10 D Brühl, Germany.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Infrastructure overview Arnold Meijster &
E-Science Workshop, Santiago de Chile, 23./ KIT ( Frank Schmitz Forschungszentrum Karlsruhe Institut.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
UNICORE UNiform Interface to COmputing REsources Olga Alexandrova, TITE 3 Daniela Grudinschi, TITE 3.
CSC Grid Activities Arto Teräs HIP Research Seminar February 18th 2005.
W w w. h p c - e u r o p a. o r g Single Point of Access to Resources of HPC-Europa Krzysztof Kurowski, Jarek Nabrzyski, Ariel Oleksiak, Dawid Szejnfeld.
EUROPEAN UNION Polish Infrastructure for Supporting Computational Science in the European Research Space Cracow Grid Workshop’10 Kraków, October 11-13,
© 2008 by M. Stümpert, A. Garcia; made available under the EPL v1.0 | Access the power of Grids with Eclipse Mathias Stümpert (Karlsruhe Institute.
Forschungszentrum Jülich in der Helmholtz-Gesellschaft Grid Computing at NIC September 2005 Achim Streit + Team
GRACE Project IST EGAAP meeting – Den Haag, 25/11/2004 Giuseppe Sisto – Telecom Italia Lab.
1Forschungszentrum Jülich  11:00 – 11:20UNICORE – A European Grid Middleware (20 min) Achim Streit (FZJ)  11:20 – 11:30Demonstration of UNICORE in DEISA.
What is OMII-Europe? Qin Li Beihang University. EU project: RIO31844-OMII-EUROPE 1 What is OMII-Europe? Open Middleware Infrastructure Institute for Europe.
User requirements for and concerns about a European e-Infrastructure Steven Newhouse, Director.
Forschungszentrum Jülich UNICORE and EUROGRID: Grid Computing in EUROPE Dietmar Erwin Forschungszentrum Jülich GmbH TERENA Networking Conference 2001 Antalya,
Nicholas LoulloudesMarch 3 rd, 2009 g-Eclipse Testing and Benchmarking Grid Infrastructures using the g-Eclipse Framework Nicholas Loulloudes On behalf.
INFSO-RI Enabling Grids for E-sciencE SA1: Cookbook (DSA1.7) Ian Bird CERN 18 January 2006.
Dr. Harald KornmayerISCG 2007 – 28 th of March 2007 g-Eclipse A framework for Grid users, operators and developers Harald Kornmayer (Forschungszentrum.
Why do we need PGI? Shahbaz Memon Jülich Supercomputing Centre (JSC)
Advanced Techniques for Scheduling, Reservation, and Access Management for Remote Laboratories Wolfgang Ziegler, Oliver Wäldrich Fraunhofer Institute SCAI.
The Grid System Design Liu Xiangrui Beijing Institute of Technology.
Forschungszentrum Jülich in der Helmholtz-Gesellschaft Experiences with using UNICORE in Production Grid Infrastructures DEISA and D-Grid Michael Rambadt.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Code Applications Tamas Kiss Centre for Parallel.
Cracow Grid Workshop October 2009 Dipl.-Ing. (M.Sc.) Marcus Hilbrich Center for Information Services and High Performance.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks, An Overview of the GridWay Metascheduler.
IST CGW 2006 ?? Björn Hagemeier Björn Hagemeier, Roger Menday, Bernd Schuller, Achim Streit A Universal API.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
1October 9, 2001 Sun in Scientific & Engineering Computing Grid Computing with Sun Wolfgang Gentzsch Director Grid Computing Cracow Grid Workshop, November.
© 2006 STEP Consortium ICT Infrastructure Strand.
RI The DEISA Sustainability Model Wolfgang Gentzsch DEISA-2 and OGF rzg.mpg.de.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Research Infrastructures Information Day Brussels, March 25, 2003 Victor Alessandrini IDRIS - CNRS.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Role, Objectives and Migration Plans to the European Middleware Initiative (EMI) Morris Riedel Jülich Supercomputing.
Overview of IST 6FP Call 2/2003 Grid Projects Marian Bubak, Piotr Nowakowski Academic Computer Center CYFRONET AGH Cracow, Poland.
Panel “Making real large-scale grids for real money-making users: why, how and when?” August 2005 Achim Streit Forschungszentrum Jülich in der.
Identity Management in DEISA/PRACE Vincent RIBAILLIER, Federated Identity Workshop, CERN, June 9 th, 2011.
Easy Access to Grid infrastructures Dr. Harald Kornmayer (NEC Laboratories Europe) Dr. Mathias Stuempert (KIT-SCC, Karlsruhe) EGEE User Forum 2008 Clermont-Ferrand,
H. Widmann (M&D) Data Discovery and Processing within C3Grid GO-ESSP/LLNL / June, 19 th 2006 / 1 Data Discovery and Basic Processing within the German.
International Symposium on Grid Computing (ISGC-07), Taipei - March 26-29, 2007 Of 16 1 A Novel Grid Resource Broker Cum Meta Scheduler - Asvija B System.
Development of e-Science Application Portal on GAP WeiLong Ueng Academia Sinica Grid Computing
Fourth EGEE Conference Pise, October 23-28, 2005 DEISA Perspectives Towards cooperative extreme computing in Europe Victor Alessandrini IDRIS - CNRS
CERN The GridSTART EU accompany measure Fabrizio Gagliardi CERN
BalticGrid-II Project EGEE UF’09 Conference, , Catania Partner’s logo Framework for Grid Applications Migrating Desktop Framework for Grid.
Introduction to UNICORE Morris Riedel, Forschungszentrum Jülich (FZJ), Germany OMII – Europe Training, Edinburgh, UK 11th July 2007 – 12th July
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks gLite – UNICORE interoperability Daniel Mallmann.
14, Chicago, IL, 2005 Science Gateways to DEISA Motivation, user requirements, and prototype example Thomas Soddemann, RZG, Germany.
E-Infrastructure the FP7 prospects Mário Campolargo European Commission - DG INFSO Head of Unit Research Infrastructures TERENA Networking Conference 2006.
INTRODUCTION TO GRID & CLOUD COMPUTING U. Jhashuva 1 Asst. Professor Dept. of CSE.
EGEE Workshop on Management of Rights in Production Grids Paris, June 19th, 2006 Victor Alessandrini IDRIS - CNRS DEISA : status, strategies, perspectives.
Page : 1 SC2004 Pittsburgh, November 12, 2004 DEISA : integrating HPC infrastructures in Europe DEISA : integrating HPC infrastructures in Europe Victor.
Monterey HPDC Workshop Experiences with MC-GPFS in DEISA Andreas Schott
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
1 MSWG, Amsterdam, December 15, 2005 DEISA security Jules Wolfrat SARA.
A European Grid Technology Achim Streit Jülich Supercomputing Centre (JSC)
Bob Jones EGEE Technical Director
SuperComputing 2003 “The Great Academia / Industry Grid Debate” ?
European Middleware Initiative (EMI)
DEISA : integrating HPC infrastructures in Europe Prof
Building Components for Grid Interoperability
Study course: “Computing clusters, grids and clouds” Andrey Y. Shevel
DGI: The D-Grid Infrastructure
Malte Dreyer – Matthias Razum
Presentation transcript:

Experiences with using UNICORE in Production Grid Infrastructures DEISA and D-Grid Michael Rambadt

ISGC Table of content The Eurpopean DEISA project The German Grid initiative D-Grid Grid middleware UNICORE Experiences with UNICORE in production in DEISA and D-Grid Lessons Learned Conclusion

ISGC DEISA – general aspects Consortium of leading national supercomputing centers in EU Deploy and operate an innovative, distributed, terascale Grid-empowered infrastructure –to enhance and reinforce High Performance Computing in Europe –to be used by scientists and industries in a coherent and comfortable way –with production quality being stable, secure, reliable, persistent,...  “no-complains-from-users” services Deep integration of middleware and OS / batch system layer

ISGC DEISA Partners

ISGC DEISA Service Activities SA1 – Network Operation and Support (FZJ) –Deployment and operation of a gigabit per second network infrastructure for an European distributed supercomputing platform. Network operation and optimization during project activity SA2 – Data Management with Global File Systems (RZG) –Deployment and operation of global distributed file systems, as basic building blocks of the “inner” super-cluster, and as a way of implementing global data management in a heterogeneous Grid SA3 – Resource Management (CINECA) –Deployment and operation of global scheduling services for the European super-cluster as well as for its heterogeneous Grid extension SA4 – Applications and User Support (IDRIS) –Enabling the adoption by the scientific community of the distributed supercomputing infrastructure as an efficient instrument for the production of leading computational science SA5 – Security (SARA) –Providing administration, authorization, and authentication for a heterogeneous cluster of HPC systems with special emphasis on single sign-on

ISGC Max Mustermann Folientitel Veranstaltung - general aspects the German Grid initiative builds up and operates a sustainable Grid infrastructure establishes methods of e-science in the German scientific community More than 100 partners Started in 2005 with a 100 Million Euro funding from the German ministry for Education and Research Initiative contains following projects –DGI - D-Grid Integration project –AstroGrid-D in astronomy –C3-Grid for climate research –HEP-Grid for high energy physics –InGrid for engineering research –MediGrid for medical research –TextGrid for humanities 2007

ISGC Motivation: Why ? Scientists have to use huge computational and storage resources

ISGC Supercomputers are managed by Resource Management Systems (RMSs) that handle the scheduling But: There are many RMSs available Many proprietary ways of job submission –IBM Loadleveler  llsubmit … –Torque Resource Manager  qsub … Different job description languages (# of nodes, memory requirements…) Resource Management System Motivation: Why ?

ISGC Solution: Grid System UNICORE Define job workflows in abstract manner Immediate portability of job definitions for other systems with other architectures No ‘learn overhead’ if a new RMS is used Both DEISA and D-Grid are using UNICORE as Grid middleware Motivation: Why ?

ISGC UNiform Interface to COmputing Resources Started as a German funded Project in 1997 Enhanced in many (European) projects Seamless and secure access to distributed resources and data via an intuitive GUI workflow engine for –complex multi-site multi-step workflows –job monitoring easy installation and configuration of client and server components The UNICORE Grid

ISGC EUROGRID UNICORE Plus GRIP GRIDSTART UniGrids NextGRID CoreGRID EGEE-IIChemomentum PhosphorusA-WARE eDEISA OMII-Europe UNICORE DGI VIOLAOpenMolGRIDDEISA development

ISGC in production UNICORE production use on JUMP at Research Centre Juelich (IBM p690 eSeries Cluster (1312 CPUs, 8.9 TFlops) Almost 1/3 of all jobs are UNICORE jobs

ISGC Lessons Learned… Often “only” initial hurdles –adapting applications –managing certificates How easy is it to access the Grid Middleware? –Certificate handling –Automatic user management in DEISA in D-Grid Users have to be stimulated and encouraged to –use Grid technology for applications, computations, data transfer and access to resources –adapt/integrate their applications to/into Grids –once convinced they likely use it further on Operation of production environments is costly –certification authority, administrative tools, integration into site management, licenses, …

ISGC More lessons learned Fulfillment of functional requirements is not enough Users want –help to overcome initial hurdles –24/7 availability of the Grid infrastructure Monitoring tool SIMON for DEISA and D-Grid UNICORE components –24/7 availability of the Grid experts support hotline, help desk, mailing lists, … –long-term commitment for continuous development and support –workshops, hands-on training, … Agreement on what the users want and what the developers implement is crucial Scientist sometimes don’t like GUIs –DEISA developed DESHL as command line interface to UNICORE to address this

ISGC Conclusions Production Grids are possible But: users only use Grid middleware if –Deployment of new production software offers added value Easy usage, increased effectiveness, decreased cost, … integration of legacy applications Success of the Grid Middleware depends on successful interaction with other components, working groups, colleagues… –Network, File System, underlying batch system, Applications, Security,… Functionality is important but also Support, Support, Support… Open Source distribution is the right way –Source for bug reports, requirements, … –Higher visibility & community building

ISGC Getting Download UNICORE at

ISGC Questions?

ISGC T h a n k s …

ISGC T h a n k s … To all partners who contributed to the UNICORE development in DEISA and D-Grid

ISGC T h a n k s … Central Institute for Applied Mathematics Forschungszentrum Jülich GmbH Jülich, Germany CINECA Via Magnanelli 6/ Casalecchio di Reno, Italy Rechenzentrum Garching Max-Planck-Institute for Plasmaphysics Garching, Germany R. Breu, L. Clementi, Th. Fieseler, A. Giesler, P. Malfetti, R. Menday, J. Reetz, A. Streit, P. Wieder