Forschungszentrum Jülich in der Helmholtz-Gesellschaft December 2006 A European Grid Middleware Achim Streit

Slides:



Advertisements
Similar presentations
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Advertisements

Piotr Bała ICM University of Warsaw Grid technology in Europe Zurich 2008.
/ 1 N. Williams Grid Middleware Experiences Nadya Williams OCI Grid Computing, University of Zurich
Tom Sugden EPCC OGSA-DAI Future Directions OGSA-DAI User's Forum GridWorld 2006, Washington DC 14 September 2006.
UNICORE – The Seamless GRID Solution Hans–Christian Hoppe A Member of the ExperTeam Group Pallas GmbH Hermülheimer Straße 10 D–50321 Brühl, Germany
W w w. h p c - e u r o p a. o r g HPC-Europa Portal: Uniform Access to European HPC Infrastructure Ariel Oleksiak Poznan Supercomputing.
GPE4UNICORE Grid Programming Environment for UNICORE
March 6 th, 2009 OGF 25 Unicore 6 and IPv6 readiness and IPv6 readiness
Forschungszentrum Jülich in der Helmholtz-Gesellschaft February 2007 – OGF19 A Collaborative Online Visualization and Steering (COVS) Framework for e-Science.
Challenges for Interactive Grids a point of view from Int.Eu.Grid project Remote Instrumentation Services in Grid Environment RISGE BoF Manchester 8th.
Project Overview Daniel Mallmann, Research Centre Juelich Alistair Dunlop, University of Southampton.
XtreemOS IP project is funded by the European Commission under contract IST-FP XtreemOS: Building and Promoting a Linux-based Operating System.
An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
Current status of grids: the need for standards Mike Mineter TOE-NeSC, Edinburgh.
OMII-UK Steven Newhouse, Director. © 2 OMII-UK aims to provide software and support to enable a sustained future for the UK e-Science community and its.
Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Torsten Antoni – LCG Operations Workshop, CERN 02-04/11/04 Global Grid User Support - GGUS -
ASCR Data Science Centers Infrastructure Demonstration S. Canon, N. Desai, M. Ernst, K. Kleese-Van Dam, G. Shipman, B. Tierney.
Towards a Virtual European Supercomputing Infrastructure Vision & issues Sanzio Bassini
The UNICORE GRID Project Karl Solchenbach Gesellschaft für Parallele Anwendungen und Systeme mbH Pallas GmbH Hermülheimer Straße 10 D Brühl, Germany.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
Grid Programming Environment (GPE) Grid Summer School, July 28, 2004 Ralf Ratering Intel - Parallel and Distributed Solutions Division (PDSD)
Lisbon, August A. Streit DEISA Forschungszentrum Jülich in der Helmholtz-Gesellschaft Achim Streit
Milos Kobliha Alejandro Cimadevilla Luis de Alba Parallel Computing Seminar GROUP 12.
UNICORE UNiform Interface to COmputing REsources Olga Alexandrova, TITE 3 Daniela Grudinschi, TITE 3.
CSC Grid Activities Arto Teräs HIP Research Seminar February 18th 2005.
EUROPEAN UNION Polish Infrastructure for Supporting Computational Science in the European Research Space Cracow Grid Workshop’10 Kraków, October 11-13,
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
Forschungszentrum Jülich in der Helmholtz-Gesellschaft Grid Computing at NIC September 2005 Achim Streit + Team
- 1 - Grid Programming Environment (GPE) Ralf Ratering Intel Parallel and Distributed Solutions Division (PDSD)
EUFORIA FP7-INFRASTRUCTURES , Grant Scientific Workflows Kepler and Java API 4 HPC/GRID Hands on tutorial - ITM Meeting 2009 Michal Owsiak.
Experiences with using UNICORE in Production Grid Infrastructures DEISA and D-Grid Michael Rambadt
EuroCAMP, Malaga, October 19, 2006 DEISA requirements for federations and AA Jules Wolfrat SARA
1Forschungszentrum Jülich  11:00 – 11:20UNICORE – A European Grid Middleware (20 min) Achim Streit (FZJ)  11:20 – 11:30Demonstration of UNICORE in DEISA.
GGF16 Athens, February DEISA Perspectives Towards cooperative extreme computing in Europe Victor Alessandrini IDRIS - CNRS
Forschungszentrum Jülich UNICORE and EUROGRID: Grid Computing in EUROPE Dietmar Erwin Forschungszentrum Jülich GmbH TERENA Networking Conference 2001 Antalya,
Nicholas LoulloudesMarch 3 rd, 2009 g-Eclipse Testing and Benchmarking Grid Infrastructures using the g-Eclipse Framework Nicholas Loulloudes On behalf.
The John von Neumann Institute for Computing (NIC): A survey of its computer facilities and its Europe-wide computational science activities Norbert Attig.
Forschungszentrum Jülich in der Helmholtz-Gemeinschaft UNICORE and Grid Computing in Europe Dietmar Erwin Forschungszentrum Jülich
Why do we need PGI? Shahbaz Memon Jülich Supercomputing Centre (JSC)
Advanced Techniques for Scheduling, Reservation, and Access Management for Remote Laboratories Wolfgang Ziegler, Oliver Wäldrich Fraunhofer Institute SCAI.
Forschungszentrum Jülich in der Helmholtz-Gesellschaft Experiences with using UNICORE in Production Grid Infrastructures DEISA and D-Grid Michael Rambadt.
EUFORIA FP7-INFRASTRUCTURES , Grant EUFORIA: EU Fusion fOR ITER Applications Marcus Hardt
IST CGW 2006 ?? Björn Hagemeier Björn Hagemeier, Roger Menday, Bernd Schuller, Achim Streit A Universal API.
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
Steering and Interactive Visualization on the Grid Using the UNICORE Grid Middleware K. Benedyczak 1,2, A. Nowiński 1, K.S. Nowiński 1, P. Bała 1,2 (1)ICM,
EUROGRID – An Integrated User–Friendly Grid System Hans–Christian Hoppe, Karl Solchenbach A Member of the ExperTeam Group Pallas GmbH Hermülheimer Straße.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Research Infrastructures Information Day Brussels, March 25, 2003 Victor Alessandrini IDRIS - CNRS.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
UKQCD Grid Status Report GridPP 13 th Collaboration Meeting Durham, 4th—6th July 2005 Dr George Beckett Project Manager, EPCC +44.
CEOS Working Group on Information Systems and Services - 1 Data Services Task Team Discussions on GRID and GRIDftp Stuart Doescher, USGS WGISS-15 May 2003.
Role, Objectives and Migration Plans to the European Middleware Initiative (EMI) Morris Riedel Jülich Supercomputing.
Panel “Making real large-scale grids for real money-making users: why, how and when?” August 2005 Achim Streit Forschungszentrum Jülich in der.
7. Grid Computing Systems and Resource Management
Fourth EGEE Conference Pise, October 23-28, 2005 DEISA Perspectives Towards cooperative extreme computing in Europe Victor Alessandrini IDRIS - CNRS
CERN The GridSTART EU accompany measure Fabrizio Gagliardi CERN
BalticGrid-II Project EGEE UF’09 Conference, , Catania Partner’s logo Framework for Grid Applications Migrating Desktop Framework for Grid.
Introduction to UNICORE Morris Riedel, Forschungszentrum Jülich (FZJ), Germany OMII – Europe Training, Edinburgh, UK 11th July 2007 – 12th July
14, Chicago, IL, 2005 Science Gateways to DEISA Motivation, user requirements, and prototype example Thomas Soddemann, RZG, Germany.
Breaking the frontiers of the Grid R. Graciani EGI TF 2012.
Instituto de Biocomputación y Física de Sistemas Complejos Cloud resources and BIFI activities in JRA2 Reunión JRU Española.
EGEE Workshop on Management of Rights in Production Grids Paris, June 19th, 2006 Victor Alessandrini IDRIS - CNRS DEISA : status, strategies, perspectives.
Page : 1 SC2004 Pittsburgh, November 12, 2004 DEISA : integrating HPC infrastructures in Europe DEISA : integrating HPC infrastructures in Europe Victor.
Monterey HPDC Workshop Experiences with MC-GPFS in DEISA Andreas Schott
1 MSWG, Amsterdam, December 15, 2005 DEISA security Jules Wolfrat SARA.
A European Grid Technology Achim Streit Jülich Supercomputing Centre (JSC)
DEISA : integrating HPC infrastructures in Europe Prof
Presentation transcript:

Forschungszentrum Jülich in der Helmholtz-Gesellschaft December 2006 A European Grid Middleware Achim Streit

2Forschungszentrum Jülich UNiform Interface to COmputing Resources seamless, secure, and intuitive Initial development started in two German projects funded by the German ministry of education and research (BMBF) 08/1997 – 12/1999: UNICORE project Results: well defined security architecture with X.509 certificates, intuitive graphical interface, central job supervisor based on Codine from Genias, 1/2000 – 12/2002: UNICORE Plus project Results: implementation enhancements (e.g. replacement of Codine by custom NJS), extended job control (workflows), application specific interfaces (plugins) Continuous development since 2002 in several European projects Core developers today from Europe: CINECA, ICM, Intel, FLE, FZJ History lesson

3Forschungszentrum Jülich A vertically integrated Grid middleware system since 1997 Provides seamless, secure, and intuitive access to distributed resources and data Used in production and projects worldwide Open Source under BSD license Features intuitive GUI with single sign-on X.509 certificates for AA and job/data signing workflow engine for complex workflows extensible application support with plug-ins interactive access with UNICORE-SSH Key features

4Forschungszentrum Jülich EUROGRID UNICORE Plus GRIP GRIDSTART NextGRID D-Grid IPOMII-Europe EGEE-II A-WARECoreGRID ChemomentumeDEISADEISA PHOSPHORUS UNICORE UniGridsVIOLAOpenMolGRID More than a decade of German and European research & development and infrastructure projects Projects

5Forschungszentrum Jülich Interactive access (UNICORE-SSH) Improved workflow capabilities (MetaPlugin for Workflows) High-level API for programming Grids (Roctopus) DRMAA-based TSI Collaborative Online Visualization and Steering (COVS) Comfortable configuration tool Site Functionality Monitoring Tool (SIMON) Recent Developments

6Forschungszentrum Jülich Website:

7Forschungszentrum Jülich Usage in the National German HPC center NIC About 450 users in 200 research projects ¼ of them uses UNICORE Access via UNICORE to IBM p690 eSeries Cluster (1312 CPUs, 8.9 TFlops) SoftComp Cluster (264 CPUs, 1 TFlop) Cray XD1 (120 CPUs + FPGAs, 528 GFlops)

8Forschungszentrum Jülich Consortium of leading national HPC centers in EU Deploy and operate a persistent, production quality, distributed, heterogeneous HPC environment IDRIS – CNRS, France FZJ, Jülich, Germany RZG, Garching, Germany CINECA, Bologna, Italy EPCC, Edinburgh, UK CSC, Helsinki, Finland SARA, Amsterdam, NL HLRS, Stuttgart, Germany BSC, Barcelona, Spain LRZ, Munich, Germany ECMWF, Reading, UK Distributed European Infrastructure for Supercomputing Applications

9Forschungszentrum Jülich Dedicated 1 Gb/s network as a basis High performance datagrid via GPFS Extended to non-AIX Linux like SGI Altix, Mare Nostrum Common Production Environment on all sites Job migration across sites Used to load balance the global workflow when a huge partition is allocated to a DEISA project in one site UNICORE as Grid Middleware for workflow applications Co-allocation for applications running on multiple sites at the same time Global data management to include tertiary storage and hierarchical data management system Science Gateways and Portals to facilitate the access of new, non traditional users communities Services

10Forschungszentrum Jülich Usage in DEISA fully-meshed UNICORE infrastructure complex multi-site workflows easily possible heavily used by DECI (DEISA Extreme Computing Initiative) projects/jobs

11Forschungszentrum Jülich Usage in D-Grid Core D-Grid sites committing parts of their existing resources to D-Grid Approx. 700 CPUs Approx. 1 PByte of storage UNICORE is installed and used Additional Sites receiving extra money from the BMBF for buying compute clusters and data storage Approx CPUs Approx. 2 PByte of storage UNICORE (as well as Globus and gLite) will be installed as soon as systems are in place LRZ DLR-DFD

12Forschungszentrum Jülich Usage in Industry and Commercial Support

13Forschungszentrum Jülich New infrastructure based on web services Preserved traditional User-level features Atomic: simple tasks, such as Execute script Client: workstation GUI Workflow: edit, run and monitor graphs of atomic tasks Additional User-level features Portal: web based portal client Streaming: client-server streaming support (for visualization or media applications) Application development features Software license management Simplified application deployment Deployment features User and virtual organization (VO) management Roadmap to UNICORE 6.0

14Forschungszentrum Jülich Architecture of Version 6.0

15Forschungszentrum Jülich Collaborative Online Visualization and Steering (COVS) Implemented as a higher level service of UNICORE WS-RF compliant for session management ssh-based data transfer with visualization on the client Collaboration server + multiplexer for geographically dispersed clients Usage of UNICORE security infrastructure for single sign-on COVS is a real application of WS-RF based UNICORE Collaboration server and multiplexer are the resources Controlled through a UNICORE service COVS is a framework for scientific simulations & visualizations In addition to usual post-processing (offline) techniques Enables to view the actual status (online) of parallel simulations Based on the communication library VISIT Works with all VISIT-enabled scientific visualizations A use case of UNICORE 6

16Forschungszentrum Jülich Gridbean for UNICORE Clients Manages the collaborative visualization and steering sessions (participants, collaboration server, and multiplexer Who is/is not participating? Who is able to steer the simulation? Who is just watching? Monitors performance of connections (detection of bottlenecks) Successfully demonstrated at OGF18, Europar06, SC06, … A use case of UNICORE 6