Grid Computing Reinhard Bischof ECFA-Meeting March 26 th 2004 Innsbruck.

Slides:



Advertisements
Similar presentations
Andrew McNab - Manchester HEP - 17 September 2002 Putting Existing Farms on the Testbed Manchester DZero/Atlas and BaBar farms are available via the Testbed.
Advertisements

Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
Andrew McNab - Manchester HEP - 2 May 2002 Testbed and Authorisation EU DataGrid Testbed 1 Job Lifecycle Software releases Authorisation at your site Grid/Web.
NorduGrid Grid Manager developed at NorduGrid project.
The DataGrid Project NIKHEF, Wetenschappelijke Jaarvergadering, 19 December 2002
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Swedish participation in DataGrid and NorduGrid Paula Eerola SWEGRID meeting,
Workload Management Workpackage Massimo Sgaravatto INFN Padova.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
27-29 September 2002CrossGrid Workshop LINZ1 USE CASES (Task 3.5 Test and Integration) Santiago González de la Hoz CrossGrid Workshop at Linz,
AustrianGrid, LCG & more Reinhard Bischof HPC-Seminar April 8 th 2005.
Grid and High Energy Physics Paula Eerola Lunarc, Artist’s view on Grid, by Ursula Wilby, Sydsvenskan
IHG (Innsbrucker Hochenergiephysikgruppe) The High Energy – Particle Physics group (IHG) at the Institute for Experimental Physics Innsbruck is involved.
1 Deployment of an LCG Infrastructure in Australia How-To Setup the LCG Grid Middleware – A beginner's perspective Marco La Rosa
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
The EDG Testbed Deployment Details The European DataGrid Project
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Status Report on Tier-1 in Korea Gungwon Kang, Sang-Un Ahn and Hangjin Jang (KISTI GSDC) April 28, 2014 at 15th CERN-Korea Committee, Geneva Korea Institute.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
ATLAS Off-Grid sites (Tier-3) monitoring A. Petrosyan on behalf of the ATLAS collaboration GRID’2012, , JINR, Dubna.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
Grid Computing - AAU 14/ Grid Computing Josva Kleist Danish Center for Grid Computing
Grid Appliance – On the Design of Self-Organizing, Decentralized Grids David Wolinsky, Arjun Prakash, and Renato Figueiredo ACIS Lab at the University.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
OGF 25/EGEE User Forum Catania, March 2 nd 2009 Meta Scheduling and Advanced Application Support on the Spanish NGI Enol Fernández del Castillo (IFCA-CSIC)
:: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :: GridKA School 2009 MPI on Grids 1 MPI On Grids September 3 rd, GridKA School 2009.
DataGrid Applications Federico Carminati WP6 WorkShop December 11, 2000.
Grid Technologies  Slide text. What is Grid?  The World Wide Web provides seamless access to information that is stored in many millions of different.
1 st December 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
International Workshop on HEP Data Grid Nov 9, 2002, KNU Data Storage, Network, Handling, and Clustering in CDF Korea group Intae Yu*, Junghyun Kim, Ilsung.
Alex Read, Dept. of Physics Grid Activity in Oslo CERN-satsingen/miljøet møter MN-fakultetet Oslo, 8 juni 2009 Alex Read.
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
Resource Brokering in the PROGRESS Project Juliusz Pukacki Grid Resource Management Workshop, October 2003.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Support in setting up a non-grid Atlas Tier 3 Doug Benjamin Duke University.
22 nd September 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
London Tier 2 Status Report GridPP 11, Liverpool, 15 September 2004 Ben Waugh on behalf of Owen Maroney.
Preparation for Integration Organized access to the code WP6 infrastructure (MDS-2, RC, …) Input from WPs on requirements,... Acquire experience with Globus.
Evolution of a High Performance Computing and Monitoring system onto the GRID for High Energy Experiments T.L. Hsieh, S. Hou, P.K. Teng Academia Sinica,
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Brunel University, School of Engineering and Design, Uxbridge, UB8 3PH, UK Henry Nebrensky (not a systems manager) SIRE Group.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 14 February 2006.
National HEP Data Grid Project in Korea Kihyeon Cho Center for High Energy Physics (CHEP) Kyungpook National University CDF CAF & Grid Meeting July 12,
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT.
Andrew McNab - Manchester HEP - 17 September 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –“How much of the Testbed has.
Alex Read, Dept. of Physics Grid Activities in Norway R-ECFA, Oslo, 15 May, 2009.
December 26, 2015 RHIC/USATLAS Grid Computing Facility Overview Dantong Yu Brookhaven National Lab.
Performance of The NorduGrid ARC And The Dulcinea Executor in ATLAS Data Challenge 2 Oxana Smirnova (Lund University/CERN) for the NorduGrid collaboration.
LCG LCG-1 Deployment and usage experience Lev Shamardin SINP MSU, Moscow
, VilniusBaltic Grid1 EG Contribution to NEEGRID Martti Raidal On behalf of Estonian Grid.
EC Review – 01/03/2002 – WP9 – Earth Observation Applications – n° 1 WP9 Earth Observation Applications 1st Annual Review Report to the EU ESA, KNMI, IPSL,
Materials for Report about Computing Jiří Chudoba x.y.2006 Institute of Physics, Prague.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Status of Globus activities Massimo Sgaravatto INFN Padova for the INFN Globus group
Overview of ATLAS Israel Computing RECFA, April Overview of ATLAS Israel Computing Overview of ATLAS Israel Computing RECFA Meeting Tel Aviv University,
The DataGrid Project NIKHEF, Wetenschappelijke Jaarvergadering, 19 December 2002
INRNE's participation in LCG Elena Puncheva Preslav Konstantinov IT Department.
CERN News on Grid and openlab François Fluckiger, Manager, CERN openlab for DataGrid Applications.
Dave Newbold, University of Bristol14/8/2001 Testbed 1 What is it? First deployment of DataGrid middleware tools The place where we find out if it all.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
NDGF – a Joint Nordic Production Grid Lars Fischer ICFA Workshop on HEP Networking, Grid, and Digital Divide Issues for Global e-Science Cracow, 2 October.
Grid Computing: Running your Jobs around the World
The EDG Testbed Deployment Details
Long-term Grid Sustainability
Austria, Czech Republic, Hungary, Poland, Slovakia
Patrick Dreher Research Scientist & Associate Director
Presentation transcript:

Grid Computing Reinhard Bischof ECFA-Meeting March 26 th 2004 Innsbruck

Grid: From local to global resource management Information System (Top) Certificate /C=AT/O=UIBK/OU=HEPG/CN=Nemo Replica Location Service Which resources are available ? Many, many other clusters Storage Where are the files I need ? Mapfile C=AT/O=UIBK/OU=HEPG/CN=Nemo.atlas job is executed e.g. as local user atlas003 PC Local resource management PBS Cluster 1 Master PC Local resource management condor Cluster 2 Master Gatekeeper Information system GridFTP Certificate /C=AT/O=UIBK/OU=HEPG/CN=host/grid.uibk.ac.at Gatekeeper Information system GridFTP

Atlas Data Challenge 2 3 Grid flavours will be used : Nordugrid –Sweden (15), Denmark (10), Norway (3), Finland (3), Slovakia (1), Estonia (1) –Nordugrid develops Grid middleware, assists in software deployment, is not dedicated to physics, is not limited to Nordic countries LHC Computing Grid (LCG) –based on middleware development of DataGrid project –resources in Europe, USA, Taiwan, Japan –goal: first truly world wide grid with 24h/7d/week operation (at least two Grid Operating Centres at two different time zones) Grid 2003 (US-Project) –25 sites across USA and South Korea High Energy Physics Innsbruck: participates in LCG as tier2

LCG Tier 1 GridKa Karlsruhe supports as primary site HEPHY Innsbruck in installing and testing the LCG-Cluster LCG-Cluster UIBK-HEPHY Tier2 (under evaluation) SE grid TB grid gate WN grid grid02-04 LCFG UI WN grid05 grid06

Grid CPUs Nordugrid 1499 CPUs LCG core sites: CNAF,CERN,FNAL,FZK,NIKHEF,PIC- Barcelona,RAL,Taipei 1844 CPUs (~1300 KSI2K) planned for 2Q: 3009 KSI2K

Access for grid users to student‘s PC-labs in Innsbruck during weekends, holidays, whenever they are not needed for teaching Cooperation between –High Energy Physics group (part of pilot grid project 1) : installation, providing scripts, testing, documentation for users) –Central Information Technology Service (ZID) –Institute of Computer Science 1) Innsbrucker Hochenergie-Grid-Projekt(Federal Ministry for Education, Science and Culture)

ZID-Grid (details) Strategy: be as flexible as possible –one cluster with adequate size for each group (virtual organization), cluster size can be changed easily (default: one lab = one cluster) –easily change resource management (OpenPBS, Condor or Sun Grid Engine), one configuration file/cluster (add other services) –one timetable/cluster for grid opening hours One wake up on lan server per location starts the machines of a cluster if the timetable says yes Status informations of all clusters (ganglia and Globus MDS) are collected by a server (agrid) Users: High Energy Physics, DPS (Distributed and Parallel Systems, Computer Science), Institute of Structural Analysis (starting next week),... and members of the AustrianGrid project.

ZID-Grid Master Slaves arch_14 Wake up on LAN server Technik DPS (Distributed and Parallel systems) Grid machines Student‘s lab Administration, Information collector, job submission status jobs agrid server start Innrain Sowi

AustrianGrid AustrianGrid project starts April 1 st 2004 initiated by High Energy Physics Partners in Vienna, Linz, Innsbruck 1 st technical meeting December 4th 2003: –several resources already available via grid Linz –SCI-Cluster 9 Knoten/18 CPUs, Myrinet Cluster 8 Knoten / 16 CPUs –SGI Altix CPUs, SGI Origin CPUs Vienna –Cluster Bridge 16*4 CPUs, Cluster Gescher 16 CPUs, other clusters Innsbruck –ZIDGrid-Cluster (~ 192 CPUs), 14 Cluster AustrianGrid certification authority in Vienna (L. Lifka VCPC), needs to be accepted by partners outside Austria (CPS Certificate Practice Statement prepared in Innsbruck, reviewed by partners) Upcoming activities –information system collecting status of all resources –adding grid gates to more resources