Site Report HEPHY-UIBK Austrian federated Tier 2 meeting 03.03.2008.

Slides:



Advertisements
Similar presentations
SouthGrid Status Pete Gronbech: 12 th March 2008 GridPP 20 Dublin.
Advertisements

Liverpool HEP – Site Report May 2007 John Bland, Robert Fay.
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware and software on Prague farms Brief statistics about running LHC experiments.
CBPF J. Magnin LAFEX-CBPF. Outline What is the GRID ? Why GRID at CBPF ? What are our needs ? Status of GRID at CBPF.
Chris Brew RAL PPD Site Report Chris Brew SciTech/PPD.
Birmingham site report Lawrie Lowe: System Manager Yves Coppens: SouthGrid support HEP System Managers’ Meeting, RAL, May 2007.
CURRENT AND FUTURE HPC SOLUTIONS. T-PLATFORMS  Russia’s leading developer of turn-key solutions for supercomputing  Privately owned  140+ employees.
GridKa SC4 Tier2 Workshop – Sep , Warsaw Tier2 Site Adam Padee ( ) Ryszard Gokieli ( ) Krzysztof.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
Northgrid Status Alessandra Forti Gridpp25 Ambleside 25 August 2010.
Institute for High Energy Physics ( ) NEC’2007 Varna, Bulgaria, September Activities of IHEP in LCG/EGEE.
- CENTRU DE RESURSE GRID ICI - Bucharest 1 Grid site RO-12-ICI Grid site RO-12-ICI Bildea Ana ICI Bucharest, December 5, 2008.
Computing Resources Joachim Wagner Overview CNGL Cluster MT Group Cluster School Cluster Desktop PCs.
IFIN-HH LHCB GRID Activities Eduard Pauna Radu Stoica.
Computing in Poland from the Grid/EGEE/WLCG point of view Ryszard Gokieli Institute for Nuclear Studies Warsaw Gratefully acknowledging slides from: P.Lasoń.
SouthGrid Status Pete Gronbech: 4 th September 2008 GridPP 21 Swansea.
Computing/Tier 3 Status at Panjab S. Gautam, V. Bhatnagar India-CMS Meeting, Sept 27-28, 2007 Delhi University, Delhi Centre of Advanced Study in Physics,
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
INFSO-RI Enabling Grids for E-sciencE Status of LCG-2 porting Stephen Childs, Brian Coghlan and Eamonn Kenny Grid-Ireland/EGEE October.
Preparation of KIPT (Kharkov) computing facilities for CMS data analysis L. Levchuk Kharkov Institute of Physics and Technology (KIPT), Kharkov, Ukraine.
BINP/GCF Status Report BINP LCG Site Registration Oct 2009
GridKa SC4 Tier2 Workshop – Sep , Warsaw Tier2 Site.
Grid Infrastructure for the ILC Andreas Gellrich DESY European ILC Software and Physics Meeting Cambridge, UK,
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
INDIACMS-TIFR Tier 2 Grid Status Report I IndiaCMS Meeting, April 05-06, 2007.
Batch Scheduling at LeSC with Sun Grid Engine David McBride Systems Programmer London e-Science Centre Department of Computing, Imperial College.
Romanian Tier-2 Federation One site for all: RO-07-NIPNE Mihai Ciubancan on behalf of IT Department.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
Klaster obliczeniowy WLCG – cz.I Alice::WTU::LCG - skład: VOBOX  alicluster.if.pw.edu.plVM: saturn.if.pw.edu.pl CREAM-CE  aligrid.if.pw.edu.pl VM: saturn.if.pw.edu.pl.
Site Report BEIJING-LCG2 Wenjing Wu (IHEP) 2010/11/21.
UKI-SouthGrid Update Hepix Pete Gronbech SouthGrid Technical Coordinator April 2012.
Architecture and ATLAS Western Tier 2 Wei Yang ATLAS Western Tier 2 User Forum meeting SLAC April
Company LOGO “ALEXANDRU IOAN CUZA” UNIVERSITY OF IAŞI” Digital Communications Department Status of RO-16-UAIC Grid site in 2013 System manager: Pînzaru.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
Martin Bly RAL Tier1/A Centre Preparations for the LCG Tier1 Centre at RAL LCG CERN 23/24 March 2004.
Grid DESY Andreas Gellrich DESY EGEE ROC DECH Meeting FZ Karlsruhe, 22./
News from Alberto et al. Fibers document separated from the rest of the computing resources
HEP Computing Status Sheffield University Matt Robinson Paul Hodgson Andrew Beresford.
HEP SYSMAN 23 May 2007 National Grid Service Steven Young National Grid Service Manager Oxford e-Research Centre University of Oxford.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
DataTAG Work Package 4 Meeting Bologna Simone Ludwig Brunel University 23rd and 24th of May 2002.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN – RAL 10 th June 2010.
EGEE is a project funded by the European Union under contract IST VO box: Experiment requirements and LCG prototype Operations.
Site Report: Prague Jiří Chudoba Institute of Physics, Prague WLCG GridKa+T2s Workshop.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Progress report from University of Cyprus.
SA1 operational policy training, Athens 20-21/01/05 Presentation of the HG Node “Isabella” and operational experience Antonis Zissimos Member of ICCS administration.
GridKa Cloud T1/T2 at Forschungszentrum Karlsruhe (FZK)
Materials for Report about Computing Jiří Chudoba x.y.2006 Institute of Physics, Prague.
BaBar Cluster Had been unstable mainly because of failing disks Very few (
Evangelos Markatos and Charalampos Gkikas FORTH-ICS Athens, th Mar Institute of Computer Science - FORTH Christos.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
Instituto de Biocomputación y Física de Sistemas Complejos Cloud resources and BIFI activities in JRA2 Reunión JRU Española.
INFN/IGI contributions Federated Clouds Task Force F2F meeting November 24, 2011, Amsterdam.
2007/05/22 Integration of virtualization software Pierre Girard ATLAS 3T1 Meeting
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
The status of IHEP Beijing Site WLCG Asia-Pacific Workshop Yaodong CHENG IHEP, China 01 December 2006.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
Brief introduction about “Grid at LNS”
Paul Kuipers Nikhef Site Report Paul Kuipers
Moroccan Grid Infrastructure MaGrid
LCG Deployment in Japan
Grid Computing for the ILC
Partner Status HPCL-University of Cyprus
Computing Board Report CHIPP Plenary Meeting
HIGH-PERFORMANCE COMPUTING SYSTEM FOR HIGH ENERGY PHYSICS
UTFSM computer cluster
Servizi di Grid e impatto sulla rete
QMUL Site Report by Dave Kant HEPSYSMAN Meeting /09/2019
Presentation transcript:

Site Report HEPHY-UIBK Austrian federated Tier 2 meeting

Content Overview HEPHY-UIBK Site Services Hardware Power and Cooling Planned Projects

Rack 2 Rack 1 Overview HEPHY-UIBK WMS/LB MON EGEE-Cluster Testbed

Facts WNs: 26 (76 Cores) without testbed CEs: 3 UIs: 2 SEs: 2x DPM and 1 classic SE Storage: ~16 TB on two Raids WMS/LB: 1 MON: 1 VOs: atlas, ops, dteam, bphys, uibktest

Gigabitswitch 8 Ports CE UI DPM Top Raid 2x 1.7 TB Testbed 16 Worker Nodes (32 Cores)‏ NAT Firewall Rack 1 SCSI Gigabitswitch 24 Ports

Gigabitswitch CE / classic SE Easy Raid 13 TB Fileserver 10 Worker Nodes (44 Cores)‏ NAT Firewall Rack Gigabitswitch 24 Ports

Content Overview HEPHY-UIBK Site Services Hardware Power and Cooling Planned Projects

Site Services User Interface (UI) Computing Element (lcg CE)‏ Storage Element (classic SE)‏ Storage Element (DPM)‏ Information Service (BDII)‏ Resource Broker (WMS/LB)‏ Monitoring (MON)‏ gLite 3.0 Scientific Linux gLite 3.1 Scientific Linux 4.5 User Interface (UI)‏ Computing Element (lcg CE)‏ Storage Element (DPM)‏ Worker Node (WN)‏

Content Overview HEPHY-UIBK Site Services Hardware Power and Cooling Planned Projects

Hardware WN: 16x Intel Xeon (32 Cores), 3.0 GHz, 1 GB RAM per core, 80 GB SATA 9x Intel Xeon (36 Cores), 3.0 GHz, 1.5 GB RAM per core, 80 GB SATA 1x Intel Xeon (8 Cores), 2.33 GHz, 1.5 GB RAM per core, 80 GB SATA CE: Intel Xeon (2 CPUs), 2.66 GHz, 2 GB RAM, 80 GB IDE AMD Opteron (4 Cores), 2.6 GHz, 4 GB RAM, 80 GB SATA

Hardware (2)‏ SE: Intel Xeon (2 CPUs), 2.66 GHz, 2 GB RAM AMD Opteron (4 Cores), 2.6 GHz, 4 GB RAM Storage: TopRaid: 2x 1.7 TB, Raid 5 EasyRaid: 13TB (16x 1 TB), Raid 6

Content Overview HEPHY-UIBK Site Services Hardware Power and Cooling Planned Projects

Power and Cooling Wattage EGEE-Cluster: Rack 1: max. 6.5 kW Rack 2: max. 4 kW Used cooling methods: air cooling water cooling (CoolAdd)‏ CoolAdd: cooling Capacity 8 kW wattage: max. 725 W weight: ~45 kg

Content Overview HEPHY-UIBK Site Services Hardware Power and Cooling Planned Projects

Planned Projects Installation of MON and WMS/LB on SL4 Migration of the classic SE to DPM Build up UI, MON and WMS/LB as virtual machines with XEN Migration of DPM from SL3 to SL