The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences.

Slides:



Advertisements
Similar presentations
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Advertisements

CROSSGRID WP41 Valencia Testbed Site: IFIC (Instituto de Física Corpuscular) CSIC-Valencia ICMoL (Instituto de Ciencia Molecular) UV-Valencia Javier Sánchez.
CROSSGRID WP41 Valencia Testbed Site: IFIC (Instituto de Física Corpuscular) CSIC-Valencia ICMoL (Instituto de Ciencia Molecular) UV-Valencia 28/08/2002.
DOSAR Workshop VI April 17, 2008 Louisiana Tech Site Report Michael Bryant Louisiana Tech University.
NIKHEF Testbed 1 Plans for the coming three months.
Task 3.5 Tests and Integration ( Wp3 kick-off meeting, Poznan, 29 th -30 th January 2002 Santiago González de la.
MASPLAS ’02 Creating A Virtual Computing Facility Ravi Patchigolla Chris Clarke Lu Marino 8th Annual Mid-Atlantic Student Workshop On Programming Languages.
On St.Petersburg State University Computing Centre and our 1st results in the Data Challenge-2004 for ALICE V.Bychkov, G.Feofilov, Yu.Galyuck, A.Zarochensev,
IFIN-HH LHCB GRID Activities Eduard Pauna Radu Stoica.
Cluster Computers. Introduction Cluster computing –Standard PCs or workstations connected by a fast network –Good price/performance ratio –Exploit existing.
Edge Based Cloud Computing as a Feasible Network Paradigm(1/27) Edge-Based Cloud Computing as a Feasible Network Paradigm Joe Elizondo and Sam Palmer.
E-Infrastructure for Science in Georgia Prof. Ramaz Kvatadze Georgian Research and Educational Networking Association – GRENA
1 Deployment of an LCG Infrastructure in Australia How-To Setup the LCG Grid Middleware – A beginner's perspective Marco La Rosa
Computing at COSM by Lawrence Sorrillo COSM Center.
Sarkis Mkoyan *Yerevan Physics Institute. 2 Alikhanyan Brothers St., YerPhI Network Overview.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
15-Feb-02PvS Brunel Report, GridPP 3 Cambridge 1 Brunel University ECE Brunel Grid Activities Report Peter van Santen Distributed and Grid Computing Group.
Budapest 2006 Grid Activities in Ukraine Nataliya Kussul Space Research Institute NASU-NSAU, Ukraine WGISS 21, Budapest 2006.
DATAGRID ConferenceTestbed0 - resources in Italy Luciano Gaido 1 DATAGRID WP6 Testbed0 resources in Italy Amsterdam March,
29 June 2004Distributed Computing and Grid- technologies in Science and Education. Dubna 1 Grid Computing in the Czech Republic Jiri Kosina, Milos Lokajicek,
DataGrid WP6 CA meeting, CERN, 12 December 2002 IISAS Certification Authority Jan Astalos Department of Parallel and Distributed Computing Institute of.
March 2003 CERN 1 EDG and AliEn in Prague Dagmar Adamova INP Rez near Prague.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
EU DataGrid segment in Russia. Testbed WP6. V.Ilyin 1, N. Kruglov 1, A. Kryukov 1, V. Korenkov 2, V. Kolosov 3, V. Mitsyn 2, L. Shamardin 1 1 SINP MSU.
ThaiGrid: Current Status Vara Varavithya Dept. of Electrical Engineering King Mongkut's Inst. of Tech. North Bangkok, Thailand
Edinburgh Investment in e-Science Infrastructure Dr Arthur Trew.
InstantGrid: A Framework for On- Demand Grid Point Construction R.S.C. Ho, K.K. Yin, D.C.M. Lee, D.H.F. Hung, C.L. Wang, and F.C.M. Lau Dept. of Computer.
HPCVL High Performance Computing Virtual Laboratory Founded 1998 as a joint HPC lab between –Carleton U. (Comp. Sci.) –Queen’s U. (Engineering) –U. of.
1 Introduction One of the largest research laboratories in the Computer Science Department of the School of Electrical and Computer Engineering, National.
St.Petersburg state university computing centre and the 1st results in the DC for ALICE V.Bychkov, G.Feofilov, Yu.Galyuck, A.Zarochensev, V.I.Zolotarev.
José D. Zamora, Sean R. Morriss and Manuela Campanelli.
D0SAR - September 2005 Andre Sznajder 1 Rio GRID Initiatives : T2-HEPGRID Andre Sznajder UERJ(Brazil)
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
2-3 April 2001HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
Computing Resources at Vilnius Gediminas Technical University Dalius Mažeika Parallel Computing Laboratory Vilnius Gediminas Technical University
On High Performance Computing and Grid Activities at Vilnius Gediminas Technical University (VGTU) dr. Vadimas Starikovičius VGTU, Parallel Computing Laboratory.
Grid activities in the Czech Republic Jiří Kosina, Miloš Lokajíček, Jan Švec Institute of Physics of the Academy of Sciences of the Czech Republic
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
Building the e-Minerals Minigrid Rik Tyer, Lisa Blanshard, Kerstin Kleese (Data Management Group) Rob Allan, Andrew Richards (Grid Technology Group)
Optical Networking & Grid Middleware Final Presentation May 25, 2004.
Easy Deployment of the WRF Model on Heterogeneous PC Systems Braden Ward and Shing Yoh Union, New Jersey.
KOLKATA Grid Site Name :- IN-DAE-VECC-02Monalisa Name:- Kolkata-Cream VO :- ALICECity:- KOLKATACountry :- INDIA Shown many data transfers.
National HEP Data Grid Project in Korea Kihyeon Cho Center for High Energy Physics (CHEP) Kyungpook National University CDF CAF & Grid Meeting July 12,
7 March 2000EU GRID Project Proposal Meeting CERN, M. Lokajicek 1 Proposal for Participation of the Czech Republic in the EU HEP GRID Project Institute.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
December 26, 2015 RHIC/USATLAS Grid Computing Facility Overview Dantong Yu Brookhaven National Lab.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
LCG LCG-1 Deployment and usage experience Lev Shamardin SINP MSU, Moscow
RHIC/US ATLAS Tier 1 Computing Facility Site Report Christopher Hollowell Physics Department Brookhaven National Laboratory HEPiX Upton,
W.A.Wojcik/CCIN2P3, Nov 1, CCIN2P3 Site report Wojciech A. Wojcik IN2P3 Computing Center URL:
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
Evangelos Markatos and Charalampos Gkikas FORTH-ICS Athens, th Mar Institute of Computer Science - FORTH Christos.
INRNE's participation in LCG Elena Puncheva Preslav Konstantinov IT Department.
UTA Site Report Jae Yu UTA Site Report 7 th DOSAR Workshop Louisiana State University Apr. 2 – 3, 2009 Jae Yu Univ. of Texas, Arlington.
Dave Newbold, University of Bristol14/8/2001 Testbed 1 What is it? First deployment of DataGrid middleware tools The place where we find out if it all.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
DutchGrid KNMI KUN Delft Leiden VU ASTRON WCW Utrecht Telin Amsterdam Many organizations in the Netherlands are very active in Grid usage and development,
Grid activities in Czech Republic Jiri Kosina Institute of Physics of the Academy of Sciences of the Czech Republic
13 January 2004GDB Geneva, Milos Lokajicek Institute of Physics AS CR, Prague LCG regional centre in Prague
Brief introduction about “Grid at LNS”
LCG 3D Distributed Deployment of Databases
Christos Markou Institute of Nuclear Physics NCSR ‘Demokritos’
UK GridPP Tier-1/A Centre at CLRC
The INFN TIER1 Regional Centre
QMUL Site Report by Dave Kant HEPSYSMAN Meeting /09/2019
Cluster Computers.
Presentation transcript:

The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences

Motivation and Objectives Motivation for the involvement in Grid Testbeds II SAS participation in CrossGrid application development – Task 1.2 Virtual Organization for Flood Forecasting Objectives: Obtain know-how on Grid technologies Prepare for national Grid projects Provide support and infrastructure for: Development and testing of applications related to flood forecasting Establishment of VO for Flood Forecasting

Funding II SAS Testbed site funding: Equipment CrossGrid, II SAS Networking National project SANET 2 – 1Gbps backbone connected to GÉANT by 155Mbps. Man power CrossGrid, II SAS National projects are being prepared

Current Grid Activities Evaluation of the Globus toolkit and DataGrid middleware Setup of Certification Authority for national Grid activities Implementation of: CrossGrid testbed site at II SAS HTC Condor pool with 25 workstations at Technical University of Kosice will be made available for CrossGrid applications developed at II SAS

Networking in Slovakia Slovak academic provider is SANET (Slovak Academic NETwork) Project SANET 2 – 1Gbps backbone connecting academic organisations in Slovakia GÉANT connectivity – 155Mbps from 2.5Gbps Current usage about 20% Total international connectivity – 465Mbps

II SAS Networking 100Mbps optical connection to the Slovak Academy of Sciences Computing Center (SANET node) Switched 100Mbps link to CrossGrid testbed site equipment No bandwith allocation at the moment Depends on the specific requirements of the applications

Computing Infrastructure Beowulf class HPC Cluster – CE for CrossGrid testbed site Front-end: dual PIII 550MHz, 512MB RAM, 40GB SCSI disk Computing nodes: 16 Pentium 4, 1.8GHz diskless PC, 256MB RAM Network interconnection: 24 port 100Mbps switch with 1Gbps link to front-end Mandrake Linux 8.1 Resource management: PBS Pro Soon: HTC Condor pool with idle workstations

Additional Grid Resources Storage Element and web portal of VO for Flood Forecasting Pentium 233MHz, 128MB RAM, 40GB IDE disk Certification Authority signing machine Pentium 233MHz, 64MB RAM, 3.2GB disk Web server for CA's homepage SUN SparkStation 20, Solaris 2.51, Apache UPS

II SAS Testbed Setup Front-end Computing Node Ethernet Switch UPS HPC Cluster UPS Web Server Storage Element and VO Portal CA machine II SAS Ethernet Switch SANET Router FW Slovak Academy of Sciences Computing Center Slovak Academic Network Ethernet Switch

Human Resources People involved in CrossGrid project: 3 computer scientists, 2 PhD students Mainly in WP1 “CrossGrid Application Development” - Task 1.2 In WP4 – Task 4.1: CrossGrid testbed site setup and maintenance Help with Grid technologies to the other team members and VO participants.