We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
Published byAustin Baldwin
Modified about 1 year ago
Computer Cluster at UTFSM Yuri Ivanov, Jorge Valencia
Computer Cluster at UTFSM Initiative: Informatics and Physics Departments ATLAS Collaboration LHC Computing Grid EELA EELA Grid Node at UTFSM
CE SE WN1..11 DELL Power Edge 2950, 2 Xeon 1.6 GHz, 2GB RAM, 140GB SAS HD AFS, LDAP, Kerberos 5 ( torque, psr GSI ) SL3, i386 Local UTFSM (Informatics, Physics) ATLAS Collaboration (June: CERN – Chile) Grid EELA SL3, x86_64 SL4, x86_64 140GB 1TB Network speed ?!
Cluster outlook UTFSM CE (torque, maui), AFS, LDAP, Kerberos Physics Fast WN, Huge SE Informatics Multi CPU WN (MPI) Other deps. WN?, SE?
GridKa Cloud T1/T2 at Forschungszentrum Karlsruhe (FZK)
KOLKATA Grid Site Name :- IN-DAE-VECC-02Monalisa Name:- Kolkata-Cream VO :- ALICECity:- KOLKATACountry :- INDIA Shown many data transfers.
CBPF J. Magnin LAFEX-CBPF. Outline What is the GRID ? Why GRID at CBPF ? What are our needs ? Status of GRID at CBPF.
IST E-infrastructure shared between Europe and Latin America High Energy Physics Applications in EELA Raquel Pezoa Universidad.
1 Worker Node Requirements TCO – biggest bang for the buck –Efficiency per $ important (ie cost per unit of work) –Processor speed (faster is not necessarily.
PIC port d’informació científica DateText1 November 2009 (Elena Planas) PIC Site review.
IFIN-HH LHCB GRID Activities Eduard Pauna Radu Stoica.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
Materials for Report about Computing Jiří Chudoba x.y.2006 Institute of Physics, Prague.
1 1APCTP LHC Konkuk University. Introduction to GSDC Project Activities in 2009 Strategies and Plans in 2010 GSDC office opening ceremony CERN.
Site Report: Prague Jiří Chudoba Institute of Physics, Prague WLCG GridKa+T2s Workshop.
Group Computing Strategy Introduction and BaBar Roger Barlow June 28 th 2005.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
Site Report HEPHY-UIBK Austrian federated Tier 2 meeting
INRNE's participation in LCG Elena Puncheva Preslav Konstantinov IT Department.
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
Information Technology Center Introduction to High Performance Computing at KFUPM.
ATLAS TIER3 in Valencia Santiago González de la Hoz IFIC – Instituto de Física Corpuscular (Valencia)
ITEP computing center and plans for supercomputing Plans for Tier 1 for FAIR (GSI) in ITEP 8000 cores in 3 years, in this year Distributed.
1 ANSL site of LHC and ALICE Computing Grids. Deployment and Operation. Narine Manukyan ALICE team of A.I. Alikhanian National Scientific Laboratory
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
January 30, 2016 RHIC/USATLAS Computing Facility Overview Dantong Yu Brookhaven National Lab.
BaBar Cluster Had been unstable mainly because of failing disks Very few (<20 out of 120) healthy workers nodes left Many workers died during two shut.
ATLAS computing in Geneva Szymon Gadomski, NDGF meeting, September 2009 S. Gadomski, ”ATLAS computing in Geneva", NDGF, Sept 091 the Geneva ATLAS Tier-3.
Ismayilov Ali Institute of Physics of ANAS Creating a distributed computing grid of Azerbaijan for collaborative research NEC'2011.
D0SAR - September 2005 Andre Sznajder 1 Rio GRID Initiatives : T2-HEPGRID Andre Sznajder UERJ(Brazil)
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks GRNET SA3 Progress Report Ioannis Liabotis.
CPP Staff - 30 CPP Staff - 30 FCIPT Staff - 35 IPR Staff IPR Staff ITER-India Staff ITER-India Staff Research Areas: 1.Studies.
LCG-2 Plan in Taiwan Simon C. Lin and Eric Yen Academia Sinica Taipei, Taiwan 13 January 2004.
LSST Cluster Chris Cribbs (NCSA). LSST Cluster Power edge 1855 / 1955 Power Edge 1855 (*LSST1 – LSST 4) –Duel Core Xeon 3.6GHz (*LSST1 2XDuel Core Xeon)
CROSSGRID WP41 Valencia Testbed Site: IFIC (Instituto de Física Corpuscular) CSIC-Valencia ICMoL (Instituto de Ciencia Molecular) UV-Valencia 28/08/2002.
30/07/2005Symmetries and Spin - Praha 051 MonteCarlo simulations in a GRID environment for the COMPASS experiment Antonio Amoroso for the COMPASS Coll.
J & H Automotive “Fast, Reliable Service… Guaranteed”
E-Infrastructure hierarchy Networking and Computational facilities in Armenia ASNET AM Network Armenian National Grid Initiative Armenian ATLAS site (AM-04-YERPHI)
1. 2 Welcome to HP-CAST-NTIG at NSC 1–2 April 2008.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
24th May PARTICIPATION OF IFIC Coordinated Project Plan Nacional de Altas Energías y Grandes Aceleradores IFIC (Instituto de Física Corpuscular)
Oxford Site Update HEPiX Sean Brisbane Tier 3 Linux System Administrator March 2015.
CERN - IT Department CH-1211 Genève 23 Switzerland t Tier0 database extensions and multi-core/64 bit studies Maria Girone, CERN IT-PSS LCG.
UTA Site Report Jae Yu UTA Site Report 4 th DOSAR Workshop Iowa State University Apr. 5 – 6, 2007 Jae Yu Univ. of Texas, Arlington.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
INFSO-RI Enabling Grids for E-sciencE Turkish Tier-2 Site Report Emrah AKKOYUN High Performance and Grid Computing Center TUBITAK-ULAKBIM.
Sejong STATUS Chang Yeong CHOI CERN, ALICE LHC Computing Grid Tier-2 Workshop in Asia, 1 th December 2006.
13 January 2004GDB Geneva, Milos Lokajicek Institute of Physics AS CR, Prague LCG regional centre in Prague
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
Prague TIER2 Computing Centre Evolution Equipment and Capacities NEC'2009 Varna Milos Lokajicek for Prague Tier2.
DataTAG Work Package 4 Meeting Bologna Simone Ludwig Brunel University 23rd and 24th of May 2002.
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware on Prague farms Statistics about running LHC experiment’s DC Experience.
© 2017 SlidePlayer.com Inc. All rights reserved.