Southwest Tier 2 Center Status Report U.S. ATLAS Tier 2 Workshop - Harvard Mark Sosebee for the SWT2 Center August 17, 2006.

Slides:



Advertisements
Similar presentations
DOSAR Workshop VI April 17, 2008 Louisiana Tech Site Report Michael Bryant Louisiana Tech University.
Advertisements

Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
ATLAS Tier 2 Status (IU/BU) J. Shank Boston University iVDGL Facilities Workshop (March 20-22, 2002) BNL.
IFIN-HH LHCB GRID Activities Eduard Pauna Radu Stoica.
Site Report US CMS T2 Workshop Samir Cury on behalf of T2_BR_UERJ Team.
March 27, IndiaCMS Meeting, Delhi1 T2_IN_TIFR of all-of-us, for all-of-us, by some-of-us Tier-2 Status Report.
Database Services for Physics at CERN with Oracle 10g RAC HEPiX - April 4th 2006, Rome Luca Canali, CERN.
Computing/Tier 3 Status at Panjab S. Gautam, V. Bhatnagar India-CMS Meeting, Sept 27-28, 2007 Delhi University, Delhi Centre of Advanced Study in Physics,
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
UTA Site Report Jae Yu UTA Site Report 4 th DOSAR Workshop Iowa State University Apr. 5 – 6, 2007 Jae Yu Univ. of Texas, Arlington.
Preparation of KIPT (Kharkov) computing facilities for CMS data analysis L. Levchuk Kharkov Institute of Physics and Technology (KIPT), Kharkov, Ukraine.
TechFair ‘05 University of Arlington November 16, 2005.
LHC Tier 2 Networking BOF Joe Metzger Joint Techs Vancouver 2005.
UTA Site Report Jae Yu UTA Site Report 2 nd DOSAR Workshop UTA Mar. 30 – Mar. 31, 2006 Jae Yu Univ. of Texas, Arlington.
GridKa SC4 Tier2 Workshop – Sep , Warsaw Tier2 Site.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
K. De UTA Grid Workshop April 2002 U.S. ATLAS Grid Testbed Workshop at UTA Introduction and Goals Kaushik De University of Texas at Arlington.
ATLAS DC2 seen from Prague Tier2 center - some remarks Atlas sw workshop September 2004.
Location: BU Center for Computational Science facility, Physics Research Building, 3 Cummington Street.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
D0SAR - September 2005 Andre Sznajder 1 Rio GRID Initiatives : T2-HEPGRID Andre Sznajder UERJ(Brazil)
Status of UTA IAC + RAC Jae Yu 3 rd DØSAR Workshop Apr. 7 – 9, 2004 Louisiana Tech. University.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
Spending Plans and Schedule Jae Yu July 26, 2002.
26SEP03 2 nd SAR Workshop Oklahoma University Dick Greenwood Louisiana Tech University LaTech IAC Site Report.
São Paulo Regional Analysis Center SPRACE Status Report 22/Aug/2006 SPRACE Status Report 22/Aug/2006.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
NL Service Challenge Plans Kors Bos, Sander Klous, Davide Salomoni (NIKHEF) Pieter de Boer, Mark van de Sanden, Huub Stoffers, Ron Trompert, Jules Wolfrat.
DØSAR a Regional Grid within DØ Jae Yu Univ. of Texas, Arlington THEGrid Workshop July 8 – 9, 2004 Univ. of Texas at Arlington.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
Sep 02 IPP Canada Remote Computing Plans Pekka K. Sinervo Department of Physics University of Toronto 4 Sep IPP Overview 2 Local Computing 3 Network.
ATLAS Great Lakes Tier-2 (AGL-Tier2) Shawn McKee (for the AGL Tier2) University of Michigan US ATLAS Tier-2 Meeting at Harvard Boston, MA, August 17 th,
ISU DOSAR WORKSHOP Dick Greenwood DOSAR/OSG Statement of Work (SoW) Dick Greenwood Louisiana Tech University April 5, 2007.
ATLAS Tier 1 at BNL Overview Bruce G. Gibbard Grid Deployment Board BNL 5-6 September 2006.
Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Implementation of a reliable and expandable on-line storage for compute clusters Jos van Wezel.
CDF computing in the GRID framework in Santander
UTA Site Report DØrace Workshop February 11, 2002.
December 26, 2015 RHIC/USATLAS Grid Computing Facility Overview Dantong Yu Brookhaven National Lab.
U.S. ATLAS Computing Facilities Bruce G. Gibbard GDB Meeting 16 March 2005.
CERN Computer Centre Tier SC4 Planning FZK October 20 th 2005 CERN.ch.
Status of the Bologna Computing Farm and GRID related activities Vincenzo M. Vagnoni Thursday, 7 March 2002.
UTA MC Production Farm & Grid Computing Activities Jae Yu UT Arlington DØRACE Workshop Feb. 12, 2002 UTA DØMC Farm MCFARM Job control and packaging software.
Site Report: Prague Jiří Chudoba Institute of Physics, Prague WLCG GridKa+T2s Workshop.
ATLAS Midwest Tier2 University of Chicago Indiana University Rob Gardner Computation and Enrico Fermi Institutes University of Chicago WLCG Collaboration.
RHIC/US ATLAS Tier 1 Computing Facility Site Report Christopher Hollowell Physics Department Brookhaven National Laboratory HEPiX Upton,
Florida Tier2 Site Report USCMS Tier2 Workshop Livingston, LA March 3, 2009 Presented by Yu Fu for the University of Florida Tier2 Team (Paul Avery, Bourilkov.
Tier 1 at Brookhaven (US / ATLAS) Bruce G. Gibbard LCG Workshop CERN March 2004.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
Ole’ Miss DOSAR Grid Michael D. Joy Institutional Analysis Center.
Database CNAF Barbara Martelli Rome, April 4 st 2006.
Southwest Tier 2 (UTA). Current Inventory Dedidcated Resources  UTA_SWT2 320 cores - 2GB/core Xeon EM64T (3.2GHz) Several Headnodes 20TB/16TB in IBRIX/DDN.
BNL dCache Status and Plan CHEP07: September 2-7, 2007 Zhenping (Jane) Liu for the BNL RACF Storage Group.
Pathway to Petaflops A vendor contribution Philippe Trautmann Business Development Manager HPC & Grid Global Education, Government & Healthcare.
Western Tier 2 Site at SLAC Wei Yang US ATLAS Tier 2 Workshop Harvard University August 17-18, 2006.
UTA Site Report Jae Yu UTA Site Report 7 th DOSAR Workshop Louisiana State University Apr. 2 – 3, 2009 Jae Yu Univ. of Texas, Arlington.
ATLAS TIER3 in Valencia Santiago González de la Hoz IFIC – Instituto de Física Corpuscular (Valencia)
LTU Site Report Dick Greenwood LTU Site Report Dick Greenwood Louisiana Tech University DOSAR II Workshop at UT-Arlington March 30-31, 2005.
RAL Plans for SC2 Andrew Sansum Service Challenge Meeting 24 February 2005.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
STORAGE EXPERIENCES AT MWT2 (US ATLAS MIDWEST TIER2 CENTER) Aaron van Meerten University of Chicago Sarah Williams Indiana University OSG Storage Forum,
BeStMan/DFS support in VDT OSG Site Administrators workshop Indianapolis August Tanya Levshina Fermilab.
6th DOSAR Workshop University Mississippi Apr. 17 – 18, 2008
U.S. ATLAS Tier 2 Computing Center
LCG 3D Distributed Deployment of Databases
OUHEP STATUS Hardware OUHEP0, 2x Athlon 1GHz, 2 GB, 800GB RAID
Southwest Tier 2 Center Status Report
5th DOSAR Workshop Louisiana Tech University Sept. 27 – 28, 2007
Southwest Tier 2.
Presentation transcript:

Southwest Tier 2 Center Status Report U.S. ATLAS Tier 2 Workshop - Harvard Mark Sosebee for the SWT2 Center August 17, 2006

Mark Sosebee OverviewOverview The Southwest Tier 2 Center is a collaboration between the University of Texas at Arlington (UTA) and the University of Oklahoma (OU) Personnel:  UTA: Kaushik De, Patrick McGuigan, Victor Reece, Mark Sosebee  OU: Karthik Arunachalam, Horst Severini, Pat Skubic

August 17, 2006Mark Sosebee UTA CC Hardware Configuration  160 compute nodes:  Dual Xeon EM64T, 3.2 GHz, 4GB RAM, 160 GB Disk  8 front-end nodes:  Dual Xeon EM64T, 8GB RAM, 73GB SCSI Raid 1  16 TB SAN storage (IBRIX):  80 x 250 GB SATA disks  6 I/O servers, 1 management server  16 TB potential in compute nodes

August 17, 2006Mark Sosebee UTA DPCC Hardware Configuration  Shared resource with CSE department  75 compute nodes:  Dual Xeon GHz  2 GB RAM  GB local disks  45 TB among 10 NFS servers (IDE RAID)  Typically ~100 ATLAS production queue slots

August 17, 2006Mark Sosebee OCHEP Hardware Configuration  40 compute nodes:  Dual Xeon EM64T, 3.2 GHz, 4GB RAM, 160 GB Disk  2 front-end nodes:  Dual Xeon EM64T, 8GB RAM, 73GB SCSI Raid 1  4 TB SAN storage (IBRIX):  20 x 250 GB SATA disks  2 I/O servers, 1 management server  4 TB potential in compute nodes

August 17, 2006Mark Sosebee Additional OU Resources  Current OSCER cluster, boomer:  135 dual Xeon nodes, 2 GHz  5 TB storage  Used for D-zero MC production & data processing  New OSCER cluster, topdawg:  512 dual Xeon EM64T nodes, 3.2 GHz  10 TB storage  Will be used for ATLAS Tier 2 & D-zero computing a.s.a.p.

August 17, 2006Mark Sosebee Network Connectivity  UTA:  Gigabit link to the North Texas Gigapop  OC12 from NTG to Houston peering site (I2)  Future: LEARN / NLR  OU:  Campus backbone 10 Gbps  Connection to NLR via OneNet  Current connection is OC12 to Gigapop in K.C.

August 17, 2006Mark Sosebee Activities Since UC Meeting (May)  New cluster at UTA CC now on-line (though we’re not as yet running at full capacity…)  Large effort devoted to DQ2 upgrade  SC4 dCache setup – it works, with a couple of caveats – learned much about the system for potential future deployments 

August 17, 2006Mark Sosebee IBRIX Issues / Status at UTA CC  Scaling issues observed with IBRIX when the number of running jobs exceeds ~ 150  Lost files  One segment server becomes a “hot-spot”  IBRIX tech supports recommends:  Upgrade software to v2  Reconfigure storage – one large filesystem rather than current two  Software upgrade scheduled for end of August

August 17, 2006Mark Sosebee Analysis Activities  Two workshops have been held at UTA (March & May) to promote physics analysis for ATLAS groups in the SWT2 region  Participants: UTA OU, UNM, Langston, SMU, UT Dallas, LTU  Bi-weekly video / phone meetings  See: