LCG Deployment in Japan

Slides:



Advertisements
Similar presentations
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Advertisements

Tier1A Status Andrew Sansum GRIDPP 8 23 September 2003.
Martin Bly RAL Tier1/A RAL Tier1/A Site Report HEPiX-HEPNT Vancouver, October 2003.
Linux Clustering A way to supercomputing. What is Cluster? A group of individual computers bundled together using hardware and software in order to make.
Site Report HEPHY-UIBK Austrian federated Tier 2 meeting
IFIN-HH LHCB GRID Activities Eduard Pauna Radu Stoica.
March 27, IndiaCMS Meeting, Delhi1 T2_IN_TIFR of all-of-us, for all-of-us, by some-of-us Tier-2 Status Report.
Data oriented job submission scheme for the PHENIX user analysis in CCJ Tomoaki Nakamura, Hideto En’yo, Takashi Ichihara, Yasushi Watanabe and Satoshi.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
1 A Basic R&D for an Analysis Framework Distributed on Wide Area Network Hiroshi Sakamoto International Center for Elementary Particle Physics (ICEPP),
Computing/Tier 3 Status at Panjab S. Gautam, V. Bhatnagar India-CMS Meeting, Sept 27-28, 2007 Delhi University, Delhi Centre of Advanced Study in Physics,
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
IRODS performance test and SRB system at KEK Yoshimi KEK Building data grids with iRODS 27 May 2008.
Computing Coordination in Japan Takashi Sasaki Computing Research Center KEK, Inter-University Research Institute Corporation High Energy Accelerator Research.
UTA Site Report Jae Yu UTA Site Report 4 th DOSAR Workshop Iowa State University Apr. 5 – 6, 2007 Jae Yu Univ. of Texas, Arlington.
Preparation of KIPT (Kharkov) computing facilities for CMS data analysis L. Levchuk Kharkov Institute of Physics and Technology (KIPT), Kharkov, Ukraine.
Data GRID Activity in Japan Yoshiyuki WATASE KEK (High energy Accelerator Research Organization) Tsukuba, Japan
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
March 2003 CERN 1 EDG and AliEn in Prague Dagmar Adamova INP Rez near Prague.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
Site Report: Tokyo Tomoaki Nakamura ICEPP, The University of Tokyo 2013/12/13Tomoaki Nakamura ICEPP, UTokyo1.
INDIACMS-TIFR Tier 2 Grid Status Report I IndiaCMS Meeting, April 05-06, 2007.
LCG-2 Plan in Taiwan Simon C. Lin and Eric Yen Academia Sinica Taipei, Taiwan 13 January 2004.
Batch Scheduling at LeSC with Sun Grid Engine David McBride Systems Programmer London e-Science Centre Department of Computing, Imperial College.
1 1APCTP LHC Konkuk University. Introduction to GSDC Project Activities in 2009 Strategies and Plans in 2010 GSDC office opening ceremony CERN.
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
ITEP computing center and plans for supercomputing Plans for Tier 1 for FAIR (GSI) in ITEP  8000 cores in 3 years, in this year  Distributed.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
National HEP Data Grid Project in Korea Kihyeon Cho Center for High Energy Physics (CHEP) Kyungpook National University CDF CAF & Grid Meeting July 12,
PC clusters in KEK A.Manabe KEK(Japan). 22 May '01LSCC WS '012 PC clusters in KEK s Belle (in KEKB) PC clusters s Neutron Shielding Simulation cluster.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
December 26, 2015 RHIC/USATLAS Grid Computing Facility Overview Dantong Yu Brookhaven National Lab.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
CERN Computer Centre Tier SC4 Planning FZK October 20 th 2005 CERN.ch.
Site Report: Prague Jiří Chudoba Institute of Physics, Prague WLCG GridKa+T2s Workshop.
Materials for Report about Computing Jiří Chudoba x.y.2006 Institute of Physics, Prague.
Status of Tokyo LCG tier-2 center for atlas / H. Sakamoto / ISGC07 Status of Tokyo LCG Tier 2 Center for ATLAS Hiroshi Sakamoto International Center for.
Evangelos Markatos and Charalampos Gkikas FORTH-ICS Athens, th Mar Institute of Computer Science - FORTH Christos.
UTA Site Report Jae Yu UTA Site Report 7 th DOSAR Workshop Louisiana State University Apr. 2 – 3, 2009 Jae Yu Univ. of Texas, Arlington.
HP Proliant Server  Intel Xeon E3-1220v3 (3.1GHz / 4-core / 8MB / 80W).  HP 4GB Dual Rank x8 PC E (DDR3-1600) Unbuffered Memory Kit.  HP Ethernet.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
The status of IHEP Beijing Site WLCG Asia-Pacific Workshop Yaodong CHENG IHEP, China 01 December 2006.
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
Bernd Panzer-Steindel CERN/IT/ADC1 Medium Term Issues for the Data Challenges.
Grid activities in Czech Republic Jiri Kosina Institute of Physics of the Academy of Sciences of the Czech Republic
13 January 2004GDB Geneva, Milos Lokajicek Institute of Physics AS CR, Prague LCG regional centre in Prague
KEK CC - present and future - Mitsuaki NOZAKi (KEK)
NDGF Site Report Mattias Wadenstein Hepix 2009 spring, Umeå , Umeå University.
Brief introduction about “Grid at LNS”
GRID OPERATIONS IN ROMANIA
ICEPP, University of Tokyo
The Beijing Tier 2: status and plans
6th DOSAR Workshop University Mississippi Apr. 17 – 18, 2008
Paul Kuipers Nikhef Site Report Paul Kuipers
Moroccan Grid Infrastructure MaGrid
Andrea Chierici On behalf of INFN-T1 staff
Status and Plans on GRID related activities at KEK
Grid Computing for the ILC
Southwest Tier 2 Center Status Report
Update on Plan for KISTI-GSDC
Experience of Lustre at a Tier-2 site
5th DOSAR Workshop Louisiana Tech University Sept. 27 – 28, 2007
UK GridPP Tier-1/A Centre at CLRC
Computing Board Report CHIPP Plenary Meeting
UTFSM computer cluster
Southwest Tier 2.
Grid Datafarm and File System Services
High Energy Physics Computing Coordination in Pakistan
Presentation transcript:

LCG Deployment in Japan Hiroshi Sakamoto ICEPP, Univ. of Tokyo

Contents Present status of LCG deployment Recent topics Future plan LCG Tier2 Certification authority Implementation Recent topics KEK-ICEPP joint R&D program Network Upgrade of resources Future plan

LCG in Japan Tier2 center at ICEPP, U. Tokyo Decision made in October 2004 Man power consideration A few dedicated people Including engineers and outsourcing Contribution to LHC/ATLAS The size of community ~ 4% of ATLAS Want to contribute more

Japanese CA for HENP KEK-CA is ready for operation Japanese HENP society ~ KEK users LHC ATLAS KEK-B Belle JPARC (50GeV PS at Tokai) RHIC Phenix (RIKEN) CP/CPS prepared Discussion between KEK and ICEPP To be submitted to the EU Grid PMA Or to the AP Grid PMA?

TOKYO-LCG2 cluster LCG-2 cluster@u-tokyo 52 Worker Nodes Upgraded to LCG-2_3_1 (last week) From LCG-2_1_1 last week With YAIM Redhad 7.3 (will replace Scientific Linux)

PC Farm HP ProLiant BL20p Xeon 2.8GHz 2CPU/node 1GB memory SCSI 36GBx2, hardware RAID1 3 GbE NIC iLO remote administration tool 8 blades/ Enclosure(6U) Total 108 blades(216 CPUs) in 3 racks

. . 52 WNs … 1.75TB * 20 = 35TB LCG nodes: HP Blade BL20P G2 (hpbwn7-1) Gateway (dggw0.icepp.jp) . . 52 WNs CE (dgce0.icepp.jp) SE (dgse0.icepp.jp) WN (hpbwn13-8) RB (dgrb0.icepp.jp) BDII (dgbdii0.icepp.jp) Campus Network (133.11.24.0/23) Private Network (172.17.0.0/24) PXY (dgpxy0.icepp.jp) LCG nodes: HP Blade BL20P G2 CPU Xeon 2.8GHz dual memory 1GB(plan to 2GB) GbE NIC UI (dgui0.icepp.jp) NFS Server: DELL 1750 CPU Xeon 2.8GHz dual/ memory 2GB IDE-FC RAID Infortrend controller 250GB HDD*16*10 NFS sever (dgnas0.icepp.jp) FC SW /storage 1.75TB … /storage 1.75TB /home 1.75TB 1.75TB * 20 = 35TB

KEK-ICEPP joint R&D Testbed cluster@u-tokyo Testbed cluster@KEK 1 Worker Node LCG-2_4_0 with VOMS Simple CA for testbed user Scientific Linux with autorpm Testbed cluster@KEK Computing Research Center

KEK LCG2 UI Proxy LCFGng BDII-LCG2 CE(SiteGIIS) RB WN WN WN WN (remaining) UI Proxy LCFGng BDII-LCG2 AMD Opteron-based Linux System as WNs (under integration) CE(SiteGIIS) RB Managed by LSF WN WN WN WN CE(SiteGIIS) ClassicSE WN WN WN WN IBM eServer 326 Opteron 2.4GHz 4096MB WN 20 nodes WN WN WN WN WN IBM eServer xSeries Pen III 1.3 GHz 256MB RAM (Test WN) WN Managed by PBS WN WN WN WN WN WN

R&D Menu Stand-alone grid connecting two clusters Special interests 1Gbps dedicated connection between KEK and ICEPP (SuperSINET) Exercises understanding LCG middleware Special interests SRB Grid datafarm (Osamu Tatebe AIST)

Network Peer to peer 1Gbps between CERN and ICEPP Sustained data transfer study 10Gbps to US and EU Still thin, but improving connection among Asia/Pacific countries JP-TW to 1Gbps very soon. JP-HK, JP-CN

PC Farm Upgrade IBM BladeCenter HS20 Xeon 3.6GHz 2CPU/node EM64T 2GB memory SCSI 36GBx2, hardware RAID1 2 GbE NIC Integrated System Management Processor 14 blades/ Enclosure(7U) Total 150 blades(300CPU) in 2 rack + 1rack for console&Network SW

FOUNDARY BigIron MG8 Disk Array two 4x10GbE modules four 60xGbE modules Disk Array 16x250GB SATA HDD 2 FibreChannel I/F 27 Cabinets in total

Future plan LCG Memorandum of Understanding LCG Tier2 Resources To be signed in JFY2005 University of Tokyo as the funding body LCG Tier2 Resources More resources added to our testbed in JFY2005 – approved LCG SC4 + ATLAS DC3 in 2006 Production system Budget request submitted for JFY2006 Expected to become operational in Jan. 2007