CD-doc-650 Fermilab Computing Division Physical Infrastructure Requirements for Computers FY04 – 07 (1/4/05)

Slides:



Advertisements
Similar presentations
Computer Room Provision in Atlas and R89 Graham Robinson.
Advertisements

CD Operations Part of Division Office Gerry Bellendir, Assoc. Head for Operations David Ritchie, Assistant Head for Operations Amy Pavnica, Senior Safety.
Columbia University’s Advanced Concepts Data Center Pilot June 17, 2011.
Facilities Master Plan Phase I Improvements "For the children, For the community, For the future"
MidAmerican Energy Holdings Company Telecom Power Infrastructure Analysis Premium Power for Colocation Telecom Power Infrastructure Analysis February 27,
The CDCE BNL HEPIX – LBL October 28, 2009 Tony Chan - BNL.
ICW Water Leak at FCC Forces a Partial Computing Outage at Feynman Computing Center J. MacNerland, M. Thomas, G. Bellendir, A. Walters May 31, 2007.
1 BROOKHAVEN SCIENCE ASSOCIATES NSLS – II ASAC Review Conventional Facilities Briefing Electrical Utility Service Dennis Danseglio, P.E. Project Engineer.
1 1 Substation Asset Strategy Kevin Dasso, Senior Director Engineering and Operations IEEE/PES Annual Substations Committee Meeting April 7, 2008.
GCC Cooling Problems and Recommendations Computing Sector.
1 Northwestern University Information Technology Data Center Elements Research and Administrative Computing Committee Presented October 8, 2007.
02/24/09 Green Data Center project Alan Crosswell.
A New Building Data Center Upgrade capacity and technology step 2011 – May the 4 th– Hepix spring meeting Darmstadt (de) Pascal Trouvé (Facility Manager.
Architecture for Modular Data Centers James Hamilton 2007/01/08
October 23rd, 2009 Visit of CMS Computing Management at CC-IN2P3.
Public Transit Department Bus and Fuel Procurement Strategy AzTA/ADOT Transit Conference April 2013.
Infrastructure Inventory Review Master Planning Task Force 19 October 2009 Randy Ortgiesen.
PIP Update Linac Utilities for 2014 shutdown Matthew Gardner, 4/23/14.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
ILCTA_NML Progress at Fermilab Jerry Leibfritz September 10, 2007.
University of Michigan, CSG, May 2006 UM/MITC Data Center.
Electrical & High Temperature Utility Distribution Upgrade.
COMP 4923 A2 Data Center Cooling Danny Silver JSOCS, Acadia University.
CD FY09 Tactical Plan Status FY09 Tactical Plan Status Report for Site Networking Anna Jordan April 28, 2009.
New Data Center at BNL– Status Update HEPIX – CERN May 6, 2008 Tony Chan - BNL.
Progress Energy Corporate Data Center Rob Robertson February 17, 2010 of North Carolina.
Most organization’s data centers that were designed before 2000 were we built based on technologies did not exist or were not commonplace such as: >Blade.
July LEReC Review July 2014 Low Energy RHIC electron Cooling Al Pendzick LEReC Civil Construction.
STANFORD UNIVERSITY INFORMATION TECHNOLOGY SERVICES Research Data Centers- Stanford Phil Reese, Stanford University CSG Meeting, June.
Infrastructure Improvements 2010 – November 4 th – Hepix – Ithaca (NY)
Jon Walton Director of Technology Bay Area Technology Forum November 5, 2009 Technology Consolidation: Fact or Fiction.
Commodity Node Procurement Process Task Force: Status Stephen Wolbers Run 2 Computing Review September 13, 2005.
COUVA SHOPPING CENTRE. EXISTING LAYOUT ASSUMPTIONS ONLY EXISTING LAND AREA AVAILABLE MAXIMISE RENTAL OF EXISTING SPACE PROVIDE A FINANCIALLY AFFORDABLE.
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Centre Upgrade Project Wayne Salter HEPiX November.
IARC Milestones completed Sept 2011: End of Tevatron Run, FESS obtains ~full access to infrastructure at CDF, PPD begins D&D in CDF assembly bldg Sept.
MINU GPP WBS 1. Main Injector Neutrino Upgrades - Infrastructure Project Dixon Bogert APD Department Report – March 7, 2007 Russ Alber - FESS Engineering.
Requirements for computing room A. Gianoli, M. Serra, P. Valente.
Missoula County Space Needs Plan Planning Logistics Phasing Plan Schematic Designs Questions.
Data Center Facility & Services Overview. Data Center Facility.
High Performance Computing (HPC) Data Center Proposal Imran Latif, Facility Project Manager Scientific & Enterprise Computing Data Centers at BNL 10/14/2015.
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Centre Consolidation Project Vincent Doré IT Technical.
ILC Test Facility at New Muon Lab (NML) S. Nagaitsev Fermilab April 16, 2007.
Power and Cooling at Texas Advanced Computing Center Tommy Minyard, Ph.D. Director of Advanced Computing Systems 42 nd HPC User Forum September 8, 2011.
EXPANSION OF BRUNSWICK COUNTY’S NORTHWEST WATER TREATMENT PLANT Public Utilities.
Computer Centre Upgrade Status & Plans Post-C5, October 11 th 2002 CERN.ch.
IHEP Computing Site Report Shi, Jingyan Computing Center, IHEP.
Vault Reconfiguration IT DMM January 23 rd 2002 Tony Cass —
CERN - IT Department CH-1211 Genève 23 Switzerland t Power and Cooling Challenges at CERN IHEPCCC Meeting April 24 th 2007 Tony Cass.
NML - RF Unit Test Facility Jerry Leibfritz Fermilab.
Status: Central Storage Services CD/LSC/CSI/CSG June 26, 2007.
This talk is not a summary of costs for conventional facilities This talk is Review of conventional facilities used by CDF Discussion of status of systems.
West Cambridge Data Centre Ian Tasker Information Services.
ILCTA_NML Progress at Fermilab Jerry Leibfritz August 16, 2007.
The Genome Sequencing Center's Data Center Plans Gary Stiehr
CD-doc-650 Fermilab Computing Division Physical Infrastructure Requirements for Computers FY04 – 07 (1/4/05)
Long-Baseline Neutrino Facility LBNF News and brief overview of Beamline plans for the next few months Vaia Papadimitriou Beamline Technical Board Meeting.
1 CF lab review ; September 16-19, 2013, H.Weerts Budget and Activity Summary Slides with budget numbers and FTE summaries/activity for FY14 through FY16.
Siting/Conventional Facilities Jonathan Hunt PIP-II Machine Advisory Committee 9 March 2015.
1 Taming the Data Center SDR 1.3 Butch Adkins Infrastructure & Operations.
Tim Kasza. FCC UPS and Building Power  Building: 1400A / 1520A (92%)  UPS1: 385KVA / 400KVA (96%)  1078A, 1061A, 1064A / 1110A  UPS2: 27.5KVA / 57KVA.
HVAC Replacement Case Study
Project Management – Part I
The Data Center Challenge
Computing room PC farm UPS.
UMCP Learning Objectives
CERN Data Centre ‘Building 513 on the Meyrin Site’
Dana M. Arenius Jefferson Laboratory Cryogenics Dept Head
News and brief overview of Beamline plans for the next few months
FY09 Tactical Plan Status Report for Site Networking
Architecture for Modular Data Centers
Presentation transcript:

CD-doc-650 Fermilab Computing Division Physical Infrastructure Requirements for Computers FY04 – 07 (1/4/05)

CD-doc-650 Mission CDO/Operations provides requirements and management of computer infrastructure, i.e., space, power & cooling CDO/Operations provides requirements and management of computer infrastructure, i.e., space, power & cooling Computer rooms located in FCC, LCC and GCC Computer rooms located in FCC, LCC and GCC Stakeholders: CDF, D0, CMS, SDSS, Minos, BTeV, BSS, FESS, PPD, among others Stakeholders: CDF, D0, CMS, SDSS, Minos, BTeV, BSS, FESS, PPD, among others

CD-doc-650 What is happening? Commodity computers – more computing bang for the buck Commodity computers – more computing bang for the buck More computers are purchased (~1,000/yr) More computers are purchased (~1,000/yr) Power required per computer increasing yearly Power required per computer increasing yearly Rapidly increasing power and cooling requirements (+45%/FY02, +25%/FY03, +40%/FY04) Rapidly increasing power and cooling requirements (+45%/FY02, +25%/FY03, +40%/FY04) More computers per sq. ft. of floor space (40 computers in 6 sq. ft., 10 – 13 KW) More computers per sq. ft. of floor space (40 computers in 6 sq. ft., 10 – 13 KW) High power & cooling density (HDCF) High power & cooling density (HDCF)

CD-doc-650 Historical Power Growth + Projection

CD-doc-650 Additional Power based on Experiment Projections in FY03

CD-doc-650 Total Power Projection ($2M/yr Computer Purchases)

CD-doc-650 FY03 Total Power Projections Summary

CD-doc-650 FCC Facts FCC reached it’s maximum power and cooling capacity in FY03 FCC reached it’s maximum power and cooling capacity in FY03 –FCC did not have sufficient utility space for added power and cooling equipment –Would have required a building addition, new feeder, upsizing of ICW, cooling pipes, … –Would have caused major service outages

CD-doc-650 Challenge Had to look outside FCC Had to look outside FCC Needed to develop short (4 yr) & long term (10 yr) plans Needed to develop short (4 yr) & long term (10 yr) plans Plans needed to be phased. No longer can build a complete computer facility for 10 year life. Plans needed to be phased. No longer can build a complete computer facility for 10 year life. Plans had to be adaptable to rapidly changing technology Plans had to be adaptable to rapidly changing technology Concentrated on next 4 years Concentrated on next 4 years

CD-doc Year Response 4,000 computers, 4,000 sq. ft., 1600 KVA 4,000 computers, 4,000 sq. ft., 1600 KVA Wide Band offered the best in footprint for FY04/05 and electric power FY04-07 Wide Band offered the best in footprint for FY04/05 and electric power FY04-07 Developed plan for adaptive reuse Developed plan for adaptive reuse Directorate granted GPP funds for FY04 construction of 2,000 sq. ft., 800 KVA Directorate granted GPP funds for FY04 construction of 2,000 sq. ft., 800 KVA

CD-doc-650 FY04/05 GCC (Wide Band) computer room #1 completed Sept (2,000 sq. ft. computer room, 800 KVA) GCC (Wide Band) computer room #1 completed Sept (2,000 sq. ft. computer room, 800 KVA) Already running KVA = 1/3 FCC Already running KVA = 1/3 FCC Planning to move ~500 systems (125 KVA) from FCC to GCC to accommodate new CDF, D0, CMS servers Planning to move ~500 systems (125 KVA) from FCC to GCC to accommodate new CDF, D0, CMS servers CDF, D0, CMS estimate total of 1,000 new computers in FY05 CDF, D0, CMS estimate total of 1,000 new computers in FY05 Expecting 80% GCC utilization by end of FY05 Expecting 80% GCC utilization by end of FY05 Also adding 1 st half of 2 nd computer room at LCC for Lattice Gauge (500 sq. ft., 250 KVA – CD funded). QCD is adding 512 computers in FY05 for a total of 768. Also adding 1 st half of 2 nd computer room at LCC for Lattice Gauge (500 sq. ft., 250 KVA – CD funded). QCD is adding 512 computers in FY05 for a total of 768.

CD-doc-650 FY05/06 GCC north room: tape robots for distributed experiment data GCC north room: tape robots for distributed experiment data Finish out computer room #1 (2AC’s) Finish out computer room #1 (2AC’s) South building addition for computer room #2 (same size & power of room #1) South building addition for computer room #2 (same size & power of room #1)

CD-doc-650 Planned Capacities by FY07 FCC KVA FCC KVA GCC KVA GCC KVA LCC KVA LCC KVA Total – 2560 KVA Total – 2560 KVA Projections for FY07: KVA Projections for FY07: KVA

CD-doc-650 Relative Capacities

CD-doc-650 FCC

GCC

LCC

All Facilities

CD-doc-650 Historical & Projections Comparison

CD-doc-650 Growth vs Projections

CD-doc-650 FY08 & Beyond ?????????????????????????????????????????

CD-doc-650 FY05/FY06 Plans $4.530m FY05/FY06 GPP Project $4.530m FY05/FY06 GPP Project Robotic Tape Storage Room (NCR) Robotic Tape Storage Room (NCR) –20 kVA UPS –30 Tons of Cooling –60 Tons of Cooling to SCR –30 Tons of Cooling to UPS Room Substation Upgrades Substation Upgrades –(2)-1500 kVA Transformers –(4) Air Switches –Replace ~1,100 LF of Feeder Cable 2,400 SF Computer Room Addition 2,400 SF Computer Room Addition –1000 kVA/900 kW UPS system –300 Tons of Cooling

CD-doc-650 FY05/FY06

FY05/FY06 Plan

CD-doc-650 FY05/FY06 Schedule

CD-doc-650 Future Facilities Options High Bay Expansion – 3 Floors High Bay Expansion – 3 Floors High Bay Expansion – 1 Floor High Bay Expansion – 1 Floor New Building New Building Adaptive Reuse of Existing Building Adaptive Reuse of Existing Building

CD-doc-650 High Bay Expansion – 3 Floors

CD-doc-650 High Bay Expansion – 1 Floor

CD-doc-650 High Bay Expansion – 1 Floor

CD-doc-650 New Building

CD-doc-650 New Building

CD-doc-650 Adaptive Reuse January 2005 Project Definition Report January 2005 Project Definition Report Conversion of an existing building: Conversion of an existing building: –Computer Room (70 40 nodes/rack=2,800 nodes) –1000 kVA/900 kW of UPS Power –300 tons of cooling Candidate Buildings: Candidate Buildings: –Near existing utilities (elect, water, ICW) –3,900 SF of 14 ’ high space ~$2.7m ~$2.7m

CD-doc-650 Summary