1 PCE 2.1: The Co-Relationship of Containment and CFDs Gordon Johnson Senior CFD Manager at Subzero Engineering CDCDP (Certified Data Center Design Professional)

Slides:



Advertisements
Similar presentations
Challenges in optimizing data center utilization
Advertisements

Presented by Paul Almond – Datacentre UK
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Green Datacenter Initiatives at SDSC Matt Campbell SDSC Data Center Services.
Data Center Design Issues Bill Tschudi, LBNL
Clair Christofersen Mentor – Aaron Andersen August 2, 2012
The Economics of Data Center Air Flow Containment
Hot Aisle vs. Cold Aisle Containment
Heat Exchanger Network Retrofit
Columbia University’s Advanced Concepts Data Center Pilot June 17, 2011.
Smart Devices. Smart Buildings. Smart Business The Potential for DCx Technology Enabled HVAC Operation Scot Duncan, P.E.
Cooling Product Positioning
PG&E and Altera Data Center Energy Efficiency Project.
Project Motivation: Opportunity to explore building efficiency technology and the engineering design process Improving the thermal efficiency will save.
VIACOM Data Center Optimization Project
Utility-Function-Driven Energy- Efficient Cooling in Data Centers Authors: Rajarshi Das, Jeffrey Kephart, Jonathan Lenchner, Hendrik Hamamn IBM Thomas.
Effect of Rack Server Population on Temperatures in Data Centers CEETHERM Data Center Laboratory G.W. Woodruff School of Mechanical Engineering Georgia.
Thermal Management Solutions from APW President Systems
 Site  Requirements  Local Resources  Initial layout ideas  Brief material selection  Supply options.
24 x 7 Energy Efficiency February, 2007 William Tschudi
MODULAR DATA CENTER PUE
September 18, 2009 Critical Facilities Round Table 1 Introducing the Heat Wheel to the Data Center Robert (Dr. Bob) Sullivan, Ph.D. Data Center Infrastructure.
Steve Craker K-12 Team Lead Geoff Overland IT and Data Center Focus on Energy Increase IT Budgets with Energy Efficiency.
1Taylor Engineering, LLC HVAC System Design Mark Hydeman, P.E., FASHRAE Taylor Engineering, LLC
Federal Data Center Consolidation Initiative Karen Petraska August 17, 2011.
Data Center Consolidation & Energy Efficiency in Federal Facilities
Cooling: Best Practices and Economizers
Best Practices in HVAC Design/Retrofit
Data Centers - They’re Back… E SOURCE Forum September, 2007 William Tschudi
Air Conditioning and Computer Centre Power Efficiency The Reality Christophe Martel Tony Cass.
Sensor-Based Fast Thermal Evaluation Model For Energy Efficient High-Performance Datacenters Q. Tang, T. Mukherjee, Sandeep K. S. Gupta Department of Computer.
COMP 4923 A2 Data Center Cooling Danny Silver JSOCS, Acadia University.
Mission Energy: Energy Efficient Cooling How Data Centre energy consumption and operating costs can be significantly reduced through readily available.
Data centre air management Case studies Sophia Flucker.
Name of Building(s) or Project Speaker(s) Organization(s)
November 2004 Low Hanging Fruit Low Cost Energy Efficiency Opportunities in Cleanrooms.
Overview of Data Center Energy Use Bill Tschudi, LBNL
Thermal-aware Issues in Computers IMPACT Lab. Part A Overview of Thermal-related Technologies.
Dealing with Hotspots in Datacenters Caused by High-Density Computing Peter Hannaford Director of Business Development EMEA.
Thermal Aware Data Management in Cloud based Data Centers Ling Liu College of Computing Georgia Institute of Technology NSF SEEDM workshop, May 2-3, 2011.
The Data Center Challenge
Energy Efficient Data Centers Update on LBNL data center energy efficiency projects June 23, 2005 Bill Tschudi Lawrence Berkeley National Laboratory
We can…. 2 GLOBAL REFERENCES Rev: 00 References :
Data Center Energy Efficiency Sonoma Mountain Village November 29, 2007 William Tschudi
Authors: William Tschudi, Lawrence Berkeley National Lab Stephen Fok, Pacific Gas and Electric Company Stephen Fok, Pacific Gas and Electric Company Presented.
Energy Savings in CERN’s Main Data Centre
Data Center Energy Use, Metrics and Rating Systems Steve Greenberg Energy Management Engineer Environmental Energy Technologies Division Lawrence Berkeley.
All content in this presentation is protected – © 2008 American Power Conversion Corporation Row Cooling.
War Stories: Implementing ASHRAE’s 2011 Allowable Ranges in Data Centers Mark Monroe, Vice President & CTO of DLB Associates Consulting Engineers Session.
1 ITM 4.1: A Three-Step Approach to Better Instrumentation of the Data Center and Use of KPIs in Decision Making George Clement.
1 PCE 4.4 New Development In DC Containment Steve Howell.
1 ITM 1.2 How IT Decisions Impact Data Center Facilities: The Importance of Collaboration Lars Strong P.E. Upsite Technologies, Inc.
7/15/2002PP.AFD.09 1 of 43 Yaskawa Electric America Variable Frequency Drives In HVAC Applications.
1 Energy Efficient Data Centers: Strategies from the Save Energy Now Program Federal Environmental Symposium June 4, 2008 Dale Sartor Lawrence Berkeley.
Restricted © Siemens AG 2016 Page 1 Tower to Rack: Driving the Next Generation of Cooling Optimization Technology Jay Hendrix, Siemens Industry Inc. Aaron.
Data Center Energy Efficiency SC07 Birds of a Feather November, 2007 William Tschudi
1 Copyright © 2016, The Green Grid The webcast will begin shortly Today’s live session will be recorded.
Free Air Cooling for Data Centres
Lars Strong P.E. Upsite Technologies, Inc.
Unit 2: Chapter 2 Cooling.
The Data Center Challenge
MODULAR DATA CENTER PUE
Using Heat to Increase Cooling George Hannah BEng (Hons) CEng MIMechE
Doug Jefferson, Business Development Engineer
Data Center Research Roadmap
Closing the Gap to Free Cooling in the Data Center
Data Center Controls Mark Hydeman, P.E., FASHRAE Taylor Engineering, LLC
Energy Efficiency in District Coiling System
The Benefit of Including Energy Recovery System Analysis
Objective Use financial modeling of your data center costs to optimize their utilization.
Liebert DSE High efficiency thermal management
Presentation transcript:

1 PCE 2.1: The Co-Relationship of Containment and CFDs Gordon Johnson Senior CFD Manager at Subzero Engineering CDCDP (Certified Data Center Design Professional) DCEP (Data Center Energy Practitioner) BSEE from New Jersey Institute of Technology

2 Data Center World – Certified Vendor Neutral Each presenter is required to certify that their presentation will be vendor-neutral. As an attendee you have a right to enforce this policy of having no sales pitch within a session by alerting the speaker if you feel the session is not being presented in a vendor neutral fashion. If the issue continues to be a problem, please alert Data Center World staff after the session is complete.

3 The Co-Relationship of Containment & CFDs This session will address a range of airflow management solutions to a variety of data center floor plan and equipment layout challenges. Before-and- after energy consumption data will give data center operation managers a better idea of total cost of ownership based on a variety of real-life scenarios. Most importantly the relationship between effective CFDs and innovative containment designs will be presented.

4 The Co-Relationship of Containment & CFDs Computational Fluid Dynamics or CFDs provide a comprehensive approach to modeling data center airflow, giving the Data Center Energy Practitioner (DCEP) the ability to determine which best practices should be employed to ensure safe, yet economical thermal parameters of computer equipment. What do CFDs teach us about containment?

5 Separation of Supply and Return Airflow The most important development in data center cooling in the last 10 years has been the value of full separation of supply and return airflow.

6 Data Center Containment Data center containment is the physical separation of supply and return airflow by employing hot aisle containment, cold aisle containment, and/or a combination of both.

7 CFDs and Containment What are the key ways CFD engineering works with containment to increase cooling reliability & efficiency? Baseline CFD Cold aisle containment CFD Hot aisle containment CFD Containment w/ CRAC failure (N+1) Containment w/ reduced airflow (VFD fans) Containment ROI Containment thermal report

8 Goals of CFD Analysis Reduce/eliminate “hot spots”. (N + 1) cooling. Lower operational costs of data center (Save $$$). Allow for future growth (IT load). Use CFD to predict savings (ROI) by adding containment (cold or hot aisle) to data center.

9 Power Usage Effectiveness (PUE) is a metric used to determine the energy efficiency. Power Usage Effectiveness (PUE)

10 Power Usage Effectiveness (PUE) is a metric used to determine the energy efficiency. PUE = Total Facility Energy/Total IT Energy PUE Breakdown Data Center Power Utilization Power Usage Effectiveness (PUE)

11 RTI is a measure of net By-Pass or net Recirculation Air. It is the ratio of total equipment airflow to total air-handler airflow expressed as a %. RTI

12 RTI is a measure of net By-Pass or net Recirculation Air. It is the ratio of total equipment airflow to total air-handler airflow expressed as a %. 100% means balanced airflow <100% indicates net airflow By-pass >100% indicates net airflow Recirculation Return Temperature Index (RTI) = (Rack Flow Rate/Air Handler Flow Rate) x 100 Net By-Pass Air Target AirflowNet Recirculation Air RTI

13 RCI is a measure of compliance with ASHRAE air intake temperature guidelines and is expressed as a percentage with the maximum value being 100%. RCI (Rack Cooling Index)

14 RCI is a measure of compliance with ASHRAE air intake temperature guidelines and is expressed as a percentage with the maximum value being 100%. RCI (Rack Cooling Index) RCH HI is a measure of the absence of over-temperatures. 100% means no temperature is above maximum recommended. RCH LO is a measure of the absence of under-temperatures. 100% means no temperature is below minimum recommended.

15 CFD Models 1 Baseline model “as is”. 2 Baseline model with CRAC failure (N + 1). 3 Add cold aisle containment to model. 4 Raise SAT (Supply Air Temperature). 5 Lower VFD fan speed. 6 Analyze effect of CRAC failure (N + 1).

16 CFD MODELS

17 Data Center Description 4,976 sq. ft. raised floor 480 kW of IT load Heat Load Density 96.5 W/sq. ft. 8 Liebert CW084DCS CRACs (Liebert CW084DCS units), 12,100 CFM per unit, nominal sensible cooling at 24.1 Tons (84.7 kW). CRACs equipped with VFD/VSD (Variable Freq/Speed Drives), SATSP of 62 °F. Total cooling capacity with fans at full speed is 96,800 CFM. Total cooling 74,349 CFM (based on airflow giving a 20F temperature rise through the server, expressed by the following formula): Cooling airflow in CFM = 154 * (Heat load in kW)

18 Data Center Operating Cost “Estimate” IT Heat Load = 480 kW Assume PUE = 2 (typical) PUE = Total Facility Energy / Total IT Energy 2 = Total Facility Energy / 480 kW Total Facility Energy = 960 kW Assumed cost of Electricity = $ 0.10 / kW-hr 8760 hours per year Annual cost = 960 x.10 x 8760 = $840,960

19 Baseline CFD Model Floor Plan Subfloor Heat Load 480kW

20 Return Temperature Index (RTI) = (Rack Flow Rate/Air Handler Flow Rate) x 100 RTI = (74,349/96,800) x 100 = 76.8% Baseline CFD Model (Results)

21 Note: 2 Racks with Inlet Temps above the ASHRAE Allowable Range! 14 Racks with Inlet Temps above the ASHRAE Recommended Range Baseline CFD Model (Inlet Temps)

22 Baseline CFD Model (3D Inlet Temps)

23 (6’ Elevation) Baseline Model

24 91 Degree Hotspot! Baseline Model (3D Airflow)

25 Note: 9 Racks with Inlet Temps above the ASHRAE Allowable Range! 30 Racks with Inlet Temps above the ASHRAE Recommended Range Baseline CFD Model (CRAC Failure)

26 SHOULD I INSTALL CAC OR HAC?

27 Cold Aisle Containment A Cold Aisle Containment system (CAC) encloses the cold aisle, ensuring that only cold air flows into the air intakes of IT racks. By containing the cold aisle, the hot and cold air streams are separated.

28 Hot Aisle Containment A Hot Aisle Containment system (HAC) encloses the hot aisle to collect the IT equipment’s hot exhaust air, ensuring that the CRAC units only receive hot air from the aisles. By containing the hot aisle, the hot and cold air streams are separated.

29 Benefits of CAC or HAC Reduced Energy Consumption Increased Cooling Capacity Increased Rack Population Consistent Acceptable Supply across IT Intake More Power Available for IT Equipment Increased Equipment Up-Time Longer Hardware Life (MTBF)

30 More Benefits Containment allows for lower cooling unit fan speeds, higher chilled water temperature, decommissioning of redundant cooling units, and increased use of free cooling. According to the U.S. EPA, a robust containment solution can reduce fan energy consumption by up to 25% and deliver 20% savings at the cold water chiller. According to Data Center Knowledge’s Energy Efficiency Guide, containment can save a data center approximately 30% of its annual utility bill without additional CapEx (Capital Expenditures).

31 Return Temperature Index (RTI) = (Rack Flow Rate/Air Handler Flow Rate) x 100 RTI = (74,349/96,800) x 100 = 76.8% CAC Model (Results) Rack Flow Rate Air Handler Flow Rate No Hot Spots!

32 CAC Model (Inlet Temps)

33 CAC Model CRAC OFF (Results) No Hot Spots!

34 CAC Model CRAC OFF (Inlet Temps)

35 HAC Model (Results) No Hot Spots! Note: Same results as CAC, all hot spots eliminated from Data Center

36 HAC CFD Model (6’ Elevation)(Inlet Temps) (3D Floor Plan) (Drop Ceiling Airflow)

37 CFD Model Results (CAC and HAC) Elimination of hot spots. Cooling system can be set to a higher temperature (saving energy and increasing cooling capacity) and still supply the heat load with safe operating temperatures. Economizer hours (if used) are increased. Humidification/dehumidification costs are reduced. Allows for future IT growth in data center. Provides CRAC redundancy (N + 1).

38 CAC Model SATSP 72ºF (Results) No Hot Spots!

39 CAC Model SATSP 72ºF (Inlet Temps) (6’ Elevation)

40 Data Center Operating Savings IT Heat Load = 480 kW, Assume Baseline PUE = 2 Total Facility Energy = 960 kW Annual cost = 960 x.10 x 8760 = $840,960 Increase SATSP by 10 °F (2-4% overall savings per degree increase) $840,960 x 10 x.02 = $168,192

41 Data Center Operating Savings IT Heat Load = 480 kW, Assume Baseline PUE = 2 Total Facility Energy = 960 kW Annual cost = 960 x.10 x 8760 = $840,960 Increase SATSP by 10 °F (2-4% overall savings) $840,960 x.02 = $168,192 New Annual cost = $840,960 - $168,192 = $672,768 $672,768 = new Total Facility Energy x.10 x 8760 New Total Facility Energy = 768 kW New PUE = 768/480 = 1.6

42 Return Temperature Index (RTI) = (Rack Flow Rate/Air Handler Flow Rate) x 100 RTI = (74,349/77,400) x 100 = 96.1% CAC Model SATSP 72ºF, 80% (Results) Rack Flow Rate Air Handler Flow Rate No Hot Spots!

43 CAC Model SATSP 72ºF, 80% (Inlet Temps) (6’ Elevation)

44 CRAC Savings From Reducing Fan Speed Fan Affinity Law: (kW 1 /kW 2 ) = (CFM 1 /CFM 2 ) 3 7.5kW/kW 2 = (12,100/9,680) 3 kW 2 = 3.84 kW = Fan 80% speed

45 CRAC Savings kW 2 = 3.84 kW = Fan 80% speed 7.5 kW – 3.84 kW = 3.66 kW saved 3.66 kW x 8 CRACs x.1 x 8760 = $25,649 Additional yearly savings = $25,649 Note: A reduction in fan speed will also effect the energy consumed in producing the chilled water.

46 Updated Data Center Operating Savings New Annual cost = $840,960 - $168,192 - $25,649 = $647,119 $647,119 = new Total Facility Energy x.10 x 8760 New Total Facility Energy = 739 kW New PUE = 739/480 = 1.5

47 CASE STUDY

48 Data Center Description Data Center is a raised floor design of 9,042 sq ft, 405 kW of IT load, and no drop ceiling. Customer wanted to increase IT load to 602 kW. Current Heat Load Density 44.8 W/sq ft (to 66.6 W/sq ft). 11 CRACs (Liebert DH240G units), each with airflow rate at 11,000 CFM and nominal sensible cooling at 19.8 Tons (69.7 kW). CRACs are not equipped with VFD/VSD fans. Total cooling 121,000 CFM, total cooling 63,808 CFM (RTI = 53%). 65 Racks with Inlet Temps > 80.6°F

49 Hot Spots! Baseline CFD Model (Results)

50 Baseline CFD Model (3D Floor Plan) (Inlet Temps) (6’ Elev.)

51 No Hot Spots! CAC Results

52 Case Study #1 CAC (3D Floor Plan) (Inlet Temps) (6’ Elev.)

53 CAC Results Data Center raised IT load 33%. 0 Racks with Inlet Temps > 80.6°F. Reduced airflow (turned OFF unnecessary CRAC units), increased temperature set points in data center (RTI = 86%). NYSERDA (New York State Energy Research and Development Authority) covered substantial cost of containment installation. CFD models estimated yearly savings of $150,409 and ROI of 1.56 years (excluding energy rebates).

54 Customer Comments

55 CFD ENGINEERING CONTAINMENT Innovative thermal solutions for the data center

56 3 Key Things You Have Learned During this Session 1.CFDs quantify the benefits of containment 2.Return on investment calculations are better made with a CFD study 3.Containment benefits are both diverse and all encompassing in any cooling solution

57 Q & A

58 Thank you Gordon Johnson Senior CFD Manager at Subzero Engineering CDCDP (Certified Data Center Design Professional) DCEP (Data Center Energy Practitioner) BSEE from New Jersey Institute of Technology x 559