1 Energy Efficient Data Centers: Strategies from the Save Energy Now Program Federal Environmental Symposium June 4, 2008 Dale Sartor Lawrence Berkeley.

Slides:



Advertisements
Similar presentations
Data Center Design Issues Bill Tschudi, LBNL
Advertisements

Environmental Controls I/IG Lecture 14 Mechanical System Space Requirements Mechanical System Exchange Loops HVAC Systems Lecture 14 Mechanical System.
Foundations of Real Estate Management TM BOMA International ® Module 3: Building Operations I Heating, Ventilating, and Cooling the Building.
ASSESS TARGET DELIVER MEASURE Understanding Building Energy Audits Joe Hofstetter P.E., CEM, LEED AP Karpinski Engineering.
Data Center Controls Mark Hydeman, P.E., FASHRAE Taylor Engineering, LLC
T OPPORTUNITIES FOR ENERGY EFFICIENCY AT TRIBAL GAMING FACILITIES.
Using Benchmarking to Identify Energy Efficiency Opportunity in Cleanrooms; The Labs 21 Approach William Tschudi and Peter Rumsey June 29, 2004
CLEANROOM ENERGY BENCHMARKING William Tschudi October 9, 2002 Acknowledgements: Pacific Gas and Electric Company, Rumsey Engineers
1 Leading Energy Efficiency in High Tech: PG&E’s Program & Service Portfolio Mark Bramfitt, P.E. CSU Dominguez Hills.
Cooling Product Positioning
Cleanrooms and Laboratories for High-Technology Industries Dale Sartor, P.E. Bill Tschudi, P.E. July 1999 Lawrence Berkeley National Laboratory Environmental.
PG&E and Altera Data Center Energy Efficiency Project.
SEMI-THERM FEBRUARY 23, 2010 SANTA CLARA, CALIFORNIA Capturing efficiency opportunities in data centers has become a business imperative, allowing managers.
1 Page 1 Public Interest Energy Research (PIER) California Institute for Energy & Environment (CIEE) Lawrence Berkeley National Laboratory Bill Tschudi.
Energy Efficient Data Centers - an Update December, 2006 William Tschudi
Refrigeration and Heat Pump Systems Refrigeration systems: To cool a refrigerated space or to maintain the temperature of a space below that of the surroundings.
24 x 7 Energy Efficiency February, 2007 William Tschudi
Reliability and Energy Efficiency – Not Mutually Exclusive
Creating Energy-Efficient Data Centers
September 18, 2009 Critical Facilities Round Table 1 Introducing the Heat Wheel to the Data Center Robert (Dr. Bob) Sullivan, Ph.D. Data Center Infrastructure.
Steve Craker K-12 Team Lead Geoff Overland IT and Data Center Focus on Energy Increase IT Budgets with Energy Efficiency.
1Taylor Engineering, LLC HVAC System Design Mark Hydeman, P.E., FASHRAE Taylor Engineering, LLC
Data Center Consolidation & Energy Efficiency in Federal Facilities
Best Practices in HVAC Design/Retrofit
Applying Precision Air Conditioning Systems
Important variables Water: Air: Conversion:
Data Centers - They’re Back… E SOURCE Forum September, 2007 William Tschudi
Optimising Data Centre Power Planning and Managing Change in Data Centres - 28th November Cirencester.
BENEFITTING FROM GREEN IT – FINDINGS FROM THE SUSTEIT PROJECT Professor Peter James
Page 1 Public Interest Energy Research (PIER) California Institute for Energy & Environment (CIEE) Bill Tschudi Lawrence Berkeley National Laboratory
COMP 4923 A2 Data Center Cooling Danny Silver JSOCS, Acadia University.
Mission Energy: Energy Efficient Cooling How Data Centre energy consumption and operating costs can be significantly reduced through readily available.
LBNL and Government Data Center Programs SC07 November, 2007 William Tschudi
Data centre air management Case studies Sophia Flucker.
Liam Newcombe BCS Data Centre Specialist Group Secretary Energy Efficient Data Centres.
1 DOE Data Center Energy Efficiency Program and Tool Strategy Paul Scheihing U.S. Department of Energy Office of Energy Efficiency and Renewable Energy.
November 2004 Low Hanging Fruit Low Cost Energy Efficiency Opportunities in Cleanrooms.
Critical Facilities Roundtable Data Center Update William Tschudi June 15, 2007.
Overview of Data Center Energy Use Bill Tschudi, LBNL
Bill Harrison Cleveland - Akron Chapters Joint Meeting March 16, 2009.
The lawrence berkeley national laboratory: a case study Revised 12/2/04.
Federal Data Center Programs 7 X 24 Exchange October, 2007 William Tschudi
1 Data Center Energy Efficiency Metrics ASHRAE June 24, 2008 Bill Tschudi Lawrence Berkeley National Laboratory
The Data Center Challenge
Energy Efficient Data Centers Update on LBNL data center energy efficiency projects June 23, 2005 Bill Tschudi Lawrence Berkeley National Laboratory
COMP 4923 A2 Data Center Cooling Danny Silver JSOCS, Acadia University.
1 CTO Challenge William Tschudi February 27, 2008.
Data Center Energy Efficiency Sonoma Mountain Village November 29, 2007 William Tschudi
Authors: William Tschudi, Lawrence Berkeley National Lab Stephen Fok, Pacific Gas and Electric Company Stephen Fok, Pacific Gas and Electric Company Presented.
Cleanrooms: Can They Be Energy Efficient? William Tschudi – LBNL Peter Rumsey – Rumsey Engineers November 4, 2004.
Datacenter Energy Efficiency Research: An Update Lawrence Berkeley National Laboratory Bill Tschudi July 29, 2004.
#watitis2015 TOWARD A GREENER HORIZON: PROPOSED ENERGY SAVING CHANGES TO MFCF DATA CENTERS Naji Alamrony
Optimizing Power and Data Center Resources Jim Sweeney Enterprise Solutions Consultant, GTSI.
1 Data Centers for the 21st Century Working with Industry to Improve Energy Efficiency in Data Centers Dale Sartor, P.E. LBNL Applications Team June 7,
1 Page 1 Steve Greenberg Sponsored by: Public Interest Energy Research (PIER) California Energy Commission and administered by California.
Data Center Energy Use, Metrics and Rating Systems Steve Greenberg Energy Management Engineer Environmental Energy Technologies Division Lawrence Berkeley.
All content in this presentation is protected – © 2008 American Power Conversion Corporation Row Cooling.
1 ITM 1.2 How IT Decisions Impact Data Center Facilities: The Importance of Collaboration Lars Strong P.E. Upsite Technologies, Inc.
1 PCE 2.1: The Co-Relationship of Containment and CFDs Gordon Johnson Senior CFD Manager at Subzero Engineering CDCDP (Certified Data Center Design Professional)
Data Center Energy Efficiency SC07 Birds of a Feather November, 2007 William Tschudi
Enabling High Efficient Power Supplies for Servers
Fifty Questions What Business and IT Officers Need to Know about their Campus’ Carbon Emissions Did your CEO sign the American College and University.
The Data Center Challenge
HYDRONIC HVAC: The Most Comfortable and Efficient System
Using Heat to Increase Cooling George Hannah BEng (Hons) CEng MIMechE
DOE Data Center Energy Efficiency Program and Tool Strategy
HYDRONIC HVAC: The Most Comfortable and Efficient System
Data Center Research Roadmap
Data Center Controls Mark Hydeman, P.E., FASHRAE Taylor Engineering, LLC
Liebert DSE High efficiency thermal management
Presentation transcript:

1 Energy Efficient Data Centers: Strategies from the Save Energy Now Program Federal Environmental Symposium June 4, 2008 Dale Sartor Lawrence Berkeley National Laboratory

High Tech Buildings are Energy Hogs:

LBNL Feels the Pain!

LBNL Super Computer Systems Power:

Typical Data Center Energy End Use Server Load /Computing Operations Cooling Equipment Power Conversions & Distribution 100 Units 33 Units Delivered 35 Units

Overall Electrical Power Use Courtesy of Michael Patterson, Intel Corporation

Lifetime electrical cost will soon exceed cost of IT equipment, but IT equipment load can be controlled Consolidation Server efficiency –Flops per watt –Efficient power supplies Software efficiency (Virtualization, MAID, etc.) Power management – Low power modes Redundant power supplies Reducing IT load has a multiplier effect –Equivalent savings +/- in infrastructure

20-40% savings typically possible Aggressive strategies can yield better than 50% savings Potentially defer need to build new generating capacity and avoid millions of metric tons of carbon emissions Extend life and capacity of existing data center infrastructures But is my center good or bad? Potential Benefits of Improved Data Center Energy Efficiency:

Benchmarking for Energy Performance Improvement: Energy benchmarking can identify best practices. As new strategies are implemented (e.g. liquid cooling), benchmarking enables comparison of performance.

With funding from PG&E and CEC, LBNL conducted benchmark studies of 22 data centers: –Found wide variation in performance –Identified best practices

Your Mileage Will Vary The relative percentages of the energy actually doing computing varied considerably.

Data Center Performance Varies in Cooling and Power Conversion DCiE (Data Center Infrastructure Efficiency) < 0.5 –Power and cooling systems are far from optimized –Currently, power conversion and cooling systems consume half or more of the electricity used in a data center: Less than half of the power is for the servers Typical Practice DCiE < 0.5 Better Practice DCiE = 0.7 Server Load /Computing Operations Cooling & Power Conversions Server Load /Computing Operations Cooling & Power Conversions Server Load /Computing Operations Best Practice DCiE = 0.85 DCiE Data Center Infrastructure Efficiency Energy for IT Equipment Total Energy for Data Center DCiE =

High Level Metric— Data Center Infrastructure Efficiency (DCiE) Ratio of Electricity Delivered to IT Equipment Average.57 Higher is better Source: LBNL Benchmarking

Alternative High Level Metric – Data Center Total / IT Equipment (PUE) Average 1.83 Lower is better Source: LBNL Benchmarking

Save Energy Now On-line profiling tool: “Data Center Pro” OUTPUTS Overall picture of energy use and efficiency End-use breakout Potential areas for energy efficiency improvement Overall energy use reduction potential INPUTS Description Utility bill data System information IT Cooling Power On-site gen

Other Data Center Metrics: Watts per square foot Power distribution: UPS efficiency, IT power supply efficiency –Uptime: IT Hardware Power Overhead Multiplier (ITac/ITdc) HVAC –IT total/HVAC total –Fan watts/cfm –Pump watts/gpm –Chiller plant (or chiller or overall HVAC) kW/ton Lighting watts/square foot Rack cooling index (fraction of IT within recommended temperature range) Return temperature index (RAT-SAT)/ITΔT

17 DOE Assessment Tools (Under Development): Identifies and prioritizes key performance metrics –IT Equipment and software –Cooling (air management, controls, CRACs, air handlers, chiller plant) –Power systems (UPS, distribution, on-site generation) Action oriented benchmarking –Tool will identify retrofit opportunities based on questionnaire and results of benchmarking –First order assessment to feed into subsequent engineering feasibility study

Server Load/ Computing Operations Cooling Equipment Power Conversion & Distribution Alternative Power Generation High voltage distribution Use of DC power Highly efficient UPS systems Efficient redundancy strategies Load management Server innovation Energy Efficiency Opportunities Are Everywhere Better air management Better environmental conditions Move to liquid cooling Optimized chilled-water plants Use of free cooling On-site generation Waste heat for cooling Use of renewable energy/fuel cells

Examination of individual systems and components in the centers that performed well helped to identify best practices: Air management Right-sizing Central plant optimization Efficient air handling Free cooling Humidity control Liquid cooling Improving power chain UPSs and equipment power supplies On-site generation Design and M&O processes Using benchmark results to find best practices:

Air Management: Typically, much more air is circulated through computer room air conditioners than is required due to mixing and short circuiting of air Computer manufacturers now provide ASHRAE data sheets which specify airflow and environmental requirements Evaluate airflow from computer room air conditioners compared to server needs

Isolating Hot and Cold: Energy intensive IT equipment needs good isolation of “cold” inlet and “hot” discharge Computer room air conditioner airflow can be reduced if no mixing occurs Overall temperature can be raised in the data center if air is delivered to equipment without mixing Coils and chillers are more efficient with higher temperature differences

Enforce hot aisle/cold aisle arrangement Eliminate bypasses and short circuits Reduce air flow restrictions Proper floor tile arrangement Proper locations of air handlers Optimize Air Management:

Data Center Layout © 2004, American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. ( Reprinted by permission from ASHRAE Thermal Guidelines for Data Processing Environments. This material may not be copied nor distributed in either paper or digital form without ASHRAE’s permission. Underfloor Supply Cold Aisle Hot Aisle Only one zone for under floor arrangement!

Data Center Layout © 2004, American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. ( Reprinted by permission from ASHRAE Thermal Guidelines for Data Processing Environments. This material may not be copied nor distributed in either paper or digital form without ASHRAE’s permission. Overhead Supply Cold Aisle Hot Aisle VAV can be included on each branch

Aisle Air Containment: © 2004, American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. ( Reprinted by permission from ASHRAE Thermal Guidelines for Data Processing Environments. This material may not be copied nor distributed in either paper or digital form without ASHRAE’s permission. Cold Aisle Caps End cap Hot aisle lid © APC reprinted with permission Cold Aisle Hot Aisle

70-75º º 70-75º Best Scenario— Isolate Cold and Hot

Fan Energy Savings – 75% If mixing of cold supply air with hot return air can be eliminated- fan speed can be reduced

Better Temperature Control Can Allow Raising the Temperature in The Entire Center ASHRAE Recommended Range Ranges during demonstration

Environmental Conditions ASHRAE - consensus from all major IT manufacturers on temperature and humidity conditions Recommended and Allowable ranges of temp and humidity Air flow required

Temperature Guidelines – at The Inlet to IT Equipment ASHRAE Allowable Maximum ASHRAE Allowable Minimum ASHRAE Recommended Maximum ASHRAE Recommended Minimum

Best air management practices: Arrange racks in hot aisle/cold aisle configuration Try to match or exceed server airflow by aisle –Get thermal report data from IT if possible –Plan for worst case Get variable speed or two speed fans on servers if possible Provide variable airflow fans for AC unit supply –Also consider using air handlers rather than CRACs for improved performance Use overhead supply where possible Provide isolation of hot and cold spaces Plug floor leaks and provide blank off plates in racks Draw return from as high as possible Use CFD to inform design and operation

Data Center HVAC often under-loaded Ultimate load uncertain Design for efficient part-load operation –modularity –variable-speed fans, pumps, compressors Upsize fixed elements (pipes, ducts) Upsize cooling towers Right-Size the Design:

Have one (vs. distributed cooling) Medium temperature chilled water Aggressive temperature resets Primary-only CHW with variable flow Thermal storage Monitor plant efficiency Optimize the Central Plant:

Fewer, larger fans and motors VAV easier Central controls eliminate fighting Outside-air economizers easier Design for Efficient Central Air Handling:

Outside-Air Economizers –Can be very effective (24/7 load) –Controversial re: contamination –Must consider humidity Water-side Economizers –No contamination question –Can be in series with chiller Use Free Cooling:

Eliminate inadvertent dehumidification –Computer load is sensible only –Medium-temperature chilled water –Humidity control at make-up air handler only Use ASHRAE allowable RH and temperature Eliminate equipment fighting –Coordinated controls on distributed AHUs Improve Humidity Control:

Water is 3500x more effective than air on a volume basis Cooling distribution is more energy efficient Water-cooled racks available now; liquid- cooled computers are coming Heat rejection at a higher temperature –Chiller plant more efficient –Water-side economizer more effective Use Liquid Cooling of Racks and Computers:

Improving the Power Chain: Increase distribution voltage DC distribution Improve equipment power supplies Improve UPS

Power supplies in IT equipment generate much of the heat. Highly efficient supplies can reduce IT equipment load by 15% or more. UPS efficiency also varies a lot. Specify Efficient Power Supplies and UPSs

Redundancy Understand what redundancy costs – is it worth it? Different strategies have different energy penalties (e.g. 2N vs. N+1) Redundancy in electrical distribution puts you down the efficiency curve

Measured UPS Efficiency Redundant Operation

Can use waste heat for cooling –sorption cycles –typically required for cost effectiveness Swaps role with utility for back-up Consider On-Site Generation:

Get IT and Facilities people to work together Use life-cycle total cost of ownership analysis Document design intent Introduce energy optimization early Benchmark existing facilities Re-commission as a regular part of maintenance Improve Design and Operations Processes:

Top best practices identified through benchmarking

Design Guides were developed based upon the observed best practices Guides are available through PG&E and LBNL websites Self benchmarking protocol also available Design Guidelines Are Available

Federal Energy Management Program Best practices showcased at Federal data centers Pilot adoption of Best-in-Class guidelines at Federal data centers Adoption of to-be-developed industry standard for Best-in-Class at newly constructed Federal data centers EPA Metrics Server performance rating & ENERGY STAR label Data center performance benchmarking Industrial Technologies Program Tool suite & metrics Energy baselining Training Qualified specialists Case studies Certification of continual improvement Recognition of high energy savers Best practice information Best-in-Class guidelines Industry Tools Metrics Training Best practice information Best-in-Class guidelines IT work productivity standard

Links to Get Started DOE Website: Sign up to stay up to date on new developments Lawrence Berkeley National Laboratory (LBNL) LBNL Best Practices Guidelines (cooling, power, IT systems) ASHRAE Data Center technical guidebooks The Green Grid Association – White papers on metrics Energy Star ® Program Uptime Institute white papers

Contact Information: Dale Sartor, P.E. Lawrence Berkeley National Laboratory Applications Team MS University of California Berkeley, CA (510)