[Eco 2 System] A Fundamental Twist To Data Centre Cooling Ecological Friendly & Economical.

Slides:



Advertisements
Similar presentations
Presented by Paul Almond – Datacentre UK
Advertisements

RESTRUCTURING GREEN COMPUTER LAB AND DATA CENTRE
Going Green in the Library
Sustainable Construction
Can ICT Beat CO 2 ? Daniel Gagné ITU – Symposium on ICTs, the Environment and Climate Change May 29th 2012.
Jamesway Incubator Company Inc. HRS System By Dominic Babineau, Engineer.
CIT  Describe servers that offer substantial savings in electrical usage over traditional servers  Explain methods for greening the workplace.
Cooling Product Positioning
Presentation By Michael Dunstan
Cloud Computing Data Centers Dr. Sanjay P. Ahuja, Ph.D FIS Distinguished Professor of Computer Science School of Computing, UNF.
Co-generation Cogeneration is an attractive option for facilities with high electric rates and buildings that consume large amounts of hot water and electricity.
SEMI-THERM FEBRUARY 23, 2010 SANTA CLARA, CALIFORNIA Capturing efficiency opportunities in data centers has become a business imperative, allowing managers.
IBM Energy & Environment © 2008 IBM Corporation Energy Efficiency in the Data Centre … and beyond Peter Richardson UK Green Marketing Leader.
Data Centre World Expo 2009 Designing, building and operating high density data centres Kevin Sell Head of Technical Facilities, Telstra International.
Computer Room Efficiency Improvement Greening ICT project with JISC 23 rd of September 2011.
Utility-Function-Driven Energy- Efficient Cooling in Data Centers Authors: Rajarshi Das, Jeffrey Kephart, Jonathan Lenchner, Hendrik Hamamn IBM Thomas.
Copyright Green Revolution Cooling
Green Business Practices Unit 1: BMT. Green Business Practices Adopting environmentally-friendly and energy efficient business practices provides numerous.
Did you know? Earth Day was started by Wisconsin U.S. Senator Gaylord Nelson in 1969 after he witnessed the devastation of a massive oil spill in Santa.
Schneider Electric IT Business North America Advancing Date Centers around Life Cycle Management. May 2011.
 Site  Requirements  Local Resources  Initial layout ideas  Brief material selection  Supply options.
Environmental System Science November 18, 2009 Identifying Energy Conservation Measures. Tailoring Retro-commissioning to meet your needs.
This document was specifically prepared to aid utility account managers who are working with C&I customers. Any other use of this material (in whole or.
CNN Center John Hester Turner Properties, Inc.. CNN Center Built in ,583,000 square feet on 18 floors Five structures joined by a common atrium.
Data Centre Efficiency. How much power?! The US EPA found in 2006 data centres used 1.5% of total US consumption (5) For Bangor University data centres.
Overview of Liquid Cooling Systems Peter Rumsey, Rumsey Engineers.
Connectivity Week Santa Clara Convention Center May 23, 2011.
AC backup solutions for business continuity. Secure your business, Stay on!
Smart Buildings Srirangarajan Kadaba National Manager – Energy Edge Buildings Business Schneider Electric 21June 2010.
THIN CLIENT COMPUTING USING ANDROID CLIENT for XYZ School.
Introduction to EMC.  Data Centre physical infrastructure specialist  Technical environment M&E.
EXPLORING Strategic Partnership “Collaboration” Prepared by NET6.
Data centre air management Case studies Sophia Flucker.
Liam Newcombe BCS Data Centre Specialist Group Secretary Energy Efficient Data Centres.
Green IT Practices Steps taken by businesses to reduce environmental impact of their IT infrastructure.
BY:- Ch.Nabeel Ahmed Superior University Grw Campus
Overview of Data Center Energy Use Bill Tschudi, LBNL
/ 1 ERA A chiller to suit everyone. Air cooled water chillers Free-cooling chillers Air/water heat pumps.
ThinDesk, Inc.. What is Thin Computing?  IT Industry is buzzing about Green IT, Virtualization both in the Data Centre and on the Desktop, Public / Private.
50th HPC User Forum Emerging Trends in HPC September 9-11, 2013
The Data Center Challenge
SMART GRIDS & SMART CITY ENERGY EFFICIENCY BIG DATA INTERNET OF
Energy Efficient Data Centers Update on LBNL data center energy efficiency projects June 23, 2005 Bill Tschudi Lawrence Berkeley National Laboratory
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Centre Consolidation Project Vincent Doré IT Technical.
We can…. 2 GLOBAL REFERENCES Rev: 00 References :
Authors: William Tschudi, Lawrence Berkeley National Lab Stephen Fok, Pacific Gas and Electric Company Stephen Fok, Pacific Gas and Electric Company Presented.
1 INDUSTRIAL ENERGY EFFICIENCY CONFERENCE ON GREEN INDUSTRY IN ASIA Robert Williams Energy Efficiency and Policy Unit United National Industrial Development.
All content in this presentation is protected – © 2008 American Power Conversion Corporation Row Cooling.
1 PCE 4.4 New Development In DC Containment Steve Howell.
1 ITM 1.2 How IT Decisions Impact Data Center Facilities: The Importance of Collaboration Lars Strong P.E. Upsite Technologies, Inc.
Reducing Energy Usage DLA Strategic Materials 2016 Hard copies of this document may not be the current version. Refer to the “I Am The Key” to verify the.
1 PCE 2.1: The Co-Relationship of Containment and CFDs Gordon Johnson Senior CFD Manager at Subzero Engineering CDCDP (Certified Data Center Design Professional)
Local Action Moves the World! 12 Years of the Cities for Climate Protection Campaign International Cities in Action.
R.N.S. INTERNATIONAL Heat Pumps
Multi-pipe Systems “An Opportunity To Combine Heating and Cooling”
CANOVATE MOBILE (CONTAINER) DATA CENTER SOLUTIONS
Fifty Questions What Business and IT Officers Need to Know about their Campus’ Carbon Emissions Did your CEO sign the American College and University.
The Data Center Challenge
GREEN COMPUTING Seminar On Submitted To: Submitted By:
Using Heat to Increase Cooling George Hannah BEng (Hons) CEng MIMechE
CERN Data Centre ‘Building 513 on the Meyrin Site’
The Benefit of Including Energy Recovery System Analysis
Cloud Computing Data Centers
4th ITU Green Standards Week
Where Does the Power go in DCs & How to get it Back
Cloud Computing Data Centers
Dynamic Chiller Optimization The Next Level of Retro-Commissioning
Green Government in New York City

Presentation transcript:

[Eco 2 System] A Fundamental Twist To Data Centre Cooling Ecological Friendly & Economical

CURRENT INDUSTRY FACT For every watt needed to power a server, 1 watt is required for cooling. [Eco 2 System] FACT For every watt needed to power a server, 0.05 watt is required for cooling. HOW? Simple. We do away with CRAC, CRAH, Chillers or Fans. Air is poor heat conductor. Kept at 21°C (7°C at chiller) to keep servers cool. WHAT!!? Our system features cutting edge submersive cooling which is 1200 times more efficient at holding heat than air. [Eco 2 System] at 38°C is as efficient as 21°C air. [Eco 2 System]

You Probably Wondered…

From This……

To This. An Eco 2 Facility

AIR COOLINGEco 2 CRAC/CRAH REQUIREDNOT REQUIRED CHILLER REQUIREDNOT REQUIRED LARGE BACKUP GENERATOR FULL SIZEHALF SIZE AIR FLOW ENGINEERING REQUIREDNOT REQUIRED HOT/COLD AISLE REQUIREDNOT REQUIRED RAISED FLOOR/SPECIAL FLOOR REQUIREDNOT REQUIRED RACK RAILS REQUIREDNOT REQUIRED

How Do We Do It? Computers can function at 45°C Hard drives are limiting factor: reliable up to 45°C Intel CPU reliable at 75°C Air is a poor heat conductor Must be kept at 25°C (using ~7°C chilled water) to cool servers [Eco 2 System] at 38°C is just as effective as 25°C air Coolant transfers more heat at greater effectiveness Easy to keep [Eco 2 System] at 38⁰C. Energy intensive to cool water to ~7⁰C.

Install any standard rack server » CPU and GPU compatible » Fiber and/or copper PDU Mount Vertically mounted server Ethernet cable guides Liquid fill line Power cable guides Any Vendor Server42U Rack Heat Flow Rack Server s Pump module 30° – 53° C Water Outside Inside Submerge into coolant liquid » Captures 100% of heat » No air cooling Intelligent control system » Heat expelled externally » Alerts/monitoring software

Project Objective To implement a [Eco 2 System] data centre cooling solution that: – Reduces cooling costs (energy costs) by 90% – Reduces carbon footprint by 50% – Reduces TCO by 35% – Reduces CAPEX by 40% – Functions well in a tropicalised climate

Project Scope The MTSFB-GICT WORKING GROUP project will encompass: – A live production 6U [Eco 2 System] deployed within SKMM, Cyberjaya hosting: MTSFB-GICT WG Web Services (Internet) SKMM web applications (Intranet/Internet) Live public monitoring and observation Continuous research & development activities

Benefits The [Eco 2 System] in SKMM will feature: – A production revenue generating 6U rack with OEM servers with fully functional IT services deployed. – Exhibiting the following characteristics: 50% reduction in energy consumption (over traditional) 50% reduction in carbon footprint (over traditional) 35% reduction in Total Cost of Ownership 40% reduction in CAPEX – Demonstrates SKMM’s CSR and Sustainability efforts. – Prepares SKMM for eventual regulations.

Environmental Impact Comparison of a Traditional 6U Rack System and [Eco 2 System] Reduction of 62.9% of CO 2 emissions No hazardous materials Fully recyclable

Payback & CAPEX 10 Year Simple Savings: – Energy Bill: RM – Maintenance: RM – Carbon Taxes (from 2014): RM – Total: RM CAPEX: – RM173, Simple Payback: 4.9 Years

Origin of Materials & Services GCI FiveR™ model is used: – Reuse: [Eco 2 System] reuses unused IT equipment – Recycle: [Eco 2 System] uses recycled materials – Reduce: [Eco 2 System] reduces excessive materials/resources – Rethink: [Eco 2 System] redefines traditional thinking – Reengineer: [Eco 2 System] redesigns DC cooling solutions [Eco 2 System] sources 90% of it’s materials from within a 50km radius.

Timeline Start of Mobilisation Design & Development November 2012 Alpha Testing Beta Testing January 2013 Live Services Research & Development February 2013