Download presentation
Presentation is loading. Please wait.
Published byDwayne Watkins Modified over 9 years ago
1
Computer Room Experiences A medium sized tier-2 site view Pete Gronbech GridPP Project Manager HEPIX April 2012
2
Come A Long Way Days of converted offices Large rooms in the basement 2
3
Oxford Physics Computer Rooms Oxford University Physics Department has two computer rooms built during 2007 Both are traditional false floor, with under floor cooling, coming through grill tiles in front of the computer racks. The RACU’s are mounted on the walls in the hot aisle. External roof mounted chillers provide cool water feed. Mixed occupancy precluded dedicated water chilled racks. 3
4
Begbroke 4
5
Begbroke IAT Computer room 5
6
Begbroke Hot air can accumulate at the ceiling and get sucked into the front of the nodes at the top of the rack. The temperatures of the nodes in the rack are markedly higher at the top. One solution is to use a panel to restrict the air flow. The other is to increase the minimum fan speed on the RACU to ensure good air flow. Thus avoiding accumulation of hot air near the ceiling. Blanking panels are essential to prevent hot air leakage to the front of the racks. 6
7
Standard Hot Aisle Cold Aisle layout 7 RACU RACK
8
Blanking Panels 8
9
Panels and screens 9
10
10
11
RACU /CRAC Fan speeds The units have variable speed fans which kick in as required. Originally these were set high which caused high air speed through the grills, but not necessarily effective cooling. Having the fans on full all the time is also very expensive. The fans were set to a minimum speed of 10%. This can lead to a cycling effect due to the slow air flow, causing build up of hot air at the ceiling followed by a cooling phase repeatedly. A compromise solution is to increase the minimum fan speed to 20-25% on the RACU’s behind the hottest racks. This ensure they are blowing cool air, but more importantly sucking in the hot air from the ceiling area. Which should reduce the cycling effect. 11
12
Begbroke Oddity Temperature behind the rack is very hot (Of course) but the RACU is reading a cooler temperature. Noticed leakage of cool air coming up behind the RACU. Sealing strip hand become loose and was flapping in the breeze!! 12
13
Standard Hot Aisle Cold Aisle layout 13 RACU RACK
14
Jacarta PDU’s Currently target temperature is set to 25 degrees C, but we intend to raise this shortly in a controlled way to monitor the change in PUE Monitoring 14
15
Bus bars 15 Power sockets are 32A Commando mounted on bus bars. Sometimes not as close to the rack as we would like. In principle more can be added, but need an electrician.
16
Particle Physics Building 16
17
17 Each rack position has two 32A commando power sockets. Each rack position has four CAT6 network sockets.
18
Cold Aisle Containment 18 RACU RACK
19
Cold aisle containment 19
20
Cold aisle containment Easier to retro fit to existing traditional computer room. Prevents hot air leakage into the cold aisle. Temperature remains more consistent throughout the aisle. The set point temperature can be raised, which saves money. Area outside the enclosure runs at a hotter temperature. Less comfortable for personnel, but A/C runs more efficiently with a larger temperature differential. Were hoping to be able to switch off one of the two units, but the current heat load is just too high to allow that. (The savings would be much larger if one unit could be switched off due to the units having fixed fan speeds.) However added benefit is that when one unit is off, the cold aisle takes much longer before the temperature is too high. 20
21
21
22
Custom made enclosure The seal does not have to be perfect Just enough to provide positive pressure in the cold asile Our departmental mechanical workshops constructed the frame, using stainless steel, Perspex and tough plastic flaps. 22
23
DWB Temperature Monitoring 23 Cost: ~£20 8000 Data points Eg Temp ~5 mins Approx 12 units placed around the room. Reader costs ~£50
24
PUE Power usage effectiveness (PUE) is a measure of how efficiently a computer data center uses its power; specifically, how much of the power is actually used by the computing equipment (in contrast to cooling and other overhead).data center PUE is the ratio of total amount of power used by a computer data center facility to the power delivered to computing equipment.data centerpower PUE was developed by a consortium called The Green Grid. PUE is the inverse of data center infrastructure efficiency (DCiE). An ideal PUE is 1.0. Anything that isn't considered a computing device in a data center (i.e. lighting, cooling, etc.) falls into the category of facility power usage.The Green Griddata center infrastructure efficiency 24
25
Mechanical Services 25 RACU - Room Air Conditioning Units Chillers on the roof
26
Cost saving calculation Roughly 10% drop in A/C power consumption. Which equates to a drop of ~6.7% of Total Power (~90KW) Very roughly 100KW costs £100K per year so a saving of around £6K per year. Compared to an installation cost of <£5000 26
27
References GridPP http://www.gridpp.ac.uk/http://www.gridpp.ac.uk/ WLCG http://lcg.web.cern.ch/lcg/http://lcg.web.cern.ch/lcg/ OSC http://www.osc.ox.ac.uk/http://www.osc.ox.ac.uk/ Oxford Physics http://www2.physics.ox.ac.uk/http://www2.physics.ox.ac.uk/ Jacarta http://www.jacarta.co.uk/http://www.jacarta.co.uk/ LogTag http://www.logtagrecorders.com/http://www.logtagrecorders.com/ PUE http://en.wikipedia.org/wiki/Power_usage_effectivenesshttp://en.wikipedia.org/wiki/Power_usage_effectiveness APC White Papers http://www.apc.com/prod_docs/results.cfm?DocType=White%20Paper&Query_Type=10 http://www.apc.com/prod_docs/results.cfm?DocType=White%20Paper&Query_Type=10 27
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.