Presentation is loading. Please wait.

Presentation is loading. Please wait.

Future O 2 LHC P2 U. FUCHS, P. VANDE VYVRE.

Similar presentations


Presentation on theme: "Future O 2 LHC P2 U. FUCHS, P. VANDE VYVRE."— Presentation transcript:

1 Future O 2 Installation @ LHC P2 U. FUCHS, P. VANDE VYVRE

2 ALICE O 2 Project Data of all interactions shipped from detector to online farm in triggerless continuous mode Data volume reduction by cluster finder No event discarded Average factor 2.2(factor 2.5 for the TPC data) Asynchronous event reconstruction with finalCalibration with a delay of few hours. HI run 1.1 TByte/s Data Storage: 1 year of compressed data Bandwidth: Write 90 GB/s Read 90 GB/s Capacity: 60 PB Tier 0 (Computing Centre Meyrin) 20 GByte/s Tiers 1 and Analysis Facilities Data volume reduction by tracking All the events go to data storage Average factor 5.5 (factor 8 for the TPC data) 500 GByte/s 90 GByte/s Global computing needs: ~ 100 k CPU cores Use of accelerators: ~ 5000 GPUs and ~ 500 FPGAs ~ 60 PB of storage FLP EPN DS ALICE O22

3 O 2 Facility @ Pt2 The O 2 facility will consist of 3 major parts: 1.Read-Out – First-Level Processors (FLP) Connected to the detector frontend Will stay in CR1 (actual computing room located in the LHC access shaft in the SX hall bld 2285) and utilize the racks, power and cooling upgraded/installed during LS1. 2.Event Processing Nodes (EPN), Quality Control (QC), Data Storage (DS) Connected to Read-Out via IT network (fibers) Large installation (~2MW), needs a new computing room Present CRs (CR1 and CR2) were built for LEP/L3, they do not provide an adequate environment for the whole O2 facility: lack of Rack-Space, power, cooling Weight issue (CRs are suspended in the access shaft) 3.Services, Network ALICE O2 3

4 Computing room CR0 A new computing room (“CR0”) is needed: Space Requirement Min. 2000 U, possible in 45 large racks Power Requirement IT: 115 kW UPS + 1937 kW Normal Power Total (IT+Infrastructure): 240 kW UPS + 2062 kW Normal Power Cooling Requirement Min. 2.5 MW cooling capacity Foreseen Placement: Immediate vicinity of hall 2285 to re-use a maximum of services already present and to reduce costs for cabling ALICE O2 4

5 CR0 Tentative Location ALICE O2 5

6 CR0 Tentative Location 24m 27m ALICE O2 6

7 Power Needs ALICE O2 7

8 Power Needs Evolution Present DAQ + HLT Systems (2015 - 2018) Future O2 System (2019 - ) Servers (box count) 600~2000 Server Max. Power Consumption 200-400 W1.1 kW Max. # of servers per rack 2554 Max. power per rack 15-20 kW32-50 kW Normal Power (avail/used) [kW] 540 / 0x / 2062 UPS Power (avail/used) [kW] 500 / 215500 / 453 ALICE O2 8

9 O 2 Facility @ Pt2, Power Needs LocationPower UPS CR1 Power Normal CR1 Power UPS CR0 Power Normal CR0 FLPCR1213 EPNCR01650 QCCR015 DSCR040272 ServicesCR1+CR0515 NetworkCR1+CR01045 IT TOTAL22801001937 CoolingCR0125 G TOTAL22802252062 ALICE O2 9

10 O 2 Facility @ Pt2, Power Needs The power needs of the O2 System: UPS Power (ALICE: 500kW available for O 2 ): 228 kW needed in present CR1 (all in place) 225 kW needed outside for CR0 (TBD) Normal Power: No need for normal power in CR1 2 MW of additional normal power needed outside for CR0 (TBD) Possibility of adding 500 kW ? (hardware evolution) The installation will start in ~Q2 2018. So the power should be available ~1 year before. ALICE O2 10

11 Schedule ALICE O2 11 Q2 2018 Start of installation in CR0

12 Computing Room 0 (CR0) Options ALICE O2 12

13 CR0: building options Option (I): Container-based Foundation (concrete slab) and services (electricity and water) needed Container has different noise screening than a building, probably additional noise screens are needed Containers are fast to deploy (6-8 months including site preparation) and to extend (4 months) Option (II): A new building Civil engineering work for a new building (can be a simple “sandwich wall” structure) Electricity distribution in building and to racks (e.g. Canalis) Engineering of a Cooling System depending on the cooling solution chosen ALICE O2 13

14 Examples of container computing rooms ALICE O2 14

15 Cooling Options ALICE O2 15

16 Operating temperatures of IT equipment IT equipment quite efficient at exhausting its own heat Operating temperature is the inlet temperature IT fans generally bottom out at around 24  C Dell recommended temperature in 2009: 24-27  C (1) Most of today’s IT equipment can run at > 30  C inlet temperature, some even up to 43  C (1) Data Center Operating Temperature: What Does Dell Recommend, Dell, 2009 (2) Data Center Operating Temperature: The Sweet Spot, Dell, 2011 (3) 2011 Thermal Guidelines for Data Processing Environments – Expanded Data Center Classes and Usage Guidance; ASHRAE TC9.9; 2011: ALICE O2 16

17 CR0: cooling options CR0 shall have a Power Usage Effectiveness (PUE) value of better/lower than 1.15 The IT power needs are the same for all solutions Several options are being studied based on cooling technology: I.A Container based Solution 1.Chilled-Water Cooling 2.Primary-Water Cooling 3.Free-Air Cooling II.A new Building 1.Rack-Cooling with Chilled Water 2.Rack-Cooling with Primary Water 3.Free-Air Cooling ALICE O2 17

18 O 2 Facility @ Pt2, Cooling Options Cooling Option (1): Chilled Water In General Needs a chilled water computing room than can accept a load of 2.5 MW Running chillers is not a “GreenIT” practice, PUE will be [1.4.. 2] Needs additional power to be installed for chillers In a building: Racks with water-cooled back door exist up to 32 kW only e.g. KNURR CoolTherm, 38U usable, 80cm wide, 130cm deep (74/84 usable – not deep enough, min 92cm needed) Not very space efficient, the footprint of the computing room will be big In a Container: Different Rack/Back Door or In Row cooling options exist Same constraints (kW/rack, U usable..) as for a building ALICE O2 18

19 O 2 Facility @ Pt2, Cooling Options Cooling Option (2): Primary Water (24  C, max 28  C) In General Primary water can be used directly to cool equipment ? If not: additional heat exchanger needed (+3  C) Direct Liquid Cooling (DLC) excluded due to type of equipment (no solution for main heat contributors: disks, GPUs) In a building: No Rack cooling solution found for racks>32kW using 22/24/27  C water In a Container: A solution exists for water at ~ 22  C, so additional chillers would be needed to cool down from primary water temperature Better than chilled water but still not GreenIT - friendly ALICE O2 19

20 O 2 Facility @ Pt2, Cooling Options Cooling Option (3): Free Air Cooling (FAC) In General State-of-the-art solution, used by many large computing centers (EBay, Google (1), Microsoft, YAHOO (2),..) The O2 power density is probably higher (1) Google’s Chiller-Less Data Center, Data Center Knowledge, July 15, 2009 (2) Yahoo Computing Coop: Shape of Things to Come? ‖ Data Center Knowledge, April 26, 2010. In a building: Needs a special building geometry to optimize the airflow. Hot-aisle separation and immediate heat evacuation (roof) Challenge: Temperature/Humidity control ALICE O2 20

21 O 2 Facility @ Pt2, Cooling Options Cooling Option (3): Free Air Cooling (FAC) In a Container: Two major suppliers identified: DELL and HP following different philosophies: DELL: outside-airflow with recycling to control temperature and humidity, chillers can be added but are not an integral part HP: closed air volume for IT cooled by chillers running in FreeCooling ALICE O2 21

22 O 2 Facility @ Pt2, Cooling Options Cooling Option (3): Free Air Cooling (FAC) In a Container, cont’d PUE value depends on IT installed in container DELL solution can achieve 1.07-1.05 HP runs at 1.15 year-average (due to chillers) Additional Cooling is available (if outside too hot) DELL: evaporative cooling (default), DX chillers if needed HP: air-air chillers Racks are vendor-neutral vis-à-vis installed IT equipment Power density of 50 kW/rack achievable ALICE O2 22

23 Free cooling map Geneva Free air in Geneva: <= 27  C 85% of the year, <= 40  C 100% of the year ALICE O2 23

24 Summary ALICE O2 24

25 O 2 Facility @ Pt2 Power: Needs are well established Cooling: Preference for Free-Air-Cooling due to simplicity and running costs; CERN has a commitment to Green Technologies. Available solutions for chilled/primary water cooling will require more racks Computing room: Need to evaluate both solutions (container and new building) Choice depending on features, price and delivery lead time ALICE O2 25


Download ppt "Future O 2 LHC P2 U. FUCHS, P. VANDE VYVRE."

Similar presentations


Ads by Google