Presentation is loading. Please wait.

Presentation is loading. Please wait.

BayCare Health System Designs Next Generation Data Center August 8, 2007 Doug Lauterbach Data Center Director, Bay Care Health System Matt Schuster Principal.

Similar presentations


Presentation on theme: "BayCare Health System Designs Next Generation Data Center August 8, 2007 Doug Lauterbach Data Center Director, Bay Care Health System Matt Schuster Principal."— Presentation transcript:

1 BayCare Health System Designs Next Generation Data Center August 8, 2007 Doug Lauterbach Data Center Director, Bay Care Health System Matt Schuster Principal Consultant, BT INS

2 In todays presentation… About BayCare Challenges that necessitated a new data center Planning the new facility A look at the new data center Your questions

3 About BayCare 9 hospitals in Tampa Bay area Largest full service community-based health system in the region 17,000 team members 2,707 Beds 355,523 ER visits 6.3 million lab tests

4 Challenges that necessitated a new facility

5 Business needs driving IT In the process of a 7 year electronic medical record project (moving towards paperless) Constant growth of digital imaging Increasing compliance regulations - JCAHO concerns about information management and availability Increased clinical reliance on IT systems requires high availability for quality patient care –Pharmacy –Lab –Radiology –Food and Nutrition –Bed tracking –Scheduling

6 Business Drivers of the new Facility Best Practices Business Continuity Business Expansion Capacity Planning or Performance Analysis Change in Application Architecture, Providers or Vendors Compliance Disaster Recovery Expense Management Increased Capacity Operational Optimization Regulatory Reliability Risk Management and Mitigation Technology Growth or Consolidation Technology Refresh

7 Existing facility challenge: space Short on space –8 separate spaces for IT equipment –4,700 sq. ft data center split into 4 spaces 30% of data center space used for circulation Large command center using a large portion of raised floor space Buildings physical resiliency not up to hurricane standards Design aspect to 40 watts/sf Shallow cabinets limit the ability to expand and distribute power and low-voltage cabling Structure cabling unable to support: –Blades –VoIP –Anticipated SAN growth

8 Existing facility challenge: power load Design didnt support heat and power load Downtime related to power design and capacity –circuit capacity lacking –Distribution: not dual bus all the way to equipment Lack of available power threatening to limit new technology adoption No standard power configurations No means or method to manage/measure power at the equipment racks

9 Existing facility challenge: cooling Cooling was inefficient –Not using hot-aisle/cold-aisle design –Uneven temperatures through space –Hot spots creeping up Equipment lifespan likely shortened due to heat 13-year old design, not optimized for blade configurations Not easily expandable Low 12 raised floor and under-floor low-voltage cabling restricting air supply No available physical space for additional cooling Legacy CRACs were not communicating to efficiently cool the space.

10 Existing facility challenge: visibility & control Monitoring – currently focuses on circuits that are tripped (reactive), not designed for proactive data Temperature monitoring handled at the return air of the precision air conditioner (not looking at hotspots) Little remote control for servers Lights-out management (iLO) Existing Liebert SiteScan equipment was not managing non-Liebert equipment No temperature sensors at equipment location or around the room No branch circuit monitoring 100% of lighting on 24x7

11 Existing facility challenge: managing change Provisioning is not standardized, automated or managed System deployments are based on: –looking through racks –personal knowledge –gut feeling –where uneducated users want placement Not managing capacity of: –space –circuits –power –weight –heat loads –network switch ports

12 Planning the New Facility

13 Objectives Provision for planned growth – addition of 100 servers per year Gain complete visibility and control and improve administration Improve system availability through full redundancy of the power and cooling systems Improve fire detection and suppression Implement hot-aisle/cold-aisle design for maximum capacity and energy efficiency Improve planning and provisioning Improve cable management Increased Mechanical and Electrical efficiencies Reduced operational costs Operate as a nearly lights-out facility Remote access into all systems

14 Building a Team Doug Lauterbach Imaging Services Director (serving as Data Center Project Director), BayCare Health System Chris Jenkins, Director Technology, BayCare Health System Matt Schuster, Principal Consultant, BT INS Thomas Gooch, Local Liebert Representative, Emerson Network Power Paul Carastro, Carastro & Assoc, MEP Engineers Mike Montecalvo, General Contractor, Solutions, Inc. Paul Schnitzlein, Architect, Harvard Jolly

15 Documenting Existing Equipment Document existing devices, power supplies and port counts per device at the rack level Determine/Graph device averages per rack to determine cabinet profiles Each piece of equipment assigned a number and quantities and loads were considered. Classifications included: –Legacy –Low –Medium –High –Blade –IBM RISC –Network

16 Planning Process: Modeling Mapped watts per square foot via CFD several times, tweaking each time to determine placement of precision air conditioners Room designed for 800-1,000 kW; CFD proves capacity 24-inch raised floor, multiple precision air conditioning units White space to grow into: 40 to 50 percent allows for doubling of capacity Rerun CFD every three to six months as changes are required using actual values collected from thermistors, branch circuit monitoring and other tools

17 Existing facility: Servers per Cabinet

18 Planning Process: Profile Planning Cabinet Profiles

19 Best Practice: Technology Infrastructure 44U, 45 deep cabinets to allow for distribution of power and low-voltage cabling Metered Power Strips –Every cabinet has at least two strips, one for each bus –Some cabinets with additional strip behind a static transfer switch –Each strip has colored tape to quickly determine which bus KVM over IP in each cabinet –Centralized –Decentralized iLO switch in each cabinet ($38/Ethernet port)

20 Best Practice: Power System Design

21 Best Practice: Dual-Bus EPO Systems

22 Best Practice: Fire Suppression Very Early Smoke Detection Apparatus (VESDA) Clean Agent Suppression system –People-friendly –Environmentally friendly –Lower quantity of agent vs. FM-200 –Over 3,000 pounds of FE-25 agent Four zones Over 150+ detection points (above and below floor) Sprinklers were required, therefore pre-action with double-interlock was installed

23 Best Practice: Remote Management Integrated Lights Out – allows remote control of servers from anywhere in the world People-free facility benefits: –Removed NOC Command Center from raised floor –Minimizes human error disruptions –Allows managers to monitor and control environment remotely

24 Best Practice: Proactive Monitoring Followed some of ASHRAEs TC9.9 recommendations for monitoring temperature in racks/cabinets Temperature and Relative Humidity Branch Circuit Monitoring SiteScan I/O-32 and additional units Leak Detection VESDA Fire Suppression Fire Detection Liebert Environmental Control Systems ICOMs Control Network

25 Best Practice: Labeling Mechanical equipment Electrical equipment Electrical receptacles Grid Coordinate Racks and Cabinets Structured Cabling

26 Tools: Rackwise/DCM Used tool to document existing inventories Tracking relocation from one facility to another Manage asset location by rack unit Provisioning limitations per cabinet Reporting capabilities –Space and Assets –Power and Cooling –Cost and Capacity Planning –Rack Assembly with Cabling –Service & Warranty Contracts –Department & Customer Information Long term objective: to integrate a change/configuration management tool to track and manage assets

27 Tools: Rackwise/DCM

28 Tools: Computational Fluid Dynamics Allowed modeling the complete room for the initial 6 CRACs and full deployment of 12

29 Tools: Computational Fluid Dynamics Initial assessment indicates potential problems related to placement of precision cooling units… Low Pressure Zone CRACs 12 3 4 5 6

30 Tools: Computational Fluid Dynamics CFD Program allows for virtual change - validating proposed solution prior to installation. CRAC Moved To New Location Low Pressure Zone Removed

31 Tools: Computational Fluid Dynamics Follow-on Modeling Provides Proof of Operational Success Assuming Full Load – 735 kW. Clouds Per Fahrenheit Degree

32 Tools: Computational Fluid Dynamics Modeling allows for proof of design concept at full load – prior to implementation Pathlines from CRACs Temperature ( o F) in 1 foot increments

33 CRAC Network Communications

34 New Data Center – Construction

35

36 Current Data Center – Raised Floor

37

38

39 New Data Center – Raised Floor

40 Current Data Center – Operations

41 New Data Center – Operations

42 Current Data Center – Generator

43 Current Data Center – Network Rack

44 Current Data Center – Server Cabinets

45

46

47 Current Data Center – UPS Room

48 New Data Center – UPS Room

49 Current Data Center – Yard

50 New Data Center – Yard Construction

51 New Data Center – Yard Outside

52 Critical Success Factors Teamwork between BayCare, INS and Emerson Network Power/Liebert enabled success Maximization of modeling tools allowed a solid design, even before the first nail was hammered Have done virtualization study – partially executing recommendations ahead of time to make move easier Cant convert everything to dual corded, so a Liebert STS was installed to provide redundancy Space for a second generator is provisioned and an ATS is in place to accommodate expansion Move in progress!

53 Questions and Answers


Download ppt "BayCare Health System Designs Next Generation Data Center August 8, 2007 Doug Lauterbach Data Center Director, Bay Care Health System Matt Schuster Principal."

Similar presentations


Ads by Google