1 / 47

Data Centers

Data Centers . Chapter 17. Definition. A data center is anywhere you keep machines that are shared resources and where your customers do not need physical access to I n the normal course of events. It is more than just a room that servers live in,

oshin
Télécharger la présentation

Data Centers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Centers Chapter 17

  2. Definition • A data center is anywhere you keep machines that are shared resources and where your customers do not need physical access to I • n the normal course of events. • It is more than just a room that servers live in, • It has cooling, humidity control, power, and fire suppression systems. Building a data center is expensive, doing it right is even more expensive. You should expect company mgmt to balk at cost and ask for justification

  3. The Basics • Building a data center may seem fairly easy • Actually building a good reliable center that enables SAs to work efficiently is a lot more complicated than that. • Need good racks, network wiring, and condition the power that u send to your equipment, lots of cooling, and you need to consider fire suppression. • Plan for the room to survive natural disaters. • Organize the room.

  4. 17.1.1 Location The first thing to do is decide on location for the center • Pick a town and a building within that town. • Once the building has been chosen, a suitable place within the building must be selected. • Selection of town and building typically out of SAs staff. • You must be prepared for someone with a back hoe to accidentally dig up the power and communication lines, no matter how immune your site is.

  5. Location When it comes to selecting the location within the data center, the SA team should have some influence. Be able to provide the facilities dept with requirements that will help them select an appropriate location. Make sure the floor will be strong enough to take the weight of equipment. Other factors to consider: If location is prone to flooding, avoid having the data center in a basement or even ground level.

  6. Location cont…. • Consider how this affects the location of the support infrastructure for the center, such as UPS systems, automatic transfer switch (ATS), generators, and cooling systems. • If these support systems have to be shut down, so will the data center. • Having a data center in earth quake prone zone Make sure that racks can withstand a reasonable amount of shaking, make sure equipment is secure and will not fall out during an earthquake. Areas exposed to lighting require lighting protection; architects can offer advice for that.

  7. 17.1.2 Access • Local laws will determine to some degree the access to your data center • Consider how equipment will be moved in and out of location. • May consider extra-wide doors, if you have double doors make sure they don’t have a post in the middle. • Look at spacing between racks for getting the equipment into place. • A raised floor: will need either a ramp for wheeling equipment up or elevator access. • Strengthen certain areas of the floor and the path to them for supporting extra-heavy equipment.

  8. 17.1.3 Security • Physical security: Only SAs, managers, and the appropriate people from the security and safety dept. should have access. • Fire safety wardens (emergency search teams) should be drawn from people who have access already • Restricting data center access to SAs increases the reliability and availability of the equipment in which increases the chance that wiring and rack-mounting standards will be followed.

  9. Security • Access to the data center should not use simple key access. • Keys are easy to copy, and it is impossible to audit who has access or check who entered the room in a given period if there is a problem. • Proximity badges: (badges that unlock the door when they are brought in a close proximity to the badge reader) work well for data centers. • Some data centers have higher security requirements for legal or insurance reasons. Data centers that has machines that contain individual’s medical records might require biometric locks in addition to proximity badges.

  10. Security contd….. • Data centers that contain banking systems may require at least two people to badge in and out together, leaving no one in the room alone. Activating motion detectors when the room is believed to be empty, based on badge-in, badge-out records. Biometric locks have brought up new ethical issues. Is it ethical to install a security system that can be bypassed by cutting off an authorized person’s finger or removing eyeball?

  11. Security • Newer biometric systems also check for life, by look for a pulse or body heat from the finger or eye, as well as scanning the pattern. • Others look for PIN or do voice recognition, in addition to fingerprint or retina scan. • We recommend selecting bio-system that check to see that the person is still alive.

  12. 17.1.4 Power and Air • When deciding how much power and cooling a data center needs, you should aim to reach capacity on power systems at the same time you reach capacity on the cooling systems, which should be the same time as you run out of space. • Humidity is a component of the air-conditioning. • Important to regulate the humidity in the center, it causes static discharge that may damage equipment. • Ideal level between 45%-55%

  13. Power and air • Plan for these systems to last at least eight to ten years. • Centers must be conditions to protect against spikes, brownouts, and power cuts. • Have at least one UPS that provides sufficient power at a contant voltage to the whole data center. • Normal UPSs supply power from battery banks, constantly recharging from its in-bound power feed. • Generators: data centers have these for backup of power if utility power fails.

  14. Power and Air • Generators is connected through and ATS, which turns on the generator a configurable amount of time after the utility power goes out, then switches the power feed for the UPS over to the generator power. • When utility power switches to tolerance, the ATS switches the power feed back to the UPS back to the utility power after a configurable amount of time and turns the generator off.

  15. Power and Air • Always install and switch to allow you to bypass the UPS if it fails. This switch must be external to the UPS and a reasonable distance away from it. • UPS are full of batteries, and if it catches fire do not go into the room to bypass it. • The UPS will probably come with a bypass switch that is integral to the unit. • Studies show that outages tend to be extremely short (secs) or extremely long (half a day or longer) • Majority of power outages last less than 3 secs.

  16. Power and Air • If last for more than 10 minutes, there is a likelihood that is will last the rest of the day. Consider sending staff home. • Two ways to make purchasing a UPS. • One is to purchase a UPS that could last for an hour. Which will cover extreme short outages and give you enough time to power down all the systems if it looks like the outage is going to last the rest of the day. • Alternative, you can purchase a UPS that can lasts about 15 mins and combine it with a generator and ATS so that your can survive multihour outages.

  17. Power and Air • Trying to purchase a UPS that can last more than an hour without a generator is expensive, and statistically, its unlikely to have an outage of that length. • When purchasing UPS, consider its maintenance and environmental requirements. • The UPS may need periodic maintenance, batteries replaced about every 3 years. • Require cooling and humidity control, which may dictate its location within the building. • Consider whether UPS can be force to trickle-charge batteries, rather than fast as it can.

  18. Power and Air • When the UPS fast-charges it batteries, it puts a huge load on the rest of the power system, which may bring power down. • Trickle-charges, allows the additional load to be must less. • Generators have to be carefully maintenanced, tested weekly, and periodically refueled, or that will not work on the few occasions when they are needed, and you will have wasted money.

  19. Power and Air • The HVAC system (heating, ventilation, air-conditioning) must be also on protected power, or have at least a feed directly from the ATS if it does not need very clean power. • Heat sensors are useful tools for detecting hot spots. • Cheap alt. method is to use digital thermometers that record the high and low temperatures and move them around in the room. • Hot spots with no airflow are problematic, they will get hotter.

  20. Power and Air • When there are known hot spot, the equipment can be redistributed to address the problem or the HVAC can be altered to provide better airflow to those areas. • HVAC systems often fail silently and sometimes return to service without notice. • HVAC failures cause hardware to fail more quickly, so it is important to notice when HVAC system fails. • Monitor heat sensors, if the HVAC system itself does not provide monitoring mechanism that can be plugged into the help desk.

  21. Power and Air • Once you have appropriate amount of conditioned power in the center, you will need to distribute it to the racks. • An overhead bus is a good way to do that, giving you the option of bringing different voltages into each rack, in case you have equipment that requires non standard power, as some high end equipment does. • Power outlets can be located away from anything that might drip on them and protect with something to deflect dripping water. • Sites with raised floors must install water sensors under the floor • Builder should be able to help in locating low spots where what will accumulate first. Sensors should be placed under the air-conditioning units.

  22. Power and Air • Overhead power provides some flexibility in how much power can be brought into a rack, some racks may need more than others, and you should provide power cords between racks. • If equipment in one rack takes power from another rack, it may be inadvertently deprived of power by someone working in the next rack who is unaware of the interrack dependency. • Keep everything within the rack as far as possible. • Power distribution unit (PDU) may look like a power strip but has internal wiring that connects different sockets onto different circuits. • PDUs don’t suffer from overload whereas simple power strips can.

  23. Power and Air • Many different kinds of PDUs are available, including vertical, and horizontal rack-mount options • In all cases look at where the power switch is on the PDU and how easy it is to accidentally turn off. • Some PDUs have switches that are protected inside a small box that must be lifted to trip the switch. • HVAC and UPS systems should be able to notify staff in case of failure. A good idea is to have a network-attached thermometer that can alert you in case of HVAC or high heat.

  24. 17.1.5 Fire Suppression • Have fire suppression system in your data center, even if local laws do not require it. • Consider the dangers to the people working in the room, environmental hazards of the system, the damage that it might do to the equipment that is not on fire, and how well the system deals with electrical fires. • Consider whether to link activation of the fire suppression system with a switch for turning off power in the computer room. • If water is to be used on the equipment, you will need to cut the power to the equipment first. • Such a harsh method of turning off the equipment may cause some hardware fatalities, but not as many as dumping water on live equipment.

  25. Fire Suppression • Find out if choice of suppression system will allow other equipment to continue operating. • If not, is there a way to local the fire suppression to a small set of racks? • Some systems have a preactivation facility that enables on-site staff to check on a small amount of the localized smoke before the suppressions system activates. • Important procedural components to put in place:

  26. Fire Suppression • If your fire suppression system is linked to your operations center, you need to train the operations staff on what to do if there is an alert. • If the people who are on-site 24 hours a day are not computer professionals, train them on a process they should follow in response to a fire alert. • If suppression system activates, you will be without fire suppression system until the system recharges. If fire reignites after the system activated, you may lose the whole building. • Use a procedure to minimize both the chance of fire reactivating and for monitoring it and dealing with it effectively.

  27. 17.1.6 Racks • Selection and layout of racks will influence the amount and type of space you need, so consider them before consulting with the facilities department on your space requirements. • DCs should have proper racks for equipment • Machines get stacked on top of each other, making it difficult to work on lower machines with out bringing down those on top of it. • Two-post racks are cheaper than four-post racks, so many sites use them.

  28. Racks • Four-post racks are nicer to work with. Two post racks often used for networking and telecommunication equipment. • Easier and safer to mound some of the heavier networking and telecommunications equipt. In a four-post rack. • Four-post racks provide more protection against accidental knocks that may loosen or damage cables, and better horizontal cable-management options.

  29. Racks • Most servers are only front-mountable, although some have options for center or rear mounting. • If front-mounted in two-post racks it sticks out in back, and different-depth equiptment sticks out diff. Distances, which can be hazardous for people walking behind racks. • Full-Depth shelves for two-post racks are center mounted. • If site decides to get two post racks, make sure that you leave lots of space between rows • The aisle must be wide enough to accommodate the depth of one and a half pieces of equipt. Plus the width of the aisle where people will walk.

  30. Racks • Height -height of rack may have impact on reliability, if it is very tall and an SA has to stretch across other equipt. to access a machine. -taller racks also may not fit beneath anything that might attached to ceiling -it may not be safe to pull roll-out shelves from the high part of the rack, on other hand they use data center floor space more efficiently • Width -Couple of standard rack widths, most equipment fits into 19 inch racks, but telecommunications equipt is usually in NEBS-compliant racks, which are 21 inches between poles. NEBS equipment tends to come in its own rack, so the only need is to allocate space for that rack.

  31. Racks • Depth -for four-post racks, several depths are available because there are several machine depths. -have racks deep enough for equipment to fit completely inside, so cables are more protected for accidental knocks. -having machines protrude into the aisles is a safety hazard and may be in contravention to local safety laws. -it looks neater and more professional to have it contained within the rack.

  32. Racks • Air Circulation- Some racks have fans built into them for this purpose. Consider how air will reach racks, they may require raise perforated floors with air pushed in to the racks from below. • Racks with doors- Tom prefers racks with doors so he can institute an “if the door doesn’t close, you’re not done” policy. This keeps SAs from leaving dangling wires after they make changes.

  33. 17.1.7 Wiring • There are several ways to make it easier for SAs to keep the wiring neat. • Choose a section of your data center that will house only network equipment. Clearly label the top panel of each rack. • Racks should be labeled based on their row and position within the row. Some companies put these labels high on the walls so that they can be see from anywhere and its easy to locate. • Wire the rack’s panel to a patch panel in your network row that has corresponding labels and is clearly labeled with rack number.

  34. Wiring • Some sites choose to color code their network cables. • Category-3, Category-5, and cables with different wiring (straight through, crossover) should be different colors. • All network and console wiring for servers should stay within that rack. • Make sure there is adequate cable management within the rack for the intra-rack cabling. • Get cables in a variety of lengths, allowing you to find a cable that is the correct length. • The cable should not have so much slack that it leaves a trailing loop. Make sure there is enough slack so the machines can keep functioning when shelves are completely extended.

  35. Wiring • Cables should never run diagonally across the rack, they will get in the way of someone working in the rack later. • Make it easy for people to do the right thing by having a full selection of cable lengths in stock. • If you know that a certain percentage of connections from a rack will go to a certain destination, have all of those connections prewired, live, and ready to go, which will reduce entropy in your cabling.

  36. Wiring • A word of caution…. -be able to deal gracefully with hardware failures, which may require being able to rapidly more a lot of connections to a different piece of hardware while you get replacement parts. • Power and Data cables should be separated.

  37. 17.1.8 Labeling • Good labeling is essential to a smooth running data center. • All equipment should be labeled on both the front and the back with its full name as it appears in the corporate namespace and in the console server system. • Color coding the network cables can help, perhaps using different color for each security domain. -ex. On a firewall that has three network interfaces, on for the internal, protected network; one for external, unprotected network; and one for a service network that is accessed from untrusted networks through the fire wall.

  38. Labeling • The interfaces should have “int,” “ext,” and “serv” next to them, cables should have labels with corresponding tags attached. • Network equipt. connected to WANs, but the name of the other end of the connection and the link vendor’s identity number for the link should be on the label. • This labeling should be one the piece of equipt. That has the error lights for that link. -ex. A CSU/DSU for a T1 would have a label that reads “T1 to San Diego office” or “512k link to WAN Frame Relay cloud,” and the T1 provider’s circuit ID and telephone number

  39. Labeling • Policy for Enforcing Label Standards -Eircom has a very strict labeling policy. *Servers must be labeled front and back. *every power cord must be labeled at the far end with the name of the machine that it is attached to. *Network cables are color coded rather than labeled. *Make periodic sweeps to check labels, if any server or power cord is not labeled, it will be removed. *Label all Power Plugs.

  40. 17.1.9 Communication • SAs need to communicate with customers or other SAs outside the Data Center. • Have someone else test whether a problem a has been fixed, monitor service availability, or find information, equipment, or another person. • SAs carry radios, many SAs are rarely at their desks. • Radios don’t may not work well in data sites, because of RF shielding. Some telephone extensions may work better.

  41. 17.1.10 Console Servers • They allow you to maintain console access to all of the equipt. In the data center without the overhead of attaching a monitor and a keyboard to every system. • Having a lot of “heads” in the center is an inefficient way to use the valuable resource of the DC floor space and special power, air, and fire suppression systems. • Console servers come in two flavors • Switch boxes allow you to attach the monitor, keyboard, and mouse ports of many machines through the switch box to a single “head”.

  42. Console Servers • Console server that support serial consoles. • The serial port of each of these machines is connected to a serial device such as a terminal server. These terminal servers are on the network. • These terminal servers are on the network. Software on a central server controls them all and makes the consoles of the machines available by name, with authentication and some level of access control.

  43. 17.1.11 Workbench • Another key feature is easy access to a work bench. • Work bench should be in a room that is attached to the DC so that the SAs do not have to go far. These work spaces generate a lot of dust, keep dust outside of DC.

  44. 17.1.12 Tools and Supplies • The DC should be fully stocked with all different cables, tools, and spares your need. • If SA notice data center is running low on something or that she is about to use a significant quantity of anything, she should inform the person responsible for tracking the spares and supplies, so they can order more. • Tool should be kept in a cart with drawers. • Large machine room should have multiple carts, cart should have screw drivers of different sizes, couple elect. Screwdrivers, Torx drivers, hex wrenches, chip pullers, needle nose pliers, wire cutters, knives, static straps, a label maker or two, and anything else you need.

  45. 17.1.13 Parking Spaces Tools live in carts. Carts have parking spaces where they are returned to when no longer in use. If you have a raised floor pick your spot for you tile pullers to be stored when not in use. The chargers for battery-operated tools should have own secure area Mobile items should be labeled with the location where they should be returned.

  46. 17.2 The Icing • You can improve your DC above and beyond the facilities that have been described. • Equipping DC properly is expensive, and the improvements can add substantively to costs. • But if able to, our business requires it, you can improve your data center by having much wider aisles than necessary and having greater redundancy in your power and HVAC systems.

  47. Conclusion • A data center takes a lot of planning to get right, but whatever you build, you will be stuck with for a long time, so its worth doing right • A badly designed, underpowered, or under cooled data center can be a source of reliability problems, whereas well designed centers should see you safely trough many problems.

More Related