Sep
12

Practical Data Center Management: A Few Tips for A Cleaner, More Efficient Data Center

Practical Data Center Management: A Few Tips for A Cleaner, More Efficient Data Center

September 12
By

When I made the jump to a career in technology, it was the early 90s and the internet was beginning to take off. I had recently graduated from college and moved from Salt Lake City to Seattle to work for one of the first and largest ISPs in Seattle. I believe we were the first ISP in the area to secure a T3 line from Sprint from which we sold fractional T1 services to local businesses.

To give you an idea of the nascent state of things, my first jobs at the ISP was to travel around to potential customers and deliver a presentation on the benefits of providing email service to your employees. It was an exciting time, and we had just begun to experiment with offering collocation services to customers who benefited most by being close to the big pipe.

Basically, we were building out mini data centers for our customers that were separate from those systems we used to run our own business. The three major areas we had to have in place were: Power, cooling, and space. We had to have these three areas locked down before we could even begin to offer value-add services on top of basic bandwidth.

So this week, I want to take a look at some of the practical management aspects of data centers. This includes practical cable management, temperature considerations, and finally, a few tips on how to make your data center more energy efficient.

Cable Management Best Practices

Cable management is influenced by the physical placement of your racks. But there are a number of best practices that can help you reduce cable clutter.

  1. Document your existing infrastructure – This is a crucial first step, and one you hopefully have on hand. If you don’t, this is a good time to document your current topology along with the various types of cables you’re using. Don’t forget to document cable counts and distances too. For a look at how to document equipment for disaster recovery, check out this great ebook.
  2. Use color to identify cables – This provides a quick visual identification and saves a lot of time
    when tracing cables. Here is one example: cable managemnet
  3. Keep patch cable short – Measure twice, cut once. It’s easy to slap a three-foot cable onto a server, but if it only needs a one-foot cable, find a one-foot cable or punch one out. And don’t forget to test it!
  4. Plan for tomorrow – Today’s conduit might be plenty, but if you’re adding servers each month, it may not be enough to handle your cabling needs down the road. Planning for increased capacity is well advised.
  5. Planning is key – It’s been said before, but it’s true. Don’t place a rack in a location that’s going to make getting cables to it a challenge. Once it’s running, it’s going to be difficult to relocate.  By making cabling a priority from the start, you’ll have a path that can guide you as you grow.

Temperature Considerations

“What is the correct temperature for a server room?” is a popular question found on many IT related forums. I’ve always assumed that the proper temperature was just below what I could handle without a jacket, or around 60-65F.

But that’s chilly by today’s standards. While you’ll find experts who believe a data center should be kept in the mid-50s, Google is taking the opposite approach and advising data center operators to raise temperatures.

Google uses plastic curtains to keep its servers running cool

Google uses plastic curtains to keep its servers running cool

“The guidance we give to data center operators is to raise the thermostat,” said Erik Teetzel, an Energy Program Manager at Google. “Many data centers operate at 70 degrees or below. We’d recommend looking at going to 80 degrees.”

Most of today’s data centers operate at temperatures between 68 and 72F. The baseline temperature of a data center is called the set point and data centers can save 4% in energy costs for every degree they increase the set point, according to Mark Monroe, who ran Sustainable Computer for Sun Microsystems.

Proper venting and air flow also determine the optimal data center temperature, and one reason it’s difficult to house even a small rack of equipment in a small space such as a closet.

Energy Efficiency

If you were to ask data center operators what their top priority was, most put reliability at or near the top. Keeping all that equipment is a tall task that goes unnoticed until it’s down. But reliability often comes at the expense of efficiency. Here are a few tips on how to improve the energy efficiency of your data center as collected by Clemons Pfeiffer, CTO of Power Assure Inc.

  1. Setting a higher set point – We’ve already discussed this, but raising the set point temperate is probably the quickest and simplest way to save energy and decrease operation costs.
  2. Virtualize servers – It makes sense that decreasing physical servers decreases energy consumption. It can also reduce rack space and cabling needs as well.
  3. Match server capacity to load – During times of low application demand, power consumption can be reduced by up to 50% by matching online capacity to actual load in real-time.
  4. Load-balance by following the moon – This works well if you have multiple, strategically located data centers that can shift load based on demand. Power is most abundant and least expensive at night, thus the “follow the moon” reference.

Google provides a number of resources about its own data center management techniques you may find useful, even on a much smaller scale. Notice how color is an important indicator not only with cables but with pipes and structure.

Photo credit: Everton Pastana via Pixabay