Four tips for keeping your data centre cool this summer

Now that summer is in full swing, it’s natural for IT shops to be concerned about keeping their data centres cool. But with limited budgets, keeping the data centre operating efficiently without blowing the utility bill out of the water can be a challenge.

It turns out there are some simple tools and best practices that IT departments can employ to keep data centre temperatures — and budgets — under control.

Mind the Gaps

First and foremost, say industry professionals, the journey to data centre cooling efficiency starts with an assessment of the physical integrity of the facility. While it seems like an obvious fix, all too often data centres have gaps throughout that allow air to escape and penetrate the facility.

“I’ve been in data centres that have folding, garage-like doors with 1-inch gaps under them,” says Joe Capes, business development director for cooling at Schneider Electric, a global provider of energy management products and services.

In addition to leaky windows and doors, another common drain on cooling efficiency is ventilation or ceiling tiles that have been removed and not replaced. “One of the simplest things to do is properly seal the room,” Capes adds, as a means of “preventing any type of penetration of outside air through doorways, windows, and so forth.” Brush strips, blanking panels and even curtains can be deployed to provide what is essentially weather-stripping. Pay particular attention to cable cutouts and the space under CRAC units.

For data centres in locations where humidity is of particular concern, it’s critical to ensure that the vapor barrier — the plastic or metal coating for the walls, ceiling and floors that keeps moisture at bay — remains intact. Many data centres were designed and built with vapor barriers; over time, as equipment is brought in and moved around, the holes that are punched throughout the facility to accommodate conduits compromise the vapor barrier, thereby allowing humidity in and out.

While a vapor barrier provides some protection against air flow leakage, this is not the primary benefit of keeping the barrier intact. “Depending on how precisely you want to maintain the humidity in your space and not waste a lot of energy to humidify or dehumidify it, the vapor barrier becomes critical,” according to Dave Kelly, director of application engineering for the Liebert Precision Cooling products, a business unit of Emerson Network Power.

Monitor and Measure

It is tough to make efficiency improvements in data centre operations without knowing how equipment is performing and where heat is generated. “In many data centres, there is a lot of legacy equipment that is doing nothing,” asserts Don Beaty, president of DLB Associates, an engineering firm in Eatontown, N.J., that designs and builds data centres. As new equipment is installed, Beaty says it’s fairly typical for existing equipment to remain on board.

This problem has its roots in the IT/facilities disconnect: IT people tend to keep equipment running out of fear of disrupting mission-critical operations, while staff on the facility side focus on energy issues. By installing monitoring equipment on racks and PDUs to measure the load and power consumption, a data centre can then identify any equipment that is running inefficiently in terms of capacity or even unnecessarily in terms of the applications that are on it.

“Let’s say five racks that are measured all have utilization down under 50 percent,” Beaty says. “This would increase the possibility of considering virtualizing those five racks, or consolidating in another way to reduce power consumption.”

Optimize the Supply Air Temperature

For years, data centres have operated under the premise that the cooler, the better. Today that’s not always the case, even in the summer. However, due to an increase in the solar load, data centre operators tend to decrease the set points in the AC units in the summer months.

“The reason many do this is that they want to have a supply air temperature of 70 degrees, so as the temperature drifts up towards 75 degrees, they turn on the AC higher,” Capes says. The most recent guidelines from ASHRAE (the American Society of Heating, Refrigerating and Air-Conditioning Engineers) recommend a supply air temperature of up to 81 degrees, yet many data centres continue to operate at much cooler temperatures.

“All too often, engineers or facilities operators run their data centres based on the return air temperature of the CRAC unit,” says Capes. And while there might be plausible reasons to do that, “all server manufacturers and ASHRAE care about is the supply air temperature to the inlet of the server.”

In addition to raising the temperature on your CRAC units, Capes says it may be possible ? and certainly is simpler ? to turn some of them off altogether. A data centre that has 300 kW of cooling installed with a 400 kW UPS system running at 25% of capacity has three times the amount of cooling that is needed. “In some cases, you can put units on standby and have no effect on the environment at all,” Capes says.

General Housekeeping

Maximizing cooling efficiency also requires regular maintenance and perhaps a few interior changes.

If a data centre has a raised floor, make sure the space under the floor is as clear as possible. Often when IT disconnects cables, they drop them below the floor where they may inhibit air flow to the point where fans have to work extra hard. And to reduce the overall work an AC has to do, it makes sense to locate cooling as close to the workloads as possible, whether this means moving perimeter units to the end of a rack row or adding supplemental localized cooling.

Air conditioning units themselves also won’t function efficiently when dirty, so make sure to clean the outdoor heat exchanges and the filters on indoor units. And if a data centre has windows, drawing blinds or installing a room darkening film can reduce solar load. Lighting typically contributes about 4% of the total heat load of a data centre, Capes says; adopting LED lighting is an easy way to achieve a quick pay back.

“It is astonishing how many data centres I go into that haven’t done some really simple things and best practices to improve cooling efficiency,” Capes says. By following just a few tips now, data centres can achieve noticeable cooling improvements before summer’s end.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Related Tech News

Featured Tech Jobs

 

CDN in your inbox

CDN delivers a critical analysis of the competitive landscape detailing both the challenges and opportunities facing solution providers. CDN's email newsletter details the most important news and commentary from the channel.