Eight radical ways to cut power costs

Today’s data centre managers are struggling to juggle the business demands of a more competitive marketplace with budget limitations imposed by a soft economy. They seek ways to reduce opex (operating expenses), and one of the fastest growing — and often biggest — data centre operation expenses is power, consumed largely by servers and coolers.

Alas, some of the most effective energy-saving techniques require considerable upfront investment, with paybacks measured in years. But some oft-overlooked techniques cost next to nothing — they’re bypassed because they seem impractical or too radical. The eight power savings approaches here have all been tried and tested in actual data centre environments, with demonstrated effectiveness. Some you can put to work immediately with little investment; others may require capital expenditures but offer faster payback than traditional IT capex (capital expenses) ROI.

You won’t find solar, wind, or hydrogen power in the bag of tricks presented here. These alternative energy sources require considerable investment in advanced technologies, which delays cost savings too much for the current financial crisis. By contrast, none of the following eight techniques requires any technology more complex than fans, ducts, and tubing.

The eight methods are:

Radical energy savings method 1: Crank up the heat

The simplest path to power savings is one you can implement this afternoon: Turn up the data centre thermostat. Conventional wisdom calls for data center temperatures of 68 degrees Fahrenheit or below, the logic being that these temperatures extend equipment life and give you more time to react in the event of a cooling system failure.

Experience does show that server component failures, particularly for hard disks, do increase with higher operating temperatures. But in recent years, IT economics crossed an important threshold: Server operating costs now generally exceed acquisition costs. This may make hardware preservation a lower priority than cutting operating costs.

Radical energy savings method 2: Power down servers that aren’t in use

Virtualization has revealed the energy saving advantages of spinning down unused processors, disks, and memory. So why not power off entire servers? Is the increased “business agility” of keeping servers ever ready worth the cost of the excess power they consume? If you can find instances where servers can be powered down, you can achieve the lowest power usage of all — zero — at least for those servers. But you’ll have to counter the objections of naysayers first.

For one, it’s commonly believed that power cycling lowers the servers’ life expectancy, due to stress placed on non-field-swappable components such as motherboard capacitors. That turns out to be a myth: In reality, servers are constructed from the same components used in devices that routinely go through frequent power cyclings, such as automobiles and medical equipment. No evidence points to any decreased MTBF (mean time between failure) as a result of the kinds of power cycling servers would endure.

Radical energy savings method 3: Use “free” outside-air cooling.

Higher data center temperatures help you more readily exploit the second power-saving technique, so-called free-air cooling that uses lower outside air temperatures as a cool-air source, bypassing expensive chillers, as Microsoft does in Ireland. If you’re trying to maintain 80 degrees Fahrenheit and the outside air hits 70, you can get all the cooling you need by blowing that air into your data centre.

The effort required to implement this is a bit more laborious than in method 1’s expedient cranking up of the thermostat: You must reroute ducts to bring in outside air and install rudimentary safety measures — such as air filters, moisture traps, fire dampers, and temperature sensors — to ensure the great outdoors don’t damage sensitive electronic gear.

Radical energy savings method 4: Use data centre heat to warm office spaces

You can double your energy savings by using data center BTUs to heat office spaces, which is the same thing as saying you’ll use relatively cool office air to chill down the data center. In cold climes, you could conceivably get all the heat you need to keep people warm and manage any additional cooling requirements with pure outside air.

Unlike free-air cooling, you may never need your existing heating system again; by definition, when it’s warm out you won’t require a people-space furnace. And forget worries of chemical contamination from fumes emanating from server room electronics. Modern Restriction of Hazardous Substances (RoHS)-compliant servers have eliminated environmentally unfriendly contaminants — such as cadmium, lead, mercury, and polybromides — from their construction.

Radical energy savings method 5: Use SSDs for highly active read-only data sets

SSDs have been popular in netbooks, tablets, and laptops due to their speedy access times, low power consumption, and very low heat emissions. They’re used in servers, too, but until recently their costs and reliability have been a barrier to adoption. Fortunately, SSDs have dropped in price considerably in the last two years, making them candidates for quick energy savings in the data centre — provided you use them for the right application. When employed correctly, SSDs can knock a fair chunk off the price of powering and cooling disk arrays, with 50 per cent lower electrical consumption and near-zero heat output.

Radical energy savings method 6: Use direct current in the data centre

Yes, direct current is back. This seemingly fickle energy source enjoys periodic resurgences as electrical technologies ebb and flow. The lure is a simple one: Servers use direct current internally, so feeding that power to them directly should reap savings by eliminating the AC-to-DC conversion performed by a server’s internal power supply.

Radical energy savings method 7: Bury heat in the earth

In warmer regions, free cooling may not be practical all year long. Iowa, for example, has moderate winters but blistering summers, with air temperatures in the 90- and 100-degree range, which is unsuitable for air-side economization.

But the ground often has steady, relatively low temperatures, once you dig down a few feet. The subsurface earth is also less affected by outdoor weather conditions such as rain or heat that can overload traditional equipment. By sending pipes into the earth, hot water carrying server-generated heat can be circulated to depths where the surrounding ground will usher the heat away by conduction.

Radical energy savings method 8: Move heat to the sea via pipes

Unlike geothermal heat sinks, the ocean is effectively an infinite heat sink for data centre purposes. The trick is being near one, but that is more likely than you might think: Any sufficiently large body of water, such as the Great Lakes between the United States and Canada, can serve as a coolant reservoir.

The ultimate seawater cooling scenario is a data center island, which could use the ocean in the immediate area to cool the data centre using sea-to-freshwater heat exchangers.

How much money do you want to save?

The value of these techniques is that none are mutually exclusive: You can mix and match cost saving measures to meet your short-term budget and long-term objectives. You can start with the simple expedient of raising the data centre temperatures, then assess the value of other techniques in light of the savings you achieve with that first step.

Would you recommend this article?


Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.

Jim Love, Chief Content Officer, IT World Canada

Featured Download

Related Tech News

Featured Tech Jobs


CDN in your inbox

CDN delivers a critical analysis of the competitive landscape detailing both the challenges and opportunities facing solution providers. CDN's email newsletter details the most important news and commentary from the channel.