dimanche 11 octobre 2009

Seeking Energy Savings at the Heart of the Internet

Digital-era icons like Google and Twitter have made life more efficient — and fun. But they also guzzle vast amounts of energy.
Skip to next paragraph
green inc.

A blog about energy, the environment and the bottom line.

Scattered around the world are scores of data centers that sift through the endless streams of information that keep the Internet and office computers running. In the United States alone, those data centers accounted for 1.5 percent of the country’s electricity use in 2006 — more than the entire state of Massachusetts. And their power use could nearly double over five years, according to government reports.

Experts say that data centers present an obvious opportunity to improve efficiency.

“It’s becoming a big deal,” said Dale Sartor, an energy efficiency expert at Lawrence Berkeley National Laboratory near San Francisco. He noted that in some cases, the energy costs of a server over its useful life of three or four years exceeded the initial cost of the server itself.

Some of the largest opportunities lie in the way data centers are kept cool. The buildings — many of which are enormous — must typically be kept below 80 degrees Fahrenheit (26.7 Celsius), so that the chips work at maximum efficiency. And that requires a great deal of energy.

The cooling equipment alone can consume 25 percent of the power that goes into a data center, said Christian Belady, an efficiency specialist at Microsoft. “So if there’s anything we can do to eliminate that, right there we use 25 percent less power.”

Companies are innovating in this area, not least by using a tool that is ancient and free: the weather. Last month, Microsoft opened a data storage center in Dublin, which it said would take advantage of the Irish chill to achieve greater efficiencies. The system brings in air via large, high-up ducts that are controlled by valves, so it works somewhat like an attic fan, Mr. Belady said.

Nonetheless, he said, the company has backup systems in case the temperature spikes or the air is smoky.

Other Internet giants are making similar moves. In June, Yahoo announced that it would locate a data center in Buffalo, New York, to take advantage of the “micro-climate” to cool the servers entirely with outside air. And Google has a data center in Belgium where, according to Niki Fenwick, a spokeswoman, “the local climate allows us to efficiently cool the data center without needing to use electricity to power chillers.”

She noted, however, that “not all Google data centers can be located in cold climates, because we want our tools to be as fast as possible.” (In other words, the transmission of data can slow down over long distances.)

A number of companies, including Microsoft, Yahoo and Deutsche Telekom’s T-Systems, are also locating their data centers near hydroelectric plants, allowing them to play up the virtues of renewable power (though hydropower is often less expensive than conventional power, at least in the United States, so there is a bottom-line reason too).

Traditionally, many data centers have been designed “like a vault,” according to Andres Carvallo, the chief information officer for Austin Energy, a utility in the heart of Texas’s high-tech “Silicon Hills” that runs a rebate program to encourage companies to buy more efficient data center equipment. In other words, he explained, they had no access to the outside air.

That is changing. “There’s certainly a renaissance around designing a data center,” Mr. Carvallo said.

Companies are indeed innovating. In Uitikon, Switzerland, I.B.M. is using the waste heat from a data center to keep a swimming pool warm.

Mr. Belady of Microsoft said that his company was pushing its suppliers to build servers that could work in higher temperatures — up to 95 degrees Fahrenheit (35 Celsius) — allowing Microsoft to build systems that use the outside air closer to the Equator.

Mr. Belady also emphasized the importance of pushing companies to measure the effectiveness of their power or energy usage, so that they could understand how much power or energy actually makes it to the number-crunching equipment, rather than going toward cooling or other auxiliary uses. Today, only about 10 percent of data center operators make such measurements, he estimated.

There is also innovation surrounding the management of the power supply to the chips, which goes through a number of transformations, said Mr. Sartor of Lawrence Berkeley National Laboratory. For example, he said, interruptible power supplies can often be bypassed, thus avoiding losses associated with converting power from alternating current to direct current and back to alternating current. In this regard, “Europeans, like so many areas of efficiency, are typically ahead” of the United States, he said.

Meanwhile, the need for more computations continues to grow. Mr. Sartor cited an example in his backyard: Whereas earlier this decade a supercomputer at Lawrence Berkeley National Laboratory used a few hundred kilowatts of power, its needs are projected to grow to 17 megawatts over the coming years.

“We’re talking about tens of millions of dollars to power our new supercomputer facility, and that starts catching management’s concern,” he said.

“We are dramatically improving the efficiency of computation. The situation is that our appetite for computation is going up way faster than the efficiency is going up.”

One piece of good news is that cooperation has increased in recent years among companies eager to tackle the data center efficiency problem. A number of cross-company consortiums, like the Green Grid, have sprung up (a symposium is being held this week in the Silicon Valley to discuss data center efficiency, with participation from several large multinational companies).

“Everybody recognizes that we have to drive efficiency as an industry, not just as individuals,” said Mr. Belady of Microsoft.

Source : New York Times, 11/10/09



Aucun commentaire:

Enregistrer un commentaire