[Beowulf] running hot?

Eugen Leitl eugen at leitl.org
Thu Mar 19 10:36:08 PDT 2009

On Thu, Mar 19, 2009 at 12:34:39PM -0400, Mark Hahn wrote:
> are you running your machinerooms warm to save power on cooling?

Speaking about warm: http://www.datacenterknowledge.com/archives/2009/03/19/rackable-cloudrack-turns-up-the-heat/ 

Rackable CloudRack Turns Up The Heat

March 19th, 2009 : Rich Miller

The server trays from the CloudRack C2 enclosure from Rackable have no on-board fans and power supplies.

The server trays from the CloudRack C2 have no on-board fans or power supplies.

Are you ready for the 100-degree data center? Rackable Systems has introduced a new version of its CloudRack enclosure that it says can operate in environments as hot as 104 degrees, offering customers the option of saving energy costs by raising the temperature in their data center. The new CloudRack C2 is Rackable’s latest effort to combine higher density and lower power usage by shifting components out of the server tray and into the enclosure.

The C2 introduces cabinet-level power distribution technology, using rectifiers to convert AC power to 12V DC power. This innovation, combined with the cabinet-level fans introduced in the initial CloudRack, mean that the server trays contain no fans or power supplies. Rackable says the CloudRack fans and rectifiers equate to an N+1 redundancy.

Rackable says the design innovations will allow data center operators to safely run server-packed CloudRacks at temperatures up to 40 degrees C, or 104 degrees Fahrenheit. Most data centers operate in a temperature range between 68 and 74 degrees, and some are as cold as 55 degrees.

“The CloudRack C2 is a landmark achievement,” said Mark Barrenechea, president and CEO of Rackable Systems (RACK). “Most notably, it solves the problem of stranded power. Data centers can now also reduce power consumption by simply turning up the thermostat while using CloudRack C2. It is the most energy-efficient and thermally-intelligent cabinet technology Rackable has ever offered.”

The first CloudRack design introduced last fall featured two to four large fans in the rear of the enclosure. The C2 goes with a denser configuration of 18 smaller fans in the rear of the 23U half-rack, with 42 fans cooling the 46U full rack. Rackable says this can support up to 1,280 cores per cabinet using the company’s MicroSlice servers.

Raising the baseline temperature inside the data center - known as a set point - can save money spent on air conditioning. Data center managers can save 4 percent in energy costs for every degree of upward change in the set point.

Google and Intel have encouraged data center engineers to consider raising their set point as a way to improve energy efficiency, while HP and Sun Microsystems have made higher temperatures a focus of their data center efficiency services.

In January the American Society for Heating, Refrigerating and Air-conditioning Engineers (ASHRAE) expanded its recommendations for ambient data center temperatures, raising its recommended upper limit from 77 degrees to 80.6 degrees.

Some data center managers warn that running equipment near the high end of the manufacturers’ suggested range for equipment could void warranties with equipment vendors. Another major concern is what happens in the event of a cooling failure, when a lower set point could buy a few additional minutes of recovery time before the room heat reaches unacceptable levels.

Running your data center warmer also raises the potential for “hot spots” to form in areas where cooling airflow doesn’t reach an entire rack. That’s why it’s a good idea to implement advanced monitoring of rack temperatures and data center airflow before nudging the set point higher. But the focus on temperature and energy efficiency is unlikely to abate.

“Energy has become a central design point for the data center,” said Jed Scaramella, senior research analyst, Datacenters, IDC. “The density, power and thermal efficiencies Rackable achieves with CloudRack C2 enable customers to drive meaningful performance gains, while at the same time helping to reduce overall data center operating expenses.”

Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE

More information about the Beowulf mailing list