Slashdot is powered by your submissions, so send in your scoop


Forgot your password?

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).


+ - IT, Facilities Remain Strangers in the Night->

Submitted by
Esther Schindler
Esther Schindler writes "Despite best efforts, the IT and facilities departments still don't talk, leading to technology conflicts in the data center and unreasonable electric bills. Andy Patrizio asks, How much is a lack of communication costing your business?

This isn't just a "Can't we all get along?" story. Andy Patrizio also interviews eBay to learn what they're doing differently than most, such as questioning the assumptions about data center cooling:

How did eBay do it? For starters, the company didn't insist on turning its server containers into meat lockers. The conventional wisdom has been that a data center must be cold enough to store meat, even though Intel rates its CPUs at max temperatures of around 150 degrees Fahrenheit.

"We baby these systems too much," says Nelson. "The perception is it has to be cold. And who does that? IT thinks it can't be hot in here. There are two variables you need to design to: the surface temperature of the chip and the outside maximum worst case temperature in the environment of where you are."

So instead of turning the place frigid, they use outside air and unchilled water. If a CPU is hitting 150 degrees Fahrenheit, water that's 87 degrees is downright cold, even if that's the typical temperature of a pool in summertime.

eBay also has a different way of buying servers — and it has 1,920 servers in its Phoenix data center alone. Though I think what startled me the most is that the losing bidders on an RFP are told why. eBay went back to the losing vendors and explained why they lost:

The vendor put 36 engineers on the phone to get a lesson in their hardware's inefficiency. They went back and tuned their servers with low-voltage DIMMs, changed the heat sink, changed firmware settings, and made other tweaks. They won the next bid four weeks later.


Link to Original Source
This discussion was created for logged-in users only, but now has been archived. No new comments can be posted.

IT, Facilities Remain Strangers in the Night

Comments Filter:

The less time planning, the more time programming.