A week after it was shown to a group of industry engineers during a company event in Mountain View, Google has publicly released a video detailing its once top-secret data center design.
As has been rumored for years - and as The Reg confirmed this fall - Google pieces together its data centers using intermodal shipping containers pre-packed with servers and cooling equipment. The video - posted to YouTube - provides a six minute tour of the company's inaugural containerized data center, which went live sometime in 2005. Google calls it, well, Data Center A.
It's unclear where the facility is located. But presumably, it's one of three data centers Google has built in The Dalles, Oregon. According to Google, it holds 45 shipping containers, and each of these data center pods houses around 1,160 servers. The entire facility has a power capacity of 10 megawatts and a trailing twelve-month average Power Usage Effectiveness (PUE) value of 1.25.
The video tour begins in the outdoor equipment yard, which houses the cooling towers, the power-distribution centers, and the generator farm. Google says its transformer exhibits a better than 99.5 per cent efficiency, and boasts about the relatively small size of its cooling radiators.
After a shot of the generator farm, the camera follows the low-voltage cables that carry power from the distribution center into an indoor pod hanger, where they eventually connect to switch panels for each container. The video then jump-cuts to the facility's cooling plant, where a portion of the water streaming from the cooling towers in redirected to a sidestream filtration system.
Google admits to its fair share of leaks in the cooling plants. Apparently, the pipes aren't sealed as tightly as the mouths of company employees.
Because it uses the most power, Google calls the water chiller the company's "biggest target." An important part of reducing chiller hours, the company says, is the use of plate and frame heat exchangers, which exhibit low-approach temperature characteristics.
Pipes then carry the chilled water from the cooling plant into Google's container hanger. Fifteen containers are lined up on one side of the hanger, and thirty more are slotted into two stories on the opposite side.
Google's idea of nightlife
As the video shows a technician riding a foot scooter towards one of the containers, the company refers to the scooter - in recitative monotone - as a "Google-provided personal transportation device." Naturally, the Chocolate Factory would never force its Oompa Loompas to pay for their own scooters. After entering the container, the technician is seen swapping out one of Google's previously top-secret servers, which includes it own battery-power UPS. Again, Google's claims 99.5 per cent efficiency.
The aisle inside the container is cooled around 27 degrees Cecilius (81 degree Fahrenheit). According to Sun Microsystems, which has built its own data center containers, a data center can save four per cent in energy costs for every extra degree (Fahrenheit) on the thermostat. If a data center can run at a higher temp, you can use less power and spend less money on cooling equipment.
Most data centers tend to operate between 68 and 70 degrees Fahrenheit. according to the American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE). But following the lead of operations like Google, ASHRAE recently raised its recommended data center temperature range to between 68 and 77 degrees Fahrenheit (20 to 25 degrees Celsius).
In the video, Google says the high cool-aisle temperatures are "enabled by attention to good airflow management." But according to one former employee speaking with The Reg, Google has insisted on an Intel guarantee that the processors it purchases from the chip giant will operate at temperatures five-degrees centigrade higher than their standard qualification.
Before showing off the container's underfloor, you're treated to another ill-advised attempt at Googlehumor. The lights turn off in the container, and our monotone narrator calls it "Google's idea of nightlife."
The video is otherwise short on detail. And it's worth remembering that this was Google's first containerized data center. In all likelihood, its newer data centers - Google runs at least 36 sites across the globe, and several more are under construction - are even more efficient than its 2005 model. Google's specs for Data Center A are matched by today's cutting-edge designs for other operations.
We asked Google to discuss its data center designs, but it declined.
Brewster Kahle and the Internet Archive publicly pitched the containerized data center idea in the fall of 2003, and that December, Google filed for a patent describing a data center pod of its own. According to Kahle and a 2005 expose from Robert X. Cringely, Google co-founder Larry Page witnessed one Internet Archive pitch a little more than a month before the patent filing. The patent was granted in October 2007. The company has also patented containerized data centers that float on water.
According to the aforementiond former Google employee, the company's containerized data center operation is known internally as Project Will Power. Today, countless other operations are following suit, including arch-rival Microsoft. The idea is that you can manufacturer these standardized data centers from a central location and ship them across the globe where needed. This is cheaper - and quicker - than building centers from scratch. ®
You can view the whole video here: