
With a mean annual temperature of around 1°C, and huge hydroelectric generation capacity, Luleå in northern Sweden is the ideal location for Facebook’s first data centre outside the USA.
1. In the beginning the site was run by a single server in founder Mark Zuckerberg’s dorm room; now, just ten years on, Facebook requires an extensive infrastructure of servers and storage systems to support its rapid growth. The Luleå data centre in north Sweden is a part of the global infrastructure bringing Facebook to almost 1.5 billion people around the world.
2. Tom Furlong shows aspects of the cooling system in place at the centre. In the Arctic climate, where winter averages are -20°C, facilities are able to cut dependence on air conditioning, reducing costs and carbon emissions. The freezing air from outside is pumped into the building, acting as a natural coolant, while walls of axial fans keep temperatures constant by expelling hot air generated by the servers.
3. Inside the centre, long, cavernous hallways give way to data rooms filled with computer networks, walls covered in flashing lights and giant fans. All the components, including power supplies, servers and storage systems, are built to Open Compute Project (OCP) specifications, ensuring the highest standards of cost and energy efficiency.
4. Villa, one of the head technicians of the Luleå data centre, repairs a faulty memory card in a server inside a data storage area. The ‘vanity-free’ design of the equipment allows for optimal cooling, as well as quick and easy repairs. Very few technicians are needed to maintain the centre, with each one employed to cover between 25,000 and 45,000 servers.
5. One of the data storage areas at the Facebook Luleå Data Centre. An enormous taskforce of servers and storage devices all working together are required to produce each Facebook page and house the enormous quantity of data produced every day. Every time a photo is uploaded, a message sent, or a page liked, the data is pulled from one of the company’s global data centres.
6. The centre’s many fan rooms keep servers cool by sucking in the cold Arctic air, which serves as free environmentally friendly air conditioning. The 27,000-square-metre facility also benefits from the region’s huge hydroelectric capacity, and is run entirely from power generated on the nearby Lule River.
7. The filter rooms act as mixing chambers to regulate the air temperature by mixing quantities of cool outside air and hot return air emitted by the servers. This method of air cooling has increased the data centre’s power usage effectiveness (PUE), which is calculated as the total power consumption of the facility divided by the power consumption of IT systems. Facebook estimates that the total PUE of the centre is 1.08 or better.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.