In a growing digital world where everything is online, one of the biggest challenges for companies is actually finding the right location and space to place its data centers. Not only do you need data centers all over the place to reduce latency for different countries, but the actual physical space that these need to take up, along with building added redundancy, makes it quite a challenge for many companies that provide these services. Therefore, when Microsoft project team that recently concluded a two-year experiment involving an underwater data center asserting its high reliability and sustainability, the research has prompted discussions around the widespread applicability (and commercial sense) of the concept.
Back in 2018, Microsoft leveraged technology from submarines to sink a data center off the coast of Orkney, in Scotland that they claimed could provide internet connectivity for years. Now, it has been retrieved and the firm’s researchers are testing it to see how well it performed.
“The retrieval launched the final phase of a years-long effort that proved the concept of underwater datacenters is feasible, as well as logistically, environmentally and economically practical,” Microsoft’s John Roach has written in a blog post.
One of the first things the researchers noticed was that the sea-based data center had a lower failure rate than a conventional one. Only eight out of the 855 servers on board had failed. Microsoft speculates that the greater reliability may be related to the fact that there were no humans on board and that nitrogen was pumped in rather than oxygen.
The firm is now trying to figure out why the servers in the underwater datacenter are eight times more reliable than those on land. Project Natick – the name given to this underwater wild experiment – also proved that the data center could be recovered in a rather pristine state and researchers believe the underwater pods can reduce latency by moving cloud services closer to customers.
Underwater data centres might sound an outlandish idea. But David Ross, who has been a consultant to the data centre industry for many years, told BBC News the project has great potential. He believes organizations facing a natural disaster or a terrorist attack might find it attractive. “You could effectively move something to a more secure location without having all the huge infrastructure costs of constructing a building. It’s flexible and cost effective.”
Spencer Fowers, a principal member of technical staff for Microsoft’s Special Projects research group noted that perhaps the biggest lesson to be taken from Project Natick is how to use energy in a more sustainable way.
There is little doubt about the huge data center boom that’s happening in the world in recent years. Companies such as Facebook, Microsoft, IBM, Apple and Google are spending billions to build data centers. With the global demand for data skyrocketing, data centers are becoming much sought after, and under strain, large enterprises are also partnering with tech giants to come up with wild and creative datacenter stations that are located in places that most people would never imagine.
Read more: The Future of Data Centers Is Green
The Nordic country installed a data center inside an old mine, in which120,000 square meter complex is cooled by water from nearby fjords. The Lefdal Mine Data Center is known as the world’s most efficient data center and serves the challenge of datacenter cooling. Created jointly by IBM, Rittal, and Lefdal, the mine could house up to 1,500 containers and 100% of the energy it uses comes from renewable sources.
Mine data centers are not confined to Scandinavia. Several exist in the US, including the Bluebird Underground Data Center in Springfield, Missouri.
To address the issue of cooling, Facebook has also set up a data center in the Swedish city of Lulea, in the Arctic Circle. The system takes advantage of the low temperatures of the outside air to cool the interior and thus relies less on additional generators.
Many also believe, the outer space is likely to be data’s next frontier. In 2016, NASA created the New Solar System Internet by on the International Space Station (ISS), in order to communicate with its Voyager and Mars rovers. The ISS is also the first space-based data center that able to transmit and receive data between operations centers and their payloads aboard the station.
DE-CIX North America vice president and general manager Ed d’Agostino believes that the primary goal of novel data centers is to ensure operational efficiency and maintain high-quality connections. “While they are all very interesting and tell great stories, we have to always keep in mind proximity to users who demand low-latency connectivity for a better experience,” he said.
According to a research by CB Insights, with over 175 zettabytes of data expected by 2025, datacenters will continue to play a vital role in the ingestion, computation, storage, and management of information, which emphasizes that the future of data centers may be unique to accommodate this huge trove of data.