Microsoft had sunk a data centre off the Orkney coast in the Northern Isles of Scotland two years ago. It was done as a part of an experiment on energy efficiency in the data servers. Orkney was chosen as the site of the experiment as it is in a temperate zone. This region is a centre of research on renewable energy. The datacenter was pulled out of the ocean bed and some important observations were made. The data centre in the shape of a cylinder packed with the servers has shown a lower failure rate than the conventional data centres. It was found that only 8 of around 860 servers had failed in the ocean submerged data centre of Microsoft. Observers state that it is one-eighth of the failure rate on the land. It makes the underwater datacenter setup eight times more reliable than the land based ones. Experts opined that since Nitrogen was pumped into the system, it led to fewer failures. Role of nitrogen is linked to the lowered incidence of corrosion. There are many other factors contributing to malfunctioning and failure of datacentres like temperature, humidity, oxygen level among others. Placing data centres in underwater setup has been found to be a cost effective way of cooling the computing systems. This development is relevant in the context when larger amount of data is being uploaded in the cloud which increases the energy demands and cooling needs for the data centres.
Project Natick is working on building such underwater datacenters which can answer the issues of energy demand, lesser incidences of failures and speed of the data. The project targets to know the feasibility of taking the cloud servers underwater and whether taking such a step would be possible from the points of view of cost and energy budget. It had been working on developing clusters of small underwater datacenters which might be pitched in a commercial project. Orkney’s entire electricity comes from solar and wind which are generally considered as unreliable grid based power. It is interesting to note that for the water submerged datacenters the supply of power has not posed major issues. A prominent consultant in the data centre industry, David Ross stated that the organizations which are susceptible to terrorist attacks and other calamities might find the idea of water submerged datacenters secure and effective. It is a cost effective and flexible option with lesser requirements of constructing any structures as such, hence an attractive proposition. The experts also opine that the question is no more regarding whether it needs to be done. It is regarding how big or how small the ocean submerged datacenters would be.
The project Natick had begun its first phase in 2015. That included a deployment of over 105 days in Pacific Ocean region. This development in Scotland was part of phase II to demonstrate that the datacenters could be built and deployed. Placing the datacenters submerged in the oceans near the densely populated coastal cities could help in improving the quality of the data by reducing the distance travelled by the data. It can also place them in protection from the air based pollutants like hydrogen sulphide, bromine, oxides of nitrogen, dioxide and trioxide of Sulphur and others. This project is definitely important in the context of the increasing energy required to maintain the ever increasing amount of digital data. It has been found out in calculations that the number of digital bytes present in the cyberspace would be equal or more than the number of atoms in the planet in the coming two to three hundred years. It is increasingly referred to as the information catastrophe of the coming age. This would put a lot of strain on the energy requirement to keep the servers and the computers cool. Developments under the Natick would help in tackling the energy budget of digital information.