Data centres under water? Project Natick goes deep

Published on the 13/06/2018 | Written by Pat Pilcher


Data centres are the unsung heroes of our age. Every time we use cloud apps or utter “OK Google” a data centre is doing all the legwork…

The trouble is this – setting up and running data centres is an expensive and complex undertaking.

Location plays a big role. Data centres need access to high-speed data networks. They also need a supply of reliable and affordable electricity. Because of this, many are near large population centres. With the property boom, acquiring land in metropolitan centres is costly.

Another factor figuring in the location of data centres is Latency. This is the measure of how long it takes data to travel between its source and destination. It can be the difference between a useable and laggy cloud app.

There’s also another less obvious issue – Heat. Servers generate lots of heat. This see’s data centres consuming vast amounts of electricity for air conditioning.

With a strong focus on all things cloud-based, Microsoft has long been aware of these challenges. Now they say they have a solution.

“The first Natick data centre near Scotland’s Orkney Islands already contains 12 racks of 864 servers and 27.6 petabytes of storage.”

It takes the form of Project Natick which involves underwater data centres. This might sound sci-fi and more than a little crazy, but there’s a method to Microsoft’s madness.

Most of us live within 200km of a coast. This means getting high-speed data connections to offshore data centres isn’t difficult. Land costs also don’t figure offshore.  Another plus is cooling costs which are lower as Natick data centres are under water.

This plus the fact that Energy comes from renewable sources results in a low TCO. Microsoft says that in the future, Natick data centres could use offshore wind and tidal power. This would make them near self-sufficient.

Being close to onshore cities also reduces latency which translates into more responsiveness. Data travels at around 200 km per millisecond. For populations, 200 km from a data centre, a round trip takes about 2 milliseconds. For cities 4,000 km away, a round trip takes 40 milliseconds, adding lag to cloud applications.

Natick data centres are also modular. This says Microsoft translates into a 90-day deployment. Microsoft says this is a quicker and smoother process than with land-based data centres.

The first Natick data centre got submerged near Scotland’s Orkney Islands. It’s already processing data and contains 12 racks of 864 servers and 27.6 petabytes of storage. Microsoft says its compute power is akin to several thousand high-end PCs. Its shipping container sized system consumes around 240 KW of power.

Natick’s performance so far has been good says Project Natick manager Ben Cutler:

“There were no hardware failures in the initial three-month deployment. The cooling system worked well, surpassing some of the Natick team’s efficiency goals. That gives us the confidence to go to environments that are deeper and cooler.”

Microsoft will test the data centres performance over the next 12 months. They are also planning a next generation version. It will use a submersible vessel four times larger than the current Natick container. It will also house 20 times its compute power and may make use of tidal energy generation.

Post a comment or question...

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

MORE NEWS:

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Follow iStart to keep up to date with the latest news and views...
ErrorHere