Data centres are definitely a business running red-hot at the moment.
And it’s not just the business that’s hot - the areas that large data centres are going into are on the warm side too. Like Malaysia, where operator NextDC is plotting facilities, and Darwin in Australia’s Northern Territory.
Vocus has been laying sub-sea internet cables in the area, hooking up Darwin with Indonesia and Singapore (and Perth). The idea is to tap into the massive Asian market from a geopolitically stable region not too far away. Which is Australia, in case you wondered.
The cloud offers economies of scale and flexibility, but it performs better the closer the facilities with servers and storage are to customers. “Data sovereignty” means keeping stored information within specific jurisdictions for a variety of reasons - technical, political - and ease of access, not just by the customers the data belongs to.
Even New Zealand’s getting a bunch of them, big ones to boot. There’s bound to be many more over the next few years, and it does mean thinking about how to feed with clean power, or if it’s even possible to do so.
Last year, Microsoft ran into an issue where Ireland’s at-capacity mains grid was deemed too feeble to supply the tech giant’s $1.6 billion data centre campus with a steady flow of energy.
Apparently, the 21 existing diesel generators at the site will be supplemented by a 170-megawatt gas power plant, with Microsoft having sought an industrial emissions exemption from Ireland’s Environmental Protection Agency.
Ireland appears to be running at max data centre capacity currently, with EirGrid not accepting any more applications for Dublin data centres until 2028.
Generally speaking, everything points to cloud computing being more energy-efficient than having smaller, decentralised computing instances all over the place. Big hyperscalers can put pressure on electricity generators to invest in large-scale renewable energy sources to power facilities, for example.
There are also less obvious reasons, like cloud computing enabling hybrid and remote work so people don’t have to commute, and the ability for organisations to scale their IT systems according to need so they don’t have to factor in spare capacity to cover growth, and can shrink their solutions if business slows.
Dropping megawatts of electricity into single locations isn’t ever going to be easy, however. What’s more, powerful computer systems run hot. Adding lots of them into massive buildings means even more heat which has to be removed.
Over the years, many creative solutions have been mooted to keep data centres cool. Microsoft (again) has tested sinking data centres into the sea floor off Orkney with Project Natick.
That strategy apparently works perfectly well in the North Sea, but maybe not so much in the quake and tsunami-prone Pacific.
A few years ago, Kolos, a US-Norwegian start-up, hatched a plan to build a giant data centre with a 1000-megawatt capacity in the Arctic Circle. It was supposed to be used for Bitcoin cryptocurrency mining, but it doesn’t seem to have panned out despite massive publicity. The Kolos domain now displays a spammy Malaysian gaming site.
Exotic cooling solutions apart, running water over hot electronics sounds a bit old-fashioned, but it’s still popular.
Especially so with large artificial intelligence systems, which use tens of thousands of powerful graphics cards that consume huge amounts of power.
Google’s Bard wasn’t able to tell me how much water it uses, but pointed to a non-peer-reviewed paper by American researchers from April this year that suggests OpenAI’s ChatGPT-3 large language model (LLM) goes through vast amounts of H2O - like half a litre per 20 to 50 questions.
Training the giant artificial intelligence computer systems with the data that produces the answers the LLMs’ algorithm coders think you want to see results in colossal water consumption as well, with a figure of over 700,000 litres suggested by the researchers.
In fact, data centre water usage is mind-boggling if the researchers are correct.
“For example, even excluding water consumption in leased third-party colocation facilities, Google’s self-owned data centres in the US alone consumed 12.7 billion litres of freshwater for on-site cooling in 2021, roughly 90 per cent of which was potable water,” the researchers found.
Multiply that number many, many times and you’re looking at a worrying situation in which a non-renewable resource is being placed under increasing strain.
And no, climate emergency Aotearoa, which this year was almost washed away with the huge rains, isn’t ideally placed to take advantage of lots more water falling from the skies. There’s not going to be a constant supply of fresh water locally.
It’s at this stage you start to wonder if we’re headed for an immediate future where people are in competition for power and water with demand-hungry tech giants with financial resources bigger than small to medium-sized countries. If so, it probably wasn’t how the second IT revolution was meant to shape up.