Our all-important communications infrastructure is holding up, but it has been seriously dented. Not through direct storm damage in many cases, but through power being cut and back-up supply (which is often very expensive to deploy) not lasting long enough.
As of writing, the New Zealand Telecommunications Forum umbrella organisation says that hundreds of cell sites are offline, mostly due to power cuts. All three telcos and the Rural Connectivity Group’s network, jointly operated by Spark, Vodafone and 2degrees, have suffered damage, with sites dropping off.
Chorus reports that while its core network is largely intact bar two fibre cuts, one to the Taupō to Napier line and another one running northwards in the region, copper broadband access has been more vulnerable.
Over 7000 services were hit by storm damage by Monday, and Chorus warned that the full impact of the bad weather is yet to be felt by many in the North Island.
Smaller wireless network operators say their networks are still running, but the sites are slowly dropping off as their back-up batteries empty out.
Field techies from lines companies, telcos and broadband providers are doing their level best to restore service to customers. In many cases, however, it’s just too dangerous for techies to work outside in gale-force winds, flooding and dense rain.
We need our communications to stay up though, especially broadband and mobile as there are no real alternatives to them for most people nowadays. Alerts, warnings, information sharing, work, service delivery, checking on friends and family, and education: all of that goes via data packets.
Gabrielle is a traumatic catastrophe for all of us, and the clean-up will be expensive and take a long time.
Hard as that work will be, thinking about how to make our communications infrastructure more resilient, with distributed power supply and avoiding single points of failure, should be done.
This will inevitably run into that “who’s going to pay for that?” roadblock, the killer of many a sensible preparation effort.
Some years back, I talked to a medium-sized company that stored valuable data on servers in their office, and which relied on a single internet connection to communicate with the outside world.
That company was baulking at the cost of disaster recovery, let alone moving from that basic worst-case preparation to resilience. Was it really necessary to spend heaps on off-premises and cloud back-ups, and one or two failover network connections?
Working out how much extra it would cost if the company lost data and/or connectivity and had to rely on ad-hoc solutions or even couriers did the trick. Even just a few hours’ worth of outage would be more expensive than a solid disaster recovery/resilience solution.
Technology has moved forward, and bringing power generation via solar (for example) and using local batteries could make a huge difference, and in fact save everyone money.
Australia has a national battery strategy to reach the country’s net-zero goal by 2050, and New Zealand is investigating energy storage too, to solve the “dry year” problem. Given how Gabrielle ravaged the country, maybe it’d be worth adding national resilience in case of disasters to that mix?
Rolling out even more fibre optic circuitry everywhere and making sure the networks are interconnected with others around the country is a relatively cheap way to keep comms alive in a disaster - provided that there’s power to run the equipment, of course.
Unfortunately, it’s time to start thinking in that manner on a national scale. The climate emergency isn’t going away any time soon.
That kind of planning and work is of course a less-than-ideal solution to a monumental and very dangerous threat.
The best one would’ve been to listen to scientists who have for decades been warning us what’s coming up. Humanity needs to kick itself hard for not doing that.