When bankers got too clever and our businesses too complex, we all suffered the consequences. The 2008 financial crisis touched so many of us because the banks were woven into all our lives. Society was exposed to risks it didn't understand, and we all paid the price via government-backed bailouts.
Former HSBC boss: Warning lights flashing for Big Tech as they did for banks
My 30 years as a banker required me to look into the future and anticipate risks. I can confidently say that for tech companies, the warning signs are showing. We must act now.
The pandemic has emphasised our reliance on technology. It's not just the interminable video calls or the technology that underpins deliveries to our locked-down doors. It is the mountains of data that dictate the ads we see as we scroll through coverage of the latest coronavirus briefing. Digital transformation is sweeping through my industry, too. Consumers have squeezed years-worth of adoption of apps and online banking into just a few months.
Yet the risks go much deeper than the risk of hastily managed change. The algorithms that determine how much work delivery drivers get and what we see when we go online are no better understood than the structured credit products that brought the banking system to its knees in the financial crisis.
We need to manage the risks of disinformation and learn how to moderate content. We need to understand how bias is confirmed and then cultivated by algorithm. And when we understand how this works, we need to be clear who is accountable for it all. You can't sack an algorithm.
None of this is to say a crisis on the scale of 2008 lies around the corner. But it does add up to a significant risk, and one that we should not ignore. The sheer complexity of the risk presented by Big Tech makes the challenge of tackling it daunting, but we can get off to a good start if we use the work done in financial services as a model.
The financial crisis taught us that careful oversight is needed when the public interest is dependent on businesses that exist to meet the needs of private capital providers. Before 2008, regulators' approach to conduct risk in banking was what they called "principles based" — deliberately light touch. It relied too much on banks' abilities to govern themselves and it failed. The similarities with our current approach to Big Tech are striking.
In the years after the crisis, regulators and politicians in the UK did not sit back. Instead, they created the Financial Conduct Authority, which has established itself as a top conduct regulator for financial services.
The FCA has made a significant impact in two key areas that are relevant to the tech firms driving the new economy. It forced banks to communicate in a clearer way, particularly about their charges. This allowed consumers to make informed decisions about the exchange of value between themselves and their bank. It also made it easier to identify who was accountable if things went wrong. This had a positive impact on companies' diligence and appetite for risk, which improved the outcomes for their customers. Not an easy journey, but the FCA showed that it can be done.
We need the same ambition to address the risks posed by technology now. In other words a new, world-leading Digital Conduct Authority. This would strip away a complex mesh of interlocking institutions, and become a powerful, reliable regulator that could hold individuals to account. Its purpose would be — quite simply — to ensure good outcomes for customers, and a fair exchange of value for those who use technology platforms.
That would be good for consumers, and — ultimately — for Big Tech too.
• John Flint is a former chief executive of HSBC
- Financial Times