Our lives are now ruled by algorithms and apps, our smartphones are as addictive as slot machines. That's the bad news, says industry whizzkid Tristan Harris. The good news is that Big Tech bosses are taking note. Ben Hoyle reports.
Reporting can be a scary job. I have had nervous moments with warlords, gangsters and neo-Nazis. I have been shot at and threatened. Once, long ago, I had to endure, without displaying any outward sign of panic, the whole of Tonight's the Night, the Rod Stewart musical. But if Tristan Harris is right about what he is telling me, then the presentation playing now on his phone is the most frightening thing I've seen in my life.
READ MORE:
• Aussie regulator takes Google to court, alleging it misled users over location data
It's a road map for the erosion of civilisation as we know it.
Harris, 35, is a former Google insider who has been called "the closest thing Silicon Valley has to a conscience" and a "Silicon Valley apostate".
Harris believes that we are in the midst of an "invisible climate change of culture" caused by technology companies that view the world's 2.7 billion smartphone users as a resource whose attention they can mine for profit. The resulting competition has a very unfortunate side-effect: "attention capitalism" is making us nastier, stupider and much less likely to find common ground with our fellow humans.
We can try to resist, but it is not a fair fight. Whenever you open Facebook, Instagram or YouTube, you switch on what Harris has called "a voodoo doll-like version of you in a supercomputer". This consists of nearly everything you've ever clicked on, liked or watched. That's how the companies keep you ensnared: they know you better than you know yourself.
Harris's conclusions are controversial, but his influence is unmistakable. He has briefed world leaders and is a confidant of some of the most powerful figures in the technology industry. He has testified to the US Congress. His two TED Talks have been viewed more than 4 million times.
More is at stake here than just children spending too much time staring at screens, companies selling our data or Russian hackers interfering in elections, Harris argues. Those seemingly separate problems are real, but they're also diversions from tackling the bigger picture. "To even make the debate about addiction or screen time is like talking about climate change in terms of whether the number of polar bears has gone up or down. It's the wrong currency to talk about a systemic catastrophe."
What is actually happening is a fundamental rewiring of the human brain, leading to behaviour "that is tearing apart our social fabric", he says.
We're sitting in a small room in San Francisco's business district, upstairs from the rented offices of the Centre for Humane Technology (CHT), the non-profitmaking organisation that Harris cofounded last year. He is slightly built, with vigilant eyes that are almost the same copper-brown colour as his neat hair and beard. He wears dark jeans, a grey shirt and grey fleece with an old-school digital watch on his left wrist. The watch is both a means of freeing himself from checking his phone and a sign to other refuseniks that he is on their side.
Saving the world sounds exhausting. Harris's Shortwhale page (a service for winnowing email overload) explains that for his "health and sanity" he has minimised his email time, focusing on opportunities "where I can have powerful, transformative impact". Potential contacts should bear in mind that every week he gets "10+ major media interview requests" and "10+ major speaking engagement inquiries". Every month there are "10+ film documentary interview requests" and "10+ major inquiries from major governments".
As soon as Harris starts his pitch, though, he gleams with evangelical purpose.
Ten minutes in, he leaps to his feet and sketches a graph on a whiteboard to show the moment in the future when technology will overwhelm humankind's strengths: when artificial intelligence can do everything better than we can and the machines take our jobs. It looks reassuringly far off. But then Harris opens up the presentation on his phone (which is set to greyscale to make it less addictive). He homes in on a much earlier watershed. This, he says, is when the algorithms that churn away in the background of our everyday lives achieve a form of stealth supremacy by hacking our all too human weaknesses. These are vulnerabilities such as vanity, social insecurity and our susceptibility to information that affirms our existing prejudices rather than contradicting them. Technology doesn't have to be nearly so advanced to penetrate this soft underbelly. We're there already.
"The first crossing point was when it overloaded our mental limits, which we feel as information overload," Harris says. That probably happened in the early Noughties, he says, around the time that the tabbed browser was invented, so we could keep multiple pages open more easily on our computers. Then smartphones arrived and became a portal through which apps such as Facebook and LinkedIn could reach "and grab the puppet string of your self-image and social validations". You know the kind of thing, he says: "Oh, these three people endorsed you on LinkedIn. Don't you feel like you should endorse them back?"
Since then, our relationship with technology has had profound real-world effects. These appear like outriders from the book of Revelation in Harris's presentation. "You get shortening of attention spans, addiction, disinformation, narcissism, outrage, polarisation." This is measurable. Half of teenagers and more than a quarter of parents feel "addicted" to their mobile devices, a 2016 study for the charity Common Sense Media found. Research from Yale University indicates that each word of moral outrage added to a tweet increases the amount of retweets by 17 per cent. A 2018 study by Massachusetts Institute of Technology (MIT) showed that fake news spreads six times faster than accurate news.
Last summer, a program called FaceApp went viral by offering users a chance to generate plausibly aged images of themselves and share them with their friends. Thus did its Russian-based designers persuade 150 million people to hand over private images of their faces, paired with their names, simply by exploiting their vanity.
The most damaging development is the most recent, what Harris calls "the checkmate". This is when technology "attacks the foundation of what we trust" via fake news, bots and deep fake videos. You can't opt out: even if you boycott the internet, you are still living in a world where people around you might be radicalised by YouTube videos or choose not to vaccinate their children because of misinformation spread on Twitter. Tech-influenced crises are erupting everywhere, but taking a whack-a-mole stick to each problem misses the point.
"This is a self-reinforcing system that gets worse as [the problems] feed each other. We call it 'human downgrading' because we need a name. This isn't the privacy problem. This isn't the data problem. It's not the 'tech is not a blockchain yet' problem. This is the diagnosis for why all this shit is going wrong at the same time."
Harris often quotes Edward Wilson, the Harvard professor and evolutionary biologist, who said, "The real problem of humanity is the following: we have Paleolithic emotions, medieval institutions and godlike technology."
Essentially, Harris says, "We're chimps with nukes," and the chimps are becoming more primitive just as the nukes get more sophisticated and deadly.
Belief in truth and facts is slipping away at a time "when it has never been more urgent to know how many years we have until the permafrost melts", in an era when "we need the whole world to see our world's problems the same way very quickly".
I can feel myself recoiling, pressing into the back of my chair to get away from the dystopian forecast on the phone. Harris grins. "I try to stay lighthearted, but that's why we lose sleep. That's why we work so hard."
The goal is to change how technology is built. This summer, CHT launched a podcast called Your Undivided Attention in which Harris and his cofounder, Aza Raskin, interview experts who can help demystify human downgrading, including authorities on cults, casino design, addiction, election hacking and methods of persuasion. In one episode, Gloria Mark, a professor at the department of informatics at the University of California, talked about the "science of interruptions". She has found that when people are working on computers, their attention breaks every 40 seconds. Less than two decades ago it was every three minutes – and people were shocked then. "We are still in the Wild West of tech development," she told the hosts. Tech "is being developed without really thinking about how it fits with human beings". That's what Harris and Raskin want to change. Their approach is a pincer movement: they lobby tech leaders discreetly and hold workshops inside technology companies, while also mounting a public campaign to increase external pressure.
In April, Harris and Raskin gathered several hundred tech heavy-hitters in a San Francisco amphitheatre to introduce the concept of human downgrading. The audience included cofounders of Apple, Craigslist and Pinterest, vice-presidents at Facebook and Google, venture capitalists and the actor and technology entrepreneur Joseph Gordon-Levitt. At a dinner for a select group of attendees, Harris flagged up a silver lining to the very dark clouds he had spent the day depicting. "Unlike climate change, it only takes about 1,000 people to reverse human downgrading," he said. "In this room, right now, are many of those people."
To convince them to act, Harris expects that he must reframe the way that thousands, if not millions, of people think about technology.
The main offenders are obvious. "It's hard not to look at this and say, essentially, we have at least two of the biggest companies – Facebook (including Instagram and WhatsApp) and Google (including YouTube) incentivised to create this digital Dark Age, where disinformation outcompetes information." YouTube has 2 billion unique monthly users, giving it a psychological footprint roughly the same size as Christianity. Facebook is bigger.
The founders of these companies did not set out to build a system to undermine humanity, Harris says, but they are now trapped by the way their businesses are configured.
"I found that it's only been external pressure – from policymakers, shareholders and media – that has changed companies' behaviour," he told a US Senate hearing in June.
Apple is often praised, rightly, for its privacy standards, but it is also "the company that can change all this, because it's not bound by the incentives of maximising attention". It makes its money through sales of devices and through a cut of fees paid for services bought or subscribed to via its App Store. Apple could simply lock out of the App Store all companies that have a business model based on engagement, he suggests. That would incentivise change pretty quickly.
Of course, Harris recognises that when a business model is the problem and that model has "one and a half trillion dollars of market value", companies won't switch course overnight.
But he believes that most technologists are idealists. "You don't have engineers in Silicon Valley saying, 'You know what I want to do when I graduate from Stanford? I want to work for a company that's going to destroy democracy.'" Harris went to Stanford. Two of his friends there, Kevin Systrom and Mike Krieger, later started Instagram together, a company that feeds on vanity and which they sold to Facebook for cash and stock worth $1 billion. At Stanford, Harris, Systrom and Krieger were in a group imagining ways to improve the world through technology. He doubts that either of them have lost that aspiration. A lot of technology leaders that he knows feel the same way. One boss he is reportedly close to is the chief executive of Twitter, Jack Dorsey, whose company perhaps not coincidentally introduced a ban on political adverts in November. Because discreet influence only works if it stays discreet, Harris won't confirm exactly which Silicon Valley executives he has the ears of. But he does say that watching the leaders of the attention-capitalism companies give presentations "is like watching a hostage in a hostage video. You're like: why are they saying those things? And then you see off stage, there's a guy with a gun to their head. That gun is their business model."
Harris grew up in San Francisco's Bay Area and was raised by his mother, who worked as a lawyer for injured workers. As a shy seven-year-old, he learnt about the power of persuasion when he became obsessed with magic. It never ceased to amaze him how he could manipulate his audience with tricks that he thought seemed obvious. The thing about magic, though, is that, "It doesn't work or not work based on how intelligent you are as a consumer. Magic is about a universal set of vulnerabilities and biases, such as how our minds construct cause and effect." The magician, in other words, is exploiting his understanding of the audience's psychology to create a powerful emotional response, much as Facebook draws on its virtual voodoo dolls to outmanoeuvre consumers no matter how clever or educated they are.
Harris studied computer science at Stanford, where as well as the two Instagram cofounders his contemporaries included Chris Cox, the future head of product at Facebook, and Ed Baker, later head of growth at both Facebook and Uber. Evan Spiegel, chief executive of Snap, was just behind them. During his master's, he joined the Persuasive Technology Lab run by the behavioural psychologist BJ Fogg. The class has since attained quasi-mythical status for training a generation of entrepreneurs to use psychological insights to influence users' actions. Harris cofounded a company of his own, Apture, which was bought by Google in 2011, and he joined the search engine's email service. Anxieties about attention capitalism boiled inside him there for almost a year, he says, "Because I saw the situation getting worse and I didn't see the key product, Gmail, sufficiently attack the problem."
In 2013, after a trip to Burning Man and a walk in the Santa Cruz woods with Raskin, Harris wrote a slide presentation setting out his thoughts on the "enormous responsibility" borne by designers like him at Google, Facebook and Apple for how "millions of people around the world spend their attention". He sent the manifesto to ten friends. It went viral within Google and Harris ended up discussing the slide deck with Larry Page, then the chief executive. Google gave him a new role, "design ethicist", but ultimately he became frustrated with the company's failure to reform and in 2016 he left to run a nonprofit advocacy group focused on addressing his concerns. He called it Time Well Spent, to crystallise what he thought a user's experience of technology should be.
In April 2017, the technology investor Roger McNamee, an early backer of Facebook and former mentor of Mark Zuckerberg, heard Harris on the news programme 60 Minutes. Harris was talking about how app design features made smartphones addictive in the same way that slot machines are. McNamee, who suspected that Facebook was "a clear and present danger to democracy", was intrigued. The two men joined forces and added Jim Steyer, the founder of Common Sense Media, the largest charity in the United States focused on children and media. They briefed members of Congress investigating possible Russian interference in the 2016 election. They discussed privacy violations with lawyers and politicians, and within months nearly 40 states had opened investigations into Facebook. A Wall Street Journal profile depicted the trio in cowboy garb as "the New Tech Avengers".
Other aspects of Harris's message were gaining traction too. In January 2018, Mark Zuckerberg outlined his company's goals for the year in a Facebook post that began, "One of our big focus areas is making sure the time we all spend on Facebook is time well spent."
A few months later, both Google and Apple announced initiatives to help users monitor and reduce their screen time.
Harris was encouraged but also wary of his ideas being diluted as the companies co-opted his language, so he retreated from the spotlight to brainstorm with Raskin and update his vision. The theory of human downgrading and the presentation in April were the first fruits of that process.
Not everyone has been won over. Andrew Przybylski, an experimental psychologist and director of research at the Oxford Internet Institute at the University of Oxford, believes that Harris means well but lacks scientific evidence for his claims, which he compares to previous moral panics over video games and comic books in the Nineties and Fifties. Dean Eckles, a social scientist and professor in communications and technology at MIT, also questions the evidence for the theory of human downgrading. He is "not sure" that there is proof of "a general erosion in our faith in facts", or that society is significantly more polarised by social media today than it was by partisan, sensationalist journalism a century ago.
However, he stresses that "Tristan has done good" by increasing scrutiny on how technology companies' business models affect society.
That scrutiny is paying off.
Fogg, the Stanford behavioural psychologist, recently renamed the Persuasive Technology Lab. It is now the Behaviour Design Lab, with a focus on fostering "good habits". A few months ago, he made a forecast for the new year. "A movement to be 'post-digital' will emerge in 2020," he tweeted. "We will start to realise that being chained to your mobile phone is a low-status behaviour, similar to smoking."
In May, Chris Hughes, a co-founder of Facebook and an adviser to CHT, called for Facebook to be broken up. His chief concern was Zuckerberg's historically unprecedented power "to monitor, organise and even censor the conversations of two billion people".
In July, the Federal Trade Commission fined Facebook $5 billion for violating users' privacy. In September, it gave Google a $170 million penalty for collecting children's personal information without parental consent via YouTube.
In October, Zuckerberg was hauled before Congress for a second time in 18 months, a development that would have seemed "crazy" only a few years ago, Harris says.
Soon afterwards, it emerged that more than 250 Facebook employees had written a letter to the company's top team protesting at its refusal to fact-check political adverts.
Harris sees these developments as proof that he is on the right path. The challenge is to press on. "We're something like eight people in an office in San Francisco with every government, thousands of engineers and media knocking at our door," he says, looking suddenly weary. "You can imagine how overwhelmed we are and how little of a personal life any of us has, because of how much is at stake and how quickly it needs to change."
How to make your phone less addictive
Simple settings and helpful apps to wean you off your device.
Turn off all notifications except those from people
Most notifications are generated by machines, not actual people. They keep our phones vibrating to lure us back into apps we don't really need to be in.
Visit settings > notifications and turn off all notifications, banners and badges, apart from apps where real people want your attention, eg messaging apps such as WhatsApp, Messenger, Signal, WeChat etc. Or, better still, turn off all your notifications altogether.
Go greyscale
Colourful icons give our brains shiny rewards every time we look at our phone. Set your phone to greyscale to remove those positive reinforcements. It helps many people check their phone less.
In iOS, go to settings > general > accessibility > accessibility shortcut (bottom) > colour filters. This allows you quickly to triple-tap the home button to toggle greyscale on and off, so you keep colour when you need it. For Android, go to settings > digital wellbeing & parental control > wind down.
Charge your device outside the bedroom
Get a separate alarm clock and charge your phone in a different room (or on the other side of the bedroom). This way, you can wake up without getting sucked into your phone before you even get out of bed.
Keep your home screen for tools only
Do you open apps mindlessly because they are the first thing you see when you unlock your phone? Limit your first page of apps just to tools – the apps you use for quick in-and-out tasks such as Maps, Camera, Calendar, Notes. Move the rest of your apps, especially mindless choices, off the first page and into folders.
Launch apps by typing their names
Swipe down and type the app you want to open instead of leaving easily accessible bad habits on the home screen. Typing takes just enough effort to make us pause and ask, "Do I really want to do this?"
On Android devices, you can use the search box on your home screen. With iOS, for best results turn off Siri suggestions (settings > Siri & search > Siri suggestions to off).
Better still, remove social media from your phone altogether
If you really want to use your phone less, remove all the major social media apps. It's the easiest way to cut back, because these apps can easily gobble up so much of our time. Train yourself to use them from your computer only (if at all). Note: you can delete the Facebook app and still get some specific features, ie Messenger for messages and Local for events.
Send audio notes or call instead of texting
It's common for people to misinterpret text messages, while the voice is rich with tone and less vulnerable to misinterpretation. Recording a quick voice message is often faster and less stressful than typing out each letter. Plus, it doesn't require your full visual attention.
Use texting shortcuts
On iOS, press and hold on a text message and you'll see a menu of quick reactions. It's faster than crafting a response and can also add some context, giving a taste of the emotion that's often lost in a text.
Night Shift (iOS)
Blue light from screens late at night tricks our bodies into believing it's still daytime, which disrupts our natural ability to sleep. The Night Shift function uses the clock and geolocation of your device to determine when it's sunset in your location. Then it automatically shifts the colours of your display to warmer shades.
Go to settings > display & brightness > night shift > toggle schedule a time > warm or less warm slider.
Useful apps
Flipd
This company boasts that users have spent more than 100 million minutes distraction-free. The app protects you by temporarily locking you out of distracting games, social media and other apps. (Apps temporarily disappear from your phone.)
Thrive Away (Android)
Turns your device into a dumb phone by turning off all notifications, calls and texts except for those from people you've specified on a "VIP list". Works for set periods and automatically lets others know when you're taking a break.
Freedom (iOS, Android)
Temporarily blocks specific websites or apps on your desktop, tablet and phone for set periods of time.
Moment (iOS)
Displays in bar-graph form how much time you spend on your phone each day and which apps you use most. It also includes a coach, which encourages you to change.
Calm
Meditation app that offers programmes for all levels to help you take a break in your day. You can also select music tracks engineered to help you focus, relax or sleep, and calming tales narrated by celebrities such as Stephen Fry and Matthew McConaughey.
Siempo (Android)
Lets you replace your home screen with an interface that allows you to batch notifications so that you receive them all in one go at an allotted time. It will also unbrand icons, randomise their location and set restrictions on how much you use apps.
On your desktop
News Feed Eradicator for Facebook
Removes your Facebook news feed, allowing you to use some of the more utilitarian features of the site without getting sucked into the news feed. Also adds an inspirational quote of your choice in place of a news feed.
Distraction Free for YouTube (Chrome)
Removes recommended videos from the YouTube side bar, making you less likely to get sucked into unintentional content holes.
Send & archive (Gmail)
This archives the email right after you send it. It will reappear in the inbox when the person replies.
Open Gmail > click top right cog > settings > send and archive > show send and archive > show send and archive button in reply > save changes.
F.lux (Mac, Windows)
Adjusts a display's colour temperature according to location and time of day, offering respite for the eyes. Helps counter eye strain during night-time use and helps to reduce disruption of sleep patterns by cutting the blue light from screens.
Inbox When Ready (Gmail)
A Chrome extension that focuses your attention by only showing messages when you click "show inbox" instead of distracting you as new emails arrive.
Source: humanetech.com
Written by: Ben Hoyle
© The Times of London