If you joined Facebook at any time over the past decade, Alex Schultz probably had something to do with it. The 36-year-old from south London, a Cambridge physics graduate and self-taught specialist in online marketing, moved to Silicon Valley in 2004. After three years at eBay, he was appointed to Facebook's newly formed "growth team" in 2007.
Schultz's mission — along with seven others — was to pioneer innovative techniques to lure in new users and keep them coming back for more. Mark Zuckerberg, Facebook's founder and chief executive, would later describe the growth team as the platform's "most important product feature". Their quasi-religious verve for using data to grow at virtually any cost was so successful that companies across the industry — and around the world — have copied their tactics.
When Schultz celebrated his 10th anniversary at the company, Zuckerberg posted a message on his Facebook page. "Alex is one of a small handful of people that I can say without his work, our community would not have connected more than 2 billion people around the world." Now Zuckerberg has trusted Schultz with something even more important: helping to fix Facebook.
The company calls this work "integrity". Others might call it the world's biggest clean-up. When I meet Schultz in a conference room at Facebook's campus in Menlo Park, California, his manner is cheerful and confident, even when I ask about the sustained and growing public criticism of the company.
"We absolutely missed things and have said that we missed things and we are changing," he admits. But he denies that the company used data and behavioural science to addict its users. "Creating a valuable and useful experience has always been the main focus — we have never focused on or designed anything for 'addiction'."
As politicians and regulators discuss imposing new rules on social networks, Schultz — perhaps predictably — says Facebook has the tools to fix its own problems. In fact, he sees the platform's vast size and expertise in data as key to the solution. "Having an international company that has the resources that we have, being able to apply the machine-learning tactics that we get in all of the data around the world — and then apply it to a new language, when it comes up, or a new problem area, when it comes up, is really, really powerful," he says.
But critics worry that Schultz and his growth team are the last people who should be in charge of solving the social network's problems. As one former Facebook executive told the FT: "It is perfectly reasonable to ask the question: isn't that putting the foxes in the hen house?"
To understand the role of the growth team and its new mandate, the Financial Times interviewed over a dozen current and former Facebook employees. Many would only speak anonymously, for fear of the impact on their careers. They describe the growth team as a dominant force within the company that was aggressively focused on engagement, speed and seeing off rivals from the start, and say this culture helped shape Facebook into a platform that was ripe for manipulation.
"The priority at Facebook for the last decade has been growth. Period. End of story," says David Kirkpatrick, who was granted rare levels of access to the company for his 2010 book The Facebook Effect. "The pursuit of growth has blinded Mark and his team to some of the risks of too rapid expansion, which are so obvious to many of us on the outside now."
Even people who praise the team's work worry that its intense focus on metrics may have caused it to miss looming problems. Not long after Schultz joined, the group worked with Danny Ferrante, one of Facebook's first data scientists, to develop a practice called "growth accounting".
Instead of just trying to lure new users to join the platform, they became obsessed with "monthly active users" — how often people returned to the site and how long they spent there. This so-called "North Star" metric of engagement guided the company for more than a decade.
In a lecture Schultz gave to start-up founders in 2014, he explained: "What you really need to think about, is what is the North Star of your company: what is that one metric, where . . . everyone in your company is thinking about it and driving their product towards that metric and their actions towards moving that metric up? . . . Monthly active people . . . was the number [Zuckerberg] made the whole world hold Facebook to."
Mike Hoefflinger, former head of global business marketing at Facebook, says: "A North Star metric also means by definition that there is something else you are paying less attention to. Sometimes it's hard to realise how important those things are."
How today's growth team responds to Facebook's challenges could shape lives, elections and conflicts around the world. Schultz is not an expert in preventing privacy violations such as the massive Cambridge Analytica data breach, stopping the spread of Russian disinformation aimed at warping US elections, or quelling the online hate speech that has incited real-world violence and even genocide in Myanmar. But he is an expert in measuring. "We do look at data hard, we can measure how many pieces of content we are taking down, we can measure false positives, how many times we make mistakes, we can measure the prevalence and measure things the right way and report on them publicly," he says.
Ultimately, he still has faith in one key number: the Facebook userbase. I meet Schultz the day after Facebook announced its most recent user numbers: the platform continues to expand, with daily active users up 9 per cent year-on-year. After stagnation and a slight dip in some western markets last year, they are growing again. Schultz sees this as a sign that Facebook's fixes are working. "I'm so proud of doing this for the last two years," he tells me. "It's been really hard but in the long term I really believe that if you trick users, give users bad experiences, they won't stick around. And I believe the numbers show that they are sticking around and we are doing the right things."
Others might argue that it's exactly this logic — which rests on the belief that users always act in their own self-interest, and manage their social media use entirely rationally — that got Facebook into trouble in the first place.
Domination!
In Facebook's early days, Zuckerberg is widely reported to have ended meetings by shouting "domination!". But by 2007, he was worried that at just three years old, the platform's growth was already slowing. Schultz admits now that it seems "ridiculous" how scared the company was that Facebook would stall in 2007.
"No social service had ever got to 100 million [users]. We were smaller than MySpace, Bebo, HighFive," he laughs.
Zuckerberg, then just 23, responded with an innovation that spawned many others. He created the growth team to use data analysis to fuel engagement. At other companies, growth was the job of marketing and PR departments. Zuckerberg prioritised data and engineering above all else. He charged the standalone team with developing a deep understanding of user behaviour to re-engineer the site. They had a simple aim: more users and more of their time.
Zuckerberg put Chamath Palihapitiya, a brash executive who joined from AOL, in charge of the growth team. At company-wide meetings, Palihapitiya would stand on a table and yell at employees about the competition, according to former Facebookers. "They were four-letter-filled invectives about how we were soft and the whole company wasn't committed to growth and everybody had to triple their efforts," one former Facebooker told the FT. The fear was that competition from companies such as MySpace and Google would "crush little baby Facebook in its crib", the former employee says. Palihapitiya, who left in 2011 to launch a venture capital fund called Social Capital, did not respond to the FT's request for comment.
In 2009, Zuckerberg told Business Insider's Henry Blodget that "move fast" was a "core value" at Facebook. "We used to write this down by saying, 'Move fast and break things.'
And the idea was, unless you are breaking some stuff you are not moving fast enough," he explained, adding that in practical terms this meant "everything from . . . nightly code pushes [to] hiring the best people who have a bias towards just pushing things very quickly."
Schultz and two other key executives from the growth team — Javier Olivan and Naomi Gleit — are both still at Facebook and remain firmly in Zuckerberg's inner circle. Gleit, Facebook's second-longest tenured employee, was hired by Facebook in 2005 after she became so fascinated with the site that she made it the subject of her senior thesis at Stanford. She has said that when she joined as employee number 29, she had an "almost spiritual belief" that Zuckerberg was going to become an important person in the world, and in 2009 she told Newsweek: "My job isn't done until literally everyone in the world is on the site."
The most senior team member was Olivan — now a vice-president of central services who reports directly to Zuckerberg. A Spanish MBA graduate, Olivan attracted Zuckerberg's attention when he created a Spanish version of the social network in his spare time. He joined in 2007 and turbocharged international growth by crowdsourcing translation of the site, saving Facebook the time and expense of employing professional translators across hundreds of countries. Users could suddenly access the social network in their native language, opening it up to people who did not speak English.
The growth team quickly saw success: within months, Facebook had overtaken MySpace. By carefully tracking users' interaction with the platform and using that to inform the design of the product, they were able to drive the "monthly active users" metric ever higher. As Zuckerberg told an audience of entrepreneurs in 2016: "We learned a huge amount about our users and their patterns on the site and what helps them stay connected and engaged."
A social experiment
The growth team's key decisions fuelled incredible growth — but they also made Facebook into a vast social experiment that neither the team, nor the world, was prepared for. In those early years, there was little discussion about the wider impact of social media, or whether it was a good idea to nudge millions of users across the world to spend more and more of their time on the platform.
Tristan Harris is a computer scientist and former Google employee who co-founded the Center for Humane Technology to push back against Silicon Valley's addictive design theories. He says the industry's relentless pursuit of user time and attention — as exemplified by Facebook's "active user" metric — led to products that exploited people's weaknesses. "Growth hacking is about doing something unnatural to human social and
psychological instincts," he says.
Techniques employed by the growth team — often experimentally at first — had long-term knock-on effects; tapping into people's addictive tendencies, reducing privacy or incentivising the spread of fake news. One of the first challenges the team tackled, in 2008, provides an example. Facebook's data on user retention showed that if new users could not find their friends fairly quickly once they'd registered, they would leave, and rarely return. So Facebook needed to rush them to what Schultz and others have called the "magic moment" where a user has at least 10 friends.
To do this, they created the feature People You May Know. New users were told "Facebook is better with friends" as they were pushed to allow the platform to access their contacts — then from email address books; now from smartphones. This simple tool transformed the company's prospects, says one former Facebooker. "It was supremely important," he says.
But the feature — now common across many apps — warped people's social networks. Facebook acted like a "digital drug lord", Harris says, prodding people to invite their entire address book — not because users necessarily wanted to spend time with these people, but because Facebook wanted new potential users.
The feature violated privacy, encouraging users to expose their friends' contact details without their permission, even if they were not on Facebook. People often did not understand that if their phone number was in anyone's contact book, that person could then allow Facebook to see they were connected and access their contact details. Even insiders realised this was dubious: in a controversial internal memo that leaked after the Cambridge Analytica revelations, Zuckerberg's close lieutenant Andrew Bosworth described it as "questionable contact-importing practices".
To some users, the feature proved extremely invasive. In 2017, Gizmodo journalist Kashmir Hill discovered that one man had his secret biological daughter recommended to him — because he still knew the couple he donated sperm to, but was not friends with them on Facebook. Psychologists' patients were recommended to each other, because all their contact details were in the same address book.
People You May Know was an incredibly successful product, so the principles it used spread around the company. Photo-tagging encouraged people to keep returning to the site whenever they were tagged and created a deep well of data. Users were able to tag photos of users they weren't yet Facebook friends with, or who weren't even on the network — another recruitment tool, as these contacts would have to join the platform to see the photos.
When Facebook began allowing developers such as games companies to run apps on the social network, the company worked on the same premise. People could expose their friends' data, usually without realising it. This stored up trouble that led to the massive data leak to Cambridge Analytica. The academic Aleksandr Kogan collected vast troves of information and gave it to the data analytics firm that worked for the Trump campaign. Only 250,000 people took his survey in 2014 but, by capturing the information on their friends, he managed to harvest the data of up to 87 million users.
"There was absolutely a conflict with privacy," says a former Facebook executive.
A masterstroke
Early in 2009, the company rolled out a masterstroke: the "like" button, which kept people coming back for the possible dopamine hit of finding out if their update or photograph had been "liked" by others. As Tristan Harris has written, this kind of design turns our smartphones into "slot machines" offering an addictive range of variable rewards; we endlessly check our notifications or press refresh in the hope of some kind of hit. A range of other design decisions encouraged users to stay on the site for longer: the "bottomless bowl" of infinite scrolling and, in 2013, videos that autoplay.
Internally, Facebook employees were not very concerned that their tactics could addict users. After all, people still spent far more time in front of the TV (and many still do). "We knew that addictive behaviour would happen on the margins. We didn't observe it at the core of the product or as something unique to Facebook — at least in comparison to general internet addiction," says one early Facebooker.
When Facebook saw competitors on the horizon, it would metamorphose to avoid losing users to new rivals. After Twitter gained ground in 2012 and 2013, Facebook quickly pushed the sharing of news on its own platform, encouraged more public conversations and adopted hashtags. Kirkpatrick says the company rapidly expanded its definition of what Facebook was meant to be — turning every user into a miniature broadcaster. "In retrospect, many of those changes led to things that went haywire later, political difficulty in particular," he says.
Facebook users rapidly became dependent on the site for news: two-thirds of US users today say they get news from social media. Twitter also has problems policing fake news and bots, but on Facebook, disinformation can be particularly hard to spot, as people tend to connect primarily with friends and family. Within these filter bubbles, clickbait, fake news and disinformation can spread rapidly and quietly with no one to fact-check them (this problem is even more extreme on WhatsApp, which Facebook acquired in 2014).
Zuckerberg seems to have had an overly optimistic view of human behaviour that prevented him from foreseeing some of the negative consequences of "just pushing things very quickly". In an interview with Time in 2010, when he was named Person of the Year, he said: "I really do think there is this concept where the best stuff spreads." In fact, studies using Facebook data have shown since then that the "best stuff" is not what spreads quickest. In the run-up to the 2016 US presidential election, more people engaged with the top fake news stories than with real news stories. Stories that inspire outrage or fear are among the most likely to be clicked, commented on and reshared, and their high levels of engagement mean they will then be prioritised by Facebook's algorithm-driven news feed.
Schultz argues that the team would have done things very differently if it was solely focused on "short-term tactical wins". He praises the team for asking users' permission to access more of their data, even when they were not required to by the operating system.
"People will only return and continue using Facebook if it's useful and valuable, and if they feel safe, are not harassed, their information is secure, and if we are fighting abuse," he says. "That is what guides this team's work."
But another former Facebook employee told the FT that he doesn't believe Facebook executives when they argue that the company will not stand for bad players and clickbait.
"I say, 'Yes, it is what you stand for because all your metrics are geared to more time on the platform'," he says.
Facebook sells itself to advertisers as a fantastic medium for influencing people's choices — and along with Google, it now dominates the digital advertising market.
Judy Estrin, an internet pioneer and serial entrepreneur, says disinformation comes more from an "intended use" than a "negative consequence" of the platform. "The platform taps into people's emotional reactions to manipulate them: that is essentially what advertising is about — messaging and persuasion — and this is at a new level of scale.
Disinformation is using the same tools with malicious intent," she says.
The first billion
When Facebook reached a billion users in the autumn of 2012, Schultz and Olivan celebrated by spraying each other with champagne. Every office was filled with balloons and the company released a video comparing the social network to a chair, a doorbell and an aeroplane. Facebook, in case it wasn't clear, connected people — and had become part of the furniture.
"It was a party, a big f***ing deal," says one former Facebook executive. "No one in the world had done that at that pace." It had taken Microsoft almost 26 years to reach a billion Windows users and 12 years for Google Search to do the same.
Facebook was now a public company, after a wobbly initial public offering in May 2012. Growth became even more important as investors panicked in those first months, worried that users were moving to mobile phones but advertisers would not follow. Monthly active users, then daily active users, became closely watched numbers on Wall Street — and the growth team became accountable to distant shareholders, as well as Zuckerberg.
The people who did spot problems were often ignored. Many of the former Facebook employees who spoke to the FT described a culture where it was hard to challenge the prestigious growth team. "It was like Game of Thrones," says one. "The growth team was certainly put up on a pedestal . . . They were jockeying to show how close they are with Mark, who gave the most recent presentation or understands the way he thinks."
Another former employee says they had voiced privacy concerns about products — but often lost the argument. The growth team's attitude was "active animosity". "It was 'You all ought to be driven out of the company, you are a danger to the company, you are not team players'," he told the FT. "Facebook empowered the wrong people and disempowered people. When push came to shove, oftentimes growth won."
One such conflict became public when the UK parliament published internal Facebook emails obtained as part of a lawsuit, late last year. In a February 2015 email, Michael LeBeau, a Facebook product manager, wrote that the growth team was planning to ask Facebook users with Android phones for access to their call logs, to determine who were their closest friends. LeBeau comments: "This is a pretty high risk thing to do from a PR perspective, but it appears that the growth team will charge ahead and do it." Facebook made the change.
Outside the company, the chorus of critics began to get louder from 2011 onwards. Privacy activists such as the Austrian Max Schrems pushed regulators to examine how the company handled data. In the US, the Anti-Defamation League called on Facebook to remove hate speech from the platform. Academics such as Sherry Turkle wrote about how social media addiction was warping relationships.
Facebook's policy team often met critics such as privacy activists but many feared it wasn't with open ears. Eli Pariser, who coined the term "filter bubble" and worried about its implications for society, told the FT that he felt Facebook had a "level of overconfidence" in how good it was for the world. The focus on a few metrics made it hard for Facebook to see all the "wildly different" media experiences it was creating for people, he says.
David Madden, a tech entrepreneur living in Myanmar, told PBS last year that he warned Facebook about hate speech on the site targeting the country's Muslim minority in 2015. In a presentation at Facebook's headquarters, Madden told employees that the platform risked playing a key role in a genocide, like radio broadcasts had in Rwanda. The company's response was to say they needed to do something substantive — but he felt Facebook failed to take action. The company said it did address individual pieces of content and issues, but admits it was not proactive enough.
As Russia began piloting its new disinformation tactics in Ukraine in 2015, creating a model it would later deploy in the US, the Ukrainian government also warned Facebook about the problem. Facebook denies discussing fake news with Ukrainian officials. But Dmytro Shymkiv, deputy head of Ukraine's presidential administration, told the FT in 2017 that Facebook's response was, "'We are an open platform, we allow everybody the possibility to communicate.' That's all I got".
Schultz pushes back against accusations that the growth team didn't listen. He says it worked closely with others in the company, including employees in the privacy, policy and legal departments. He questions the motives of the former Facebookers who felt sidelined. "I think it is very interesting to look at the motivations of a lot of the people who have left Facebook and why they decide to say some of these things," he says.
He admits that there are "tonnes of things" you can't understand with data alone, and says he doesn't push as hard in those arguments. But he adds: "I think a lot of people I've met with have lost discussions because they are doing stuff based on gut feel that can be addressed with data."
Hear no evil
Zuckerberg initially dismissed accusations that fake news on Facebook could have influenced users' votes in the 2016 US presidential election, calling them "pretty crazy".
But following the revelation that the Kremlin-backed Internet Research Agency had spread divisive disinformation on the platform, he apologised. Facebook admitted in October 2017 that as many as 10 million people in the US saw Russian ads, with 44 per cent of views before the election.
Zuckerberg asked his loyal lieutenants on the growth team to re-engineer the site to discourage and detect fake accounts, fake news and hate speech.
Since then, Facebook has partnered with fact-checkers to downrank stories they identify as false and cut off financially motivated fake sites from its ad network. It has taken down millions of fake accounts and campaigns originating in Russia and Iran, and now requires authentication to prevent foreign actors placing ads. It has created a searchable database of political ads.
After the US presidential election, Zuckerberg's first reaction was to dismiss accusations that fake news could have influenced users' votes
The company now has 30,000 people working on safety and security, about half of whom are moderators to take down content. It has published community standards, created an appeals system to try to be more fair about what stays up and what doesn't and is experimenting with the idea of an independent body to oversee decisions. In Myanmar, it belatedly hired almost 100 Burmese language experts to review content and has sent policy, research and engineering staff there.
In response to growing concerns about the addictive qualities of digital technology, the company has given users the option to "snooze" or "take a break" from some notifications, and provided a "time spent" dashboard to track their hours on site.
On privacy, it has dropped data-sharing partnerships with data brokers and cut off some developers. Earlier this month, Zuckerberg said the company would focus on creating more private platforms, such as WhatsApp and Facebook Messenger. But his plan to further integrate the messaging apps actually enables Facebook to learn more about who is connected to whom — and the push for privacy does not extend to stopping advertisers using data to target users.
Many former Facebook employees agree with Schultz that the growth team is the right home for integrity, although not all for the same reason. Some believe it may work because its leaders are among the few who might be able to say no to Zuckerberg.
Hoefflinger believes the team's single-minded focus could be powerful when applied to integrity. "In an interesting way, [the growth team] may be the very best people in the world to really focus on integrity," says Hoefflinger. "They are not going to use contact importers to improve integrity. What they will borrow is this dedication to a goal."
The growth team, however, is unlikely to transform how Facebook fundamentally works. It won't change the advertising business, which depends on garnering more of people's attention than its rivals, and using data to target them. It is not dramatically changing the algorithms behind the newsfeed to prioritise in-depth thinking, abandoning the "like" button, or ceasing from sending notifications that pull people back. People You May Know still dredges users' address books, Facebook still collects information about you as you travel around the web, and its methods to obtain European users' consent under the new privacy rules are being challenged by privacy activists.
Meanwhile, the challenges confronting Facebook are morphing and expanding. The Christchurch terror attack was livestreamed on the platform by the attacker; in the 24 hours that followed, 1.5 million videos of the attack were uploaded to Facebook. More than 1.2 million of those were blocked at upload, yet that left 300,000 which could be seen before they were deleted. New technologies such as "deep fakes", videos where people are manipulated to appear to say things they did not, could make this Whac-A-Mole function even harder.
A recent study by social media engagement tracking firm Newswhip showed that the algorithm change in 2018 aimed at driving "more meaningful social interactions" actually ended up increasing the prominence of articles on divisive topics such as abortion and gun laws in the US.
Facebook's problems are deeply embedded in the platform and culture that was created by Zuckerberg and his colleagues in the first few years. Some critics such as early Facebook investor Roger McNamee and Tim Wu, a professor at Columbia law school, have argued that the only thing to do is to break the company up.
Siva Vaidhyanathan, author of Anti-Social Media, is one such critic. But he believes even a break-up would not solve all the problems that the platform amplifies. "Thinking about the problem of Facebook is much like thinking about the problem of climate change. In many ways, it is too big to wrap our minds around," he says.
Facebook and governments are responding to the social network's problems one by one, rather than addressing underlying causes. Tristan Harris believes the company has created a "digital Frankenstein". "By definition, they cannot control it," he says. "I think they don't want to admit that."
Written by: Hannah Kuchler
© Financial Times