Matt Rudd meets the crack team of women taking on the tech giants with a new set of digital guidelines.
When the first steam-powered cars started hurtling — at speeds of up to 16km/h — along the quiet lanes of Victorian Britain, there were two distinct responses. There were those who wanted no regulation — this was the future, don't spoil it with red tape. And there were those who wanted lots of it. These monsters were dangerous. Someone would get killed.
In 1865, after much toing and froing, the Locomotive Act was introduced. Not only did it set a 6km/h speed limit, but it also made it compulsory that a person "shall precede such locomotive on foot by not less than 60 yards (55m), and shall carry a red flag constantly displayed, and shall warn the riders and drivers of horses of the approach of such locomotives". Draconian, cried the nascent car lobby, a put-up job by the train lobby.
What followed, as those car lobbyists predicted, was a whole host of further regulations — brakes became standard, then brake lights, headlights, indicators, seatbelts, airbags, crumple zones, automated collision avoidance systems, windscreen wipers. Yes, people still die in car crashes, but where would the automotive industry be today if it hadn't been for regulation? The red tape didn't stop invention. It made it better.
This month, we have reached the same pivotal point in the digital world. Ahead are two discrete paths. The first lets Big Tech continue to enjoy self-regulation and to "move fast and break things" — as Mark Zuckerberg's strategy for Facebook innovation went. The second leads to the digital equivalent of men walking in front of cars with red flags.
Until now, Silicon Valley has resisted that second path with a combination of intransigence, promises and, quite simply, not turning up to committee meetings. Now a large swathe of that resistance looks as if it's coming to an end, thanks largely to a cross-discipline, cross-bench and cross-house crack team that includes Baroness Beeban Kidron; the information commissioner, Elizabeth Denham; the children's commissioner, Anne Longfield; and Margot James, minister for digital and the creative industries.
The Age-Appropriate Design Code — or Kids' Code for short — is an extrapolation of last year's UK Data Protection Act, which introduced tough rules about how our data can be handled.
"For the first time, kids have special rights in the law," says Denham, who is responsible for the legislation, which is currently in public consultation. "The code translates those rights into practical standards that companies and service providers will take and build into our digital future. Ten years ago, most of the conversation was about safety and security — keeping kids off pornographic sites, making sure that they're not lured by adults online, protection from cyber-bullying. Now there's a broader conversation to have because websites have been developed in a way that really robs children of agency."
The 16 standards set out in the Kids' Code will amount to a comprehensive, radical change in the digital lives of children. Standard one sets the tone. It states that if children are likely to be using a service, then "the best interests of a child should be a primary consideration". As any parent of a YouTubing, Instagramming, Fortniteing teenager will tell you, this is not currently the case.
Tech firms will have to make their services "suitable by default" for all children. Adults can then decide if they want to access sections of the service that are not appropriate for children via robust age verification. In other words, the digital world arrives child-friendly and needs to be tailored to display the grown-up stuff, rather than the other way around, as it is now.
Companies will no longer be able to share a child's data. They will only be able to collect the minimum amount of data needed to "provide the elements in which a child is actively and knowingly engaged".
Services will be on high-privacy settings by default. Profiling options will be limited drastically. Instead of burying important information about data use in the 147th clause of the 15th page of the terms and conditions, there will have to be clear messaging at the point of entry. If any ulterior motives survive, they will have to be there, in plain sight. Not quite "We are using your data to sell you stuff", but not far off.
There is more. The psychological tricks used prolifically online to keep us addicted will not be allowed under the code if children are likely to be using the service. Likes on Facebook, streaks on Snapchat and games that reward regular log-ons, don't have a save option and give you bonus points if you message friends — all gone.
"When you like or streak or share something," Denham says, "you're leaving a digital footprint. And those digital footprints are what the company wants to encourage you to stay online and to deliver advertising. A larger digital footprint allows more personalisation of content, more delivery of advertising and messages that are increasingly more provocative about subjects that a child is interested in." All of which will stop under the Kids' Code.
Those smart devices — the interactive home hubs, the connected thermostats, vacuum cleaners and speakers, the talking teddy bears and wireless baby monitors that sit quietly in our homes gathering intel — will also have to comply with the code. If there are children in the home, Alexa, Siri and friends will no longer be able to listen in unless parents opt to downgrade settings.
Perhaps most significantly of all, the code will tackle digital rabbit holes. On social media sites and video-streaming services, algorithms are designed to keep you hooked. You found one guide to self-harm, here are 47 more. With restrictions on data-sharing, profiling and geolocating, this psychological snare will no longer be a central part of the digital world's business plan. Should the Kids' Code make it through consultation, the internet will never be the same again.
Sitting in the riverside tearoom half a corridor along from the House of Lords chamber, Baroness Beeban Kidron, the Bafta-winning filmmaker and children's rights campaigner, remembers the lightbulb moment when she first understood the degree to which children were unprotected online.
"It was 2012 and the smartphone had just hit a price point where a parent would give it to a child," she says. "They were suddenly endemic, these mobile universes, and because they were mobile, they were beyond a parent's reach. I was making a documentary [InRealLife] about kids growing up in a connected world, and I was talking to people at the heart of tech. They all held on to that founder's vision — that technology is democratising, that there is no gatekeeper, that all users are equal. Then it clicked — if all users are treated equally then, de facto, you treat a kid as if they're an adult. In no other area of life do we do that."
Kidron went on to spend hundreds of hours watching the way children use the internet or, more pertinently, the way the internet uses children. "I could see that all of the decisions the kids were being asked to make, all of the critical understanding it assumed they had, was based on the assumption that they were adults."
Data is 21st-century gold. The more data a company can gather, the more valuable it becomes. This is why we are followed around the internet by adverts for a shirt we once considered buying. It's why you can have a conversation about pizza and then find an offer for pizza in your inbox.
When I mention that I have a smart thermostat from Nest — a company that Google acquired in 2014 — Kidron takes a long, deep breath like a teacher about to explain trigonometry for the umpteenth time. "If you click through everyone that Nest shares your data with, that's 1,000 sets of terms and conditions you've automatically agreed to, according to research by University of London," she says. "If it does not worry you that your household details are being shared with at least 1,000 companies, then fine. But if you put 1,000 people in your house and asked them to stand silently watching what you and your family were doing, you might feel different."
Many of us, myself included, don't fully comprehend quite how much of this data gold we give away or the myriad ways it is used to profile us and then modify our spending habits, our behaviour, even the way we vote. We are scandalised when we hear about security breaches, but agree to our personal information, tastes, habits and — in the case of robot vacuum cleaners — floorplans being spread around the ether.
After an eye-opening hour with Kidron, I'm less ignorant and less happy. Thousands of very clever people are devoted to keeping us online for as long as possible. The more Big Tech knows about us, the more it can make us do what it wants. Children need protecting from this. So, perhaps, do adults — but that's for another time.
Three years after her epiphany, Kidron founded 5Rights, to fight for the established rights of children in the analogue world to be applied to the digital world. She is very much the architect of the Kids' Code.
"We are not killjoys," she says. "We're not against tech at all. What we're saying is, 'Give us a truly marvellous new world in which you have considered whether moving fast and breaking things means you're moving fast and breaking our kids.' "
Some in the tech lobby have claimed that regulation amounts to censorship — anathema to the core ethos of Silicon Valley. Kidron has no time for this. "This is about data practice," she says. "If a child seeks out certain things and they try really hard, they will always find it. The point is how you normatively treat them and how you uphold your own terms and conditions. There are strict rules in advertising, in medicine, in publishing. The digital world should be no different."
The children's commissioner, Anne Longfield, describes herself as "the eyes and ears" of children, sticking up for their rights. In the physical world, she tells me those rights are usually clear and understood, but it's different online. "It's been so fast and so disorganised and, frankly, everyone has been overwhelmed by it," she says. "What is clear with the free flow of content, the algorithms that encourage addictive tendencies and the business model as a whole is that they have not been designed with kids in mind."
She points out that children don't receive the same level of parental support online as they do in the analogue world "because parents themselves haven't got that experience. They haven't grown up with the digital world, so they don't know as much. Children are at the forefront of this new world without protection or guidance."
In November 2017, Molly Russell, a 14-year-old schoolgirl, was found dead in her bedroom. Her father believes she killed herself after viewing material about anxiety, depression, self-harm and suicide on social media sites including Instagram. Earlier this year, Ian Russell told BBC News that Instagram "helped kill my daughter". Since then, at least 30 families have contacted the suicide prevention charity Papyrus to say they believe social media was a factor in the deaths of their children.
In January, Longfield wrote an open letter to the biggest social media companies urging them to take "a moment to reflect" on the schoolgirl's tragic death. "The potential disruption to all user experiences should no longer be a brake on making the safety and wellbeing of young people a top priority," she wrote.
Today Longfield says the imagery on self-harm and suicide were "pushed to [Molly] by the algorithm. It understood what she asked to see and sent more of it and it became more extreme. No one else was understanding or assessing the impact that this might have on a vulnerable person."
Longfield herself searched for the hashtag #selfharm and was, as she puts it, "horrified not only by the volume, but by the explicit nature of what I saw. Tech is not the cause, but it is the accelerant. Kids already have the pressure of growing up. Now they have the pressure of constant communication."
In a recent exercise, the Children's Commission translated the terms and conditions of five of the biggest tech companies into plain English, cutting an average 5,500-word document down to one side of A4, and then took them into schools. "It didn't stop the children going online, but it did make them think about it differently," Longfield says. "They were more careful with what they wrote and what decisions they made."
When the Kids' Code arrives, this level of clarity will become mandatory.
In a straw poll at our school gate, precisely 100 per cent of parents were hugely in favour of the code. Best interests of the child? Finally. Default settings to high privacy? Of course. No profiling? "Can we sign up for this, too?" And this is the whole point. Most parents are worried about technology, but are relatively powerless to police it. With a lot of discussion and some argument, I managed to keep my eldest off Instagram until he was 13. Even though Instagram is for over-12s only, most of his friends already had an account, some for several years. It is a war of attrition, us versus those clever people in Silicon Valley, and it is unwinnable.
In the past, I've blamed other parents. Why let an 8-year-old on social media? But, as Kidron points out, blaming parents is a go-to dodge of Big Tech. How a child exists in the digital world is down to parents, they say. Except, in reality, it's not as easy as that. To monitor at all times is exhausting. Why not make the digital world more child-friendly in the first place?
As minister for digital and the creative industries, it is part of Margot James's brief to, as she puts it, "foster an environment where innovation thrives". "This is," she says, "particularly important for small and growing businesses. Otherwise you'll just concentrate more and more power in the hands of the very big companies. And we will all be the losers for that."
She is, however, another advocate for regulation. "It's important that we listen to those who are opposed, but no longer will they be able to make decisions with scant regard to the unintended harms that have arisen from the very liberal environment that tech has enjoyed to date. Until now, the needs of children have been an afterthought and only in response to a big outcry.
"I don't have children myself," she says, "but I do have two great-nieces who are under the age of 12 and so I see it first-hand, the worries and concerns that parents have in this area as children get to the age when they need to be using the internet for all the benefits it offers. Parents are extremely worried, and with good reason."
James was "persuaded in the space of one meeting" that what Kidron was doing was important.
Responses from children's charities, teachers and parent organisations are resoundingly in favour of the draft code, as are some tech firms. Facebook's response has been … lukewarm. It claims the code risks "dumbing down controls for young people who are often highly capable in using digital services". It invokes the UN Convention on the Rights of the Child in arguing against "barriers" to a child's freedom of expression. Google "agrees with the ambitions of the code", but wants to "maintain a degree of flexibility". If, for example, geolocation data is restricted, how is a child going to find the quickest route to school?
"The time for self-regulation — especially from large platforms — is over," Denham says. "People want a more responsible and more human-focused way forward. We didn't expect the big tech companies to like it, but that's no reason to put it in the too-hard pile. I am laser-focused on this and it's not just because I have a young granddaughter who loves playing on her tablet. I want to leave this area of regulation in a better place so that we can truly protect the agency of our kids online."
Kidron is also laser-focused. She tells me the tech sector is now conscious that doing nothing is no longer an option, "but I have been told by numerous sources that they are quietly lobbying to water down the code. This is a moment where they could stand up and make some of the pronouncements they've made recently a reality for children."
"It will happen," Denham concludes. And when it does, I suspect, it will seem crazy that it didn't happen sooner. Almost as crazy as having cars without brakes.
The 16 rules set out in the Kids' Code
1. The child's best interests should be the primary consideration in any services likely to be accessed by a child.
2. Age-appropriate standards must apply to all users. Robust age-verification mechanisms must distinguish children from adults.
3. Privacy information and other terms must be in clear language suited to the age of the child.
4. Do not use children's personal data in ways that have been shown to be detrimental to their wellbeing.
5. Uphold your own published terms, policies and community standards.
6. Settings must be "high privacy" by default, unless you can demonstrate a compelling reason for a different setting.
7. Collect only the minimum amount of personal data you need to provide the service in which a child is actively and knowingly engaged.
8. Do not disclose children's data unless you can demonstrate a compelling reason to do so.
9. Switch geolocation options off by default and provide an obvious sign for children when location tracking is active.
10. If you provide parental controls, give the child age-appropriate information about this, and provide an obvious sign when they are being monitored.
11. Switch options that allow profiling off by default. Only allow them if you have measures in place to protect the child from any harmful effects, in particular content that is detrimental to their health or wellbeing.
12. Do not use nudge techniques to encourage children to provide unnecessary personal data, weaken their privacy protections, or extend use.
13. Ensure any connected toy or device complies with this code.
14. Provide tools for children to exercise their data protection rights and report concerns.
15. Undertake data protection impact assessments.
16. Ensure that your policies, procedures and terms comply with this code.
Written by: Matt Rudd
© The Times of London