YouTube has attracted a massive audience of younger viewers. Photo/Getty Images
The US government is in the late stages of an investigation into YouTube for its handling of children's videos, according to four people familiar with the matter, a probe that threatens the company with a potential fine and already has prompted the tech giant to reevaluate some of its business practices.
The Federal Trade Commission launched the investigation after numerous complaints from consumer groups and privacy advocates, according to the four people, who requested anonymity because such probes are supposed to be confidential. The complaints contended that YouTube, which is owned by Google, failed to protect kids who used the streaming-video service and improperly collected their data.
As the investigation has progressed, YouTube executives in recent months have accelerated internal discussions about broad changes to how the platform handles children's videos, according to a person familiar with the company's plans. That includes potential changes to its algorithm for recommending and queuing up videos for users, including kids, part of an ongoing effort at YouTube over the past year and a half to overhaul its software and policies to prevent abuse.
A spokeswoman for YouTube, Andrea Faville, declined to comment on the FTC probe. In a statement, she emphasised that not all discussions about product changes come to fruition. "We consider lots of ideas for improving YouTube and some remain just that--ideas," she said. "Others, we develop and launch, like our restrictions to minors live-streaming or updated hate speech policy."
The FTC declined to comment, citing its policy against confirming or denying nonpublic investigations.
The Wall Street Journal first reported Wednesday that YouTube was considering moving all children's content off the service into a separate app, YouTube Kids, to better protect younger viewers from problematic material - a change that would be difficult to implement because of the sheer volume of content on YouTube, and potentially could be costly to the company in lost advertising revenue. A person close to the company said that option was highly unlikely, but that other changes were on the table.
YouTube Kids gets a tiny fraction of the YouTube's audience, which tops 1.9 billion users logging in each month. Kids tend to switch from YouTube Kids to the main platform around the age of seven, Bloomberg reported this week.
The internal conversations come after years of complaints by consumer advocates and independent researchers that YouTube had become a leading conduit for political disinformation, hate speech, conspiracy theories and content threatening the well-being of children. The prevalence of preteens and younger children on YouTube has been an open secret within the technology industry and repeatedly documented by polls even as the company insisted that the platform complied with a 1998 federal privacy law that prohibits the tracking and targeting of those under 13.
The FTC has been investigating YouTube about its treatment of kids based on multiple complaints it received dating back to 2015, arguing that both YouTube and YouTube Kids violate federal laws, according to the people familiar with the investigation. The exact nature and status of the inquiry is not known, but one of the sources said that it is in advanced stages - suggesting a settlement, and a fine depending on what the FTC determines, could be forthcoming.
"Google has been violating federal child privacy laws for years," said Jeffrey Chester, executive director of the Center for Digital Democracy, one of the groups that has repeatedly complained about YouTube.
Major advertisers also have pushed YouTube and others to clean up its content amid waves of controversies over the past two years.
A report last month by PWC, a consulting group, said that Google had an internal initiative called Project Unicorn that sought to make company products comply with the federal child privacy law, called the Children's Online Privacy Protection ACT and known by its acronym COPPA.
The company that commissioned the PWC report, SuperAwesome, helps that help technology companies provide services without violating COPPA or European child-privacy legal restrictions against the tracking of children.
"YouTube has a huge problem," said Dylan Collins, chief executive of SuperAwesome. "They clearly have huge amounts of children using the platform, but they can't acknowledge their presence."
He said the steps being considered by YouTube would help, but "They're sort of stuck in a corner here, and it's hard to engineer their way out of the problem."
Earlier this month, YouTube made its biggest change yet to its hate speech policies - banning direct assertions of superiority against protected groups, such as women, veterans, and minorities, and banning users from denying that well-documented violent events took place. Previously, the company prohibited users from making direct calls for violence against protected groups, but stopped short of banning other forms of hateful speech, including slurs. The changes were accompanied by a purge of thousands of channels, featuring Holocaust denial and content by white supremacists.
The company also recently disabled comments on videos featuring minors and banned minors from live-streaming video without an adult present in the video. Executives have also moved to limit its own algorithms from recommending content in which a minor is featured in a sexualized or violent situation, even if that content does not technically violate the company's policies.