TikTok is popular among young people. Photo / Getty Images
TikTok has been fined £12.7 million ($19.8m) by the UK’s privacy watchdog for illegally using more than a million young children’s personal data.
The Information Commissioner’s Office (ICO) said that the viral video app had allowed 1.4m under-13s to sign up in 2020 alone, despite claiming to ban them fromthe app.
The ICO said the Chinese-owned video-sharing app had failed to check people’s ages and remove underage users.
UK data laws require apps such as TikTok to seek parental consent when children under 13 sign up.
John Edwards, the Information Commissioner - and former New Zealand Privacy Commissioner - said: “Our investigation found TikTok breached a number of data protection laws, including failing to use children’s data lawfully. UK data protection law says that organisations that use personal data when offering services to children under 13 must have consent from their parents or carers. TikTok failed to do that. TikTok also failed to take adequate steps to identify and remove underage children from its platform.”
As a result, the children’s data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their next scroll.
“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”
The watchdog said concerns had been raised inside TikTok about under-13s using it. In the regulator’s view, the app failed to respond adequately.
The ICO said last year that it could fine the app up to £27m but said on Tuesday it had reduced the figure after representations from TikTok.
It remains among the largest fines issued by the regulator.
In 2017, TikTok was fined US$5.7m ($9m) in the US for misusing children’s data, in what was a record penalty for violating children’s privacy laws.
The app, which is owned by China’s ByteDance, is under growing scrutiny around the world over its ownership and treatment of young users.
The UK, US and EU are among those to have banned the app from government-issued devices. Australia became the latest country to issue a ban on Tuesday.
The app faces a potential ban in the US if its Chinese investors do not sell their stake.
A TikTok spokesman said the company disagreed with the ICO’s decision but was pleased the fine had been reduced.
The spokesman said: “We invest heavily to help keep under 13s off the platform and our 40,000-strong safety team works around the clock to help keep the platform safe for our community.
“We will continue to review the decision and are considering next steps.”
The company said it has a series of controls to stop under-13s from using it, such as requiring users to enter their date of birth when signing up, and training moderators to spot signs that an account holder may be underage.
Figures from the app show that it removed more than 17m accounts in the last three months of 2022 for being too young.
Shou Zi Chew, TikTok’s chief executive, recently told US politicians that he allows his young children to use a version of the app designed for under-13s, which does not exist in the UK.
According to Ofcom research, 53 per cent of 8-12-year-olds use TikTok, making it the most popular social media service in that age group apart from YouTube, which has a dedicated version for children. This compared to 30 per cent for Snapchat and 28 per cent for Instagram.