Information commissioner says app had done ‘very little, if anything’ to check for underage users
TikTok has been fined £12.7m for illegally processing the data of 1.4 million children under 13 who were using its platform without parental consent, Britain’s data watchdog said.
The information commissioner said the China-owned video app had done “very little, if anything” to check who was using the platform and remove underage users, despite internal warnings the firm was flouting its own terms and conditions.
“Our findings were that TikTok were not doing enough to prevent under-13s accessing their platform, they were not doing enough when they became aware of under-13s to get rid of them, and they were not doing enough to detect under-13s on there,” John Edwards told the Guardian on Tuesday. “They assure us that they are now doing more.”
The fine from the Information Commissioner’s Office (ICO) comes weeks after the app was banned from UK government phones amid security concerns. It is fast becoming a flashpoint for the UK’s handling of big tech and Chinese influence.
After the announcement of the fine, one of the largest the watchdog has given, Rishi Sunak was accused of moving too slowly in taking action against TikTok – and was called “naive for assuming TikTok could ever regulate itself”.
UK data protection law does not have a strict ban on children using the internet but requires organisations that use the personal data of children to obtain consent from their parents or carers.
TikTok itself bans those under 13 in its terms and conditions. The failure to enforce age limits led to “up to 1.4 million UK children” under 13 using the platform as of 2020, the ICO estimated.
“There was a whole bunch of stuff that they could have been doing that they weren’t,” Edwards said. “All that was required was a self-certification that the applicant was over 13, by clicking a box with no verification, with no extra checks. We understand that there are now significantly more checks and balances in place to detect that kind of thing.”
Edwards added: “There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws.
“As a consequence, an estimated 1 million under-13s were inappropriately granted access to the platform, with TikTok collecting and using their personal data. That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll.
“TikTok should have known better, TikTok should have done better. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”
The ICO’s investigation found that concern was raised internally but that TikTok did not respond “adequately”.
In a statement, a TikTok spokesperson said: “TikTok is a platform for users aged 13 and over. We invest heavily to help keep under-13s off the platform and our 40,000-strong safety team works around the clock to help keep the platform safe for our community.
“While we disagree with the ICO’s decision, which relates to May 2018 to July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps.”
TikTok emphasised that it had changed its practices since the period the ICO investigated. Now, in common with social media peers, the site uses more signals than a users’ self-declared age when trying to determine how old they are, including training its moderators to identify underage accounts and providing tools for parents to request the deletion of their underage children’s accounts.
The accusations also pre-date the introduction of the ICO’s “age appropriate design code”, which specifies an even stricter set of rules that platforms are expected to follow when handling the personal data of children. That code also makes it more explicit that platforms cannot argue ignorance over the ages of younger users as a defence for failing to treat their personal data carefully.
“We will be looking at other providers of services likely to be accessed by children and checking the ways in which they’re doing age verification,” Edwards said.
In 2019, TikTok was hit with a $5.7m fine by the US Federal Trade Commission for similar practices. That fine, a record at the time, was also levied against TikTok for improper data collection from children under 13. At the time the company committed to improving its practices and said it would begin keeping younger users in “age-appropriate TikTok environments”, where those under 13 would be pushed into a more passive role, able to watch videos, but not post or comment on the platform.
John Hayes, a Conservative MP who pushed Sunak to toughen up the online harms bill, warned the government it would be forced to return to the issue of regulation “time and again because the idea that companies like TikTok will ever regulate themselves is for the birds”.
Labour has accused the government of failing to protect children online. Alex Davies-Jones, the shadow digital and tech minister, said: “Ministers have been passing the buck and platforms have been acting with impunity.”
She added: “The government and tech companies have a clear responsibility to ensure young people are safe online and this episode shows they are failing that duty. Labour has long campaigned for stronger protections online and if this government fails to act the next Labour government will.”