TikTok was fined $15.7 million in the UK for misusing children’s data

by Ana Lopez

TikTok has been fined £12.7 million (~$15.7 million) for breaching UK data protection law, including rules designed to protect children.

The privacy watchdog, the Information Commissioner’s Office (ICO), announced today that it felt the video-sharing site “didn’t do enough” to monitor who was using their platform and didn’t take enough action to remove the underage children using the service.

According to the ICO, TikTok had an estimated 1.4 million underage UK users over a two-year period, between May 2018 and July 2020 – which the investigation focused on – in violation of its terms of service which state that users must be 13 or older.

The UK’s data protection regime puts a limit on the age at which children can consent to the processing of their data at age 13 – meaning TikTok would have needed parental consent to lawfully process these minors’ data ( which the company did not).

“We have fined TikTok for providing services to British children under the age of 13 and processing their personal data without the consent or consent of their parents or guardians. We expect TikTok to continue its efforts to implement adequate controls to identify and remove underage children from its platform,” an ICO spokesperson told us.

In addition, the ICO found that TikTok breaches transparency and fairness requirements in the UK’s General Data Protection Regulation (GDPR) by not providing users with accurate, easy-to-understand information about their data being collected, used and shared.

“Without that information, users of the platform, especially children, would likely not be able to make informed choices about whether and how to interact with it,” the ICO noted in a press release announcing the punishment for misusing children’s data.

John Edwards, the UK’s information commissioner, added in a statement:

There are laws to ensure that our children are just as safe in the digital world as they are in the physical world. TikTok did not abide by those laws.

As a result, an estimated one million youths under the age of 13 were improperly accessed the platform, with TikTok collecting and using their personal information. That means their data may have been used to track and profile them, allowing them to deliver potentially harmful, inappropriate content on their next scroll.

TikTok should have known better. TikTok should have done better. Our fine of £12.7 million reflects the serious impact their failures could have had. They didn’t do enough to monitor who was using their platform or remove the underage children who were using their platform.

TikTok was contacted for comment on the enforcement of the ICO. The company told us it is reviewing the decision to consider next steps.

In a statement, a TikTok spokesperson said:

TikTok is a platform for users aged 13 and over. We invest heavily to keep children under 13 off the platform and our 40,000 strong security team works around the clock to keep the platform safe for our community. While we disagree with the ICO’s decision, which covers May 2018 – July 2020, we are pleased that the fine announced today has been reduced to less than half of the amount proposed last year. We will continue to evaluate the decision and consider next steps.

TikTok claims it has taken a number of steps to address the issues for which it is being fined today. Although it continues to use an age gate that asks users to enter their date of birth to create an account (meaning if they are underage, they can lie to get around the measure).

However, it says it is supplementing this with improved systems and training for its security moderation team to look for signs that an account may be used by a child under 13 so that they can flag accounts and send them for review. It also claims to respond immediately to parental requests to delete underage accounts – and uses other information provided by users, such as keywords and in-app reports, to help track down potential underage accounts.

TikTok further suggests that it has improved transparency and accountability in this area – saying it produces regular reports on the number of underage users removed from the platform (in the last three months of 2022, it said the figure exceeded 17 million suspected underage accounts deleted). worldwide; but it does not report this data by country); as well as offering family linkage to help parents monitor children’s use.

Despite breaching the UK’s GDPR on grounds of lawfulness, transparency and fairness over a two-year period, the social media platform is only facing a double-digit fine – well below the theoretical maximum (up to 4% of global annual turnover). ) – so the settlement looks pretty generous for TikTok.

The figure is also notably less than half the amount originally proposed by the ICO back in September, when the regulator issued a preliminary finding saying the company could face a fine of up to £27 million ($29 million). for what were then a series of suspected infringements. .

The reason for the significant haircut to the amount of the fine is a decision by the regulator not to make a preliminary finding with regard to the unlawful use of special category data in response to statements made by TikTok.

Under the GDPR, special categories of data refer to particularly sensitive types of information, such as sexual orientation, religious beliefs, political beliefs, racial or ethnic origin, health data, biometric data used for identification — where the bar for lawful processing is higher than for personal data ; and if consent is the basis that is relied upon, there is a higher standard for explicit consent.

This means that last year the ICO had suspected TikTok of processing this type of information without legal basis. But the company managed to persuade it to drop that concern.

It’s not exactly clear why the ICO dropped the special category research line. But in response to questions from businessupdates.org, a spokesperson for the regulator suggested it comes down to a lack of resources — telling us:

We have considered TikTok’s statements and decided not to pursue the preliminary finding regarding the unlawful use of special data. That does not mean that the use of special category data by social media companies is irrelevant to the ICO. But we need to be strategic with our resources and in this case the Commissioner has used his discretion not to pursue the preliminary finding regarding the unlawful use of special category data. This potential infringement was not included in the final amount of the £12.7 million fine, and this was the main reason why the provisional fine was reduced to £12.7 million. The amount of this penalty is determined in accordance with our Regulatory Action Policy.

The ICO has a history of inaction due to systematic breaches by the behavioral advertising industry – and the failure to clean up the tracking-and-targeting adtech industry could jeopardize its ability to pursue individual platforms that also rely on data-dependent tracking, profiling and ads microtargeting to monetize a “free” service.

Children’s data protection has certainly been a stronger area of ​​focus for the UK watchdog. In recent years, under pressure from campaign groups and parliamentarians in the UK, it has produced an age-appropriate draft code linked to GDPR compliance (and thus the risk of fines for those flouting recommended standards). Active enforcement of the children’s privacy and safety code started in September 2021. While it’s fair to say there hasn’t been a tsunami of enforcement yet, the ICO has conducted a number of investigations.

With the UK no longer a member of the European Union, the ICO’s enforcement of the GDPR is UK-only – and it’s worth noting that in the EU, TikTok’s activities are still being investigated about how it uses the data processed by children.

The Irish Data Protection Commission (DPC) opened an investigation in September 2021 into TikTok’s handling of child data. That pan-EU investigation is ongoing – and we understand that the European Data Protection Council is expected to make a binding decision to resolve disagreements between DPAs over Ireland’s draft decision, so the process could take many months. Also underway in the EU: An investigation by the DPC into TikTok’s data transfers to China, as GDPR also regulates data exports (which is of course a very hot topic when it comes to TikTok these days).

A point of comparison: Last year, rival social network Instagram was fined €405 million for misusing children’s data under the EU’s application of the GDPR. Although, in that case, the fine reflects cross-border data processing activity in the bloc of 27 member states – while TikTok’s ICO’s enforcement is done on behalf of UK users only, hence some of the difference in height between the fines imposed.

Related Posts