Regulations to protect minors online to be introduced in the European Union

Published on August 5, 2022
Last Updated on August 15, 2022

An Introduction

The TaskUs Policy Research Lab provides consultative and value-add services in Trust & Safety and Content Moderation. We service a wide range of policy areas and content types, with the goal of helping create a safer internet for all.

Focus and Findings

As part of our core set of responsibilities to provide the most updated and relevant information pertaining to online safety, the TaskUs Policy Research Lab is continuously studying regulations to promote minor safety online alongside the European Commission’s adoption of the Digital Services Act (DSA) and Digital Markets Act (DMA)

In line with these regulations, specific legal actions have been proposed to combat child abuse and protect children online. 

Such proposals include milestone strategies, such as:

  • In 2022: Co-funding the safer internet helplines and hotlines in the European Union (EU), including those recognized in the future as “trusted flaggers” under the DSA, to assist the public, particularly children, when confronted with harmful and illegal content.
  • In 2023: Issuing a standardization request for a European standard on online age verification in the context of the eID proposal.
  • In 2024: Encouraging and facilitating the design of a comprehensive EU code of conduct on age-appropriate design building on the framework provided in the DSA.

The new EU legislation focused on child sexual abuse material (CSAM) regulations and was developed by the Directorate-General for Migration and Home Affairs (otherwise known as ‘’Home’’). More so, it highlights that existing measures do not go far enough or wide enough across the tech sector, and there is continued growth in the number of reports of CSAM online content and its trends on platforms

The rules further include:

  • To require assessments of the various risks that platforms operating in Europe pose for the proliferation of CSAM and then take “reasonable measures to mitigate the risk.”
  • To implement efficient voluntary measures by providers to detect, report, and predict CSAM based on three categories:
    • “Known”: content that has already been confirmed to be CSAM by companies or child safety groups,
    • “New”: CSAM content that has not been fingerprinted yet but should be proactively detected,
    • “Grooming” content (seemingly, messages and other communications that might, in the future, lead to the creation of CSAM). 
  • To establish a European Centre on prevention and assistance to victims in the form of a coordination hub managed by the Commission which would work closely with Europol and platforms to implement the regulation and oversee the deployment of new technical infrastructures for CSAM moderation. 

Next Steps

The TaskUs Policy Research Lab is continuously considering these strategies in its consulting and value-adding initiatives. Platforms would have to collaborate with more ''trusted flaggers'', anticipate stronger age verification features, and prepare themselves for more monitoring ahead. We will continue to provide you with the most updated information about our ongoing focus on upholding and promoting a safer online experience for all.

Want to know more about our studies and significant findings?

Get in touch with Us >

Interested in Working With Us?

References

TaskUs