Social media and gaming platforms operating may have to adapt their operations to new measures from the European Parliament, which seek stronger child protection standards and personal liability for executives in repeated breaches of the Digital Services Act (DSA).
Members of the Parliament’s Internal Market and Consumer Protection Committee (IMCO) voted 32–5, with nine abstentions, to adopt a report calling for new EU-wide safeguards for minors online. The report urges the European Commission to fully enforce the DSA, warning that companies found repeatedly in violation of child protection laws could face financial sanctions or be prohibited from operating within the European Union.
The proposal also includes the possibility of holding senior executives personally accountable when their companies fail to meet legal obligations related to minors’ digital safety. Lawmakers said the measures are intended to make compliance a direct corporate responsibility.
Under the proposed framework, the EU would set a digital minimum age of 16 for access to social media, video-sharing platforms, and AI companions unless parents give explicit consent. The age threshold of 13 would still apply for general access under parental supervision. Lawmakers said the updated age requirements seek to address exposure to harmful content and ensure privacy protections for children.
The report identifies certain engagement systems as contributing to excessive use among minors. It calls for prohibiting mechanisms such as autoplay, infinite scrolling, disappearing stories, and engagement-based algorithms. These design elements, often used to increase user activity, have been linked to extended screen time and negative mental health outcomes in young users.
In addition to social media controls, lawmakers are seeking to ban gambling-style elements such as loot boxes in online games accessible to minors. The report said these mechanics encourage risky spending behavior similar to gambling and can expose children to exploitative monetization models.
The proposals also address “kidfluencing,” a practice where minors act as social media influencers or content creators in exchange for payment or platform rewards. Lawmakers said this commercial activity can place children at risk of data misuse and manipulation.
To verify users’ ages, the report supports privacy-preserving age assurance systems that do not compromise personal data. Lawmakers stated that these technologies should not replace companies’ broader responsibility to design platforms that are inherently safe for young audiences.
The scope of the proposal includes artificial intelligence applications capable of generating manipulated or non-consensual images. Lawmakers urged the enforcement of the EU’s new AI Act to prevent the misuse of such technologies. The report also calls for regulating chatbots and AI companions to ensure they do not manipulate users emotionally or financially.
“We need a higher bar for access to social media and stronger safeguards for minors using online services,” said Danish MEP Christel Schaldemose, who authored the report. “My report calls for safety-by-design principles and a ban on the most harmful engagement mechanisms.”
The proposals follow a Eurobarometer survey indicating that young Europeans increasingly rely on digital platforms for entertainment, information, and social communication. The study raised concerns over exposure to influencer-driven content and misinformation, prompting calls within the EU for tighter measures to make online environments safer for children and teenagers.