Page 3: Research news on Online child safety regulation

Online child safety regulation addresses legal, technical, and policy frameworks designed to protect minors on digital platforms, particularly social media and search services. Central themes include statutory minimum ages for account creation, mandatory age verification and age assurance technologies, and safety-by-design obligations such as teen defaults and PG-13 content gating. The field also examines platform liability for addictive design and youth mental health harms, constitutional and human-rights challenges to age-gating laws, and the effectiveness of national and regional regulatory regimes in reducing online risks to children.

Internet

Discord adopts facial recognition in child safety crackdown

Messaging platform Discord announced Monday it will implement enhanced safety features for teenage users globally, including facial recognition, joining a wave of social media companies rolling out age verification systems.

Business

EU tells TikTok to change 'addictive' design

The EU said Friday that it had told TikTok it needs to change its "addictive design" or risk heavy fines, after the Chinese-owned platform was found in breach of the bloc's digital content rules.

Internet

EU says WhatsApp to face stricter content rules

WhatsApp is set to face greater EU scrutiny after the European Commission on Monday added the platform to its list of digital firms big enough to face stricter content rules.

page 3 from 13