Page 2: Research news on Online child safety regulation

Online child safety regulation addresses legal, technical, and policy frameworks designed to protect minors on digital platforms, particularly social media and search services. Central themes include statutory minimum ages for account creation, mandatory age verification and age assurance technologies, and safety-by-design obligations such as teen defaults and PG-13 content gating. The field also examines platform liability for addictive design and youth mental health harms, constitutional and human-rights challenges to age-gating laws, and the effectiveness of national and regional regulatory regimes in reducing online risks to children.

Business

India ramps up AI rules for social media platforms

India tightened rules regulating artificial intelligence on Tuesday, requiring social media platforms to clearly label AI content and comply with takedown requests by authorities within three hours.

Internet

Discord adopts facial recognition in child safety crackdown

Messaging platform Discord announced Monday it will implement enhanced safety features for teenage users globally, including facial recognition, joining a wave of social media companies rolling out age verification systems.

Business

EU tells TikTok to change 'addictive' design

The EU said Friday that it had told TikTok it needs to change its "addictive design" or risk heavy fines, after the Chinese-owned platform was found in breach of the bloc's digital content rules.

page 2 from 12