Page 9: Research news on Online child safety regulation

Online child safety regulation addresses legal, technical, and policy frameworks designed to protect minors on digital platforms, particularly social media and search services. Central themes include statutory minimum ages for account creation, mandatory age verification and age assurance technologies, and safety-by-design obligations such as teen defaults and PG-13 content gating. The field also examines platform liability for addictive design and youth mental health harms, constitutional and human-rights challenges to age-gating laws, and the effectiveness of national and regional regulatory regimes in reducing online risks to children.

Business

'Roblox' game to impose age controls this year

The publisher of "Roblox" has promised to set up age verification mechanisms, after allegations the video game massively popular with children and teens worldwide has fallen short on safety.

Internet

YouTube turns to AI to spot children posing as adults

YouTube has started using artificial intelligence (AI) to figure out when users are children pretending to be adults on the popular video-sharing platform amid pressure to protect minors from sensitive content.

page 9 from 13