Page 7: Research news on Online child safety regulation

Online child safety regulation addresses legal, technical, and policy frameworks designed to protect minors on digital platforms, particularly social media and search services. Central themes include statutory minimum ages for account creation, mandatory age verification and age assurance technologies, and safety-by-design obligations such as teen defaults and PG-13 content gating. The field also examines platform liability for addictive design and youth mental health harms, constitutional and human-rights challenges to age-gating laws, and the effectiveness of national and regional regulatory regimes in reducing online risks to children.

Internet

EU lawmakers propose social media ban for under-16s

European lawmakers on Thursday called for stricter rules to protect minors online, including a bloc-wide minimum age of 16 to access social media and AI companions without parental consent.

Business

EU grills Apple, Snapchat, YouTube over risks to children

The EU Friday demanded digital giants including Snapchat and YouTube explain how they are protecting children from online harm, as all but two member states signaled openness to restricting social media access for minors.

Internet

Child protection vs privacy: decision time for EU

Does protecting children justify snooping on private messages? That is the sensitive question facing EU countries Wednesday as they wrangle over a push to combat child sexual abuse material online.

page 7 from 13