Credit: CC0 Public Domain

Facing an onslaught of ongoing criticism, Facebook said that it's taking several steps to make Instagram safer and private for teens.

On Tuesday, the social network said it's making changes to Instagram, the photo- and video-sharing app with more than 1 billion users. They include automatically defaulting teen users under 16 into private accounts, making it harder for potentially suspicious accounts to find teens, limiting the options advertisers have to reach those younger viewer with ads, and using AI to detect users' age.

"We think are the right choice for young people, but we recognize some young creators might want to have public accounts to build a following," Instagram said in a blog post. "We want to strike the right balance of giving young people all the things they love about Instagram while also keeping them safe."

While the changes on Instagram starting Tuesday are praised by several tech experts, they also face criticism from other experts who raised several questions including whether the changes have been vetted by federal regulators and whether Instagram is doing enough to make young people safe.

"We've been telling Facebook for years to stop targeting to teens on both Facebook and Instagram and to protect their privacy," said Jeff Chester, executive director of the Center for Digital Democracy, a Washington, D.C.-based nonprofit. "We have a lot of questions. We're not alone."

The changes on Instagram also come amid ongoing momentum among several groups and lawmakers to collectively stop a potential Instagram for kids. In May, Facebook received more than 180,000 signatures in a series of petitions from a collective of nonprofit and grassroots organizations including Fairplay (formerly known as the Campaign for a Commercial-Free Childhood), SumOfUs, and a joint effort led by the Juggernaut Project all urging the tech giant to scrap its plans for an Instagram for kids.

The groups collectively claim that a version of Instagram—the popular photo- and video-sharing app with more than 1 billion users—is unsuitable for kids under 13.

Meanwhile, Facebook says it's "exploring" a version of Instagram for kids. The company said it will make safety and privacy a priority and "consult with experts in , child safety, and mental health, and privacy advocates to inform it." Facebook also said it will "not show ads in any Instagram experience we develop for people under the age of 13."

The strategy is backed by both Instagram Head Adam Mosseri and Facebook CEO Mark Zuckerberg, who told lawmakers in March that plans for an Instagram for kids are in "very early stages."

Currently, Instagram doesn't allow users under 13 to use the platform. But, in regards to Instagram's teen safety update taking effect on Tuesday, the platform admitted that creating a safe, private but also fun site "comes with competing challenges."

Users under 16 now defaulted into private accounts

Instagram said everyone who is under 16 years old (or under 18 in certain countries) will be defaulted into a private account when they join the platform.

Instagram said it previously let younger users choose between a public account or a private account when they signed up, but said recent research showed they appreciated "a more private experience." Instagram added recent testing showed that eight out of 10 young people accepted the private default settings when signing up.

Instagram said for young users who currently have a public account, they will send a notification "highlighting the benefits of a private account" and explain how they can change their settings.

"We'll still give young people the choice to switch to a public account or keep their current account public if they wish," Instagram said.

In the blog, David Kleeman, a for global trends at United Kingdom-based Dubit and a former president of the American Center for Children and Media said, "Defaulting accounts to private for under-16s encourages young people to develop comfort, confidence and capability as digital citizens during their younger years and help them develop habits to last a lifetime."

However, Josh Golin, Fairplay's executive director, believes the decision to make Instagram accounts for teens private by default are also part of Facebook's commitment to comply with the Age Appropriate Design Code (AADC) in the UK. According to the AADC, Facebook and other tech companies must set the most privacy protective setting by default for users 18 and under by Sept. 2.

Also, Instagram said it has developed a tool that automatically detects potentially suspicious adult accounts and stops them from interacting with teen user accounts. For example, an adult's account might be marked as suspicious if it has been blocked or reported by multiple teen users.

And in a few weeks, advertisers will soon be restricted in how they can target users under age 18 on Instagram, Facebook and Messenger. Advertisers will be allowed to target younger users based only on their ages, genders and locations and will no longer be able to target them based on their interests or activities on other websites and apps, Facebook said.

How AI might help

In an article written by Pavni Diwanji, Facebook's vice president of youth products, the AADC "has been a model example of how Facebook can play a role in informing, and subsequently implementing, thoughtful age-appropriate solutions to protect young people on our services."

Diwanji also said Facebook is creating new ways to stop those who are underage from signing up, and "developing AI to find and remove underaged accounts," and new solutions to verify and estimate a user's age.

Diwanji wrote they will "train the technology using multiple signals," including when a young user gets birthday messages and and the age written in those messages as well the age users gave on Facebook.

"(We'll) apply it to our other apps where you have linked your accounts and vice versa—so if you share your birthday with us on Facebook, we'll use the same for your linked on Instagram," said Diwanji, who admits it's a work in progress as the company is also developing options for users to prove their age.

"This technology isn't perfect, and we're always working to improve it, but that's why it's important we use it alongside many other signals to understand people's ages," Diwanji added. "This technology is also the basis of important changes we're making to keep safe."

"We routinely recommend that teens have all of their social media profiles set to private so that they have better control over who can see what they post," Justin Patchin, co-director, Cyberbullying Research Center, said in the Instagram post. "Defaulting to private just makes sense as adolescents explore the boundaries of what they want to share with whom."

Although he remains "skeptical," Golin, the Fairplay executive, said on the surface, Facebook and Instagram are taking much needed appropriate steps.

"I hope these are real changes," Golin said. "We'll see when this rolls out if this lives up to the hype."

Meanwhile, Chester of the Center for Digital Democracy said he has concerns and has reached out to the Federal Trade Commission to ask it to review Instagram's changes.

Titania Jordan, a chief parenting officer for Bark, an Atlanta-based tech watchdog group, said Monday that while it's "wonderful" seeing Instagram and Facebook release specific, concrete ways they believe will work to keep children safer online, she also has many concerns.

She said even though Instagram appears to be addressing online predation, underage users, and advertising standards for teens, they also must be mindful of other issues including cyberbullying, suicidal ideation, self-harm and exposure to misinformation, violent, and sexual content.

Facebook's Diwanji said in her article that "with no foolproof way to stop (young) people from misrepresenting their age, we want to build experiences designed specifically for them, managed by parents and guardians."

Dwanji added, "We believe that encouraging (young users) to use an experience that is age appropriate and managed by parents is the right path. It's going to take a village to make this experience compelling enough so that this age group wants to use it, but we're determined to get it right."

The mother of a pre-teen, Jordan also said that parents need to be constantly notified by both Facebook and Instagram about options to protect their kids.

"Parents are on the front lines when it comes to protecting their children, both online and in real life," Jordan said. "If Instagram and Facebook aren't involving the parents in thoughtful and empowering ways, it's not going to work."

In its blog, Instagram said it is committed to listening to its younger users, their parents, lawmakers and experts to build a platform that won't "compromise on their privacy and safety," and is trusted.

One tech accountability expert said he's curious to see if Facebook will do what it promises.

"These long overdue changes are an important acknowledgment from Facebook of the many harms kids and teens face on their platforms, from manipulative product designs and pervasive surveillance advertising to unwanted contact from predators," said Jesse Lehrich, co-founder of Accountable Tech, a Washington, D.C.-based nonprofit. "It should not take years of tireless advocacy from child safety and tech accountability NGOs to earn bare minimum protections for platforms' most vulnerable users, but it nonetheless speaks to the unprecedented pressure Big Tech is facing over their exploitative business practices."