Credit: Unsplash/CC0 Public Domain

Two advocacy groups want the Federal Trade Commission to take a tougher stance against Google, accusing its app store of recommending apps that transmit kids' personal information such as location without their parents' consent in violation of a 1998 law that protects children online.

The Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) said they plan to file a complaint with the FTC Wednesday asking regulators to investigate how the Google Play Store promotes apps for kids.

The groups say three recent studies, including one conducted by the American Medical Association last year, found that more than two-thirds of apps—67%—used by kids age 5 and under were transmitting information to third-party advertisers.

"If you're under 13, your data is not supposed to be shared with advertisers, but that's happening routinely, and at scale on the apps on Google Play," Josh Golin, the CCFC's executive director, told U.S. TODAY.

Google did not respond to multiple requests for comment Tuesday.

The raised similar concerns in 2018 and acknowledge that Google has made some improvements including tightening the rules for apps that target kids under the age of 13.

But Google has not done enough, Golin said.

"They need to go after this in a more systematic way," Golin said. "If Google actually did the vetting that they imply that they do, then the apps would be much safer for ."

Massachusetts Sen. Ed Markey, a Democrat who authored the Children's Online Privacy Protection Act, known as COPPA, said in a statement that while he's both "disturbed, but not surprised" that it remains a problem.

"Children are spending an unprecedented amount of time on their devices right now, and they shouldn't be tracked at every turn," Markey said. "That has to stop. It's time for Big Tech to be held accountable for prioritizing profits over privacy, particularly when it comes to our children."

Parents have become increasingly concerned about how to keep kids safe on the internet during the pandemic when screens are in such heavy use.

In 2019, Google's YouTube agreed to provide new protections for children and paid a $170 million fine to settle a year-long investigation by the FTC and the New York state attorney general into complaints from that it illegally collected data from children under the age of 13 children to target advertising.

The new complaint comes amid increased government scrutiny of the tech giants. Google faces antitrust lawsuits from the Justice Department and separate coalitions of states.

Concern over Big Tech's effect on children's well-being has also become a rare bipartisan issue on Capitol Hill.

Reuters reported Tuesday that Republican lawmakers requested that Facebook, Twitter, and Google hand over studies that show how their services affect children.

The request came after last week's congressional hearing last week during which lawmakers asked if the companies conducted internal research on children's mental health.

Facebook CEO Mark Zuckerberg said he believed the company had. Twitter CEO Jack Dorsey said he didn't think his company had. Google CEO Sundar Pichai said his company consults with third-party experts.

At the hearing, Rep. Cathy McMorris Rodgers of Washington state told the CEOs that their platforms are her "biggest fear as a parent."

Zuckerberg confirmed during the hearing that Facebook is exploring building a new product: Instagram for kids.

Kids' privacy advocates say better enforcement is needed, but say developers also need more guidance on how to make their apps safer for children.

"These advocacy groups should ask the FTC to investigate the developers' practices to determine if their apps are healthy or unhealthy for kids," said Girard Kelly, a senior counsel for Common Sense Media's privacy program. "It's up to companies partnering with independent privacy experts and organizations to work with developers on their practices to help consumers make better-informed choices."