Apple will delay the rollout of its new child protection measures.

Apple announced Friday it is delaying the rollout of its controversial new anti-child pornography tools, following criticism that the feature would undermine user privacy.

The Silicon Valley giant said last month that iPhones and iPads would soon start detecting images containing child sexual abuse and reporting them as they are uploaded to its online storage in the United States.

However, digital rights organizations quickly noted that the tweaks to Apple's operating systems create a potential "backdoor" into gadgets that could be exploited by governments or other groups.

The announcement comes as Apple faces intensifying scrutiny from regulators over what critics say is abuse of its dominance.

The company announced a rare and long-demanded concession Wednesday to how its online app marketplace works.

Apple cited feedback from customers, advocacy groups, researchers and others in its choice to "take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

New technology allows the software powering Apple mobile devices to match abusive photos on a user's phone against a database of known images of child abuse and then flag them when they are uploaded to the company's online iCloud storage.

Apple said previously that at the start of the system's rollout, a minimum of 30 machine-recognized images would be required for it to flag an account and trigger a verification process.

'Incredibly disappointing'

Critics of the policy welcomed the delay.

"It's encouraging that the backlash has forced Apple to delay this reckless and dangerous surveillance plan," said Evan Greer, director of digital advocacy group Fight for the Future. "They should shelve it permanently."

Though Apple cited feedback from advocacy groups in its decision, not all welcomed the pause.

"This is incredibly disappointing," tweeted Andy Burrows, head of child safety online at the National Society for the Prevention of Cruelty to Children.

"Apple had adopted a proportionate approach that sought to balance user safety and privacy, and should have stood their ground," he added.

The new image-monitoring feature would represent a major shift for Apple, which has until recently resisted efforts to weaken its encryption that prevents third parties from seeing private messages.

The company said it would have limited access to the violating images which would be flagged to the National Center for Missing and Exploited Children, a nonprofit organization, and has resisted government effort to weaken iPhone encryption.

FBI officials have warned that so-called "end to end encryption," where only the user and recipient can read messages, can protect criminals, terrorists and pornographers even when authorities have a legal warrant for an investigation.

The tech giant has unveiled major changes in recent days to its online app marketplace after years of criticism, as it tries to stave off a deeper, swelling effort to regulate Big Tech.

A concession announced Wednesday will spare apps that provide newspapers, books, music or video from having to use the App Store payment system and thus avoid paying a 30 percent commission.

Experts see the changes from Apple as proof that Big Tech companies have succumbed to pressure and decided to give an inch to try to avoid a collision with government rules that they would not control.