Google acknowledged that its Google Assistant recording device can be accidentally triggered by background noise

Internet giant Google acknowledged Thursday that a language expert it partnered with had leaked sensitive Dutch audio data, following a report by the Belgian media group VRT.

Google also acknowledged that its Google Assistant device was liable to "false accept" pickups that would allow it to record by mistake without having been activated by users.

The group said it partnered with language experts to create products like Google Assistant, and that "one of these language reviewers has violated our data security policies by leaking confidential Dutch audio data."

VRT said it had listened to more than 1,000 recordings from devices in Belgium and the Netherlands, of which 153 were accidental.

They included users discussing their or children, and some who revealed such as their address.

"We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again," a Google statement said.

The company emphasised that its experts "only review around 0.2 percent of all audio snippets" and that they "are not associated with ".

Google Assistant is designed to be activated either by a button or by someone saying "Hey Google".

In a few cases, that included the word "Google" was enough to trigger the device, the statement said.

Google underscored that users can manage and control data by choosing to auto-delete it every three or 18 months.

Another internet giant, Amazon, has also been called to task after it emerged that staff had been told to listen to private conversations, which the company said was done to improve the system.