This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

reputable news agency

proofread

New artificial intelligence: Will Silicon Valley ride to riches again on other people's products?

AI
Credit: Pixabay/CC0 Public Domain

Silicon Valley is poised once again to cash in on other people's products, making a data grab of unprecedented scale that has already spawned lawsuits and congressional hearings.

Chatbots and other forms of generative artificial intelligence that burst onto the technology scene in recent months are fed vast amounts of material scraped from the internet—books, screenplays, research papers, , photos, art, music, code and more—to produce answers, imagery or sound in response to user prompts.

Technology companies are falling over themselves to leverage this new and potentially lucrative technology. Google, valued at $1.5 trillion, has gone all in with its Bard chatbot after rival Microsoft, valued at $2.4 trillion, invested billions in San Francisco's generative AI pioneer OpenAI. Meta, valued at $680 billion, just announced plans to add chatbots to its apps. Venture capitalists are pouring billions of dollars into generative AI startups.

But a thorny, contentious and highly consequential issue has arisen: A great deal of the bots' fodder is copyrighted property.

In January, Bay Area artist Karla Ortiz joined an Oregon cartoonist and a Tennessee painter to sue UK-based image-generation company Stability AI in U.S. District Court in San Francisco, claiming Stability violated the rights of millions of artists by training its software on more than 5 billion copyrighted images scraped from the internet without permission or compensation.

"It just took them," the lawsuit alleged. Outputs from Stability AI are "derived exclusively" from those images and "will substantially negatively impact the market" for artists' work, the lawsuit claimed.

Stability AI, in an April court filing, argued that its software "enables users to create entirely new and unique images" and that its technology does not produce material with "substantial similarity" to artists' copyrighted work.

The new AI's intellectual-property problem goes beyond art into movies and television, photography, music, and computer coding. Critics worry that major players in tech, by inserting themselves between producers and consumers in commercial marketplaces, will suck out the money and remove financial incentives for producing TV scripts, artworks, books, movies, music, photography, news coverage and innovative software.

"It could be catastrophic," said Danielle Coffey, CEO of the News/Media Alliance, which represents nearly 2,000 U.S. news publishers, including this news organization. "It could decimate our industry."

The new technology, as happened with other Silicon Valley innovations, including internet-search, social media and food delivery, is catching on among consumers and businesses so quickly that it may become entrenched—and beloved by users—long before regulators and lawmakers gather the knowledge and political will to impose restraints and mitigate harms.

"We may need legislation," said Congresswoman Zoe Lofgren, D-California, who as a member of the House Judiciary Committee heard testimony on copyright and generative AI last month. "Content creators have rights and we need to figure out a way how those rights will be respected."

Central to the issue is the doctrine of fair use, which allows copyrighted work to be used without permission under certain conditions. Lofgren believes courts will decide that matter before Congress might take any action.

Bay Area lawyer and computer programmer Matthew Butterick launched the first legal salvo late last year with a proposed-class-action lawsuit on behalf of two unnamed plaintiffs against Microsoft, its subsidiary GitHub, and its partner OpenAI, alleging the AI-powered coding assistant GitHub Copilot is built upon "software piracy on an unprecedented scale." The defendant companies in January fired back in U.S. District Court in San Francisco with the assertion that its tool "crystallizes the knowledge gained from billions of lines of public code," that it "withdraws nothing from the body of open source code available to the public," and advances learning, understanding and collaboration.

Furor over the content grabbing is surging. Photo-sales giant Getty is also suing Stability AI. Striking Hollywood screenwriters last month raised concerns that movie studios will start using chatbot-written scripts fed on writers' earlier work. The record industry has lodged a complaint with federal authorities over copyrighted music being used to train AI.

Santa Clara University law school professor Eric Goldman believes the law favors use of copyrighted material for training generative AI. "All works build upon precedent works," said Goldman, an expert in internet law. "We are all free to take pieces of precedent works. What generative AI does is accelerate that process, but it's the same process. It's all part of an evolution of our society's storehouse of knowledge."

Technological advances, however, have a history of skirting for content producers, noted renowned wildlife photographer Frans Lanting of Santa Cruz, California. "The sanctity of copyright law has been undermined more and more by new technologies," Lanting said, citing "an assumption by the general public but especially by that individual works can be reproduced … without attribution or any compensation for the creators. Everything becomes for free."

Lanting worries that his own photos, typically presented with stories about human effects on the natural world, could be replicated via AI and presented in ways that undermine trust in his work.

University of California-Berkeley engineering lecturer and venture capitalist Shomit Ghose said generative AI may need regulation to bar direct mimicry of creators' work. But its potential to enhance many forms of creativity, he said, recalls the comic book and movie hero Iron Man, a human augmented by technology. Quite possibly, Ghose said, "the future is Iron Man."

To the News/Media Alliance's Coffey, attention from federal lawmakers provides reason for guarded optimism, particularly in light of Silicon Valley's history, which saw Google and Facebook cripple the news industry by inserting themselves between news producers and consumers to siphon off the lion's share of digital-advertising revenue, with legislators around the world taking decades to respond. The Alliance's "AI Principles" say fair use does not apply to unauthorized scraping of publishers' content for AI, and that news producers must be paid through a yet-to-be-developed system, possibly licensing.

Licensing might prove a problematic solution. When tech firms like Apple, valued at nearly $3 trillion, and $30 billion Spotify intervened between musicians and listeners to deliver music online, those firms and record labels, along with a small fraction of music stars, captured the bulk of the revenue, with the majority of musicians earning a relative pittance.

Lofgren wants a solution that does not sacrifice the nation's leadership on the new technology and the advances it promises. "We want to balance our efforts to make sure that artists and other are treated fairly," she said. "We also don't want to put America in second or third place."

Sunnyvale, California, software engineer Johannes Ernst, CEO of Dazzle Labs, a startup building a platform for control of personal data, said content producers could annotate their work with conditions for use that would have to be followed by companies crawling the web for AI fodder. Debates about legal protections put the cart before the horse, Ernst said.

"We need to figure out what's right and wrong here," he said. "Ignore what the law says for a second and say, "How should it be?" Then see what laws we can use to make it that way, and see what new laws might be necessary."

2023 The Mercury News

Distributed by Tribune

Citation: New artificial intelligence: Will Silicon Valley ride to riches again on other people's products? (2023, June 23) retrieved 27 April 2024 from https://techxplore.com/news/2023-06-artificial-intelligence-silicon-valley-riches.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Artists fight AI programs that copy their styles

13 shares

Feedback to editors