AI’s legal reckoning is one step closer
Below: Congresswoman demands transparency from social media firms. First:
AI’s legal reckoning is one step closer.
A group of visual artists and illustrators is celebrating a federal judge’s decision this week to allow key parts of their class-action lawsuit against the makers of popular AI image generators to move forward.
The artists allege that tech start-ups including Midjourney and Stability AI violated various laws by training their AI image tools on the artists’ work without consent. They say the tools encourage users to generate images that closely mimic a given artist’s style.
In a 33-page ruling issued Monday, U.S. District Judge William Orrick dismissed some of the artists’ claims but left core parts of the suit unresolved. That means the case can proceed to the discovery phase, which could bring to light internal communications around how the companies developed their AI tools.
Amid a flurry of AI copyright lawsuits, the case is now one of the furthest along the road to a showdown that could shape the industry’s future.
“Up until now, these cases have been kind of stuck in the starting blocks,” said Blake E. Reid, a professor of internet and copyright law at the University of Colorado at Boulder. While it’s “way too early to tell” how the case will turn out, “this is a pretty big step on a very long path.”
Orrick previously dismissed the bulk of a lawsuit brought by some of the same artists last year, prompting them to amend the suit and refile, adding new plaintiffs. The group, which includes artists Sarah Andersen, Kelly McKernan, Karla Ortiz and fantasy landscape painter Greg Rutkowski, fared better this time.
While Orrick threw out a claim that the companies violated the Digital Millennium Copyright Act, “He allowed the most important copyright infringement claims to go forward,” said James Grimmelmann, a professor of digital and information law at Cornell University. Those include claims “that the AI companies copied plaintiffs’ art in their models, and that they are responsible for users making infringing outputs.”
At stake is the business model behind a generative AI boom built largely on the free, unauthorized use of works of all kinds from across the internet to train sophisticated software capable of producing humanlike works of its own.
That content includes art, music, videos, news articles and social media posts. A key question, as lawsuits move forward, will be whether the use of such material to train AI models constitutes “fair use” or illegal infringement.
But first, Reid said, plaintiffs face the challenge of proving that their work was copied in a meaningful way. That can be tricky for two reasons. First, it’s not always clear exactly what data was used to train a given AI model. Second, the responses an AI tool generates for a user often are not identical to the works it was trained on, though there can be a strong resemblance.
Another high-profile case, brought by a group of writers including the comedian Sarah Silverman against OpenAI, has faced several setbacks, but a judge last month allowed one of its core claims to proceed. OpenAI has also moved to dismiss parts of a lawsuit brought by the New York Times in December. In June, a group of major record labels sued a pair of start-ups that let people generate songs with simple text prompts.
Grimmelmann lauded Orrick’s ruling in the artists’ case as “a fairly healthy development.”
“Some of the arguments that the defendants raised would have meant that an AI company was almost never liable for copyright infringement, regardless of what it trained on and how closely its outputs resembled well-known art,” he said. “Judge Orrick didn’t go along with that.”
“But his opinion is also realistic about what the plaintiffs will ultimately have to show,” Grimmelmann added. “It won’t be enough for them to say the AI was trained on their works; they’ll probably need to show that the AI is actually imitating their specific works.”
Stability AI, Midjourney and the plaintiffs’ attorneys did not respond to requests for comment Tuesday."