ChatGPT wrote this headline about the latest OpenAI news: Publisher lawsuit hits 'company' over alleged content plagiarism — legal battle ensues
What you need to know
OpenAI and Microsoft face yet another lawsuit surrounding artificial intelligence and copyright infringement.
AlterNet, The Intercept, and Raw Story filed separate lawsuits in the Southern District of New York.
The lawsuit claims that ChatGPT reproduces "verbatim or nearly verbatim copyright-protected works of journalism" without properly crediting a source.
OpenAI and Microsoft have faced similar claims before, including a lawsuit by The New York Times.
Microsoft and OpenAI have once again found themselves on the receiving end of a lawsuit. AlterNet, The Intercept, and Raw Story have filed separate lawsuits in the Southern District of New York that claim ChatGPT
"At least some of the time, ChatGPT provides or has provided responses to users that regurgitate verbatim or nearly verbatim copyright-protected works of journalism without providing author, title, copyright, or terms of use information contained in those works," states the lawsuit.
The suit points toward ChatGPT generating results to queries about current events or the results of investigative journalism without listing sources as well.
While Microsoft and OpenAI are separate companies, they have a complicated relationship that sees them both compete and work together. In relation to these lawsuits, the fact that Microsoft hosted the data centers that were used to develop ChatGPT is key. Additionally, Microsoft's Copilot (formerly Bing Chat) uses OpenAI tech and has also been accused of not listing source information.
Here we go again
This isn't the first time that Microsoft and OpenAI have faced lawsuits surrounding AI generated content. The New York Times sued both Microsoft and OpenAI in December 2023. The news outlet claimed that New York Times articles were used to train language models that power Microsoft Copilot and ChatGPT.
The New York Times lawsuit claimed that OpenAI and Microsoft's AI tools can reproduce content from The New York Times verbatim. OpenAI disputed the claims of reproducing content and asked a federal court to dismiss the claim. OpenAI claimed that The New York Times took advantage of a bug to get ChatGPT to reproduce content from the news outlet.
Regardless of if the complaint from The New York Times has merit, there are several legal questions surrounding OpenAI. For example, does a human artist being influenced by other artists differ from an AI model learning from artists? If so, how do you determine a truly original work versus something inspired by other content? And perhaps most importantly to those affected by AI financially, what is fair compensation for if an AI model is trained on work and then used to recreate similar work?
Microsoft has a solid track record when it comes to facing legal scrutiny, so it will be interesting to see how Copilot and related AI tech fair in against an increasing number of lawsuits.