A group of Canadian news outlets — including CBC/Radio-Canada, Postmedia, Metroland, the Toronto Star, the Globe and Mail and the Canadian Press — has launched a joint lawsuit claiming copyright infringement against ChatGPT creator OpenAI.
The lawsuit was filed in the Ontario Superior Court of Justice on Friday morning and is looking for punitive damages from OpenAI, along with payment of any profits that the company made from using news articles from the organizations.
It’s also seeking an injunction banning OpenAI from using their news articles in the future.
In a joint statement, the companies wrote that “OpenAI is capitalizing and profiting from the use of this content, without getting permission or compensating content owners,” and claim that OpenAI “regularly breaches copyright” by using content from Canadian media outlets for products such as ChatGPT.
When asked if CBC would stop its employees from using tools such as ChatGPT as a result of the lawsuit, a spokesperson for the Crown corporation declined to answer and referred to the statement from the journalistic outlets.
Canadian action comes 11 months after U.S. lawsuit
In a statement emailed to CBC News, an OpenAI spokesperson said the company’s models are trained on data that is publicly available and said the company is “grounded” in international copyright principles.
“We collaborate closely with news publishers, including in the display, attribution and links to their content in ChatGPT search, and offer them easy ways to opt out should they so desire,” wrote OpenAI.
In the Canadian lawsuit, the domestic media outlets claim that OpenAI has been “well aware of its obligations to obtain a valid licence” to use their content.
In late December 2023, the New York Times filed suit against the tech company. At that time, OpenAI said it respected the rights of content creators and owners, and was committed to working with them to ensure they benefit from AI technology and new revenue models.
That lawsuit is still ongoing, with the Times claiming in April that OpenAI had potentially erased search results that the newspaper may need for its case.
OpenAI’s value has been estimated at $157 billion US in recent months.
Is it just reading an article if AI does it?
Media and technology researcher Richard Lachman says companies such as OpenAI claim it’s not off-base to use publicly available news articles to train an artificial intelligence system.
“The argument of the companies is, ‘We’re essentially reading the news that was on a public website. That’s not illegal. A human can read the news,'” said Lachman, an associate professor at Toronto Metropolitan University’s RTA School of Media.
“Of course, [media] companies push back and say, ‘You’re not reading the news, you are scraping information. And that’s against our terms of service.'”
Lachman compared the situation to a recent offer from a major book publisher to pay authors $2,500 to use their work in training artificial intelligence. He said that was an example of content and media companies realizing they may be able to make money when their content is used by technology giants.
“Clearly, there’s value. The question is, what is that value?” he said. “I don’t know exactly how that calculation happens.”
No exemptions for AI in Canadian copyright law
It’s far from clear whether Canadian copyright law would side with the media organizations or the tech companies in cases like this.
While OpenAI has won a copyright lawsuit against news outlet Raw Story in the United States, it still has cases pending against the New York Times and a group of authors in that country.
The scenario in Canada could be very different, lawyers say. Copyright law in this country includes a concept called “fair dealing,” which can allow someone to use content without infringing on copyright if it is for purposes such as “research, private study, education, parody or satire.”
However, fair dealing does not include a category for training AI, according to intellectual property lawyer Gaspard Petit.
“There’s a category for research, so you can do your own [artificial intelligence] model, do your own research, but then if you start building a business around it, you’re out of that exception,” said Petit.
This could mean that victories in the United States do not apply here, he said.
Copyright infringement could be difficult to prove when it comes to news articles specifically, said legal expert Lisa Macklem.
“There is no copyright in facts, which is primarily what news outlets provide,” said Macklem, a lawyer and lecturer at King’s University College at Western University in London, Ont.
She said the media companies would need to prove that the actual output of OpenAI has “substantial similarity” to their existing publications.
However, she added that OpenAI may be vulnerable on claims that it violated the media companies’ terms of use, which in many cases say that use of material from their websites must be for personal, non-commercial reasons.
Mixed opinions on whether law should change in Canada
Macklem said this case points to an “immediate and pressing need to have regulations put in place” around generative AI, a message echoed by Petit.
“I think the government will probably have to intervene at some point,” he told CBC News, but he also said new cases such as this one need to be tried before the court to help establish precedent and case law.
Industry players have previously criticized federal moves to regulate artificial intelligence, saying regulatory changes could stifle innovation and make it harder to compete with companies outside of this country.
The federal government has been holding consultations on Canadian copyright law as it relates to generative AI, but hasn’t reported its findings yet.
Julien Billot heads up ScaleAI, an organization that helps fund and scale up adoption of artificial intelligence at Canadian companies. He says existing laws already do the trick.
“In most cases, privacy, copyright and other existing laws are already very efficient to protect the companies in Canada,” he said.
“What OpenAI is doing is infringing the existing laws. You don’t need a specific AI law.”