Skip to main content

News

Media must defend itself against free use of its content by AI - TR boss Hasker

News producers could go under if they allow social media companies free access to their content for use in AI, as they had done previously with Google and Facebook, Thomson Reuters President Steve Hasker said.

Speaking at the Thomson Reuters Foundation annual Trust Conference, Hasker said news publishers had made the mistake in the past of supplying news content free to the social media giants, thinking it would “drive eyeballs” to their websites.

“Well, it didn’t work that way. What happened was that the tech companies were extremely good about not paying for the content and keeping the eyeballs,” Hasker said in a panel debate on “The fight to save democracy.” This almost destroyed the news industry, he said at the Oct 22-23 conference.

“So, I am going to be a sort of naïve optimist and say somewhere along the line, we as the news industry learned a lesson.”

As AI (Artificial Intelligence) is rolled out, it remains to be seen if the news industry has the wherewithal to defend its interests against rich social media companies, said Hasker, who is also CEO of Thomson Reuters.

But if it did not, and took it on trust “that the model providers are going to do the right thing with our content and eventually we will benefit from that, I think it will be the demise of the industry,”

He noted that the New York Times had sued OpenAI and Microsoft for copyright infringement over its use of the paper’s content to train generative AI, one of a string of lawsuits including by major fiction writers.  He said Thomson Reuters had done a number of licensing deals to provide backdated copy to large language models used for AI.

Hasker said the Trust Principles, guarding Thomson Reuters commitment to independence and freedom from bias had been incredibly helpful as a set of governing principles to ensure the integrity of its output as it adopted AI.

He said this commitment had been “a huge competitive advantage” as business leaders increasingly realised the importance of trust. “I am not saying we are the only person left in that territory, but it is fairly rarified air, sadly.”

He said there was cause for some optimism that businesses would see the importance of trust. “I am hopeful that reliance on businesses to do the right thing will by and large carry the day.”

 Hasker called for the clear labelling of news content to show if it was fact that had been checked, or opinion, and where it came from, as well as whether it had been produced by AI or by a journalist.

The lack of labelling had allowed the dissemination of opinion devoid of fact, pushing an angle and calling it news.

Consumers liked to hear things which reinforced their existing beliefs and their biases and prejudices, Hasker said. The problem of disinformation and fake news would continue unless material was labelled to show where it came from.  

But another member of the panel, journalist and Nobel Peace Prize winner Maria Ressa, disputed that labelling was the solution.

 “Even if you label everything, if it is in the design of the distribution platform to manipulate us, to exacerbate the worse of who we are and to play on our fear, anger and hate to polarize us because that makes more money, then anything journalists do actually becomes useless.”

She said this would continue “until it becomes illegal to insidiously manipulate humanity at the cellular level.”

  ■