Artificial intelligence will be the next flashpoint between media conglomerates and technology giants after Nine joined its rival News Corp to push for payments from AI firms that have trained their systems on entertainment and news content.
Programs like ChatGPT and LaMDA, which are being commercialised by Microsoft and Google respectively in their search engines, rely on information from news and other websites to answer user queries but pay nothing to use that intellectual property.
Journalism companies successfully lobbied the Morrison government to introduce a media bargaining code in 2021 that forced Facebook and Google to pay tens of millions annually to the industry for their use of news content on their platforms.
Nine chief executive Mike Sneesby said Australian-made journalism and entertainment, which is expensive to produce, risked being exploited by AI companies.
“That really isn’t greatly different to the challenge that we have today with Google and Facebook and other social media platforms that already are generating value from our content,” Sneesby told staff at an internal Nine forum in Melbourne last Tuesday.
“I think it’s very important that we continue to work closely with the government around expanding that view of news media bargaining into a broader content, not just text and images, but also audio, video, and any form of intellectual property that we create.”
AI-enhanced search could also reduce the number of readers clicking on online news stories because it can summarise their content in the results page. Nine is the owner of this masthead.
Google and Facebook fought hard against the existing bargaining code, with the social media company pulling a huge array of pages, including media companies, emergency services and non-profits, from its sites to highlight the risks should the government consider expanding the law.
Sneesby’s view dovetails with News Corp chief executive Robert Thomson, who used an earnings call on Friday to demand payment for AI tools that trained on his company’s content and that refer to News Corp content in their answers. “Our content will certainly be aggregated and synthesised [by AI], and those answers monetised by other parties,” Thomson said. “We expect our fair share of that monetisation. Generative AI cannot be degenerative AI.”
However, both executives flagged some upsides from AI. Sneesby said Nine could delve into its back catalogue of newspapers and broadcast footage to rapidly create documentaries with AI and was already using it to generate sports highlight packages. “I think the opportunities there for us potentially outweigh some of the risks, but we’ve got to be focused on both,” said Sneesby.
Unlike the media, AI firms have so far been unconstrained by defamation claims that cost the media many millions of dollars a year. A Victorian regional mayor and former whistleblower, Brian Hood, has decided to drop what had been billed as a landmark AI defamation case after ChatGPT falsely suggested he was corrupt.
In the last few weeks, Mr Hood has succeeded in shining a light on the shortcomings of ChatGPT and the need for transparency of sources and responsibility for the information communicated,” said a spokeswoman for Hood and his law firm, Gordon Legal. “Mr Hood successfully removed the defamatory statements from ChatGPT, corrected the public record, and protected his reputation.”
Google and ChatGPT-maker OpenAI did not respond to a request for comment. Microsoft, which has invested billions in OpenAI and is incorporating its technology into its Office suite and Bing search engine, said it supported the bargaining code and other laws overseas to give fair compensation to local public interest journalism. “We remain supportive of these efforts and look forward to engaging in the policy conversations as they evolve around AI and journalism,” a Microsoft spokesman said.
Through a spokeswoman, Sneesby declined an interview request and would not answer further questions about what the future use of artificial intelligence systems could mean for media jobs.