During a panel with news agency CEOs, Veerasingham outlined the opportunities and risks of generative AI and how news agencies must work to keep pace with the fast-moving technology.
“We must invest most of our resources in tackling the threat AI poses to the entire industry,” Veerasingham said. “You’ve got three main areas: transparency, the need to protect our intellectual property and traffic.”
In a separate session, Pace outlined AP’s standards for using generative AI, as well as the news agency’s culture of experimentation when it comes to exploring potential use cases.
“We believe AP needs to be an active participant in the generative AI conversation, and AP’s high standards need to be at the center of the discussion,” Pace stressed. She added that, in the past decade or so, AP has used automation and AI to streamline workflows and free up journalists to do more meaningful work.
Separately, Kaiser outlined the risks to the news industry if generative AI is not developed responsibly or with a proper legal framework.
Kaiser identified three key issues generative AI brings forward: copyright infringement, the increased spread of misinformation, and data privacy issues.
“If appropriate legal frameworks aren’t established, particularly around the protection of intellectual property rights, it could lead to the disruption of our industry and the entire news ecosystem,” Kaiser stressed.
Kaiser argued that the news industry must work together to ensure the new technologies are harnessed for good, ethically, and most importantly, “in ways that preserve the legal frameworks that function as the backbone of protecting the core of what we do.”