The Associated Press has issued guidelines on artificial intelligence (AI), saying the tool cannot be used to create publishable content and images for the news service while encouraging staff members to become familiar with the technology.
AP is one of a handful of news organisations that have begun to set rules on how to integrate fast-developing tech tools like ChatGPT into their work. The service will couple this on Thursday with a chapter in its influential Stylebook that advises journalists how to cover the story, complete with a glossary of terminology.
“Our goal is to give people a good way to understand how we can do a little experimentation but also be safe,” said Amanda Barrett, vice- president of news standards and inclusion at AP. The journalism think tank Poynter Institute, saying it was a “transformational moment” urged news organisations this spring to create standards for AI’s use, and share the policies with readers and viewers. Generative AI has the ability to create text, images, audio and video on command, but isn't yet fully capable of distinguishing between fact and fiction As a result, AP said material produced by artificial intelligence should be vetted carefully, just like material from any other news source. Similarly, AP said a photo, video or audio segment generated by AI should not be used, unless the altered material is itself the subject of a story.
That’s in line with the tech magazine Wired, which said it did not publish stories generated by AI, “except when the fact that it’s AI-generated is the point of the whole story”. “Your stories must be completely written by you,” Nicholas Carlson, Insider editor-in-chief, wrote in a note to employees that was shared with readers. “You are responsible for the accuracy, fairness, originality and quality of every word in your stories.” Highly publicised cases of AI-generated “hallucinations,” or made-up facts, make it important that consumers know that standards are in place to “make sure the content they're reading, watching and listening to is verified, credible and as fair as possible,” Poynter said in an editorial.
News organisations have outlined ways that generative AI can be useful in publishing. It can help editors at AP, for example, put together digests of stories in the works that are sent to its subscribers. It could help editors create headlines or generate story ideas, Wired said. Carlson said AI could be asked to suggest possible edits to make a story concise and more readable, or to come up with possible questions for an interview.
AP has experimented with simpler forms of artificial intelligence for a decade, using it to create short news stories out of sports box scores or corporate earnings reports. That's important experience, Barrett said, but “we still want to enter this new phase cautiously, making sure we protect our journalism and protect our credibility.” ChatGPT maker OpenAI and The Associated Press last month announced a deal for the artificial intelligence company to license AP’s archive of news stories that it uses for training purposes.
News organisations are concerned about their material being used by AI companies without permission or payment. The News Media Alliance, representing hundreds of publishers, issued a statement of principles designed to protect its members’ intellectual property rights.