Entertainment

Wyoming Reporter Caught Using Artificial Intelligence to Create Fake Quotes and Stories

Share on facebook
Share on twitter
Share on linkedin
Share on pinterest
Share on telegram
Share on email
Share on reddit
Share on whatsapp
Share on telegram


HELENA, Mont. — A quote from the governor of Wyoming and a local prosecutor were the first things that struck Powell Tribune reporter CJ Baker as a little strange. Then, there were some of the phrases in the stories that seemed almost robotic to him.

The dead giveaway, though, that a reporter for a competing news outlet was using generative artificial intelligence to help write his stories came in a June 26 article about comedian Larry the Cable Guy being chosen as Cody’s grand marshal. Stampede Parade.

“The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American independence, led by one of comedy’s most beloved figures,” said the Cody Company reported. “This structure ensures that the most critical information is presented first, making it easier for readers to quickly understand key points.”

After doing some research, Baker, who has been a reporter for more than 15 years, met with Aaron Pelczar, a 40-year-old who was new to journalism and who, according to Baker, admitted to using AI in his stories before resigning. . of the Company.

The editor and publisher of the Enterprise, which was co-founded in 1899 by Buffalo Bill Cody, has since apologized and promised to take steps to ensure it never happens again. In a editorial published on Monday, business editor Chris Bacon said he was “unable to catch” the AI ​​copying and fake quotes.

“Never mind that the false quotes were the apparent mistake of a hasty rookie reporter who trusted AI. It was my job,” Bacon wrote. He apologized because “the AI ​​was allowed to put words into the stories that were never said.”

Journalists to have derailed their careers put making quotes or facts in stories long before the emergence of AI. But this latest scandal illustrates the potential pitfalls and dangers what AI represents for many industries, including journalism, as chatbots can spit out spurious yet somewhat plausible articles with just a few warnings.

AI has found a role in journalism, including in the automation of certain tasks. Some newsrooms, including the Associated Press, use AI to free up reporters for more impactful work, but most AP employees are not allowed to use generative AI to create publishable content.

The AP has used technology to assist in articles about financial earnings reports since 2014 and, more recently, in some sports stories. It is also experimenting with an AI tool to translate some stories from English to Spanish. At the end of each story there is a note that explains the role of technology in its production.

Being upfront about how and when AI is used has proven important. Sports Illustrated was criticized last year by publishing AI-generated product reviews online that were presented as having been written by reporters who didn’t actually exist. After the story broke, SI said it was firing the company that produced the articles for its website, but the incident damaged the reputation of the once-powerful publication.

In his Powell Tribune story breaking the news about Pelczar’s use of AI in articles, Baker wrote that he had an uncomfortable but cordial meeting with Pelczar and Bacon. During the meeting, Pelczar said, “Obviously I have never intentionally tried to misquote anyone” and promised to “correct them and apologize and say they are misstatements,” Baker wrote, noting that Pelczar insisted that his errors should not reflect on his Cody editors. Enterprise.

After the meeting, Enterprise released a full review of all the stories Pelczar had written for the paper during the two months he worked there. They discovered seven stories that included AI-generated quotes from six people, Bacon said Tuesday. He is still revising other stories.

“They’re very credible quotes,” Bacon said, noting that people he spoke to while reviewing Pelczar’s articles said the quotes sounded like something they would say but that they never actually spoke to Pelczar.

Baker reported that seven people told him they were quoted in stories written by Pelczar but did not speak to him.

Pelczar did not respond to an AP phone message left at a number listed as his asking to discuss what happened. Bacon said Pelczar declined to discuss the matter with another Wyoming newspaper he contacted.

Baker, who regularly reads Enterprise because it is a competitor, told the AP that a combination of phrases and quotes in Pelczar’s stories aroused his suspicions.

Pelczar’s story about a shooting in Yellowstone National Park included the line: “This incident serves as a stark reminder of the unpredictable nature of human behavior, even in the most serene environments.”

Baker said the phrase felt like a summary of his stories that a certain chatbot seems to generate, in that it adds some kind of “life lesson” at the end.

Another story — about a poaching sentence — included quotes from a wildlife official and a prosecutor that appeared to come from a press release, Baker said. However, there was no press release and the agencies involved did not know where the citations came from, he said.

Two of the stories questioned included false quotes from Wyoming Gov. Mark Gordon that his staff only learned about when Baker called them.

“In one case, (Pelczar) wrote a story about a new OSHA rule that included a quote from the governor that was completely fabricated,” said Michael Pearlman, a spokesman for the governor, in an email. he appeared to fabricate a part of a quote and then combined it with a part of a quote that was included in a press release announcing the new director of our Wyoming Game and Fish Department.

The most obvious AI-generated copy appeared in the story about Larry the Cable Guy, which ended with an explanation of the inverted pyramid, the basic approach to writing breaking news.

It’s not difficult to create AI stories. Users could feed a criminal statement into an AI program and ask it to write an article about the case, including quotes from local officials, said Alex Mahadevan, director of a digital media literacy project at the Poynter Institute, the prominent think tank. journalism tank.

“These generative AI chatbots are programmed to give you an answer, no matter if that answer is complete rubbish or not,” said Mahadevan.

Megan Barton, editor at Cody Enterprise, wrote an editorial calling AI “the new and advanced form of plagiarism and in the field of media and writing, plagiarism is something that every media outlet has had to correct at some point or another.” It’s the ugly part of the job. But a company willing to correct (or literally write) these errors is reputable.”

Barton wrote that the newspaper has learned its lesson, has a system in place to recognize AI-generated stories and will “have longer conversations about how AI-generated stories are not acceptable.”

Enterprise didn’t have an AI policy, in part because it seemed obvious that journalists shouldn’t use it to write stories, Bacon said. Poynter has a model from which the media can build their own AI policy.

Bacon plans to have one installed by the end of the week.

“This will be a pre-employment discussion topic,” he said.



This story originally appeared on ABCNews.go.com read the full story

Support fearless, independent journalism

We are not owned by a billionaire or shareholders – our readers support us. Donate any amount over $2. BNC Global Media Group is a global news organization that delivers fearless investigative journalism to discerning readers like you! Help us to continue publishing daily.

Support us just once

We accept support of any size, at any time – you name it for $2 or more.

Related

More

1 2 3 9,595

Don't Miss