Gannett’s USA Today runs a product review site called Reviewed. It makes money when you click to buy things. And, at least according to the union talent that writes a lot of the reviews, it recently started filling up with some pretty crappy, possibly AI-generated reviews that they didn’t write.
That’s the story from Washington Post Style writer Will Somer, who appears to be an actual human being. His article describes a flood of poorly written, suspiciously similar reviews under bylines of people who don’t exist online, or who are working for contractors with bio descriptions that say they excel at polishing AI-generated copy.
Carrillo, a shop steward for the union, said the mysterious reviews — which appeared just weeks after staff staged a one-day walkout to demand management negotiate on a new contract — harm the reputations of actual employees.
“It’s gobbledygook compared to the stuff that we put out on a daily basis,” he said. “None of these robots tested any of these products.” . . .
But Gannett insists the articles weren’t AI-generated. In a statement to The Post, a spokesperson said the articles — many of which have now been deleted — were created through a deal with a marketing firm to generate paid search-engine traffic. While Gannett concedes the original articles “did not meet our affiliate standards,” officials deny they were written by AI.
“We expect all our vendors to comply with our ethical standards and have been assured by the marketing agency the content was NOT AI generated,” the spokesperson said in an email.
The problem with crap content
Articles that generate income by click-throughs to product purchases are inherently suspect.
Writers who follow a template to crank out copy by the ton aren’t creating “content,” they’re creating crap.
AI is just one way to generate crap. AI in the hands of a talented writer is a helpful tool. The problem isn’t the AI. It’s publishing crap, no matter where it came from.
When you twist your prose to optimize it for search engines, not readers, you are part of the crap-generation problem.
An organization that publishes crap, whether generated by an AI or not, whether optimized for SEO or not, is a crap purveyor. You can trash your journalistic standards only so much, after which nobody will ever trust you again.
Cory Doctorow describes how tech companies favor their users, then favor their advertisers, and finally favor only themselves, a process he calls “enshittification.”
Publications can do the same thing. It’s endemic. And apparently Gannett is ready to climb on board the train to enshittification junction.