| |

Marching morons; a year in books; AI character names: Newsletter 4 February 2026

Newsletter 134. AI is overtaxing experts, and we’re all going to pay. Plus, the irrationally of the bestseller list, what books sold in 2025, three people to follow, three books to read, and one coach to rule them all.

AI and the expert’s burden.

A friend, who is a fellow ghostwriter and editor, recently came to me with a conundrum. She had been contracted to edit a memoir. The client submitted the first two chapters. They were logical, well organized, and terrible. And they were terrible in a very specific way. They were clearly written by AI, including repetition, passages written in an odd and inappropriate style, and other AI “tells.”

Now she had a challenge. This was not going to be a simple edit. It would require a line-by-line rewrite, to fix the AI-sloppy prose mannerisms. And then there’s the problem that AI-generated prose can’t be copyrighted. Would a publisher be able to detect it? Would they reject it?

When I told this story around my breakfast table, my daughter, who is working as a project manager in a tech company, shared that she’d had a similar experience. A technical staffer had generated a very detailed process document using AI, but it was full of elements at odds with both logic and reality. So now she and the tech staffer needed to spend extra time dismantling and rewriting the process document.

This is a pervasive problem now. AI creates faulty content rapidly and at scale. The resulting output looks appropriate and useful — until somebody who really knows what they’re doing examines it more closely. Worse, it doesn’t fail in uniform ways. It’s not a simple matter of finding and replacing em-dashes. AI tends to fail in ways that an actual expert can only detect by careful examination. The expert — and in many cases, that’s been me — looks at the result and says, “Ah, this is wrong, but I’ve never seen this particular kind of failure. And fixing it is going to require a lot of my time and knowledge.”

Given the value of experts’ time, that’s expensive.

I’m reminded of the 1951 short story by Cyril Kornbluth, “The Marching Morons.” It describes a world in which the vast majority of the citizenry has become colossally stupid. A small number of intelligent and competent people keep things running, but they’re enormously overworked and constantly undermined by the mass of morons.

AI is helping us to create a world in which almost anyone can create “content” that appears to solve a problem. The unskilled, like my friend’s client and my daughter’s colleague, look at what they have spit out and say, “Wow, look how quickly I got that done. Now it just needs a little cleanup.” They turn it over to the skilled expert, like my friend the ghostwriter or my daughter. And the skilled expert now has to spend a load of extra time determining what’s wrong with the AI output and how to fix it. It doesn’t matter whether the output is strategy documents or thought leadership books or code. The pattern is the same: lots of hard-to-untangle crap that simulates good content without actually getting there.

The AI optimists look at this and say “You can train the AI to be smarter and not make those mistakes, and the productivity gains will come from that.”

The AI pessimists look at this and say “It will never generate anything but flawed slop, and any productivity gains are illusory.”

The senior managers look at this and say “Maybe we shouldn’t have laid all those people off. Now the remaining expert workers are overwhelmed and the work is of poor quality.”

And the expert workers wonder if they’re going to spend the rest of their careers on the tedious work of cleaning up other people’s AI-generated messes.

If the knowledgeable people are overwhelmed with AI slop cleanup, we don’t get to apply ourselves to the creative work we do best. And that’s a loss for everyone.

In “The Marching Morons,” which is written as a satire, the solution ends very badly for the morons, who are nearly all seduced to their deaths. I would like to believe that we can come up with a better answer. But if AI reduces 90% of us to morons who can no longer think, and relegates the remaining 10% to slop cleanup, it’s falling a teensy bit short of the utopian future we’ve been promised.

Slop cleanup should have a high cost. That was my ghostwriter friend’s answer. That should become universal. It’s the only way to make sure that cheap, fast, crappy work that demands expert cleanup doesn’t overcome all of us.

News for writers and others who think

The New York Times bestseller list makes no sense. Its attempts to avoid being gamed make it subjective in ways that are shrouded in mystery. It’s become the Russian judge in the Olympic figure skating competition, and the stakes are similarly high. Read more in “Publishing Confidential.

The cloud, AI models, social media — these all seem like completely digital abstractions. But behind them all are data centers rapidly coming up against limits of energy and materials. Will we run out of copper? Check out this Cloudforge blog post by Carl Doty.

The New York Times sums up the uneven year in books (gift link). Up: dragons, romantasy, and bibles. Down: Nonfiction. Resurging: print books and bookstores.

Anne Janzer reflects on when to revise a popular but aging nonfiction book.

The Washington Post shut down its book review section amid layoffs. Book reviews in major publications are nearly gone.

Don’t ask AI to name your novel’s characters, unless you want them to have the same names as everyone else who was lazy enough to use AI to name their characters.

Three people to follow

David Goodtree , CEO of Foodgraph, making data about groceries powerful

Jeff Chausse , AI-native product designer

Geoffrey Fowler , tech columnist just liberated from the Washington Post

Three books to read

I Am Not a Robot: My Year Using AI to Do (Almost) Everything by Joanna Stern (Harper, 2026). What happens if you actually try to get stuff done with AI.

Inside the Box: How Constraints Make Us Better by David Epstein (Riverhead, 2026). How limits lead to success.

The Next Renaissance: AI and the Expansion of Human Potential by Zack Kass (Wiley, 2026). If you’re ready to believe that AI can actually cure cancer and everything else wrong with the world, this is the book for you.

Authoring is hard. If only there was a coach you could call on 24 hours a day.

There is. But it’s not free. It costs two bucks.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.