“Balance” Makes Us Stupid
In 1998, The Lancet published a study so flimsy it should have blown away in a light breeze. Andrew Wakefield, a now former British doctor with dubious methods and undisclosed conflicts of interest, implied a link between childhood vaccines and autism. The scientific community quickly shredded the work. Yet for years afterwards, newspapers and television debates gave the subject “balance.” One expert would insist the evidence showed vaccines were safe, while another parent—often encouraged by celebrity endorsement—would claim vaccines were poison. The implication was that truth lay somewhere in between, halfway between medical consensus and medical fantasy. The result was not enlightenment but confusion, a panic that lingers today.
That spectacle of false equivalence is what balance too often amounts to: the middle point between knowledge and a hallucination. Journalists, the better ones at least, learned something from that debacle—that neutrality and truth are not synonyms. But artificial intelligence cannot possibly learn the lesson. It digests information indiscriminately, parroting what it is fed, and in a world where lies are more entertaining and plentiful than facts, the machine will always incline toward the counterfeit middle ground.
If you ask anyone outside the media bubble, they often maintain that balance should be the goal for good journalism.
The myth holds that balance—hallowed “both-sides” engagement—represents truth. But hear me if one person says it’s raining and another says it’s dry, a journalist’s job is not just to report both claims, but to check which one matches the real situation. This means stepping outside, looking at weather data, or talking to other reliable sources to find out the truth. Journalists aim to give readers clear, accurate information, rather than just repeating what each side says without checking which one is correct.
Good journalists know this instinctively. Not “all sides” deserve equal weight. We learn it early, sometimes at considerable cost: tobacco executives assured us that cigarettes were not addictive; oil lobbyists swore climate change was a mere theory, and automobile companies suppressed evidence about the dangers of vehicle emissions to public health and the environment. The reporter who grants such claims equal dignity to maintain balance betrays the public.
Now consider the machines. If human journalism is fragile yet resilient, artificial intelligence is gullible and easily duped. It does not interrogate sources; it consumes them. It has no instinct to cross-examine or to recoil; no hackles rise at the bare-faced lie. It eats what it is fed, whether nourishment or poison, and then regurgitates the slurry as though it were knowledge. Balance, in the AI model, is not even a pose. It is a byproduct of training data soaked with contradiction, producing a smooth, confident blend of sense and nonsense. Studies document that current AI systems struggle to distinguish fact from misinformation, typically relying on patterns seen in data, which can include both reputable and dubious sources.
We can see the consequences. AI has recommended adding non-food substances like glue to recipes, some AI-generated news websites have circulated false stories about celebrity deaths or hoaxes, presenting them alongside authentic entertainment news. Children’s educational videos created with AI have also promoted debunked ideas, such as denying climate change, right next to legitimate scientific content. What is presented to the reader is not truth but a linguistic average between certainty and delusion. To place that anywhere near the tradition of reporting is as absurd as confusing a filing cabinet with a detective.
Human journalism, for all its flaws, is built on suspicion. It proceeds from doubt, fights with power, and insists on questions that cannot be answered simply by copying the existing record. When Orson Welles’ “War of the Worlds” broadcast panicked listeners, journalists rushed, sceptically, to document public hysteria—not to replay the fiction but to expose its effects. When Joseph McCarthy thundered about communists in government, it was journalists, not an algorithm, who eventually dismantled him: Murrow confronting him on television, the press poring over records until his accusations collapsed. An AI, had it existed then, would have swallowed McCarthy whole and said: Some say the State Department swarms with traitors; others, not so much—who’s to say?
The difference is not cosmetic. It is existential. Artificial intelligence cannot recognise that balance itself is sometimes a lie. Human reporters—at least the good ones—learn to resist the narcotic of “neutrality.” They are obliged to tell their readers, sometimes at obscene cost, that cigarettes cause cancer, that the climate crisis is happening, and that elections were not stolen. Reality does not sit halfway between fact and fiction.
The irony, of course, is that while human journalism is indispensable, it is being treated as dispensable. Newsrooms are thinned to skeleton crews, budgets slashed for efficiency, while tech firms that peddle repeated nonsense command valuations larger than entire media conglomerates.
To reverse this—to insist that truth costs money, time, and human labour—we have to stop indulging the myth of balance. A healthy news culture doesn’t ask whether the truth lies halfway between Galileo and the Inquisition. It doesn’t find equilibrium between a climate scientist and a coal baron. It certainly doesn’t outsource its conscience to a machine trained on the grand library of human error.
Real reporting is not balanced but a judgment, made after argument and verification, about what is true. It is adversarial by nature, uncomfortable by design. And unless we have the will to pay for it—to see journalists not as disposable scribblers ceding their vocation to AI but as vital craftsmen of reality itself—we will end up with what the machines already produce: a sterile, frictionless world where lies are politely averaged into truth. A world, in other words, where balance has finally defeated honesty.
I used to recommend readings in my newsletter, which I paused for a while. Now, without giving a reason, I will start again in my new section. This newsletter is, after all, called "Odds & Sods."

Daniel Miessler is a 25-year cybersecurity veteran who has led security initiatives at Apple and Robinhood, and is the founder of Unsupervised Learning, a leading voice at the frontier of tech and AI. When he talks about where technology is headed, the world’s biggest companies—and the top minds in the industry—listen. In August, he wrote a newsletter edition that I can't stop thinking about. Now I want you to read it and tell me what you think:

Silicon Valley’s soul is at stake. As Trump’s second term reshapes the power landscape, tech’s long-standing liberals are either capitulating or making exit plans—while the most influential CEOs cosy up to the new regime. Venture capitalists and founders who once championed progress are now hedging their bets, caught between fear and self-preservation. WIRED's big story explores how the Valley’s iconoclasts became cautious courtiers, and why the industry’s boldest voices are suddenly so quiet. The revolutionaries who built the future now wonder if they still have a home.

Thank you for reading this newsletter right to the end.
