Fact from Fiction: Bullshit in the Age of AI
- David Ando Rosenstein
- Apr 8
- 3 min read
Harry Frankfurt’s seminal essay On Bullshit draws a vital distinction: the liar knows the truth and tries to cover it; the bullshitter, by contrast, doesn’t care about truth at all. They operate with indifference to reality—what matters is persuasion, impression, performance. In the past, bullshit was largely human. Now, with the emergence of AI and machine-generated media, bullshit has scaled.
Welcome to the next era of epistemic chaos.
A New Era of Bullshit
Artificial Intelligence is now capable of creating hyper-realistic text, images, audio, and video—none of which require a direct connection to reality. Deepfakes can manufacture convincing visual “evidence” of events that never happened. Language models can produce articulate, confident prose that masks uncertainty or fiction with fluent bullshit. These systems don’t lie intentionally—they don’t know what truth is. They are perfect bullshitters.
And they’re everywhere.
From news articles written by AI, to comment sections full of bot-generated responses, to synthetic influencers and staged events, we are entering an age where information can no longer be taken at face value. Frankfurt warned that bullshit is more dangerous than lies because it erodes the very conditions that make truth meaningful. Today, those conditions are under siege.
Freedom of (Manipulated) Information
In the past, “freedom of information” was a rallying cry for democracy and transparency. But freedom without structure, standards, or responsibility has become a double-edged sword. Social media—once imagined as a utopian tool for open dialogue—has morphed into an unregulated marketplace of attention, outrage, and manufactured consensus.
These platforms are not built to promote truth or diverse voices. Their underlying systems—their algorithms—are optimised to generate engagement. That means whatever content keeps you scrolling wins, regardless of its accuracy or ethical value. Truth, nuance, and doubt are buried under a pile of rage-clicks, tribal content, and economic incentives.
In such an environment, “freedom of speech” becomes a hollow phrase—co-opted and weaponised by those who wish to dominate discourse, not liberate it.
The End of Observational Certainty
The epistemological shift is terrifying: we may no longer be able to verify anything we haven’t personally observed. But even observation itself can be compromised. A video of a politician saying something inflammatory? Could be AI. A photo of a disaster or protest?
Might be a synthetic composite. Written articles, citations, even “expert opinions” can be fabricated or subtly altered.
This challenges one of the core assumptions of modern society—that shared facts exist and are accessible. The new landscape demands a radical shift in how we relate to information.
Beyond Content: Learning to Navigate Frameworks
What’s required now isn’t just teaching people what to think or even what’s true, but how to evaluate information within these new emergent systems. This includes:
Skeptical Epistemology: Treat all media with mindful suspicion—not cynical rejection, but critical curiosity.
Algorithmic Literacy: Understand how information is curated, not just consumed. Know what drives the system.
Source Triangulation: Never rely on a single source. Develop the skill of cross-referencing, not just absorbing.
Pattern Recognition: Learn to spot the signs of generated, biased, or manipulative content.
Value-Centered Reasoning: Ground your judgment in ethical principles, not just fact-checking.
We need to build a new set of discernment muscles—cognitive, emotional, and ethical tools that help us function in a world where appearance and reality are dangerously decoupled.
Toward an Ethic of Information Integrity
Social media platforms cannot be left to self-regulate. Their incentives are economic, not epistemic or moral. Without structural oversight, their systems will continue to erode public discourse, amplify division, and weaponise freedom against the very people who defend it.
We need new frameworks—legal, social, and technological—that prioritise integrity over virality. We need a civic ethic for the information age.
This is not just a call for individual mindfulness—it’s a demand for collective responsibility. The bullshit is no longer just in our culture; it’s coded into our systems.
And unless we intervene—educationally, ethically, and structurally—we may find that we’ve lost not just the truth, but the ability to care about it at all.

Comments