What Is Misinformation? A Plain-English Guide
You’ve heard the word a thousand times. Misinformation. It gets attached to everything from dangerous health rumors to political arguments to honest mistakes. But what does it actually mean — and why does the distinction matter?
Most people use “misinformation,” “disinformation,” and “fake news” interchangeably. That’s understandable, but it’s also part of the problem. When everything is called the same thing, it becomes harder to recognize what you’re actually dealing with — and harder to know what to do about it.
This guide breaks it all down in plain English.
The Three Types of Information Disorder
Researchers who study this field use a framework developed by journalist and scholar Claire Wardle that divides false or harmful information into three distinct categories. Each one works differently — and requires a different response.
1. Misinformation — false, but not intentionally harmful
Misinformation is false information shared by someone who believes it to be true. There’s no malicious intent. The person sharing it is simply wrong — and often doesn’t know it.
Examples: a friend sharing a health remedy that doesn’t work, someone reposting a news story from years ago as if it just happened, or a well-meaning relative forwarding a WhatsApp message full of inaccurate statistics.
The danger isn’t the intent — it’s the reach. Misinformation spreads easily precisely because the people sharing it believe in it.
2. Disinformation — false, and deliberately so
Disinformation is false information created and shared with the specific intent to deceive. Someone made it up on purpose, or knowingly spread something they knew was false.
Examples: fabricated news articles designed to look real, doctored photos released to discredit a public figure, or coordinated social media campaigns spreading lies to influence an election.
Disinformation is a weapon. It’s manufactured, targeted, and often funded. The people behind it know exactly what they’re doing.
3. Malinformation — true, but used to cause harm
This one surprises most people. Malinformation is information that is technically true, but shared with the intent to harm.
Examples: leaking someone’s private medical records to damage their reputation, publishing a person’s home address to expose them to harassment, or selectively releasing real but out-of-context information to destroy someone’s credibility.
The content is factually accurate. The harm comes from the intent and the context.
Why Does “Fake News” Fall Short as a Term?
The phrase “fake news” exploded into mainstream use around 2016. It’s catchy and intuitive — but it has two serious problems.
First, it flattens important distinctions. A grandparent sharing a false remedy they genuinely believe in is not doing the same thing as a foreign government running a coordinated disinformation campaign. Calling both “fake news” obscures the difference in intent, scale, and danger.
Second, it has been weaponized as a dismissal. The term is now routinely used to discredit legitimate journalism that powerful people find inconvenient. When “fake news” means everything, it effectively means nothing — and that suits people who want to muddy the waters.
That’s not to say you should never use the phrase casually. But when it matters — when you’re trying to understand what you’re dealing with — the three-category framework is far more useful.
How Misinformation Actually Spreads
Understanding the type of false information is only half the picture. The other half is understanding why it travels so fast.
Research consistently shows that false information spreads faster and further than true information on social media. A landmark study from MIT found that false news stories were 70% more likely to be retweeted than true ones. The reason isn’t bots or algorithms (though those help) — it’s human psychology.
False stories tend to be more novel, more emotionally charged, and more surprising than accurate ones. Our brains are wired to pay attention to the unexpected. A story that confirms our fears or our hopes feels urgent in a way that a measured, accurate account often doesn’t.
By the time a correction appears, the false version has already reached millions of people — and corrections rarely travel as far as the original claim.
How to Protect Yourself
- Pause before you share. The single most effective thing you can do. Ask yourself: do I actually know this is true, or does it just feel true? Emotional resonance is not evidence.
- Check the source, not just the headline. Who published this? Is it a known outlet with editorial standards, or a site you’ve never heard of? Click through and read beyond the headline before forming an opinion.
- Search for the story elsewhere. If something significant happened, multiple credible outlets will be covering it. If you can only find the story in one place, that’s a warning sign.
- Ask: who benefits from me believing this? Disinformation always serves someone’s interests. Follow the incentives. Who gains if this story spreads?
- Be especially skeptical of content that makes you angry or triumphant. Strong emotions are the vehicle that carries misinformation. If a story makes you want to immediately share it, that’s exactly when to slow down.
- Distinguish between the claim and the evidence. A headline is not proof. A quote is not proof. A statistic with no source is not proof. Ask: what is the actual evidence behind this claim, and where can I verify it?
The Takeaway
Misinformation isn’t a new problem — rumors, propaganda, and false beliefs have existed throughout human history. What’s new is the speed, scale, and sophistication with which false information now travels.
The good news is that awareness is genuinely protective. Studies show that people who understand how misinformation works are significantly less likely to share it. You don’t need to become a professional fact-checker. You just need to build the habit of asking one more question before you believe — and before you share.
That habit, repeated millions of times a day by millions of people, is the most powerful antidote we have.
This is Article 3 in Viralium’s Learn series — practical guides to thinking more clearly in a noisy information environment. Next up: how confirmation bias makes smart people share false stories.