"A new slogan has emerged in the
culture: “Do your own research.” On internet forums and social media platforms,
people arguing about hotly contested topics like vaccines, climate change and
voter fraud sometimes bolster their point or challenge their interlocutors by
slipping in the acronym “D.Y.O.R.”
“Two days after getting the jab, a
friend of mine’s friend had a heart attack,” a Reddit user wrote recently in a
discussion about Covid-19 vaccines. “I’m not saying they’re connected, but
D.Y.O.R.”
The slogan, which appeared in
conspiracy theory circles in the 1990s, has grown in popularity over the past
decade as conflicts over the reliability of expert judgment have become more
pronounced. It promotes an individualistic, freethinking approach to
understanding the world: Don’t be gullible — go and find out for yourself what
the truth is.
That may seem to be sound advice.
Isn’t it always a good idea to gather more information before making up your
mind about a complex topic?
In theory, perhaps. But in practice
the idea that people should investigate topics on their own, instinctively
skeptical of expert opinion, is often misguided. As psychological studies have
repeatedly shown, when it comes to technical and complex issues like climate
change and vaccine efficacy, novices who do their own research often end up
becoming more misled than informed — the exact opposite of what D.Y.O.R. is supposed
to accomplish.
Consider what can happen when people
begin to learn about a topic. They may start out appropriately humble, but they
can quickly become unreasonably confident after just a small amount of exposure
to the subject. Researchers have called this phenomenon the beginner’s bubble.
In a 2018 study, for example, one of us
(Professor Dunning) and the psychologist Carmen Sanchez asked people to try
their hand at diagnosing certain diseases. (All the diseases in question were
fictitious, so no one had any experience diagnosing them.) The participants
attempted to determine whether hypothetical patients were healthy or sick,
using symptom information that was helpful but imperfect, and they got feedback
after every case about whether they were right or wrong. Given the limited
nature of the symptom information that was provided, the participants’
judgments ought to have been made with some uncertainty.
How did these would-be doctors fare?
At the start, they were appropriately cautious, offering diagnoses without much
confidence in their judgments. But after only a handful of correct diagnoses,
their confidence shot up drastically — far beyond what their actual rates of
accuracy justified. Only later, as they proceeded to make more mistakes, did
their confidence level off to a degree more in line with their proficiency.
The study suggested that people
place far too much credence in the initial bits of information they encounter
when learning something. “A little learning,” as the poet Alexander Pope wrote,
“is a dangerous thing.”
Anecdotally, you can see the
beginner’s bubble at work outside the laboratory too. Consider do-it-yourself
projects gone wrong. Power tools, ladders and lawn mowers are easily mishandled
by untrained users who know just enough to put themselves in danger. A study found that U.S. consumer injuries from
pneumatic nail guns increased about 200 percent between 1991 and 2005,
apparently as a result of the increased availability of nail guns that were
affordable for nonprofessionals.
Research also shows that people
learning about topics are vulnerable to hubris. Consider a 2015 study by one of us
(Professor Dunning) and the psychologists Stav Atir and Emily Rosenzweig. It
found that when novices perceive themselves as having developed expertise about
topics such as finance and geography, they will frequently claim that they know
about nonexistent financial instruments (like “prerated stocks”) and made-up
places (like Cashmere, Ore.) when asked about such things.
Likewise, a 2018 study of attitudes
about vaccine policy found that when people ascribe authority to themselves
about vaccines, they tend to view their own ideas as better than ideas from
rival sources and as equal to those of doctors and scientists who have focused
on the issue. Their experience makes them less willing to listen to
well-informed advisers than they would have been otherwise.
There should be no shame in
identifying a consensus of independent experts and deferring to what they
collectively report. As individuals, our skills at adequately vetting
information are spotty. You can be expert at telling reliable cardiologists
from quacks without knowing how to separate serious authorities from pretenders
on economic policy.
For D.Y.O.R. enthusiasts, one lesson
to take away from all of this might be: Don’t do your own research, because you
are probably not competent to do it.
Is that our message? Not necessarily.
For one thing, that is precisely the kind of advice that advocates of D.Y.O.R.
are primed to reject. In a society where conflicts between so-called elites and
their critics are so pronounced, appealing to the superiority of experts can
trigger distrust.
The problem is compounded by the
fact that outsider critics frequently have legitimate complaints about advice
provided by insider authorities. One example might be the initial instruction
from public officials at the outset of the Covid-19 pandemic that people need
not wear masks.
Instead, our message, in part, is
that it’s not enough for experts to have credentials, knowledge and lots of
facts. They must show that they are trustworthy and listen seriously to
objections from alternative perspectives.
We strive to offer careful guidance
when it comes to our own areas of expertise. Even so, some D.Y.O.R. enthusiasts
may reject our cautions. If they do, we hope that they will nonetheless heed at
least one piece of advice: If you are going to do your own research, the
research you should do first is on how best to do your own research."