- Claire Wardle is an expert on online misinformation and disinformation. She provides training to journalists on how to identify and verify misinformation, especially as it relates to the 2020 US election.
- Wardle discusses the different types of misinformation including disinformation (knowingly false information spread to cause harm), misinformation (false information spread unintentionally), and malinformation (genuine information spread to cause harm).
- She notes the tactics of spreading misinformation are evolving since 2016, moving away from fabricated content toward misleading use of genuine information spread out of context. Wardle emphasizes the shared responsibility of platforms, government, journalists, and the public to address the spread of online misinformation.
2. 2
Online Misinformation with Claire Wardle
CLAIRE WARDLE
DEMYSTIFYING MEDIA
Dr. Claire Wardle is the cofounder and executive chair of First Draft, the world’s foremost
nonprofit focused on research and practice to address myths and disinformation.
Claire is a leading expert on user-generated content, verification, and misinformation who
has worked with newsrooms and humanitarian organizations around the world, providing
training and consultancy on digital transformation. She’s a former research fellow at
the Shorenstein Center for Media, Politics, and Public Policy at Harvard. Her previous
roles also include being the research director at the Tow Center for Digital Journalism at
Columbia University.
Dr. Wardle earned her PhD in communications and an MA in political science from the
University of Pennsylvania. Follow her on Twitter: @cward1e
The Hearst Demystifying Media seminar series was launched in January 2016. Curated
by Damian Radcliffe, the Carolyn S. Chambers Professor of Journalism at the University
of Oregon, it provides a platform for leading media practitioners and scholars to talk
about their work.
Through a combination of guest lectures, class visits, podcasts and TV studio interviews,
the series seeks to help students and faculty at the University of Oregon – and beyond – to
make sense of the rapidly changing media and communications landscape.
Previous speakers have come from a wide range of organizations, including the BBC,
Facebook, NPR and Vox, as well as leading academic institutions such as Stanford, Columbia,
Virginia and George Washington University.
Access the archive at: http://bit.ly/DemystfyingArchive
3. 3
Online Misinformation with Claire Wardle
Welcome back to Oregon (Dr. Wardle
previously spoke in this series in 2016), can
you tell us a bit about the training you’re
doing training journalists around the
issues that are likely to emerge during the
2020 election.
Yeah, so as we head into election season,
our biggest concern is local newsrooms,
because if I was trying to push conspiracies
or rumors or hoaxes, I’m going to go where
the industry is most vulnerable, and that
happens to be small newsrooms, many of
which have been stripped of resources. They
have fewer staff, editorial oversight, and so
in many ways, if I am going to be successful,
that’s who I’m going to target.
We’ve been very generously funded by a
number of foundations to get out on the
road, essentially, and train small local news
organizations.
We’re going to 10 journalism schools so
that we can also train up some journalism
students, and we’re paying for journalists to
come in, to fly into these trainings, and we
train them on known tactics and techniques,
things that they might see in 2020.
We’re teaching them how to spot this kind
of information, how to verify it, and then
when they see it, should they report on it?
And if they report on it, how should they do
it so they don’t actually cause more harm by
amplifying the rumors?
Dr. Wardle speaking at a public event hosted by the University of Oregon, January 2020.
4. 4
Online Misinformation with Claire Wardle
Journalistic instinct is that if you see
something in the wild which looks a bit
strange, you want to report it, but can that
be detrimental?
Absolutely, because that’s the tactic. The
tactic is if we give something juicy enough to
a newsroom, they’re going to run it. Maybe
it’s to run it because they’ve been fooled, or
maybe it’s to debunk it, to say, “This is not
true.”
In the process of doing that, if you’re a
newsroom that has a megaphone, essentially,
you are giving legitimacy and oxygen to that
rumor. But the flip side is, is:
As a journalist the temptation is to say,
“Well, actually this is the truth,” and
to debunk it immediately. What’s the
alternative to spreading the truth if that’s
also just perpetuating disinformation?
That’s what agents of disinformation are
hoping, because one of the central tenets of
journalism is more sunlight is a disinfectant,
but actually they’re hoping that you’re going
to report on it.
If it’s a small niche rumor, you reporting on
it does exactly what they want, which is a
megaphone. They don’t necessarily have that
megaphone.
The challenge is, should you report on it?
And if you decide yes, actually it’s traveled
far enough that we should, how do you word
the headline?
The famous example is you wouldn’t say
Obama is not a Muslim, because our heads
are like, “Uh.” You sort of connect the dots.
Instead, you should say Obama is a Christian
and showing me imagery of him going to
church with Michelle, because the way our
brain works is if you tell me that something
isn’t true, I’m like, “Okay,” but you’ve created
a hole in my brain, and I need it to be filled
with the alternative.
Unfortunately, the way that we write a lot
of debunks, the way we talk about it is say,
“This is false. This is a myth,” like “Five Myths
About the Coronavirus.” I might read it as
myths, but five minutes later, if you ask me,
I’d be like, “Oh, I can’t remember if that’s true
or not.” We just have lizard brains.
Journalists are now unfortunately having
toworkwithinthatcontextofusashumans
being not particularly good at making the
distinctions between true and false.
WHEN A RUMOR HAS
BECOME SOMETHING
THAT IS STARTING TO
BE BELIEVED, NEWS
ORGANIZATIONS HAVE A
RESPONSIBILITY TO SLOW
DOWN THOSE RUMORS
AND TO SAY, “NO, WE
KNOW IT’S NOT TRUE.”
BUT THERE’S A SCIENCE TO
HOW TO DO THAT.
5. 5
Online Misinformation with Claire Wardle
You talk about misinformation,
disinformation, and even mal-information.
Whatisthedifferencebetweenallofthese?
I’m a little bit of a stickler for typologies,
only because if we don’t talk about this with
clarity, it confuses what’s already a complex
situation.
Disinformation is false information that
the people who create and share know it’s
false, and they’re deliberately trying to
cause harm.
Misinformation is also false information,
but the people sharing it don’t realize it’s
false and they don’t mean to cause any
harm. That’s the bigger problem here is that
lots of people on Facebook believing things,
sharing it, thinking it’s true, and it’s actually
false.
Mal-information is genuine information,
but by sharing it, you’re doing so to cause
harm. For example, revenge porn. That’s
genuine. Or leaking Hillary Clinton’s emails.
That was true information, but it was leaked
to cause harm.
If you’re a whistleblower and you’re doing
something in the public interest, that’s not
mal-information. But unfortunately a lot of
the stuff that we see is genuine; it’s just out
of context, and it’s being shared to cause
harm.
Dr. Wardle and Professor Damian Radcliffe, at a public event
hosted by the University of Oregon, January 2020.
6. 6
Online Misinformation with Claire Wardle
Those distinctions, I think, are useful to
improve the way that we talk about this as
a society, because those distinctions matter.
Is there also an important distinction in
terms of motivation and intent?
Yeah, there’s three main motivations for
people to do this kind of stuff.
One of it is financial. A lot of this is basically
people wanting to drive clicks to websites, so
they can either sell nutritional supplements
or they can make money off advertising.
The second is political or reputational
benefit. A lot of this we see is trying to get
one up on somebody or even to cause harm
to another business.
But the third category is just social and
psychological.
There are people who just do this because
they want to see if they can, and they like
the idea of hoaxing journalists. Sometimes
people just do it to see what’s possible.
One of the challenges in addressing this,
is that the more you go into this arena, the
more complicated it becomes.
Absolutely.Ithinkjustafterthe2016election
in the U.S., America really woke up to the
problem. I think globally there was more
recognition, but really in America that was
when people recognized it was a problem.
I’d go to convenings where you’d have lovely
Beryl from the library who wanted to talk
about media literacy in the same room as
Chad from the CIA, who wanted to talk about
disinformation against countries. It’s like, “I
don’t all think this is the same thing.”
I do think over the last three and a half years,
there’s been a recognition that this is a
really complex space, and we actually need
to be very clear about what’s the difference
between cyber operations country against
country versus how do we make sure that
my mom doesn’t share a false meme on
Facebook. I think that complexity is becoming
clearer to people.
Are there different ways of disinformation
that we should be more aware of than
others? How do you see that evolving?
Lots of people, when they think about this
space, think about purely fabricated content.
There’s a lot of concern now about deepfakes,
which is using artificial intelligence to
generate videos of people essentially saying
anything. The likelihood of that happening,
particularly in 2020, is actually quite low. The
technology is not sophisticated enough yet.
WE HAVE TO RECOGNIZE
THERE ARE THREE DIFFERENT
MOTIVATIONS, AND IF WE’RE
GOING TO TRY AND THINK
ABOUT SOLUTIONS OR
WAYS TO MITIGATE THIS, WE
HAVE TO TACKLE EACH OF
THOSE MOTIVATIONS VERY
DIFFERENTLY.
7. 7
Online Misinformation with Claire Wardle
People like to focus on, “Oh my goodness,
the fully fabricated stuff.” Again, if you’re
trying to sow this stuff, the most effective
disinformation is that which has a kernel of
truth to it.
If you’ve seen it work elsewhere, and it is a
genuine photo or a genuine video or piece
of content, that’s much more likely to be
effective.
That’s the stuff that keeps me up at night.
How has this space changed and evolved
since 2016? This space is moving very
quickly, including the techniques being
deployed and the actors behind them.
Since the election, we’ve seen the
platforms make changes to their policies.
They have become smarter at identifying
bot networks and fake accounts. Not
enough. But because of those changes,
we’ve seen the actors shift.
For example, you can’t now buy ads on
Facebook in a Russian currency. Because of
that, what we’re seeing is tactics are trying to
get Americans to do this type of work.
We have seen outside actors basically
infiltrate groups on Facebook trying to get
certain narratives pushed.
We see less bot activity and we see
more cyborg activity, which is humans
essentially publishing as many times as a
bot would, but they are real people. So it’s
harder for Twitter to do something about it.
Those kind of shifts have changed.
Also, the policies are clear now about what
they will take down. That’s why we see less
false content, because their policies in many
cases say we need to take that down, and we
see more misleading content, which is much
harder to do anything about.
Or the use of satire. If you label something
as satire, then the fact-checkers can’t touch
it. You can say, “Oh, I was just joking.” You
are pushing lines and narratives that are
damaging, but most people don’t necessarily
know that. On Facebook, everything looks
the same.
THE BIGGER CONCERN OF
THINGS THAT WE SEE IS WHAT
WE CALL FALSE CONTEXT,
WHICH IS GENUINE CONTENT
THAT HAS BEEN SLIGHTLY
TAMPERED WITH OR IS USED
OUT OF CONTEXT, BECAUSE
ACTUALLY IF TRYING TO
SOW HARM, YOU CAN DO
MORE WITH A GRAINY CCTV
FOOTAGE THAT’S BEEN
SLIGHTLY EDITED OR YOU
CAN’T SEE WHAT’S REALLY IN
IT, OR AN OLD PHOTO THAT
YOU REPURPOSE WITH A NEW
CAPTION. THAT’S THE STUFF
THAT WE SEE A LOT MORE OF.
8. 8
Online Misinformation with Claire Wardle
There have been subtle yet pretty critical
shifts in the tactics. My concern is if we fight
what happened in 2016, we’re not prepared
for 2020, because there are changes.
Would you like to see the platforms doing
more? Where does the responsibility lie
between them - as the distributors of a lot
of this content - versus consumers and the
responsibility we need to take to be more
critical consumers of information?
There isn’t one body that can solve this.
It shouldn’t be all on the platforms, all
on government, all on us. It should be
everybody. The challenge is if we’re all
going to be involved, then we need to get
more from the platforms to understand
what’s happening.
We started this podcast talking about when
reporters should report on something. Well,
if reporters don’t know how far something
has spread on Facebook or YouTube, and
they don’t know how many people have seen
something, they don’t necessarily have the
full information about whether or not they
should report on it.
The absence of us recognizing that spread,
it’s an opaque system [and] we don’t have
the data to make proper decisions about
what to study, what to report on, the impact
on society.
A full-house for Dr. Wardle’s talk on ‘Disinformation in the
US 2020 Presidential Election,’ January 2020.
9. 9
Online Misinformation with Claire Wardle
We can’t really tell whether or not
disinformation impacted the 2016 election
because we’re not sitting on the data that
Facebook is sitting on.
Unfortunately, our lizard brains mean if we
see content that reinforces our worldview,
as humans, we want to feel like, “I was
right,” and you want to share the fact that
you were right.
The most effective agents of disinformation
play on our emotions and our fears, and that
works.
Until we all realize that what looks like a
random share while you’re standing in line for
a coffee actually adds up to a really polluted
information ecosystem, this will continue.
We’re a huge part of that, and I don’t think
we as a society have recognized our role in
all of this.
What catalyzed you to get into this field?
Back in 2007, I was an academic at Cardiff
University in Wales, in the UK. I was looking
for another research project, and I became
obsessed with the fact that people were
starting to email news organizations, and
they would send pictures of snowstorms or
other things.
I was like, “How would the BBC know whether
or not that’s true?”
So I did a big piece of research for the BBC on
user-generated content. I had one question
on one survey about Facebook, and that
was it. It’s kind of astonishing how far this
world has shifted in a decade.
There’s been a lot of changes, which has
made it continue to be interesting.
Butwhenyou’reveryniche,wackyresearch
project becomes a thing that the rest of the
world cares about, it’s a pretty astonishing
feeling. I feel very, very fortunate that I
can do the work that I do.
If we aren’t able to go to your training
sessions, how should journalism students
here learn verification skills?
It’s still shocking to me that I go to lots of
J-schools and I get asked to come and do
guest lectures, but it’s not in the curriculum.
It’s shocking to me because these are the
skills that newsrooms absolutely need, and
the sad truth is that lots of even national
newsrooms don’t have many staff who know
how to verify material from the social web.
THERE IS A LOT MORE
THEY CAN DO, BUT ALSO
WE NEED TO REALIZE THAT
IF WE DIDN’T SHARE THIS
STUFF, WE WOULDN’T
HAVE A PROBLEM. THIS
STUFF IS EFFECTIVE
BECAUSE WE’VE BEEN
WEAPONIZED.
10. 10
Online Misinformation with Claire Wardle
I say to J-school students, if you want to
standoutfromthecrowd,thisisabsolutely
the skillset that you should get. We have
lots of free trainings and materials on our
website for exactly this reason, which is, if
you’re a go-getter, you can do this work.
There’s also an incredible community of
journalists online on Twitter who help each
other, share tools and tips. They do weekly
quizzes around how can you do this better.
It’s a great community to get into.
We have a 200-level class called Fact or
Fiction, which all students in our J-School
have to take, and we teach elements of
this in a social media class, but we need
to expose more people to these issues and
keep on top of how fast this is changing.
I’ve probably trained 5,000 journalists
around the world in the last 10 years; and
I’ve probably changed behavior in 25,
because the truth is if you come to a training
session and be like, “Oh, that was great,
Claire.”
If you don’t then go back into a newsroom
and have to practice those skills, you
lose them. The challenge is: “how can we
ensure that actually there are opportunities
for people to just keep their skills up, and
to recognize that every day you can verify
information?”
Even in the last month we’ve had the
downing of the Iranian plane, we’ve had the
coronavirus, you’ve got climate, bushfire
rumors in Australia.
Almost any story now has an element of
misinformation connected to it, and so this
should be built into the curriculum as a
running piece in reporting classes and other
types of classes.
How do we go beyond journalism schools
and newsrooms to reach members of the
public with these skills? They also need to
be equipped to be critical consumers of
news and information.
I think just after the election in 2016,
everybody was like “How do we solve
this problem? If only we just tweaked the
algorithms, we’d get out of this mess.”
I think now there’s a recognition that this
is just our new reality, and actually it’s
about building resiliency. If I go outside, I
have freckles, I have to put on suntan lotion,
and that’s just the reality.
I think we need people to recognize that
we’re never going to clean up our polluted
information environment, and instead it’s
how do people navigate that environment.
That means giving them the skills and tools
to do so.
I ABSOLUTELY AM TELLING
YOU NOW, THERE ARE
BIG JOBS OUT THERE
FOR PEOPLE WHO HAVE
SHOWN THAT THEY’VE
GOT THESE SKILLS.
11. 11
Online Misinformation with Claire Wardle
The skills and tools that you would get
taught in Fact or Fiction or in the New York
Times newsroom, most of those tools are
free. Actually, my mom or anybody else
could use those same tools.
We just need to be better at teaching them
and showing what those steps are, because
for most misinformation, if you spent 60
seconds, for most misinformation, you could
figure out that it’s not true.
Thethingiswedon’thavethemusclememory
to say instinctively, “When I see something
on Facebook, let me just Google that source.
Ah, they’re known for conspiracies,”
“Let me just reverse image search that meme
I’ve just seen on Instagram. Oh, actually, that
was from three years ago. That’s not from
China and the coronavirus.”
We just need as a society to be that, and we
need to say to one another, “Hey Damian,
couldn’t help but notice that you just shared
something. I do it all the time, too, Damian.”
I shouldn’t shame you. “I do it all the time. I
don’t know if you know this, but if you just
download this Chrome extension, this might
stop you doing it in the future.”
We need to hold each other to account. If you
throw a can of Coke out of the window, I’m
like, “Damian, I don’t want you to litter. We
both live in this society.”
Dr. Wardle being interviewed in the Demystifying Media TV Studio.
Watch the video on YouTube.
12. 12
Online Misinformation with Claire Wardle
We just aren’t talking about it in those terms.
Are people inclined to go to that level of
effort? A lot of people would think that’s
actually quite hard and “Well, I don’t
have 60 seconds. What’s the six-second
version of this?” or “It’s somebody else’s
responsibility. It’s not mine.”
Well, the six-second version is that we need
to build in friction to the system.
The six-second version is just to say, “We
need to be really aware that what we share,
there’s a responsibility that comes with that.”
I don’t like to talk about shame as a way to
dothis,butIdothinksocietyhastochange.
For example, 50 years ago, if we were at a
party and you were drunk, I might be like,
“Get home safe.” There wouldn’t be any
shame about me letting you drive home.
Now if I didn’t take the keys away from you,
there would be society saying, “No, you don’t
let people drink and drive.”
It sounds silly, but ultimately we have to get
to a position which is if we can share anything
and publish anything, then we get into a
mess. Instead, what’s the responsibility?
Look at the coronavirus, I really worry about
what I’m seeing being spread. Actually, the
fear around it might be worse than the actual
virus. If we have a situation there, we have a
responsibility for what we share because we
could cause harm simply by sharing.
We have to have that societal level, which
is if you can’t check it in six seconds, don’t
share, because if you share something
that’s false, let’s recognize that that could
cause harm.
… I have to say, I see individual New York
Times reporters sharing all sorts of false
information that they would never be able to
publish on the newspaper.
I think this conversation is overdue, and
I think there’s now more of a recognition
that as trust in news organizations and
journalists decline, everybody has to take
responsibility.
A badly-thought-out tweet on a Saturday
afternoon can do really serious damage.
ACTUALLY, IF FACEBOOK
REFUSED TO LET ME SHARE
SOMETHING IF I HADN’T
READ IT OR HADN’T OPENED
THE LINK, OR IF TWITTER
REFUSED TO LET ME RETWEET
SOMETHING UNLESS IT HAD
BEEN 30 SECONDS, THERE’S A
HUGE AMOUNT OF EVIDENCE
THAT THAT WOULD TAKE
OUT A LOT OF THIS STUFF,
THAT OUR LIZARD BRAINS
REACT IMMEDIATELY, BUT IF
YOU SLOW DOWN, THEN YOU
DON’T DO THAT.
13. 13
Online Misinformation with Claire Wardle
So much of this is ego. They want to be
seen to be the person who’s got the most
information. Obviously, a couple of days
ago when Kobe Bryant died, I had to get off
Twitter because it was just an absolute mess.
It was journalists wanting to say, “I know
more than the next person.” It’s like, well,
that’s not helping anybody.
What gaps you see in terms of our
knowledge and research. What are the
opportunities to do something meaningful
and impactful which has perhaps slipped
through the cracks?
First Draft is a global organization. For
example, this year there’s also an election in
Myanmar. When you’re in the U.S., it’s very
easy to become U.S. focused, but First Draft
as an organization does work globally.
We’re also trying to do a lot more work now
on not just election-related disinformation.
As we just talked about with the coronavirus,
the rest of the world, the biggest concern
is health and science misinformation or
misinformation about food. That, for me,
there’s real harm around that. Even in the
U.S., the anti-vax movement, to see measles
blow up again in a way that we thought it had
been eradicated is heartbreaking. Same with
polio in places like Pakistan and Afghanistan.
We shouldn’t be moving backwards in this,
and that’s what I really worry about. At First
Draft, we’re trying to do a lot more work in
that kind of area.
In terms of research, there are so many
fascinating research questions that we just
need to work harder on.
I’ll say this, I think the research community
has become a little bit paralyzed by saying,
“We need data from Facebook. We need data
from Facebook.”
They’re not going to hand it over anytime
soon unless there’s government regulation,
but we should be doing more research with
audiences.
How concerned are audiences? How do
audiences really consume information on
their mobile phones? How do they share it
with their friends via email?
We don’t talk about how conspiracies spread
on email from crazy Uncle Bob.
We focus on WhatsApp.
But at the same time as email, I’m fascinated
by TikTok and new visual platforms that
we are already seeing disinformation and
conspiracies spread.
So, I’d love to see faster turnaround, peer-
reviewed academic research, but in a time
frame that actually helps practitioners
with their work right now.
MY FRUSTRATION SLIGHTLY
WITH THE ACADEMIC SETUP
IS WE’RE SEEING PAPERS
PUBLISHED NOW ABOUT
THE 2016 ELECTION. I’M LIKE,
“THANKS FOR DOING IT, BUT I
NEED TO KNOW WHAT TO DO
NOW FOR 2020.”
14. 14
Online Misinformation with Claire Wardle
I’d also like to see a lot more global
research. Most of the research has been
done in the U.S. with American audiences,
and this space looks very different if you
live in Brazil or you live in Indonesia.
I’d love to see partnerships between and
across borders. There’s so much that needs
to be done.
In the absence of good, rigorous research,
we’re seeing really poor regulation.
Governments are moving because
they’re terrified and they think that they
should do something. Also, in brackets,
they’re politicians, so they’re worried that
disinformation is going to affect their own
elections. But they are basically passing
laws based on a vacuum of information, so
there’s a real need for vigorous empirical
research. We’re not seeing enough of it,
and we’re not seeing enough of it quickly
enough.
Oneofthethingsyou’vetalkedextensively
about during your visit is encouraging
newsrooms to think – and hire - differently,
by bringing in people who can potentially
help newsrooms avoid making some of the
mistakes that you see and report on.
Absolutely. So much of this is about
understanding internet culture.
If you have grown up with the internet, there
is an understanding of how that culture
works, and sometimes you see newsrooms
write pieces that are really difficult to read
because it’s like, oh, God, you’ve picked up on
dog whistles without realizing that they were
dog whistles. You’ve missed the point of that
meme. So I would love to see that.
So there is a challenge there, that there is
a lack of trust from communities who are
beingtargeted.Theyarenottalkingtotheir
newsrooms about anything. They’re talking
to their community news organizations.
We’ve been banging on about this for
years and years and years, that diversity in
newsrooms is absolutely critical, always has
been, but when it comes to this topic, it’s
even more important.
THE OTHER PIECE OF
THIS EQUATION IS
COMMUNITIES THAT ARE
MUCH MORE VULNERABLE
TO DISINFORMATION
TEND TO BE COMMUNITIES
OF COLOR, AND THOSE
COMMUNITIES ARE
MUCH LESS LIKELY TO
BE REPRESENTED IN
NEWSROOMS.
Watch full talks from the series on
YouTube
In a hurry? Catch the key lessons in these
TV Studio Q&As
Listen to the Demystifying Media podcast
on iTunes, Spotify and SoundCloud