Craig Silverman is the award-winning author and Canadian journalist who is believed to have coined the term “fake news” in 2014. The founding editor of BuzzFeed Canada and now BuzzFeed’s media editor based out of Toronto, Silverman will speak at True North, Communitech’s three-day ‘Tech for Good’ conference that opens May 29.

From 2004 to 2015 Silverman wrote the blog Regret the Error, about media accuracy, verification, transparency and accountability, and which became a 2007 book under the same title. He has followed the “fake news” phenomenon since its inception and watched as the term was co-opted by U.S. President Donald Trump and, now, by anyone who seeks to cast doubt on stories they don’t like or necessarily agree with.

Born in Nova Scotia and a Concordia University journalism graduate, Silverman has become one of the world’s leading experts on issues centred around media, propaganda and misinformation. He spoke recently to Communitech News about the fake news phenomenon and the topics he hopes to address at True North.

Q – Welcome, and thanks taking the time to chat. It must be somewhat surreal to see the way the term “fake news” has morphed from the one you originally intended back in 2014, to describe made-up information, to one that’s now used worldwide as a epithet for a story that anyone, including Donald Trump, doesn’t like.

A – [Fake news is] an area I’ve been looking at for years. And people sort of cared about it, but it wasn’t like it was one of the key topics of discussion going on in the world, and then all of a sudden in the wake of the [U.S. presidential] election [in 2016] it was.

So that, first of all, was a crazy thing, and then to see that term become the term of the moment – to see fake news being one that was used first, I think, by a lot of people in the media, to become a topic of conversation, a topic of concern. And the thing that was really wild to watch happen was to see fake news being used as an explanation for why Trump won – and which I don’t think is true; I don’t think fake news was the reason he won the election. Then you saw Trump getting upset about that [claim]. You saw him taking that personally, I think, and at a certain point, and I don’t know if it was completely conscious or not, he decided that fake news was not going to be something people labelled him with; it was going to be something that he labelled other people with.

There was a press conference he gave, which to this day is the last real full press conference he has given as president, when he walked out and he called CNN fake news. I think that’s the moment when that definition started to shift. It really is an amazing thing that he was able to achieve: He took a term that was really a negative attachment to him and his presidential victory, and he flipped it around and he [now] uses it as a bat with which to beat the media.

I honestly think, at this point, fake news is more what Trump says it is than probably what anyone else says it is. The term has become so weaponized and redefined, that he has made it what he wants it to be. Or at least he has created enough confusion about it that it has been rendered less effective.

Q – As you’ve written about in the past, we now live in a time “when real information is painted as fake and manufactured bullshit is presented as fact.” It’s a strange place to be ...

A – It is. It’s a very complex, strange and destabilizing moment right now.

The good thing is that there is really broad awareness of how easily the media that we interact with every day can be manipulated, whether that is social media accounts that can be bought on the open market, whether that’s people paying to get likes and shares for their content, or whether it’s folks being able to create and spread things that are completely untrue.

On one hand, it’s a media environment that is great because it’s democratized and more voices can play in it, but that also opens it up to incredible amounts of manipulation and misinformation. So we’re in a moment right now where we’re all trying to figure this out and I think that’s why it feels chaotic and difficult for people, because we realize there are all of these problems but we don’t know what all the solutions are yet.


Q – You’ve talked in the past about the need to delineate between the honest errors made in course of reporting and people who lie for profit. How do you arm people with the tools to make that distinction?

A – There are a few key players in this. One is, for the average person, I think we have to find ways to have everyone realize that the information environment that we’re all operating in is drastically different than it was 15 years ago. It’s much more confusing. It’s much more chaotic. We are overwhelmed with information and we have it at our fingertips at all times.

We have more choice, but that means we have more responsibility. We have to think about where [a given] piece of information comes from. For instance, we may see something that was shared by somebody we know and like but we’ve never heard about its [source] website before.

We have to do due diligence for stuff that’s coming across our gaze and I think we also have to recognize that we also have some power in that the stuff we choose to share or retweet or like: Are we giving it oxygen or starving it of oxygen? Every little action you take does actually have an effect on whether that content is shown to more people or not.

There is definitely a role in the media doing their job responsibly, so not knowingly or unknowingly spreading stuff that’s misleading or false and also trying to direct people to quality information.

There’s a role for education. How do we equip the next generation but also how do we help people of all ages navigate this world in better ways?

The last piece is, I think there’s a big role for the technology companies, particularly the big platforms, to think about how they do a better job of keeping clear of bad actors and clear 100-per-cent false information off their platforms. There’s certainly more that they can be doing.


Q – Big news outlets, the ones that in the past we would associate as being responsible, are now in a position where, if they make an error and admit it, they open themselves up to a Donald Trump or a Trump-like person to pointing and saying, ‘You see? Fake news!’ It’s a difficult place to be when they’re trying to legitimately and honestly report what is going on.

A – I’ve spent a long time looking very specifically at the kind of mistakes journalists make, intentional ones and unintentional ones. I’ve never seen an environment where a mistake that is made unintentionally and then corrected in good faith can still be wielded as a weapon against the credibility of that organization.

The traditional way that corrections have worked is that they are trust-building: You make a mistake but you correct it and you do so publicly and quickly and that’s a way to restore trust and show you’re worthy of trust, because you’re actually willing to admit mistakes.

But today, because people are really looking for ammunition and have huge networks of websites and Twitter accounts and other things to push out messages, you are seeing journalists who work in good faith, and make a good-faith mistake, then being punished for correcting it. And you see people who are making stuff up and never acknowledging it never having to face [any consequence]. That’s a really difficult scenario for journalists to operate in and the outcome is not good for the public. There needs to be a cost if someone is consistently spreading unreliable information. There should be a reputational cost.

And for those who are willing to acknowledge when they get something wrong, there should be a benefit for them, but it doesn’t always work that way.


Q – You’ve mentioned before that the current environment has become so polarized and politicized that it now jeopardizes our ability to confront it. What do we do?

A – There is a non-partisan conversation to be had about how we balance knocking down misinformation and making sure free speech is not harmed in the process. There is a good, productive, non-partisan conversation to be had about the roles platforms should and should not play in that. And also to talk about what we expect from political leaders in terms of their accountability and their honesty and about the appropriate way to call them out when they’re not being honest and not being truthful. That reasonable middle ground feels like it has collapsed, that people are behaving in tribal factions and so the work of looking at this stuff and figuring out overall what we as a society should be doing becomes harder, because people are off in their corners and looking for ammunition to use in a battle rather than to sit and figure out common ground. That is one of the things that worries me. The conversation around how we deal with online misinformation has become so rife with misinformation and partisanship that we can’t be as productive about it as we maybe could be.


Q – You’ve also spoken before about people being prone to uncritically accepting information they want to believe is true and then sharing it. Is there an answer to that one?

A – When you boil a lot of this down, one of the core things about online misinformation and the whole fake news conversation is that, at one level, it comes down to human behaviour and human psychology. We’re inclined to believe things that align with things we already believe to be true or things that are in our self-interest. That’s not going to change.

What certainly has changed is our ability to be actors and propagators and distributors and creators of that kind of information. So, in the past, sure, you might consume media that aligned with your views, but the universe of the stuff you could pull from was relatively limited. Now it’s unlimited. So we have a lot more extreme stuff. When people encounter that and it aligns with what they believe, they then gravitate towards it. That’s the piece that’s different. Human behaviour hasn’t changed but I do think our natural human behaviours make this environment even more challenging. It’s very easy to gorge on the stuff that aligns with everything you believe. And nobody really wants to spend time reading stuff that challenges them and goes against what they already think or have already decided is true.

There is some good news. A lot of the recent research that has been done looking at people’s media diets has found that there certainly is a segment of the population that is off on the extremes. But some research suggests we’re maybe a little more diverse in our media consumption than people had worried we are. Everyone had come to the belief that we’re all stuck in filter bubbles and echo chambers, and that’s the way it is. We’re not there yet. There is still work to be done to make sure we don’t fully get there and to think about how we pull people back from the extremes, but we’re not so far gone yet.


Q – Democracy and a fair exchange of information can only happen in a place where there’s a reasonable level of education and intellectual curiosity about events and interest in learning about things and events. How does one combat lack of awareness? Or a lack of curiosity? How does one combat gullibility?

A – There are people who are exploiting ignorance and bias for, in some cases, political or ideological reasons, but there’s also a business model in that now more than ever before. The business model for online misinformation is absolutely better developed than at any time in human history and it’s also an international market. You have people overseas who target Americans and Canadians and people in the U.K., because as an advertising audience they’re worth more than the people in that person’s home country. For them, it’s much cheaper and easier to just make stuff up or to steal content that’s extreme or partisan and copy and paste it on a website and promote it. Preying upon folks is a real thing and a real factor. Facebook created this economic model, and now they’re trying to undo that.

Q – Is better education an answer? In the U.S., right now, you have teachers being compelled to march for a living wage and to gain simple classroom resources. Does an eroding education system have a role to play in making people vulnerable to fake news?

A – Education is definitely a piece in this, but I think we have to be careful of thinking it’s going to solve everything. But absolutely, if you have people who aren’t being equipped to navigate this world, not only in terms of information but other elements of society, it’s a problem. An uninformed society is one that’s easier to manipulate, easier to capture and easier to push in directions they otherwise wouldn’t go. An informed society is certainly one of the bedrocks of democracy. So, if we allow educational systems to erode, that’s going to affect society as a whole. It’s going to trickle down to the workforce.

Last night I was on a panel speaking to a room of a couple hundred teachers in Ontario. They’re going to be receiving a new curriculum to help teach information literacy. We’re seeing more investment in that. It was great to see a room full of teachers who were excited to about trying to bring these kinds of skills and thinking into their classrooms, and to [help students] evaluate stuff on their own.


Q – Are you looking forward to True North? Can you give us a preview of what you will talk about?

A – It’s an amazing group of speakers who have been brought together for this [conference]. One of the things I want to talk about is this environment we’re in, and the elements of manipulation. And also, for the folks who are trying to build technology and build companies, they need to understand the ways they can be a good player in this environment or a bad player in this environment – to think about, as they’re building products, what kind of bad actors might take [their IP] and exploit it. There are so many examples out there of products where people could not imagine how they could be caught up and used for nefarious means, whether that’s related to information or other things, and suddenly it’s happening.

This interview has been edited for brevity and clarity.