Past Event

Decoding the Disinformation Problem

While the current conversation about information operations – including disinformation – is largely focused on the 2016 and upcoming 2020 US federal elections, this challenge is an age-old issue. During the Cold War, the US and Soviet Union both engaged in soft diplomacy in an attempt to influence outcomes across the globe.  Since the expansion of the Internet and development of sophisticated online targeting tools, organized disinformation campaigns as well as homegrown disinformation efforts have impacted elections, caused violence, and presented a serious challenge for policymakers and the major internet platforms. 

Moderated by CNN anchor Kate Bolduan, this event examined the overall disinformation landscape to frame the policy issues that need to be addressed, offer an historical overview of disinformation campaigns, assess current patterns of disinformation campaigns, and discuss challenges ahead – including protecting vulnerable places with the potential for violence.  

Selected Quotes

 

Jane Harman

"It's more important than ever to provide real facts and trustworthy information. Everyone's being tested, including the Wilson Center. We're doing our best, but we need our Fourth Estate — the media — to do its part."

"In addition to deepfakes, state and non-state actors nowadays infiltrate group chats on social media and use fake online identities to seamlessly blend into a user's news feed. For example, in Ukraine, Russians have attempted to rent the Facebook accounts of Ukrainian citizens so they can post political ads and circumvent Facebook's regulations."

Nina Jankowicz

"While a lot of the tactics are the same, the tools have changed. … Social media allows those bad actors [including Russia] to spread those messages much more quickly. They allow them to travel at lightning speed, and also allows those messages to be targeted to the very people who will find them most appealing. That's what makes what we're seeing today so much more difficult to counter."

"Education is really important, and I call these citizens-based solutions — not just looking at children when we're talking about education, but voting-age people and helping them navigate the flow of information online."

Jessica Beyer

"I'm sure everybody here has the experience of someone you trust…[sharing] a story on Facebook or another place, you read the headline, and share it because you trust that person. There's research out there that shows that that type of process is the perfect fertile ground for disinformation to spread. … We don't want people to not trust each other. Societies need trust to function well, particularly democracies, so how do we give people the critical thinking skills to ask that question?"

"We can see how information is moving, we can understand ways in which certain types of platforms are being used to spread that disinformation, we can see people reacting to the spread of memes on Twitter. … My sense is that, can we qualitatively or quantitatively say what exactly happened [due to disinformation during the 2016 election]? Probably not. But we can say that we're in a different landscape in which you have organized actors working to spread information."

Ginny Badanes

"What's the saying? 'A rumor will make it around the world before the truth can get its pants on.' That's just amplified by how we're connected. … Information travels faster between us, and there are benefits to it, but it also means the bad stuff gets through."

"Leading up to 2016, I worked on campaign tech for the company [Microsoft]. So I worked with political campaigns on how they use technology, sort of an evangelist for tech. And that included some security features, but that was not our focus. Clearly, my title now is cybersecurity and democracy. We're focused on campaign security, election security, and disinformation defense. So the whole fact that that team got spun up…as a company we recognized that we had a role to play to help protect democratic institutions."

Katie Harbath

"We're a completely different company than what we were in 2016. I've been at the company now for eight and a half years, and I've not seen such a big shift in the work and the focus of our company on a topic like this since when we did the big mobile shift in 2012, and I would say this one was even bigger. Those changes include more expertise. Expertise in everything from cybersecurity to threat intelligence, to local expertise on the ground."

"There's never going to be a finish line. There's never going to be a point in time where we're like, 'We solved it! Let's move onto the next thing, all the fake news and disinformation is gone.' We've gotten much better in terms of cracking down on fake accounts, more transparency in terms of ads, etc., but … they're now moving onto other areas."

"At Facebook…we believe that I should be able to post on Facebook that the sun rises in the west, but I don't have a right for that viewpoint to be amplified. For much of misinformation, that's why we don't take it down, but we do reduce the virality of it and we do try to provide related articles so that people can see alternative viewpoints."

David Greene

"No matter what the harm we're trying to prevent is, I think we have to be really careful before we embrace a role for government that either decides truth or decides who can speak and who can't speak. This, election interference, is just one of the many legitimate harms that we've had to confront over the course of our democracy where we've had to make that judgment. We've had to say that government just has a limited role in certain aspects here. I don't think government should do nothing about election interference, but what I don't think we want government doing is to either be the ultimate arbiter of truth — which I think is an impossible role to ask government to play in many situations —  and I don't think we want government, especially, telling certain people that they're not able to speak."

"There are both reasons to be extra sensitive right before an election, where you might actually want to take down things because there's not enough time to correct something, but there's also the idea that people — their voice might be especially powerful right before an election. This might be a time we actually want to be especially sensitive to people's right to speak. So you get the problem both ways. … From a user perspective, a lot of the problems with these tools are people don't know how they work, and it makes it very difficult for them to make them work the way they think they're working, and how they want them to work. Very few people understand how they get information on Facebook."

Introduction:

Jane Harman

President and CEO, Woodrow Wilson International Center for Scholars

2:00-3:30 pm---Panel 1: Patterns

Jessica Beyer

Lecturer, Jackson School of International Studies, University of Washington

Nina Jankowicz

Fellow, Woodrow Wilson International Center for Scholars

Ginny Badanes

Director of Strategic Projects, Cybersecurity & Democracy, Microsoft

3:30-5:00 pm---Panel 2: Challenges

Katie Harbath

Global Elections Director, Facebook

David Greene 

Civil LIberties Director, Electronic Frontier Foundation

Moderator

Panelists

Hosted By

Science and Technology Innovation Program

The Science and Technology Innovation Program (STIP) serves as the bridge between technologists, policymakers, industry, and global stakeholders.   Read more

Science and Technology Innovation Program

Digital Futures Project

Less and less of life, war and business takes place offline. More and more, policy is transacted in a space poorly understood by traditional legal and political authorities. The Digital Futures Project is a map to constraints and opportunities generated by the innovations around the corner - a resource for policymakers navigating a world they didn’t build.   Read more

Digital Futures Project