Protests, posters, Zuccotti Park, New York City, September 2011 … Occupy Wall Street. Think back to how you first heard about this social movement, which sought to shed light on the issue of economic inequality. Was it in the news? Or was it on social media? What about #BlackLivesMatter? Did you first learn about police brutality against people of color on TV? Or did you spot the hashtag while scrolling on your smartphone?
How wonderful that news can gain attention on social media.
In my conversation with iSchool Professor Jeff Hemsley, I learned about his research interests: politics and virality, fake news, Russian interference, and social media, to name a few.
Jeff, Sam Jackson from Maxwell, and Jenny Stromer-Galley from the iSchool collected tweets from the 2016 presidential election and plotted which candidates re-tweeted which news sources. One of the news sources being re-tweeted was @infowars— a conspiracy site that, according to Jeff, “makes shit up.”
How horrible that fake news can gain attention on social media.
Clearly, social media is a double-edged sword when it comes to the spreading and sharing of information. It has ‘democratizing power,’ allowing laypeople to leverage real news, and at the same time, it is susceptible to the spread of fake news by powerful actors.
During my interview with Jeff, he shared how these powerful actors influence social media, and how fake news – once called propaganda – has been an issue for quite some time. More importantly, Jeff suggested what information professionals can do to educate people on this phenomenon. Here’s what Jeff had to say.
Q: Can you tell me how you became interested in social media and misinformation?
My bachelor’s degree was in economics, math, and statistics and the economics topic I was interested in was information asymmetry: when one party has much more information than another party, and how that can skew results and things. So that’s what motivated me to want to go find out about information schools and get a Ph.D. … to think about ways to level the playing field.
In my second year as a Ph.D. student, Occupy Wall Street was starting to happen and the Arab Spring stuff was happening. What I realized was that social media is this place where, to some degree, the information playing field can be kind of leveled. Anybody can get attention. And so I thought, this is a way to think about information asymmetries or a solution to information asymmetries.
Democratization and Social Media
When I started studying social media, it had a lot to do with politics and social movements and the democratizing power of social media. Because in social media, instead of having news organizations as these all-powerful gatekeepers that control information flows, we’re all gatekeepers. And that means no one gatekeeper has as much power as they used to. So that’s what got me into this.
Now, of course, we’re several years later … and what’s happened? Well, those same forces, the same thing that made social media this place that really can, you know, it can make it so somebody’s grievances can be elevated to a public level where lots of people can get attention to them, and that can cause things to get better because people make a lot of noise, we end up with policies. That still happens, right? Black Lives Matter essentially made us aware of the fact that police brutality happens, and has been happening for a long time.
Social Media, Trolls, and Conspiracy Sites
But it also means that other sophisticated actors … other actors who know how to play the game and get attention can do nasty things. I’ll give you an example. We collected tweets for the 2016 presidential election. And the way that we collect tweets is we use all the candidates’ Twitter handles as collection terms or query terms. So anytime they send a tweet, we get it. But if you mentioned a candidate, we probably have that tweet, too. If you retweeted one of the candidates, we probably have that tweet. If one of the candidates retweeted your tweet, we have those tweets, too.
We have 400 million tweets around the election. We have Russian trolls in our data. A couple weeks ago, Congress released the list of Twitter accounts that Twitter gave to Congress … because Twitter had said that they had identified these 2,700 and some odd accounts as being Russian operatives. So I looked and we have them in our data. And that means we can actually start to provide some empirical evidence for what we’ve been hearing about in the news.
“Twitter had said that they had identified these 2,700 and some odd accounts as being Russian operatives. So I looked and we have them in our data.”
Like, there really are sophisticated actors. They have more followers than most other people. Some of the accounts were created as old as 2009. Most of them were created in 2012 and 2013, which means they’ve been building followings for a couple years. So by the time the 2016 election started, they were already powerful actors.
If you’ve been following all the discussion around fake news, one of the things that happens is, you have conspiracy sites. One of the conspiracy sites is called Infowars. Infowars just makes shit up. You can quote that.
So there’s this picture on my door. And what this picture shows is it shows these different Republican candidates and the websites that they linked to. Any website that has a black line running to it means more than one candidate linked to it. So when we look at someone like Donald Trump we can see sites that only he linked to. No other candidate linked to some of these sites. One of the sites on this list is Infowars.
Q: How has your research evolved since its beginnings?
There’s a website called Dribbble. It’s a site for designers to share their work, so it’s like an art site. There’s so much research that’s been done on sites like Facebook and Twitter, but we haven’t looked at a lot of the niche sites. At least not much. There are over 400 social media sites that we know about, and some of them fill very small niches. So when we say we understand how diffusion works, that’s based on the two or three biggest sites. Is that really fair?
Dribbble and Virality
So I’m looking at diffusion on some of these smaller sites and Dribbble is one of them. It’s particularly interesting because on Facebook and Twitter you have a little button you push: ‘retweet’ or ‘share.’ But on Dribbble you don’t because when people create art, they don’t want you to share their art. They want you to buy their art. They want to be appreciated for their art. They don’t want to lose control of their art.
And yet, there is a concept of virality on that site, which is interesting to me. By studying virality on big sites and small sites, what we hope to sort of do is to uncover and describe, the basic, underlying human behavior that manifests through the different affordances of these sites. Maybe we’re fundamentally sharing beings.
Q: What role do you feel information professionals play in shedding light on this phenomenon? What actions should or shouldn’t they take?
In a way, this is fairly simple. And it has to do with education. But we have to be careful about how we educate. If we tell people, “You’re spreading fake news,” they’ll just say, “No, you’re spreading fake news.” We’re not going to get past people’s defenses. It doesn’t do any good to argue. Instead, it makes sense to teach some things about how to check sources.
One thing that I try to get some of my students to do is, read a news story on say, the Washington post and then on Fox. And you can just see how they’re framed so differently. Universities can actually have classes on how to spot fake news. At the University of Washington, there’s a class called ‘Calling Bullshit.’
If we tell people, “You’re spreading fake news,” they’ll just say, “No, you’re spreading fake news.”
Q: Could you tell me about the Information Visualization class you teach, and the importance of visualizations in exposing these types of issues?
The election visualization draws links to who’s pointing at what information. That’s the point of the visualization. We can create these pictures, plots, and information visualizations which are often easier for people to make sense of. If I had to describe everything in that plot it would be like, a page of text. But I can tell you a few things about the plot in a sentence or two, and then let you make sense of the plot. One of the ways that advertising works, is it tells you a story up to a point and then it lets you fill in the blank and come to the conclusion they want you to come to. It’s a strategic way of doing things. You can do that with visualizations, too.
Q: What else should people know about this topic?
Well, I’m going to make a dire prediction. We have a security industry, right? All these companies make tools to try and keep our computers safe, and they don’t always work. Because there are people out there figuring out ways to get around that security and attack our computers. And this has been going on since the 80s. Computers are inherently vulnerable to hacking and fake news, and humans are inherently vulnerable to this kind of attack, too. So what that sort of means is that the problem of fake news and fake information is not going to go away.
Propaganda is what we used to call it, and propaganda has been around for a very long time. It’s been weaponized in similar ways and the United States is guilty of it like many other countries. Every time we figure out a fix, someone else will figure out a workaround. And the biggest problem, the biggest thing we have to balance is maintaining some sort of information freedom, which I think we all value.
At the same time, we have to try to make fake news less attractive to people, or less easy. How do you balance those two things? That’s not an easy problem.
So how to get involved? This is a growing area of research interest. Not just for me, but for the Syracuse University iSchool and Newhouse. They’re both thinking about ways that they can collaborate. So I’m sure that more information about that is going to be coming out. But students can ask people like myself or Caroline about what kind of opportunities there are to get involved.
Jeff Hemsley is a professor at the Syracuse University School of Information Studies. You can learn more about Jeff here and follow Jeff on Twitter. Jeff also contributed two chapters to the book Misinformation and Mass Audiences, coming out in January 2018.