At a time when people have a lot less trust in governments and institutions, and are willing to empower themselves to bring down oppressive systems, how are people using technology today in new ways to force governments, policies, and laws to change? Ethan Zuckerman, the Director of the Centre for Civic Media at MIT and Associate Professor at MIT’s Media Lab speaks about how social media is transforming the rules of civic engagement, why building decentralised social networks is critical to disarm the grip of Facebook and Google over the lives of ordinary citizens, and how the onus of combating the menace of fake news lies as much with the media as the governments.
Farah: These are tough times around the world and there's a lot of sense of discontentment. As an industry veteran, as somebody who's seen the technology space advance significantly over the years, how are people using technology today in new ways to force governments, policies, and laws to change?
Ethan: Technology has allowed citizens to organize and show power in some very different ways. We've seen some obvious ones where people use online media as a way of organizing protests, taking to the streets. We saw this of course in Egypt, we also saw it in Turkey. So that's been a very popular vector, but there are many more subtle ways that people use online tools as a way of making change. Sometimes rather than trying to pass laws, they are simply trying to change attitudes. They're trying to change norms. And this is something we're making into media and putting it out into the world. Having a campaign where you say, I'm for equal marriage, I'm for gay rights. This can be a very, very powerful force.
Farah: But as you point out, using social media to effect change is not new. We've seen that right from the Arab Spring movement in 2010 to something as recent as the ‘Me Too’ campaign that took place last year. So over the years, how have the rules of engagement online to galvanize political change transformed?
Ethan: I think one thing that we've discovered is that people used to criticize online activism as being Slacktivism. It wasn't real activism. I think people are discovering that real change happens whenever you can organize a lot of people, you can help them tell their stories and you can give them an action to take.
Use an example from the United States. We have had this horrific problem of school shootings. We've seen some of the students who were involved at Marjory Stoneman Douglas, which is the school in Florida that suffered one of these massacres. They've been incredibly effective on social media at putting pressure on companies and probably their greatest victory is that at two of the very largest retailers in the United States, you can no longer buy these high-powered weapons by just walking into a store and purchasing them. That is a change that probably would not have been possible through passing laws, but it was possible for mobilizing so many people and putting pressure on corporations online. So we're discovering that people can be powerful when they organize themselves and work together in ways that they’re sometimes not powerful through traditional politics.
Farah: You're writing a book about civic engagement and this comes at a time when people have a lot less trust in institutions and governments. What have your observations been? Can you take us through some of the observations that will find their way to your book?
Ethan: I'm finishing up this manuscript called ‘Mobilizing Mistrust’. In some countries, levels of mistrust have reached very high level. In the United States, only 19 percent of people will tell you that they trust the government all or most of the time. And people often mistrust other institutions. They mistrust the press, they mistrust banks, they mistrust corporations. The problem with mistrust is that it can be very paralyzing. If you don't feel like government pays attention to you or your needs, then why would you bother influencing them? Why should I participate in voting? Why should I try to influence my parliamentarian when nothing's actually going to happen? So part of what we have to do is we have to challenge that mistrust. We have to help the systems regained their trust, but we also have to mobilise the mistrust.
We have to take people who are frustrated and mistrustful and give them ways that they can be effective. As citizens, I often think were most effective as monitors. If we're watching people who are in power and calling them out when they go astray, our mistrust actually becomes our fuel.
I also think that mistrust ends up being incredibly mobilising for people who try to build their own systems. What's wonderful about this is that some of the most inspiring examples I'm seeing aren't just in Europe, they aren't just in North America, they're happening all over the world..
Farah: What you will probably be most remembered for is for creating one of the most annoying things that the Internet had seen -- the popup ad -- in the 1990s. What else have you been working on since the last decade? Which of those ideas have the potential to change the internet?
Ethan: Well, thank goodness it's been a long time since I was working on the popup ad! Now a lot of the work that I'm doing is around sort of documenting other forms of civics. So when people aren't involved with elections, how are the other ways that they're being civically active?
A lot of my work these days is around this question of, If these internet giants like Google and Facebook are becoming too powerful, can we decentralise? So I think for me that's probably the idea that ends up most powerful.
In the 1990s, we really believed that the Internet would spread power out equally. What ended up happening was that concentrated power rather massively. That I think was the result of some bad decisions. We sort of thought that power would naturally flow to individuals. And I think we're now realizing that if we want to fight concentration, we actually have to make a conscious choice both in how we build at also what tools we use. And so I actually have an enormous amount of hope that we might be able to change that trend.
Farah: Over the past decade, as netizens we have handed over much of our data to social corporations like Facebook and Google, perhaps even without realizing we've done that. What this has done is that it has allowed these organizations to influence our minds, to influence some very important political outcomes. Like for example, we've seen what happened in the Cambridge Analytica scandal where Facebook data was leaked and that went on to influence the outcome of the US Presidential elections. Is there a real fear now that online advocacy may increasingly not be merit-based, but may be steeped in some very deep biases?
Ethan: You're absolutely right to ask questions about data ownership, data privacy and the power of the platforms. I don't think these are reasons to pull away from online activities as a space for making change. I think what people are finding is that these online spaces are incredibly powerful tools to help people who are like-minded find each other and to work together. I think at the same time, we have to be sensitive to all the possible downsides of this. We know that helping people find the like-minded sometimes leaves us more polarised, leaves us more divided. We also know that these businesses have business models that are based around surveillance.
One of the places where I hope we will see more activism is around challenging the platforms to do a better job taking care of us, taking care of our data, being safe with that information. I'm also a very strong advocate of this idea that we need to decentralize, that we shouldn't have Facebook and WhatsApp be so powerful and have so much control over our lives. We would benefit from having many, many more tools out there. A lot of the work that my lab does is around these questions of how we could build these decentralized social networks, not just because they would be good for us as people, but because they're particularly good for us as activists.
Farah: You wrote this piece in ‘The Atlantic’ about parallel, decentralised social networks. But who would benefit from these parallel social networks, because if you're not going to invest in corporations like Facebook or Google, then you would probably be handing over your data to governments?
Ethan: It doesn't have to be governments who run these social networks. So one of my favorite examples of this is Wael Ghonim, who was one of the organizers of the Tahrir Square revolution. He actually was the guy who started the Facebook group, "We are all Khaled Saeed". After the revolution, when he was disappointed with where Egyptian politics was going, he started a new service called ‘Parlio’. It wasn't going to be the biggest social network, but it was going to be a social network dedicated to bridging people from very different points of view. And so it had very strict content guidelines. If you were rude, if you were impolite, you could be thrown off. It was not pretending to be a free speech zone. It was quite successful and ended up bought by Quora. So there was real value in what he created. We don't need one social network with one set of rules. We might have as many social networks as we have websites for different purposes.
Farah: Social networks like Facebook and Twitter control the kind of information that we see by controlling our news feeds. Now I believe that you’ve built a tool called Gobo which allows people to aggregate and filter the kind of information that they see on their news feed. Why do you think it's important for somebody to have a tool like that?
Ethan: This is a small university project. But the reason that we ended up doing it was we wanted to demonstrate two things. One is that it's absolutely possible to have a single tool that works with multiple social networks. We built Gobo so that it lets you look at Twitter and Facebook at the same time.
The second is that Facebook controls what ends up in your newsfeed and of course, they don't have to. They could give you a great deal more control over it. So what we did with Gobo is we've put in some experimental filters. My favourite actually is around gender. We have a slider where you can change how many men or how many women you're hearing from including a ‘Mute All Men’ button, which I think a lot of people would find very helpful. It's just a way of demonstrating that this doesn't have to be just Facebook's control. This is control that we could demand and this is control that we could build into our own tools if we wanted to build different social media.
Farah: Public advocacy is pushing Facebook to show more posts from relatives and friends and less from what brands are showing as part of our content feeds. Do you think it’s doing a good enough job of it?
Ethan: I think we're finally putting pressure on Facebook to listen to its users about making the network operate differently. I think Facebook has a very strong tendency to try to do what's right for us rather than allowing us to decide what's right for ourselves. I actually think we need companies that let people drive by choice. And so even if I mostly end up clicking on pictures of cute cats, I should have a social network that, if I want more international news, I want to pay attention to what's going on in India, that it should actually meet what I say I want rather than just what I demonstrate that I want.
Farah: The circulation of fake news is a very big problem, particularly in India. I’ll come to India in just a bit, but first, globally, how do you see organisations and governments being able to tackle fake news and its propagation? Do you believe institutions are doing a good enough job of it? Where do the biggest challenges lie?
Ethan: This question of fake news has gotten so complicated, so quickly in part, because the head of the US government has taken the term and really bent it out of shape. Fake news, when we started talking about it was really news that someone had constructed out of hoaxes. They were putting it out there to make money. Now you have Donald Trump referring to any coverage that he doesn't like as fake news. And I think you're going to see that happen in any country where you have a leader who doesn't like how she or he is being portrayed in the news. So I don't like the term. I actually find it unhelpful because it's a way of sort of dismissing things. I tend to talk more about information and quality.
I think one of the things that's very tricky is that when you have a very few social networks, they have real tough questions about information quality. For me, diversity in a marketplace is more desirable than having a government decide what we should be hearing and not hearing. That strikes me as a very dangerous direction to go in. I prefer the government not control it. I also prefer that a small number of large platforms don't control it.
I know that in India, it's really been a horrific problem with mob violence put forward around Whatsapp. I do think that what these companies really have to do is get better at letting civil society, letting researchers see what's happening on these networks. It's really important that we be able to sort of monitor stories that are on Whatsapp that are reaching 10,000, 100,000 people at that point. It's not private communication at that point. It's really its own form of news media and you need to study it the same way we study other forms of hate speech.
Farah: I was coming to the problem of fake news in India, which has grown to become a really big menace today. Authorities in India are trying to clamp down on the problem of fake news with the Department of Telecommunications, for example, recently coming out with a diktat where they were trying to get Whatsapp and Facebook blocked in emergency cases. But policies like these are seeing a lot of pushback from companies over arguments that networks like these are immensely helpful in emergencies, in getting people together. So where's the real solution for a country as diverse as India?
Ethan: In the long run blocking these networks is not the answer; they're too powerful. One thing that Whatsapp is done that I actually think is very responsible is that they're trying to slow the spread of information on their platforms. They're making it harder to spread information to hundreds of people.
In the long run, some of this is going to fall on the media. I actually think that some of our job in the media is probably to be monitoring what’s spreading on WhatsApp and then sort of do very visible fact checking of it. I think television in particular has enormous power to say - Look, you may have heard something about this on WhatsApp, let us show you what's actually happening.
Farah: There's a new term that's come about, conscious consumerism, where a group of people that are seeking political change, first find solutions to fake news or echo chambers of this information. How do you see that sort of change taking place in a country like India? Do you see real effort being made by a class of conscious consumers?
Ethan: I think there's at least two things that conscious consumers can do around misinformation. The first one is to monitor your own sources of information. When my students study with me at MIT, the first thing they do is they end up keeping a journal of what media they encounter over the course of two weeks. I asked people something to write it all down so they can reflect on it and think about what they're getting too much of, too little of, of what they’re putting into their heads.
I think the second piece of this is that we have to support high quality media. We all know that there's a media out there that we consume and we feel bad about it because it wasn't really good for us. But there's also media that you read and you ended up saying, well, I'm so glad I know that. I'm so glad I got that perspective. It's our responsibility as citizens to support those outlets and to make sure that they survive and that they're capable of growing, because that's the only way in the long run that we fight back against this.
Farah: In the 2014 general elections in India, the BJP had a successful run due to their suave deployment of social media. Will you be observing how the social media graph is changing for the party in 2019? What are you hoping to see?
Ethan: There's no doubt in my mind that part of BJP success was their incredible, thorough embrace of social media. Let’s be perfectly frank, they figured out how to use the tools very well. They've also, in some cases use the tools in quite dangerous ways. And there are unfortunately some supporters of the BJP, who have been using social media to intimidate and harass. What I'm hoping to see in 2019 is other parties, not just Congress, but smaller parties as well, really find ways of reaching out and building constituencies online as well as offline.
I think BJP really pioneered the idea that elections in India are won online as well as offline. I don't think anyone who is serious about running for office in 2019 is going to ignore the online space. They're going to be working very hard there as well as they are in organizing on the streets.
Farah: Lastly, coming to artificial intelligence. Right from Elon Musk to Masayoshi Son, leading business leaders are reposing their faith in artificial intelligence and its ability to transform our daily lives. What is your view on artificial intelligence? Do you really believe it has the power to rewrite the script as some of these leaders have been talking about?
Ethan: In the circles that I hang out in, no one talks about artificial intelligence because it's not a term that's used seriously. There's machine learning where we train computers to make generalisations and we train them to categorise information added. There are places where machine learning is very, very powerful. I use some of it in my own work, but we are so far away from anything that would constitute generalised artificial intelligence that many of the conversations that we have about AI are simply silly.
It's going to be a very, very long time before AI will be threatening most people's jobs, for instance. Even with machine learning being applied to apparently simple tasks like recognising human faces, we're discovering very serious ethical problems with it. One of my students studies the fact that face recognition systems do much worse at recognising darker skin than they do at recognising lighter skin. And so there's a much higher chance of a false-positive if you end up having darker skin rather than lighter skin.
I'm glad we're having the conversations as far as whether I think AI is going to radically transform the world as we know it. I think it's having small effects that will be increasingly important, but I'm not in the camp of people who think that the world is going to change forever within the next 20 years.