Endlesstracy

Is the internet closing our minds?

Oct 11th, 2015
242
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 106.88 KB | None | 0 0
  1. Transcript for the film: Is the internet closing our minds? (Use for capstone)
  2. [APPLAUSE]
  3.  
  4. Thank you and welcome. Well, I don't mean to be a name dropper. But last week I posed our resolution to two of my dinner partners, Rupert Murdoch and George W Bush.
  5.  
  6. [LAUGHTER]
  7.  
  8. They both had strong views, and both were against the motion. Murdoch was vehement that Fox News is not polarizing--
  9.  
  10. [LAUGHTER]
  11.  
  12. --but presents a more balanced view than the other networks, and attracts an audience that is 40% Democrat, which I thought was an interesting statistic.
  13.  
  14. [LAUGHTER]
  15.  
  16. And President Bush argued that in most of the world, the internet has opened minds to the way of life in the liberal democracies, and has been a hugely constructive force for change. He acknowledged the extreme polarization of American politics, but blamed it on the class warfare rhetoric of his successor.
  17.  
  18. [LAUGHTER]
  19.  
  20. Whatever one thinks of their points, they do illuminate some of the subtleties in the language of our resolution. Murdoch correctly grasped that it does not posit that the internet is the sole or even the primary driver of the narrow ideologies that seem to dominate politics. Rather, he understood the internet as part of a broader trend toward a highly fragmented media. And President Bush, by taking the resolution in a global context, prompts me to clarify that what we expect to talk about tonight is American politics.
  21.  
  22. Why might the internet be closing our minds? The first reason is that the internet makes it very easy to tailor the information we get to conform to a preconceived worldview. We can choose news aggregators on the left or on the right, not to mention highly ideological blogsters, and liberalize without encountering any contrary opinion.
  23.  
  24. In a great little essay called "Why Groups Go to Extremes," Cass Sunstein, the Obama administration's regulatory czar, demonstrated empirically that discussing issues with like-minded folks tends to make positions more extreme. In addition to self selection, the internet makes it easy for websites to quote "personalize" our offerings. It seems innocuous, even helpful, when Netflix tells us that the viewers who enjoyed movie A also enjoyed movie B.
  25.  
  26. But if you google "federal deficit" or "Medicare reform," is it healthy for your search to come up differently than mine based on what Google has been able to infer about our political leanings? Those against the motion will argue that the internet is a vast information utility, which facilitates search, learning, and communications.
  27.  
  28. Personalization may be a blessing, simply helping us make the choices we want to make. We can turn personal filters on or off. So there's no real danger here, and little hard evidence that Google has helped us erect mental fortresses through which contrary ideas are not allowed to penetrate. Indeed, the filters from the pre-internet era-- three like-minded networks and a handful of local newspapers-- were arguably less hospitable to views outside a narrow consensus than the open marketplace in information and ideas we have today. To illuminate the complexities here, we have four distinguished experts. And it's now my privilege to turn the evening over to them, and to our moderator John Donvan.
  29.  
  30. Thank you, Robert.
  31.  
  32. [APPLAUSE]
  33.  
  34. Thank you. And I would just like to invite one more round of applause for the benefactor of the series, Robert Rosencranz.
  35.  
  36. [APPLAUSE]
  37.  
  38. Yes or no to this statement. When it comes to politics, the internet is closing our minds. The state of the online debate-- that is what we are debating here tonight. Welcome from Intelligence Squared US. I'm John Donvan. We have four superbly qualified debaters-- two against two. And what we're touching on here-- well, it starts with this-- a sampling of this. And see if it's familiar.
  39.  
  40. This is a recent exchange among "Wall Street Journal" readers who were posting to each other about health care or actually posting at each other. Glenn the liberal just had his argument attacked by David the conservative. Glenn gets mad. "One wonders how you can even press the keys on the keyboard. Please just go away." David responds, "Now I know you're a liberal, because liberals are rude and will not listen to any reason." A guy named Kevin joins in the attack. And he tells Glenn to read up on economics and civics. Glenn tells Kevin, "Put this in your pipe and smoke it, you pseudo-intellectual rube. From nowhere a guy named Mark weighs in. And he tells Glenn that he is intellectually challenged and advises him that he needs to get some lithium.
  41.  
  42. [LAUGHTER]
  43.  
  44. When it comes to politics, the internet is changing our mind. Is that what we just heard-- our minds closing out there? Or is it a great thing that these guys are actually out there engaging with each other at all? Our debate goes in three rounds. Then the audience votes to choose the winner. Only one side wins.
  45.  
  46. On the side for our motion, "When it comes to politics the internet is changing our minds," Eli Pariser, a board member and former executive director of MoveOn.org.
  47.  
  48. [APPLAUSE]
  49.  
  50. His partner is Siva Vaidhyanathan, professor and chair of the Department of Media Studies at the University of Virginia.
  51.  
  52. [APPLAUSE]
  53.  
  54. Arguing against the motion that when it comes to politics, the internet is changing our minds, Evgeny Morozov, a Schwartz Fellow with the New America Foundation and author of the "Net Delusion." And his partner, Jacob Weisberg, chairman and editor-in-chief of The Slate Group.
  55.  
  56. Our motion is, when it comes to politics, the internet is closing our minds. Let's meet our debaters now and welcome first Eli Pariser.
  57.  
  58. And Eli, at the age of 20, you joined MoveOn.org to direct its foreign policy campaigns. And then a couple of years later, you became its executive director. You've been an online organizer all of your adult life. But now you are warning about the dangers of the internet. So what changed-- you or the internet?
  59.  
  60. Both. But the internet changed more than I did.
  61.  
  62. All right. When you get up there, we're going to see what you mean by that. Your debating partner-- let's welcome Siva Vaidhyanathan.
  63.  
  64. [APPLAUSE]
  65.  
  66. Siva, you're a professor and Department of Media Studies chair at the University of Virginia. You're also author of this book, "The Googlization of Everything (and Why We Should Worry)." This is actually your second debate with us. And the first time you were debating for the motion, Google violates its "Don't be evil" motto and you won. Do you still think Google is evil?
  67.  
  68. Well, I never thought Google was evil. But I did think that was an impossible standard for any company to hold. So it was actually not that hard to win. I think the really interesting question is, does Google think I'm evil?
  69.  
  70. [LAUGHTER]
  71.  
  72. Thank you. Our motion is, when it comes to politics, the internet is closing our mind. Let's meet the team arguing against, first. Evgeny Morozov.
  73.  
  74. [APPLAUSE]
  75.  
  76. You're a visiting fellow at Stanford University, a Schwartz fellow at the New America Foundation. You wrote a book also-- "The Net Delusion, the Dark Side of Internet Freedom." You have said-- this was in a TED talk-- that when the internet reaches a remote Russian village, people are not going to be sitting there watching reports from Human Rights Watch. You said they're going to be watching pornography, Sex in the City, or maybe funny videos of cats.
  77.  
  78. [LAUGHTER] So how worried are you about these cat videos?
  79.  
  80. Wow. Well, cats, I think, are the new opium of the masses. And dictators have figured it out, and they explore it perfectly as a rule of thought control.
  81.  
  82. [LAUGHTER]
  83.  
  84. Thank you and let's meet your debating partner, Jacob Weisberg.
  85.  
  86. [APPLAUSE]
  87.  
  88. You are chairman and editor-in-chief of The Slate Group. Now, you wrote for print for everybody in the old days-- "The New Republic," "New York" magazine, "Financial Times," among other places. But in 1996, you joined a new online magazine called "Slate" as its chief political correspondent. That was very early in the game. So what did you know back then?
  89.  
  90. Well, John, I was an early visionary.
  91.  
  92. No, actually, I got very lucky when my friend Michael Kinsley decided to found "Slate." And it seemed like a fun thing to do. And it turns out, it's impossible to go back. The internet spoils you as a writer, because of the freedom you have and the speed.
  93.  
  94. All right. Thank you, Jacob Weisberg. Our four debaters, ladies and gentlemen.
  95.  
  96. Now in this debate, you, our audience, are our judges. By the time the debate is ended, we're going to have asked you to vote two times-- once before the debate and once again after the debate-- on the language motion and on your position on it, both before and after. And what we want to ask you to do now is go to the keypads at your seat.
  97.  
  98. On the right-hand side, you'll see a keypad. And we'll ask you to vote your sentiment on this motion as you come in off the street. Our motion is, when it comes to politics, the internet is closing our minds. If you agree with it, push number one. If you disagree, push number two. If you're undecided, push number three. And you can ignore the other keys.
  99.  
  100. And if you want to correct your entry, just correct it. And the system will register your last vote. So that's locked out now. Again, at the end of the debate, we're going to ask you to vote on the quality of the arguments that were presented here tonight. And the team that has the greatest differential between this opening vote and the closing vote will be declared by you, our audience, their winners.
  101.  
  102. So we go in three rounds. The first round is seven minutes each, uninterrupted by each speaker in turn. So onto round one, opening statements from each of our debaters. And going up first for our motion, which is, when it comes to politics, the internet is closing our minds, Eli Pariser. He's a moveon.org board member, CEO of upworthy.com and author of the "Filter Bubble, What the Internet is Hiding from You." That book was the inspiration for our having this debate tonight. Ladies and gentlemen, Eli Pariser.
  103.  
  104. Good evening, everyone. I find myself arguing that the internet is closing our minds with a bit of regret. This isn't the place that I would want to be. And actually, as of a few years ago, I would've been arguing the other side.
  105.  
  106. But I've come to believe that while the internet is incredibly good at getting groups of like-minded people to get together, think together, work together; that it's actually quite bad at bridging between groups of different people; and that the view that the internet is exposing us to all sorts of new ways of thinking and new ideas is kind of a dated view; that the internet's changing; and that there are a few big companies that would like us to hold on to that idea that the internet is still this kind of open place.
  107.  
  108. So the core of my argument is this. Attention is the most valuable commodity out there right now. If you command attention, then you can direct it towards products or services, and you can make a lot of money. And that's why all of the big companies on the internet are trying to figure out what the best strategy is for gathering as much of it as possible. And most of them are focused on the same strategy, which is gathering as much data about us as they can, and then using that data to give us what they think, what they predict-- based on this data and their algorithms-- we're going to be interested in. Relevance is the big watchword here.
  109.  
  110. So you if you talk to these companies, if you look at what they're saying, it's very clear this is a big part of the business plan. Eric Schmidt says, very soon it'll be nearly impossible to see something that has not in some way been tailored to you. Sheryl Sandberg of Facebook says, within the next few years, it'll be anachronistic to visit a website that hasn't been customized to your personal interests in some way.
  111.  
  112. And Facebook is becoming this growing source of how people get their news and how people get their information. So why are they doing this? Well, I think Evgeny actually put it really well. Why does Facebook employ filters? The more they know about us, the more they can make in advertising revenues.
  113.  
  114. And the thing is that these companies aren't blind to the psychology of all this. They've read all of the studies that show that when you present people with information that confirms what they already believed was true, you can actually see these little bursts of pleasure happening in people's brains. And conversely, when people are presented with information that challenges what they believed, they get cranky. That's just the way we're wired.
  115.  
  116. And so, if you're a company that's trying to meet stockholder demands, and you have this power to present people with information that tends to validate them, why wouldn't you? As a result of this kind of personalization and self-selection, we're more likely to see things we agree with and less likely to see things we disagree with.
  117.  
  118. Now to be clear, I'm not arguing that all personalization is bad. Personalization certainly can have benefits. I'm a Netflix user. But the question is, what are the driving motives behind the kind of personalization that most people use? And what are the effects of that personalization?
  119.  
  120. And I'll argue that because of the motives of these companies, those filters that they're building are going to tend to surround us with information that's agreeable to us, and not with information that's uncomfortable.
  121.  
  122. So a few examples of what this looks like in practice. It looks like one person googling Egypt and seeing lots of information about the Arab Spring, and another person googling Egypt at the same time and seeing nothing about the Arab Spring. This actually happened. I've got the screenshots on my website.
  123.  
  124. And I could run through a number of other anecdotes. But actually there's been a study-- the only peer-reviewed study I'm aware of on the Google search results and the effects of personalization there-- that shows that 60% of the search results on a given front page are usually personalized. Either they're in a different order, or they're actually totally different results based on who Google thinks you are and what it thinks you'll be interested in.
  125.  
  126. So what I think we have to begin to do is to tease apart what the internet actually does, from what we wish it would do or what it possibly could do. And in particular, we have to tease apart what's possible to access versus what people do access. For example, it's as easy now to access the front page of "Le Monde" or "Die Zeit" as it is to access the "New York Times."
  127.  
  128. So you would think, given this vast increase in accessibility of foreign policy information of what's going on in other countries, that people would know more about what's going on there. In fact, that's not true. According to a Pew study from 2007, people actually were more informed on foreign policy matters before the internet than in 2007. So when Jacob says you can find more sources than ever, it's kind of beside the point.
  129.  
  130. What's relevant is that people come into daily contact more frequently-- is whether people come into daily contact with more different sources, and in particular, different ideological ones. I agree here again with Evgeny, who says the regular folk don't read sites like Global Voices, an aggregator of the most interesting blog posts from all over the world. Instead they are more likely to use the internet to rediscover their own culture, and dare we say it, their own national bigotry.
  131.  
  132. So to summarize, big companies are rapidly working to personalize your version of the . They want to show you what they think you're going to be interested in. And this is passive. This isn't me turning on Fox News or me turning on or picking up a copy of "The Nation." This is embedded in an increasing number of websites. Yahoo News in 2007 was a gateway for 80 million people that looked the same. But now Yahoo runs 13 million different variations of Yahoo News front page every day.
  133.  
  134. And they're different for each person. It's hard to even see. We don't know how tailored our view of the is, because you have to sit down next to someone else and look at the differences. So the object of this personalization is to get us to click more. It's to get us to like more. And importantly, it's to get us to click on ads.
  135.  
  136. And there's little benefit in that world, to presenting us with information that makes us uncomfortable, that challenges our views, or that makes us think differently. I don't think that this has caused the extreme political polarization that we're seeing right now. But I think it can't help but exacerbate it.
  137.  
  138. Google says that it's trying to provide relevance. But what is the most relevant search result when you're a 9/11 conspiracy theorist googling 9/11? Is it the conspiracy links that Google's algorithm would tend to promote? Or is it the "Popular Mechanics" article that would debunk that stuff?
  139.  
  140. I asked Google this question. And they didn't really have a clear answer. So it's with regret that I think the internet is not living up to its potential. The way that most people actually use this thing isn't to broaden their political perspectives. In fact, the path that we generally travel on online will tend to narrow our political views. Thank you.
  141.  
  142. [APPLAUSE]
  143.  
  144. Thank you, Eli Parisi.
  145.  
  146. Our motion is, when it comes to politics, the internet is closing our minds. And here to speak against the motion, Jacob Weisberg. He is a pioneer in online journalism. Jacob is editor-in-chief of The Slate Group, the internet-based arm of the Washington Post Company.
  147.  
  148. Thank you, John.
  149.  
  150. Eli, you can't have Evgeny He's committed to me till at least 8:15.
  151.  
  152. [LAUGHTER]
  153.  
  154. Look, I'll concede that the internet narrowing our political perspectives is an interesting theoretical problem. People do have a tendency to prefer listening to what they already agree with. And the internet makes it easier, in theory, for people to live in a solipsistic bubble, where they mostly interact with people who have similar views.
  155.  
  156. But is that a phenomenon that's getting worse? And if it is getting worse, is it getting worse because of the internet? Eli and, I think, Siva is going to argue that they think that because Google and Facebook are bad for us, it must be. But there's skepticism about these tools we use every day is kind of the mirror opposite of the cyber-Utopians who think the new technology only brings us good things. But there are a lot of interesting theoretical problems, from the Malthusian food scarcity to the Y2K bug, that are just never borne out in practice. "The Atlantic" magazine gets a cover out of this every month, right?
  157.  
  158. Is Google making you stupid? Is Facebook making you lonely? That's actually the cover this month. Is Twitter destroying your attention span? But cyber-realists like Evgeny and me try to look at questions like this in a more empirical way. So what's the evidence for tonight's proposition that the internet is narrowing our views. I won't say there's none.
  159.  
  160. But I will say it is laughably weak. And there's some really good evidence of the 's doing the opposite of what our opponents claim, that it's exposing us to a broader range of perspectives and making us less parochial in our outlook. And it's on that empirical basis, and on the basis of your own experience of not becoming more narrow, that you should vote against tonight's motion.
  161.  
  162. There are, of course, some studies about this. There was one a few years ago that showed left and right wing bloggers don't just talk to their own sides. They respond to each other and link to each other a lot. And it's not all the kind of exchange John cited at the beginning.
  163.  
  164. There's a Pew study from last year that shows-- this is a quote-- "no evidence that social network users, including those who use Facebook, were any more likely to cocoon themselves in social networks of like-minded and similar people, as some have feared. There's another new Pew study that says that people's social network friendships tend to cross political boundaries. The biggest study, which involved 250 million Facebook users-- yes, it was sponsored by Facebook, but it was a good study-- showed convincingly that most people share links from people they aren't close to. That is, we do reach outside of our personal circles for news.
  165.  
  166. So Eli said at the beginning of his very interesting book that he published last year, that Google was tailoring search results to our politics. He said that on the basis of one anecdote from a friend I don't think he named. He had a similar anecdote tonight about Egypt.
  167.  
  168. Allow me to be skeptical. You can do a funky search that gives you weird results at any time of the day for a variety of reasons. But I did a test on this, where I asked friends of mine-- different parts of the country, different political views, different worlds-- to search some parallel, very political, loaded terms like climate change. And basically they got exactly the same thing.
  169.  
  170. And I don't think that there is more than anecdote to support what Eli's talking about. I've looked at the studies. Now since Eli wrote his book, Google has changed. And it now includes results from your social network and search returns. So if you're spending all your time on Google+, and the only people in your circle are Bill O'Reilly and Rush Limbaugh, you may in fact see some ideological bias in the returns.
  171.  
  172. But Google added social searching in a way that actually specifically addresses Eli's problem. He had a lot of influence on this. You can turn customization off by pressing the prominent button that says Hide Personal Results. And you can toggle back and forth if you're curious about the difference. On Facebook it's just as easy to turn off customization.
  173.  
  174. You change the Sort option on your top stories to Most Recent. In other words, the filtering they do-- these giant sites-- is completely transparent and optional. I leave customization on most of the time because the social filter doesn't actually trap you in your own bubble.
  175.  
  176. And take music, just as an example I think we can all relate to. The reason I use Spotify, which is a social music service, is that I'm a little bored with the music I already know. And I want to be exposed to stuff I don't know, which I get through my Facebook friends.
  177.  
  178. So back at the level of theory, could an algorithm personalize the news to suit your political perspectives? I think it's much harder to do that than it seems. News organizations are really trying to do that. They actually haven't been very successful, which is why when you go to newyorktimes.com and I go, we get the same home page.
  179.  
  180. Of course, you can get narrow-minded, politically filtered news. But you don't need the internet to do that. You can go to Rush Limbaugh on the radio or Fox or MSNBC on TV.
  181.  
  182. Now it is worth noting that we did used to have the kind of constricting filter bubble Eli is worried about. And that was the mainstream media before the internet. When I was a kid in the '70s, you found out about the world through TV networks, a couple of news magazines, local newspapers-- there weren't any national ones. And it was a very limited range of viewpoints and voices.
  183.  
  184. And if you want to create that kind of filter bubble, all you have to do is remove technologies before television, before radio, before newspapers. So personally, I get a lot of my news these days through a tool called Twitter. I use it as a filter for stories that are relevant to me. And what it does is the opposite of narrowing my perspective.
  185.  
  186. Thanks to Twitter, I'm regularly getting news from global sources, as well as American ones. I tuned into a lot of really interesting Arab voices during the Arab Spring. I follow China, in part, through an English translation of tweets from Ai Weiwei, the Chinese artist who has become one of the most important dissidents in the world today. And those filters broadened my perspective. Twitter's a serendipity engine.
  187.  
  188. Finally, I don't think you need to look at all these studies or understand algorithms to make up your mind about tonight's motion. All you have to do is ask yourself a pretty simple question. Has the Web narrowed my mind? Has it made me less tolerant of people I disagree with, and less interested in what they have to say? Or has it exposed me to a broader range of voices, sources, and ideas than we all used to get when we relied on a consensus-oriented mainstream media for all our information.
  189.  
  190. Unless you're convinced that the Web makes you narrower, you should vote against tonight's motion. Thank you, Jacob Weisburg.
  191.  
  192. [APPLAUSE]
  193.  
  194. A reminder of what's going on. We are halfway through the opening round of this Intelligence Squared US debate. I'm John Donvan. We have four debaters-- two teams of two-- fighting it out over this motion. When it comes to politics, the internet is closing our minds. We have heard the first two opening statements and now on to the third.
  195.  
  196. Debating for the motion, the author of "The Googlization of Everything (and Why We Should Worry), chair of the University of Virginia's Department of Media Studies. He's been called Google's gadfly. Siva Vaidhyanathan.
  197.  
  198. Thank you. So--
  199.  
  200. [APPLAUSE]
  201.  
  202. --the internet has so narrowed my mind that I didn't hear a word Jacob said.
  203.  
  204. [LAUGHTER]
  205.  
  206. So in these brief comments, I'll explain what Eli and I mean by the internet, as invoked in the resolution, and what we mean by politics. And I'm going to convince you to support the resolution.
  207.  
  208. First, try to remember 1999. The days of AOL, the days of prodigy, the apex of our long national nightmare of peace and prosperity. In 1999, Thomas Friedman wrote these words. "The internet is going to be like a huge vice. It's going to keep tightening and tightening that system," meaning the system of globalization, "around everyone in ways that will only make the world smaller. Smaller and faster and faster." Well, Thomas Friedman could not have been more wrong. He's been exactly that wrong many times, but never more wrong.
  209.  
  210. [LAUGHTER]
  211.  
  212. Now, in 2012, it's clear that there is no such thing as The Internet, capital T capital I, as Friedman and others so often described in the 1990s. There is no equalizing force, great democratizing force. There is no global network of networks that unites us all and gives us all equal voice to interact with each other across borders and across classes, for that matter. In fact, the alleged network of networks is in 2012 balkanized, nationalized, compromised, anesthetized, super-sized, circumcised, and hypnotized. It's far from global and it's getting less so every day.
  213.  
  214. The online experience of a person sitting in Turkey is so different from the online experience of someone sitting in India, and so different from the online experience of someone sitting in Iran or someone sitting in China. And all of those are even more different from the experiences of people in Brazil, in Russia, or the United States. And we're talking about more than censorship, although in many of those countries, online censorship is a serious and crippling issue.
  215.  
  216. We're also talking about platforms-- the platforms that people use, the platforms that governments will let people use. The rise of kill switches is something we have to take into account as well-- kill switches that we saw a year ago in Egypt to wipe out internet communication in an instant. We've seen experiments in kill switches in China in early 2012.
  217.  
  218. And we've seen people like Senator Lieberman even propose them in the United States. Internet technologies amplify so much of what we already are, and what we already want. And the fact is, we're pretty provincial animals. So you add the internet to it, we just double down. We get more provincial. Platforms matter, defaults matter, and policies matter.
  219.  
  220. So here in 2012-- or now in 2012-- we are not all holding virtual hands with our Facebook friends across the globe, singing we are the world. There is no coordinated global movement for justice. There's no sophisticated online debate about our collective human fate or even our basic human needs. The dominant powers governing our digital experiences-- the state, for instance, in China, or corporations in Brazil or the United States-- are not interested in such matters. They are not interested in us being political. Sometimes these powers actively restrict us like in China.
  221.  
  222. More often than not, these powers seduce us into shallow consumption. Consumptive behaviors like shopping, or giggling at cats, or clicking on cows. Not that there's anything wrong with clicking on virtual cows, But it ain't political. Our minds are closing because our attentions are distracted, fractured, and segregated into niches and nations.
  223.  
  224. But all hope is not lost. There is nothing about the nature of the internet that prevents us from being political, richly political. Many of us actually are.
  225.  
  226. I would suspect that most of us in the audience have done a pretty good job of avoiding these traps. We're pretty elite, we're pretty aware, and we're pretty connected. And we route around a lot of these problems. We're also hungry for information. If you weren't hungry, you wouldn't be here.
  227.  
  228. So many of us-- despite what Facebook does with us, despite what the Iranian government does in Iran-- have managed to be richly political. But we do know that the internet by itself does not topple dictators, does not undermine newspapers. It's just not that simple.
  229.  
  230. If we recognize the biases inherent in many of the platforms of our media systems, we can correct, we can adjust, we can invest, we can invent; we can resist, persist, and thrive. We could build platforms that enhance Republican deliberation and extend cosmopolitan perspectives. We just haven't done that yet. We've been really busy clicking on cows. But all that takes work. It takes real human work in addition to invention and imagination. Real human effort. But the first step to recognizing that-- I'm sorry, the first step to realizing that, is to recognize what's happening. When it comes to politics, the internet-- most importantly, how we experience the internet-- is closing our minds, one cow-click at a time. Please vote for the motion. Thank you.
  231.  
  232. [APPLAUSE]
  233.  
  234. Thank you, Siva Vaidhyanathan.
  235.  
  236. Our motion is, when it comes to politics, the internet is closing our minds. And our final debater against the motion-- he comes from the former Soviet Republic of Belarus-- something that has shaped his interest in the internet's role in politics. He is a Schwartz fellow at the New America Foundation and author of the book, "The Net Delusion." Evgeny Morozov.
  237.  
  238. Thank you.
  239.  
  240. [APPLAUSE]
  241.  
  242. Well, I'm glad that Siva has chosen to debate with Thomas Friedman rather than with us. But I'd like to begin with a few lessons from history. First, technology always plays the scapegoat whenever it comes to debates about the closing of the American mind. Remember Allan Bloom and his bestselling book, "The Closing of The American Mind," in the '80s?
  243.  
  244. Well, let me remind you, Bloom has actually argued that the closing of the American mind occurs because of CD players and headphones. And he actually argued that those might incite teenagers to kill their parents. We know what the late Allan Bloom would have made of the iPad. Why his reactionary torch is now being carried by the liberal crowd from MoveOn is beyond me.
  245.  
  246. The second lesson from history is that concerns about online polarization are as old as they are inconclusive. As early as 1995, "The Nation" magazine carried an article by Andrew Shapiro which argued, and I quote, "Cyberspace is shaping out to be more like suburbia than Cyberkely, where you interact only with people of your choosing and with information tailored to your desires." That was 1995. Six years later Cass Sunstein argued that the internet is serving as a breeding ground for extremism because like-minded people are deliberating with one another without hearing contrary views," end of quote.
  247.  
  248. So enter Eli Pariser. In one respect, he follows in the tradition of Shapiro and Sunstein. All of them present virtually no evidence that such online segregation is taking place. But they also differ. Where Shapiro and Sunstein worried that we, the users, might choose the easy way out and simply avoid uncomfortable viewpoints, Pariser argues that filters and algorithms are doing this for us. It's a very important difference.
  249.  
  250. Shapiro and Sunstein blame the filter bubble on us, the users. Pariser blames it on the companies. Now who doesn't like such an exciting conspiracy theory? After all, it's always good to find someone else to blame but us.
  251.  
  252. I think this is a fairy tale for many of the reasons that Jacob has outlined. But let me also provoke and give you three examples of how filters may actually enhance our political culture. So let's take Twitter. Maybe they'll think that Twitter, unlike Google and Facebook, does not engage in customization and filtering.
  253.  
  254. This is actually not true. Twitter does hide certain types of messages. That's if you follow me, but you don't follow Siva, and I send Siva a public twit, you do not see that twit. Just think about it. You choose to follow me, and you probably expect to get all of my messages. But you are actually not seeing my twit to Siva. And mind you, Twitter made that filtering decision on your behalf. Is it paternalistic? Sure.
  255.  
  256. But is such paternalism justified? Well, take my case. I follow more than 2,000 people on Twitter. And I'm very happy with the breadth of news that I get. But if I have to read every single conversation that these 2,000 people have with thousands of people that they know but I don't know, I would have never managed to follow 2,000 people. At best, I would follow a hundred.
  257.  
  258. This is the beauty of it. Twitter's collaborative filter allows me to access more, not less, useful information. Now let's take Facebook. It has 800 million users. Some of them are heavy users. They have 5,000 friends and spend hours on it every day. Others open it every few weeks and only have a few dozen friends.
  259.  
  260. So Facebook has built this very clever differentiation. If you are a heavy user, it presents you all updates from your friends in the chronological order. That's the most recent updates from all your friends come first. However, if you only use it occasionally, Facebook shows you only the most interesting updates. The assumption there is that if you've been away for three weeks and you have only 30 minutes to catch up, why go through thousands of messages in chronological order. To Facebook and to me, this seems like a reasonable assumption. That's why relevance, rather than recency, is the default filter for these occasional users.
  261.  
  262. If you want to see all the recent messages, all you have to do, as Jacob said, is just click a button called Most Recent. Sure we can have a broad philosophical debate on whether social networking is good or evil. But as long as we accept that social networking is a legitimate activity, we should also accept that filters make it better. Now let's tackle the elephant in the room, which is Google. Suppose I'm so keen on conspiracy theories, b that my blackboard is larger than Glenn Beck's.
  263.  
  264. So I believe that 9/11 was an inside job, I believe that Obama is a Canadian-born Muslim, that climate change is a non-issue manufactured by the mainstream media, that the government is hiding the hiding the truth about the UFOs, and so on. In other words, I'm exactly the kind of guy that Eli is worried about. Furthermore, suppose that I became all of that before the filter bubble set in-- In the great age over until the viewpoints that we used to call the era of cable television.
  265.  
  266. [LAUGHTER]
  267.  
  268. So now comes the filter bubble. And Google start personalizing my search results. That's instead of seeing generic search results about, say, 9/11, actually see that some search results have already been endorsed or liked by my friends. In turn, when my friends use Google, they see the links that I visited and I liked as well. Now this is the new mutual exposure. Why is it a good thing?
  269.  
  270. Well, if you think that all my friends are nut cases like me, then we do have a problem, because through such new exposure, all of us may end up becoming even more paranoid. But lets leave Charlie Manson and the Unabomber aside for a moment. I don't think they are representative of most internet users and their friends. The way Google and Facebook map out our social connections, they try to be very comprehensive. We see links from people we went to school with, our colleagues, our relatives, and so forth.
  271.  
  272. It's quite likely that many of these people will have radically different positions on 9/11, climate change, Obama's birthplace, and the UFOs. So my point was this. It's that a link to the report of say the 9/11 Commission that has been endorsed by someone from my social circle is more trustworthy than a generic Google link that has not passed through a similar social filter. In other words, it's a possibility that people would now be paying more attention, or at least more respect, to positions they would otherwise find crazy and conspiratorial, only because their friends are known to endorse those positions.
  273.  
  274. So, to conclude, there are many good concerns about the future of the internet. The loss of privacy ranks very high on my personal list. But the filter bubble is not one of them, it's OK to hate Google and Facebook, but we should hate them for the right reasons. Thank you.
  275.  
  276. Thank you, Evgeny Morozov.
  277.  
  278. [APPLAUSE]
  279.  
  280. And that concludes round one of this Intelligence Squared US debate, where the motion being argued is, when it comes to politics, the internet is closing our minds. Keep in mind how you voted at the beginning of the evening, because we're going to ask you to vote again at the end. And the team that has moved its numbers the most will be declared our winner.
  281.  
  282. Now we go on to round two. Round two is where the debaters address each other directly and also answer questions from you in the audience and from me. We have two teams of two, who are arguing over this motion. When it comes to politics, the internet is closing our minds. The side arguing for the motion, Eli Pariser and Siva Vaidhyanathan.
  283.  
  284. They are arguing that the internet is actually several internets. That they put up walls between the lands that are occupied by people of differing opinions-- between nations, as much as between liberals and conservatives; and that especially with customization designed to give us what we want, that the internet is getting worse at bringing together people who are not like-minded.
  285.  
  286. The team arguing against them, Jacob Weisberg and Evgeny Morozov. They're saying, sure, maybe these bad things could happen someday. But there's not much evidence that they are happening a lot yet right now. And besides, a tool that can and does connect strangers globally around the world, by definition almost, is a good thing, and that customization perhaps helps to organize the way that we think.
  287.  
  288. So I want to put a question to the side that is arguing against the motion that when it comes to politics, the internet is closing our minds. Jacob Weisberg, you said as your sort of slam-dunk point to the audience, you have to decide whether you think the audience is closing your mind. But if the other guys are right, they won't know whether the audience is closing their minds, because if their minds are being closed, part of having their minds closed is not knowing that their minds are being closed. But they're saying that it's an insidious thing, that it's a stealth thing; particularly with the customization, that it happens in a way that it comes to you-- you go through this process unawares. Can you take that on?
  289.  
  290. Well, this is the "minds in a vat" argument--
  291.  
  292. [INAUDIBLE].
  293.  
  294. I'm sorry. This is the "minds in a vat" argument in philosophy, that if we were not actually embodied but have our minds in vats being manipulated by space aliens, we have no way to know. I'm not sure I can answer that. But I do you think-- Siva made the point that, well, for those of us in this room, this really isn't a problem. I mean, we're media savvy, we're educated, we're sophisticated. It's a problem for the hoi polloi out there-- the unwashed masses who are just getting spoonfed whatever they get fed. I find that argument condescending.
  295.  
  296. I mean, we live in a democracy and it seems to me that we're all responsible for the information we receive. And not everybody engages as deeply in different subjects. But the difference with the internet is we can measure it. We never knew who read the stories in the B section of the New York Times about Albany and the state legislature. On the internet, you know how many people click on them. But it doesn't mean--
  297.  
  298. It's about 11, I think. Right?
  299.  
  300. Yeah, what the hell, yeah. Exactly.
  301.  
  302. But how many people actually read to the end or even the beginning of the story before we had the internet.
  303.  
  304. Well, so then take on, number one, Jacob's point that you're, I think, suggesting little bit of snobbery here-- that you're saying, we all can keep up with things, but that there's a great unwatched public out there that can be deceived by the power of these algorithms that are telling them what needs to show up in their searches.
  305.  
  306. I merely meant that Jacob's test-- whether it works for you-- wasn't a good enough test, largely because you are just you. And you are not we. You are not a greater sample, right? So if we're going to be empirical, let's be good about the social science we deploy. That is the worst possible empirical test-- what happens to you. It's almost laughable in its suggestion. So that can't possibly be the test when you address this question.
  307.  
  308. But wasn't your partner using the "that's what happened to me when I typed in Egypt"?
  309.  
  310. [LAUGHTER]
  311.  
  312. Let's talk a little bit about the studies about this stuff--
  313.  
  314. Eli Pariser.
  315.  
  316. --because I think it's worth digging into this a little bit. There is one-- first off, the reason that it's so hard to study this stuff is because the easiest way to study it is to get inside the black box of these companies. And these companies don't have any interest in letting people go in and prove that these companies are doing bad things.
  317.  
  318. But it's very hard to look at from the outside, which is why there's just not many studies that haven't been, as Jacob said, funded by Facebook, funded by Microsoft, funded by the companies themselves. The one clear study on Google personalization that has been peer-reviewed is a study by a guy named Martin Feuz. And it's pretty clear. The methodology's good. He looked at how a search history affects the personalization you get. And it's very clear. You know, again, 60% of the search results on the first page are different for most-- for people who have a long web history in Google.
  319.  
  320. But actually the study that's the most interesting here is the Genskow study that Jacob was referring to. This is a study that was sort of initially described as a study that showed that people are linking to each other and that the internet isn't as polarizing as we thought.
  321.  
  322. And the interesting thing is, if you dig into that study, it actually arguably shows the exact opposite. What they did was they created an isolation index for each different kind of medium. And so the isolation index for cable news that that study used was about 3.3. On the internet-- the internet had an isolation index of 7. And so it's twice as polarized as cable news.
  323.  
  324. Now what's more interesting is that this study was done before the personalized internet. So the main thing that that study was looking at was the fact that a lot of people-- their online serving patterns lead them to Yahoo and then from Yahoo out to a whole bunch of different ideologically diverse links.
  325.  
  326. But Yahoo has changed since 2007-- since that study was written. And so now, Yahoo looks at your history, on what links you've clicked at in the past, and sends different people out in different directions.
  327.  
  328. You know, let me let the other side come in and take on--
  329.  
  330. One more piece of--
  331.  
  332. OK, go ahead. --which is that the other thing that's changed since that study was done, is the rise of social media as one of the primary ways that people get information. And they did this isolation index calculation on people's sets of friends as well. So cable news is 3.3. The internet in 2007, in the Yahoo era, was 7. And people's groups of friends was 30. Now that 30 has become embedded in every experience that we have on the internet now. So I think, actually, if you look into the studies that have been done, it's quite clear.
  333.  
  334. Let's let Jacob Weintrab or Evgeny--
  335.  
  336. I read the same study. It was little baffling. But it said the polarization index for newspapers was 10. So the old media is well beyond the m-- I think you need to take all that with a grain of salt. I mean--
  337.  
  338. But you cited it. I mean, this was the study that you were citing.
  339.  
  340. Exactly. And it doesn't show that the internet is the most polarizing thing. But there-- I looked at about half a dozen studies for this. And I would have to say, just trying to be as objective as possible, that the preponderance of evidence-- the studies I didn't cite-- say that there's not clear evidence that this is happening. There's no evidence it's getting worse.
  341.  
  342. And what did you think of the [? Feuz ?] study, because we tweeted about this.
  343.  
  344. I don't think it showed what you're saying.
  345.  
  346. Just bring your opinion in.
  347.  
  348. You know, I think it's fine to be talking about studies that no one but a few people in this audience have read. So they all sound very authoritative. But I think you also have to keep in mind that a lot depends on what kind of information you're trying to access online. If you're searching for pizza in Manhattan and I'm searching it from my mobile phone and Google knows where I am, I would actually want 99% of the search results to be personalized, dep probably not 60. 60 is not high enough, in part because it's very obvious that what I want to do is to order a pizza. Right?
  349.  
  350. So again if I'm searching for political information, maybe the ratio should be 10% or 5% or 0%. So to say that there is 60% of information that's personalized doesn't say much, because it all depends on what it is that's being personalized.
  351.  
  352. The other point is that, again, take something like YouTube. So if you don't sign into YouTube, YouTube will show you at the very front page, videos that are most popular with the rest of the crowd. You'll see all those fascinating cat videos. You'll see all those videos from MTV. You see whatever is now popular in the online world. They will be displayed to you, and you'll see them very prominently.
  353.  
  354. If I sign in with my Google account, with my search history, instead of cat videos, I'll see links to new, exciting videos about history, about culture, about theatre. Why? Because Google and YouTube know that those are the kind of videos I like to watch. So if Google can show me more videos about history than videos about cats, I can't really see how the internet is closing our minds.
  355.  
  356. All right. Siva, you arguing that the internet is closing our minds, for your response.
  357.  
  358. Thank you, Evgeny, for making two of my points. Number one, that the notion of personalization-- either through Facebook or another social network or through Google-- is actually tremendously helpful to us for shopping. It's not so good for learning. When you want to learn--
  359.  
  360. About cats?
  361.  
  362. When you want to learn about anything complex, the worst thing you can do is subject yourself to a social filter. The best thing you can do--
  363.  
  364. You need better friends.
  365.  
  366. --is challenge yourself--
  367.  
  368. [LAUGHTER]
  369.  
  370. The best thing-- you're the person I retweet the most.
  371.  
  372. [LAUGHTER]
  373.  
  374. The best thing you can do--
  375.  
  376. You need to get out of the bubble.
  377.  
  378. The best thing you can do is seek sources of expertise. Seek sources that are particularly good at researching particular areas. Use online sources that are focused in particular areas like health, like science, right? That means call your librarian. That means talk to an expert. And that means go beyond asking a small set of people what the best way to explore a particular issue is. But for pizza, not a bad idea. And we can talk later, because I'll tell you. I used to, live in [INAUDIBLE].
  379.  
  380. But why not integrate the 2,000 people that are following Twitter, one of whom is you. Why not integrate those sources in my search results?
  381.  
  382. But--
  383.  
  384. Why shouldn't I be able to see the search results if I like, if no one is concerned about privacy. Don't you think the links you share or click on are not smart or interesting for me?
  385.  
  386. They are for you-- that's why you're my buddy. But that doesn't mean necessarily that what I'm doing is enhancing anybody else's efforts.
  387.  
  388. But why don't you trust me to choose my friends and sources?
  389.  
  390. You know, I think--
  391.  
  392. Eli Pariser. I also think--
  393.  
  394. Do you want to choose them for me?
  395.  
  396. I also think your opponents who are arguing against the closing of Americans' minds through the internet have also had the argument that this personalization and customization-- the filtering helps you think. It helps you organize. It cuts away the chaff. It brings things down to a manageable amount of material. And I can see the appeal to that argument. So take that on, Eli Pariser.
  397.  
  398. Well, you don't hear me arguing that all personalization is bad, all filtering is bad. That's kind of a nonsensical argument. As everyone has said, filtering has been around as long as there's information to filter. But the question is, what are the filters-- what are the lenses through which we're looking at the world now doing for most people who are looking at the world through those lenses.
  399.  
  400. And I think it's-- you know, we're talking a lot here about Twitter. But it's important to remember that Twitter actually-- it's hard to remember in a group of New Yorkers-- Twitter is a fringe phenomenon on the internet relative to Google or Facebook. Twitter has a tiny, tiny user base compared with these other-- sorry--
  401.  
  402. Only 200 million.
  403.  
  404. Well, but the actual users-- the active users-- if you look at active users on Twitter, it's actually much smaller. It's about 25 million people, which is a lot of people. But not compared to Facebook-- [INTERPOSING VOICES], not compared to Google.
  405.  
  406. And so we have to look at, sort of how most people are using this stuff.
  407.  
  408. Yes, Jacob Weisberg.
  409.  
  410. I just want to ask Eli one other question, since we're dealing with the interesting implications of this thing that I don't think is happening. What kind of regulation do you want to deal with this problem? Do you want to say that Google and Facebook aren't allowed to filter in certain ways. Or do you want them to become the sort of paternalistic gatekeepers that say, you know, if you're searching for a Lady Gaga video you also have to have a little bit of this interesting study from the nonpartisan Congressional Budget Office.
  411.  
  412. [LAUGHTER]
  413.  
  414. I'm not suggesting that we replace the old media paternalism with new paternalism. I'm suggesting that if we built these tools toward the purpose of helping people get good information, including a diversity of perspectives, we could do that in a way that would draw on the internet's strengths.
  415.  
  416. So I'll give you one example. You know, Facebook-- the main way that you propagate information across Facebook is with the Like button. The Like button has a very particular balance. It's easy to click Like on "I baked a cake." It's hard to click Like on "Massacre in Darfur Continues for 11th Year."
  417.  
  418. And so certain kinds of information propagate very readily. Other kinds of information don't propagate at all. Now without instituting any kind of objective paternalism-- this is more important than that-- you could put an Important button on Facebook that will allow people to elevate the topics that they believed were important, as well as the topics that they think are fun and interesting and that they'd like their friends to see.
  419.  
  420. So there are ways of doing this that are internety, that don't require someone in a room making these value judgments, but that actually lead toward filtering, that help us be good citizens.
  421.  
  422. But it doesn't solve the filtering problem. Come on, we'll still be as close-minded if you buy into a paradigm with the Important button. It's just that now in addition to being closed-mind, three people in the audience will care about Darfur.
  423.  
  424. Well, this is better than none. But I think the interesting thing here, that I kind of want to get to is-- I wanted to ask you about this, Evgeny, because you wrote a Slate piece recently which closed with a call that was actually kind of more paternalistic than where I would go.
  425.  
  426. Sure.
  427.  
  428. It closes by saying, "It's not unreasonable to think the denialists of global warming or benefits of vaccination are online friends with other denialists. As such, finding information that contradicts one's views would be even harder. This is a reason for Google to atone for its sins and ensure that the subjects dominated by pseudo-science and conspiracy theories are given a socially responsible, curated treatment."
  429.  
  430. Uh-huh.
  431.  
  432. So can you talk about-- I mean, I was just curious about that, because I can't make it square with your position here.
  433.  
  434. No, I think it's very easy to make it square. I mean, first of all, public health, particularly vaccination, I think is different from political information. Political information is contentious by default. Vaccination-related decisions are far more contentious because public health and often national security is at stake. So there is a difference there, first of all.
  435.  
  436. Are you saying that global warming isn't political?
  437.  
  438. Second of all, it is less political than public health.
  439.  
  440. Really?
  441.  
  442. Well, look, is anyone going to shut down the school or public square because of global warming tomorrow? They will if there is an outbreak of an epidemic or some contagious disease. Again, you have to understand that this is a very particular context in which, I think, warning people about certain types of websites-- which has nothing to do with actually socialization, but has everything to do with certain publishers publishing deliberately misleading information about vaccination, for example, or about the risks of flu vaccines, or any other sort of vaccines.
  443.  
  444. I mean, I think it will not be a bad idea for Google to show that there are sites which have actually been approved through peer review and through normal scientific practice, to show that, hey, you actually have sites which you can trust.
  445.  
  446. I don't think it has much to do with filtering or the social layer. But when you say it's not unreasonable for denialists--
  447.  
  448. Eli Pariser.
  449.  
  450. --to be online friends with other denialists and as such finding information that contradicts one's views--
  451.  
  452. [INAUDIBLE] so it's not unreasonable.
  453.  
  454. But--
  455.  
  456. Let me bring in Siva.
  457.  
  458. Well, I would go to a less paternalistic position. I don't think we should be in the business of telling Google what to do, or telling Facebook what to do. I think we should be in the business of building new platforms, new tools, new ways of relating to each other, and having these conversations. I think everybody should read books about the subject, which is a pretty good way to--
  459.  
  460. Can I ask this--
  461.  
  462. --and then adjust your own behavior and have those ideas echo through your social networks, that perhaps it's a good idea to break out of the filter bubbles.
  463.  
  464. Let me ask the side that's arguing that when it comes to politics, the internet is closing our minds, first, for some concrete examples of this phenomenon that concerns you, actually having caused harm to the body politic-- because it is a little bit of an assumption. And maybe you're not making this point. There's a little bit of this assumption that we all do need to be talking to each other and that there's a middle that we'll reach and some Kumbaya moment.
  465.  
  466. But in fact, what's wrong with having people entrenched in their own camps, angry at each other, as long as the political spectrum is covered overall. What's wrong with that? And what examples of harm can you talk about that have been caused by that?
  467.  
  468. Can I-- just a little story. When I was on the book tour for my book, I was on a radio show in Saint Louis. And the host decided to make this big spectacle having people Google for Obama and call in and read their search results. It was a really boring radio hour.
  469.  
  470. And the first person called in, and the second person called in, and they interviewed everybody. And you had people do a kind of read off, where they were both reading it off at the same time. And it was exactly the same. And I was thinking, this is the worst book promotion I've ever done.
  471.  
  472. And then a third guy called in. And he said, you know, it's the damnedest thing. When I google Barack the first thing that comes up is this link to this site about how he's not a natural citizen. And the second link is also a link to the site about how he doesn't have a birth certificate.
  473.  
  474. That was his publicist.
  475.  
  476. I was wondering about that. But I think the danger here is -- it's not just that he was getting a view of the world that was really far from-- you know, sort of really off the average here. But he didn't even know that was the view that he was getting. He had no idea how tilted that view was. And that's sort of the challenge. I just want to address one other point, which is that there seems to be this question about whether this is happening. And it's really kind of funny to me because if you talk to these companies and if you listen to what they're saying all of these companies are very clear that personalisation is a big part of what they're doing and what they're--
  477.  
  478. For pizza-related decisions, they are very clear. And they say, we don't want to do it for politics, we in only want to do it for pizza.
  479.  
  480. Right. And the question is, can you trust--
  481.  
  482. Eli, let me bring Jacob-- But Jacob, can you resp-- I mean. I think, Eli left a pretty good image hanging out there of these folks truly not knowing how much they don't know and believing what they're getting and not understanding how slanted it is. That landed pretty well, I think. So can respond to that?
  483.  
  484. But a guy who called into a radio show? I mean, I know the plural of anecdote is data. But I mean if this were really happening in the way you say it is, wouldn't there be some kind of decent study that actually showed widely varying results. And as I say I've tried to test this out as bests-- I tried it myself on a variety of browsers, signed in, x signed out . Wikipedia always comes up first. Sometimes it comes up second. Wikipedia's vaccine entry is pretty good. I do not think there is actually the kind of variety you're talking about in searches done most of the time by most people.
  485.  
  486. Siva
  487.  
  488. So doing social science online is really hard, because almost nothing is replicable. Right? So almost any-- even if you did a broader study than that, even if Eli did a broader study than that-- the third person to do that study would not come up with the same results, for the simple reason that both Facebook and Google are constantly changing their algorithms, constantly tweaking their algorithms for reasons we can't possibly understand and are not allowed to understand.
  489.  
  490. It's also important to remember that there are so many variables in what you get. There are some searches that are more personalized another searches, again for reasons that Google understands and we are not allowed to understand. There are some ways that you can generate different searches, because there are certain key words that matter more uncertain geographic locations than others. So a search for S-O-X in Boston or Chicago will yield a particularly strong results and personalize in those areas where it's S-O-X searched somewhere else in the world might yield gibberish because it doesn't mean anything. Right.
  491.  
  492. So all of these variables plus the big variable-- whether or not you're a Gmail user, whether or not you're signed into YouTube--
  493.  
  494. But if its--
  495.  
  496. --whether or not--
  497.  
  498. Are you agreeing with your opponent that you can't make the case based on data--
  499.  
  500. No, no. what I'm saying is that the empirical studies that have been done are limited in their utility. What we do now is exactly what the companies say they do. And they say-- and they've said to Eli on a number of occasions that they personalize our results and--
  501.  
  502. [INTERPOSING VOICES]
  503.  
  504. --not for social connection.
  505.  
  506. Evgeny Morozov.
  507.  
  508. You have just defended the companies, because, I would actually prefer Google and Facebook to personalize my search results based on 500 factors, rather than on just two. factors.
  509.  
  510. I'm not interested--
  511.  
  512. I'm not going to want them to personalize my news search results based on my sex and based on my race and based on how old they think I am. I would like to take a broader view and incorporate 300 factors, 500 factors if it's necessary--
  513.  
  514. OK.
  515.  
  516. [INTERPOSING VOICES]
  517.  
  518. --rather than pigeonhole me into a social group to which I don't want to belong.
  519.  
  520. And I'm not interested in attacking or defending the company's, I'm interested in explaining what's going on to the best of our , And we are feeling rather a dark because all of this is in a black box, private data, and yet so deeply important to our public lives.
  521.  
  522. But that's how capitalism works. They have trade secrecy. they will never disclose their algorithms--
  523.  
  524. I know.
  525.  
  526. --because if they do, you're going to suffer.
  527.  
  528. I know. That's the problem.
  529.  
  530. Jacob Weisberg.
  531.  
  532. [INTERPOSING VOICES]
  533.  
  534. I think there's another point here. I mean, we're talking mainly about Google and about Facebook, that have these kinds of personalisation you can turn off. Whether when they're turned on they do what you say they're doing is a matter of some dispute. But most people don't get their news from Google search. And most people don't get their news from Facebook. They get their news from the news.
  535.  
  536. Now, these are increasing sources of information of different kinds. But, I mean, have you used Google News? It's not a great product. And it's not one of the top ways people get their news. People still get their news both from traditional news sources and from big online sources like CNN, News. Yahoo News is the AP. If you go, you know, it's basically an AP news feed.
  537.  
  538. Not any more.
  539.  
  540. There is a social feature similar to Facebook and Google you can turn off, you can have personalized or not personalized, as you prefer.
  541.  
  542. So first of all, defaults matter. The fact that you have to know to turn it off is a huge issue. There's a reason those social phenomena are on by default because it's in those companies interest to make sure that you do get personalized results. It's better for these companies.
  543.  
  544. But it's in our interest. I want it turned on. But you might not realize that it's happening unless you happen to tune into--
  545.  
  546. [INTERPOSING VOICES]
  547.  
  548. --a personalized option. I mean, Siva. it's a chicken and egg problem. You will not realize that you can personalize them otherwise. I don't see it as an argument.
  549.  
  550. All right. I'd like to go to audience questions at this point. We're in the middle of round two. Our motion is, when it comes to politics, the internet is closing our minds. Microphones will be brought forward if you raise your hand. Remember about 30 seconds is what you'll get.
  551.  
  552. You can state a quick premise. But please ask a question that keeps us on the topic of this motion. And we'll move along this debate. And I really do mean it to be a question, not a debating point. But we did put out on slate.com, who's our media partner, a request to them for questions from some of their subscribers.
  553.  
  554. And we also-- is Lenny Gengrinovich here? Did you show up? Yep, there he is. All right, so Lenny took the initiative of sending a question into us by email. So thank you for that. And I'm going to actually-- you can ask the question yourself. I just wanted you to actually make it drastically shorter. It's just one page. And you want I can paraphrase it. Or -- I'll go ahead and paraphrase it.
  555.  
  556. He basically makes this statement the tiny minority of democratic citizens who are interested in political ideas and issues cannot have a discussion. They have freedom of speech. But there's nobody to speak to. In the internet ersatz world of discussions which only hides the problem. Do you agree that our nation needs affirmative action for intelligent conversation? I'll put that to Siva first.
  557.  
  558. Most definitely. But we're doing it right now. Right? It means that we have to have foundations, like the foundation that supports this discussion. It means that we have to have universities and we have to have schools take these issues seriously and make sure that we are all able to understand the environments in which we operate, that we have to understand the nature of these platforms and technologies. And when there is a suboptimal result, we have to know how to correct for it, and if we're not happy with it, to invent something new.
  559.  
  560. Other side, want to respond? Or do you agree? Jacob Weisberg.
  561.  
  562. Well, it's just that I think the low quality of is one of the big problems on the internet. It's one we've been very concerned with. Jacob, can you just come a little closer.
  563.  
  564. Oh, I'm sorry. We've been very concerned with Slate over the years, with improving it. The big breakthrough, I have to say, came from Facebook, through a technology, an interface known as Facebook Connect, which basically brought identity based commenting to every website.
  565.  
  566. And it means that because people are themselves when they comment, they're actually a lot more reasonable-- not necessarily reasonable all the time-- but they don't engage in the case in quite the level of viciousness and ad hominem attacks and just off-point arguing they do when they're anonymous. So I think partly this is a problem I'm optimistic about improving through technology.
  567.  
  568. All right, let's go out to the audience now. And right in the front row right behind the o'clock. Again, you can stand up and hold the mic close to your mouth. Tell us your name, please.
  569.  
  570. Hi, my name is [? Tal ?]. And there's two main points in the premise tonight. One, that our minds are being closed at least more so than any other time and that the internet is doing it. And I would like to ask the panel for the motion to address both of those issues. I think they have-- unless you feel that you haven't covered that. I think that--
  571.  
  572. [INAUDIBLE].
  573.  
  574. I would only qualify it by saying it's not as simple as doing. It is that we are interacting with these platforms at such a high and intense level without realizing what's going on. And that's what's happening. So it's amplifying what we were already willing to do which is gather among ourselves and reward ourselves with positive affirmation. And that's really mostly what goes on Facebook.
  575.  
  576. But Siva, first you argue that we are gathering amongst ourselves. But then you argue that the internet is getting Balkanized and we are all in our own little niches. I mean, how do you square with both of them?
  577.  
  578. Ourselves. That's one of the niches, right? And as being Balkanized nationally, which then reduces the global, political [INTERPOSING VOICES].
  579.  
  580. So you think it shouldn't be Balkanized? you aspire to this one global utopia, where people in China feel like people in India and Iran and elsewhere.
  581.  
  582. Now that would be really nice.
  583.  
  584. [INTERPOSING VOICES]
  585.  
  586. There's a great book called "The Big Sort," about how increasingly people live in communities that are like them, and homogeneous communities. And one of the sort of fascinating things about this book is that you have two things going on at the same time. On the one hand, there's a broader set of types of communities than there was 30 years ago. Right, that's undoubtedly true. On the other hand, for each individual person in that system, their neighborhood is more like people like them than it ever was in the past.
  587.  
  588. And so you have local homogeneity and heterogeneity at the top level. So I think when people are trying to make the opposite argument, they point to, like hey, look at the internet, there's all this different stuff out there. It's so heterogeneous. Everybody's talking.
  589.  
  590. But actually the question is, what are people's local neighborhoods like? And to the extent that we're now bringing those local neighborhoods with us online, what effect does that have in amplifying the homogeneity that increasingly surrounds us even when we're offline.
  591.  
  592. I would encourage you to look at the recent issue of American Journal of Political science, which actually debunks the "The Big Sort" book with a little empirical data, and shows that even in the offline world, that has never held true.
  593.  
  594. Can you go up three steps, please, and turn right, ma'am? Yeah. If you're rising-- yeah, on your other side. Just wait for the mic. Thanks.
  595.  
  596. Thanks. My name is [INAUDIBLE]. You all write, I assume, political blogs and political stories or information you provide, or run a political-based organization. And I'm curious about either the growth of your audiences or the decrease of your audiences, and the quality and the quantity of discourse that's been created over the past, let's say three years.
  597.  
  598. So how would you relate that directly to the motion? Are you asking whether they are seeing a closing of the minds in the interaction with their readers?
  599.  
  600. In their own personal writing and their own personal blogs and the information that they're providing to--
  601.  
  602. Are they seeing minds closing?
  603.  
  604. Are they seeing minds close?
  605.  
  606. Jacob Weisberg ,
  607.  
  608. Well, I think I tried to answer-- I tried to speak to that in my introduction. No, I felt personally just the opposite. I feel like I've never been in touch with of wider variety of viewpoints and people. And a lot of that is through social media because people act as filters for you. When you find someone interesting, whether they're a personal friend, whether they are someone in a field you're interested in, whether they're in another field or just someone you admire. They bring you a whole range of things on Twitter on Facebook. So, no, my experience been just the opposite.
  609.  
  610. Siva?
  611.  
  612. Well, I've written for Slate, so I've had a great experience. But my experience is narrow. My experience is my own. And my experience is based on the particular subjects that I cover, well, which happened to plug right into these questions. For the same reason that this is not a representative audience of internet users, my readers are not a representative audience of internet users.
  613.  
  614. I just want to speak to one of the--
  615.  
  616. Oh, sure you can.
  617.  
  618. So I think that the fear that somehow diversity online is shrinking-- I think my own case disproved that. I mean, I have an unpronounceable name. I was born in the middle of nowhere in Belarus. And the only reason I actually have the attention and the audience I have is mostly because of the internet, because of the [INAUDIBLE] that I'm on and because of the active writing for online publications, and doing a lot of online research in Google Scholar. So I would say that the fear that somehow it will become impossible for diverse voices from Iran or India or elsewhere--
  619.  
  620. She's not asking-- she's asking whether your interaction with the public is indicating to you that the-- am I right, ma'am? You're talking about the--
  621.  
  622. But the problem with-- I mean, if I look at the public--
  623.  
  624. [? I'm talking about ?] the quality of discourse. I'm assuming that this debate is about whether discourse is closed, whether the--
  625.  
  626. OK.
  627.  
  628. My pointing of my own case shows it is actually getting more diverse in terms of voices who never previously would have a chance to sit in the audience like this or to participate in political debates that I participate in.
  629.  
  630. Just speak, Eli Pariser.
  631.  
  632. When I was at MoveOn, we tried a number of times to do these kind of cross partisan conversations and have people sit down in a room together and talk. And it is amazingly hard to do. And I think it's probably always been hard. I don't think that's a new phenomenon. But the degree to which people live actually in different universes of fact, not just different universes of opinion-- in other words, the degree to which you can't even get to something to argue about because people won't even agree that that's a thing is really striking.
  633.  
  634. And I can't help but wonder, as Evgeny has, as have many of us, if the way that the internet is structured isn't amplifying that, and helping people who believe in a pretty, in my opinion, to misinformation like the Obama birth-certificate suffering that climate change is not something that's partly caused by people. Is this amplified by that?
  635.  
  636. Siva?
  637.  
  638. I have a really quick answer. I misunderstood your question, and I apologize. Before five years ago, I didn't have such a thing as Facebook on which to engage in conversation with my friends. I ran two blogs. And both of them were maddening because every conversation I had was overwhelmed by harassing people-- people who were merely there to disrupt the conversation, not there to engage in a reasonable and rational way. We call them trolls. And one of the reasons that I closed down my blogs is it was so maddening.
  639.  
  640. As opposed to taking lithium.
  641.  
  642. Exactly. The quality of discourse in the comments, plus the spam, was just out of control. On Facebook it's a lot more comfortable, right? It's a lot more pleasant. It's because I never see people I disagree with. It's really lovely now.
  643.  
  644. I think a lot of the questions that came in--
  645.  
  646. You follow me on Twitter though.
  647.  
  648. A lot of the questions that cam in through "Slate," are similarly focused on this question of the impact on the quality of the American political discourse. And Levi Osborne in Warrensburg, Missouri, asks, is it so much the internet in itself that is closing our minds as it is the hyper-partisanship. of these commentators which so many people listen to. And by commentators, he's talking also not just about people who are online, but people who are also in television who are blogging, and people in newspapers who are blogging. But he's talking about the blogging aspect of people with hyper-partisan points of view. And Jacob Weisberg.
  649.  
  650. Well, I do think the phenomenon of increased polarization in Congress is pretty clearly documented at this point. That's happening. And I deplore it. I just don't think the internet has anything to do with it. I think the big drivers of that are redistricting, which put people in districts that tend to go one way or the other and fewer that swing back and forth. I think it's fundraising, which means politicians spend all their time fundraising and actually don't have human relationships with each other very much.
  651.  
  652. I think hyper-partisan media of which FOX is probably the best example, have some impact on it. But members of Congress, if you want to look for a group of people who really aren't on the internet very much, that's them. I mean, I don't think it's what's driving it. Siva. So part of the phenomenon of the punditocracy is indignation. That's the currency. And indignation is a subset of the attention economy.
  653.  
  654. In order to appear on CNN at 2 o'clock in the afternoon, in order to appear on Fox News at 7:00 PM, you have to say something that angers somebody, either directly in their face-- I'm angering you as someone who disagrees with me-- or more likely, you gather people on your side to be angry. And the notion that we are so addicted to something now we can measure-- the click, the instant attention, the instant affirmation-- just reinforces that. so it is a phenomenon that's actually moved from the blogosphere.
  655.  
  656. And that's why we've seen so many people who got famous being bloggers now appearing on CNN at 2:00 PM, now appearing on Fox at 7:00 PM, right? That's what's happening. It's why MSNBC is filled with former bloggers. It is because they're already adept at generating indignation toward one side or the other.
  657.  
  658. And is this a mind-closing exercise, or is this just meaningless entertainment?
  659.  
  660. I think the results are--
  661.  
  662. It does have an impact.
  663.  
  664. --mind-closing, yes.
  665.  
  666. I think, just to take us back to the resolution-- you know, I don't think anyone up here would argue that the primary thing driving American-- the very partisan spirit of American discourse-- is the internet. The question is, is the way that most people use the internet helping or hurting?
  667.  
  668. That's the question that you really have to answer. Is the way that people are using the internet helping or hurting in this? And I think it's hard to argue most people's social networks are homogeneous. That's just true. Most people do have social networks that are largely made up of people like them. Now there are the outliers for sure. But if you're getting your news more from people who you know, you're getting more of your news from people who are like you.
  669.  
  670. OK, let's go back to the questions. You're wearing a lime green tie. You're looking at your tie. That's you. Sorry that was a terrible cue to give you. You gotta look down.
  671.  
  672. My name is Andy. So I have a question about confirmation bias-- I really don't know the answer-- which is, do you think now or before the internet, it was harder to avoid whatever your ideology-- information that would challenge your beliefs-- whether it be through the nightly news or an encyclopedia or the newspapers, whereas now when you mentioned Wikipedia. But there might be an analog to the encyclopedia of old. But now you could just go to conservapedia and find out that, wow, Paul Revere actually-- the ride was consistent with what everyone described it as, right? And so it's a lot easier to find that information that's going to confirm your biases and that would contribute to the closing of the mind.
  673.  
  674. Evgeny kind of addressed that point when he-- I think you said, Evgeny, earlier in your opening statements, that initially the theory was the internet gave us tools to do stuff that we were already doing, but that the other side is arguing that without our knowing it now, that the internet is making us do that. And we're unaware of it. So do you feel that your question was answered. Well, yeah, it was more the idea of seeking out information that is going to keep feeding your own biases, whereas you're not exposing yourself-- you're not going to be exposed as much as you would before-- to stuff that's [? going to challenge that. ?]
  675.  
  676. How is it different from 30 years ago that if you could go down to the library and take out all the crackpot books that you wanted to?
  677.  
  678. Well, I mean, that's the question, was that it's available.
  679.  
  680. Do you want to answer the question. Let's let Jacob Weisburg take on--
  681.  
  682. You know, Daniel Patrick Moynihan used to always say that everyone's entitled to their own opinion, but not their own facts. And I think we all agree that in a democracy, you need an informational commons to some extent, where people are arguing about what they think about reality, not about reality itself. But I think Wikipedia is a pretty great informational commons. And it is, again, the first search return on almost everything, or at least one of the first few.
  683.  
  684. And Wikipedia has its issues and problems. overall Wikipedia strives to be neutral ground on an informational level. I mean, that's an anecdote. But I think it's an important one. I think the internet cuts both ways. And in some ways it can cut in favor of confirmation bias that you're talking about. And in some ways it can cut right against it.
  685.  
  686. I think we also need to--
  687.  
  688. Evgeny, let me let the other side come back to that.
  689.  
  690. Eli Pariser, who's arguing the internet is closing our minds.
  691.  
  692. Well, just to go from anecdote to the study, one of the findings of the Few-- Martin Few-- study was that the more personalized Google search results were, the further down Wikipedia was in the search results. So actually if you happen to have an empty search history, Wikipedia is the top result. but the more that-- the larger your search history, the less Wikipedia comes up, and--
  693.  
  694. [INTERPOSING VOICES]
  695.  
  696. --the more it's replaced by-- so this was just-- this was the best study that we have on Google personalisation. And it suggests-- I agree that Wikipedia's great.
  697.  
  698. Do you see the logic? Why would Google deliberately want to demote Wikipedia in search results?
  699.  
  700. Totally. I see the logic. I just think that the logic-- you know, they have the means and the motive to provide the links that people will click on the most. That's not the means and the motive to provide the links that are the most useful for people.
  701.  
  702. It's because of that "bursts of pleasure" thing that you talked about in the people's brains.
  703.  
  704. So you think people would walk away from something if Wikipedia comes first.
  705.  
  706. Evgeny, I just want to let him finish the point. Is it because of that "burst of pleasure" principle that you talked about? That in fact, that they've actually studied it , and that we feel good, and we stick around if we get results that we like.
  707.  
  708. Yeah, I think that's one of the driving--
  709.  
  710. So you never got the chance to respond to that point and I'd like to hear it.
  711.  
  712. Google places no ads on Wikipedia. Google doesn't care what you click on. All they care is that you stay on Google, right? They don't really care whether you go to Wikipedia or Conservapedia or somewhere else, right? and I think right now what Google actually wants to do is to present you the result right away, without you actually having to spend any time on it.
  713.  
  714. [INTERPOSING VOICES]
  715.  
  716. So again, I just don't see the logic on which you're basing that decision. But to respond to the question you asked, I think we need to set realistic goals for ourselves. We're not going to dissuade people who deliberately decide to go to Conservapedia instead of Wikipedia. There is nothing policy-wise that we can do. If people already decided to seek those very biased views, there's very little that we can do.
  717.  
  718. I think we need not to be Utopian. We have to set realistic policy goals, and from a public-policy perspective, to figure out what we can do-- change design, regulate Google, ask them to do something. But do you expect that Google suddenly can wave their magic wand to make sure that people stop going to foxnews.com or Conservapedia, and start going to Wikipedia instead. I think it's an unrealistic goal that you shouldn't even strive for.
  719.  
  720. Siva.
  721.  
  722. Again, thank you for making our point. That's basically what we're saying-- that Google is in the business of making sure that if you're looking for a conservative opinion about something, you're going to get more conservative opinions about things. I'm sorry. If you've expressed a desire to click on conservative things, you're going to get more conservative things.
  723.  
  724. There is no evidence that Google does any personalization in politics.
  725.  
  726. Don't expect Google to magically change--
  727.  
  728. [INTERPOSING VOICES]
  729.  
  730. Evgeny, hold. Let Siva finish.
  731.  
  732. --don't expect Google to change magically. We don't ask Google to change magically. What we ask--
  733.  
  734. It doesn't need to change. It doesn't do anything of what you are saying.
  735.  
  736. What we ask is that we pay attention closely to what Google itself says about what it does, as is well documented. And we pay attention to the totality of our information ecosystem and the strong role that Google plays in it.
  737.  
  738. Evgeny Morozov.
  739.  
  740. It's all fine to be paying attention. I'm all for paying attention to Google. It's just that Google itself goes on the record saying that they do not want to personalize politics. All they want to do is personalize pizza. I think from a business perspective, I see no reason why Google would want to serve you biased search results, and show Conservapedia before Wikipedia.
  741.  
  742. It doesn't do that explicitly.
  743.  
  744. It's a political bomb. If you can produce a study that will prove that.
  745.  
  746. It's a [INAUDIBLE] battle for Google.
  747.  
  748. It does so mathematically, not politically. It doesn't distinguish between a political site and a pizza site, because it doesn't do that sort of content [INAUDIBLE].
  749.  
  750. Yes, it does. Because when you search for GOP, it knows you're searching for GOP and not pizza.
  751.  
  752. It doesn't read what GOP is. It associates it with all the other GOP clicks. It doesn't flag it as political. It just associates it mathematically with all the other behavior going on on the Web.
  753.  
  754. So GOP could be pepperoni as far as the algorithm is concerned.
  755.  
  756. Great Old Pepperoni, yeah.
  757.  
  758. All right. I want to remind you, we're in the question and answer section of this Intelligence Squared US debate. I'm John Donvan, your moderator. We have four debaters-- two teams of two, debating this motion. When it comes to politics, the internet is closing our minds.
  759.  
  760. And sir, right in the center there. Yes. I'm sorry. I meant a few rows back. I'll try to come back to you. My apologies. That was an ambiguous signal. And I didn't mean you either. But you got the mic. You win.
  761.  
  762. Hi, my name's John. So I don't really hear anyone saying that information moving quickly, and lots of information, is closing our minds. I hear not the internet but internet companies or even personalization is closing our minds. So my question for the team arguing for the motion is, would something that wasn't corporate, like a public search engine alternative to Google, that's nonprofit and not commercial, solve some of the problems. And then my question for the "against the motion" team would be, would you be opposed to something like that as a policy decision because you love personalization so much.
  763.  
  764. And I'm assuming that this public service search engine would not be doing personalization.
  765.  
  766. Well, perhaps it-- I mean there could be a social aspect like town halls or sort of public spaces. You could have internet public spaces that are social also--
  767.  
  768. So the answer is, perhaps. That's a really intriguing possibility of a public search facility, or a public search engine, or some sort of noncorporate entity that would help guide us through certain issues, right? That's a nice idea, and I think it's worth pursuing and experimenting.
  769.  
  770. I'm not willing to predict the results. I think the important thing is that we try everything. The important thing is that we let a few things fail, and we try to learn from our mistakes. But we do so openly. We do so deliberately. And we don't sign away so many important things to a black box.
  771.  
  772. Jacob Weisberg, your point.
  773.  
  774. I was going to say to the Google skeptics, have you tried Bing? There are alternative search engines. But I don't think it's something the government needs to do. I think the idea of a nonprofit search engine is not a bad one.
  775.  
  776. But I don't think it would fail to do personalization, because I think a search engine that wasn't capable of personalization would be tremendously ineffective and disappointing. And nobody would use it if it didn't know where you were, if it didn't gather any information about what kinds of things you are interested in.
  777.  
  778. I don't disagree. I was just talking about transparency.
  779.  
  780. I want to go to another question. But I want to confess that I would love to hear a female voice or two more in the mix tonight. And I may be-- something like every hand went down.
  781.  
  782. All right. I tried. Sir, right here. I'm sorry. I didn't see you. Let the mic come. Stand up. Let us know your name. And justify my choosing you.
  783.  
  784. Hi, I'm Kate. I was recently reading an article by Ethan Zuckerman, I think. Yes, OK, some nods yes. And for those who don't know, it's an article about a theory known as cute cat theory, which basically they applied it to the Arab Spring, and saying that all of these movements-- people that , might want to watch cute cat videos, that the majority of the people on the internet, that's what they use it for. There is a small majority who use it for internet activism. Now a site like YouTube that uses-- the majority of people want to watch cute cat videos-- but it also posts things like ac--
  785.  
  786. I need you to target in on a question.
  787.  
  788. OK. Anyway, as far as-- maybe you guys can explain the cute cat theory a little better than I can. But cute cat theory-- I'm interested to know-- I think that the cute cat theory opens up the possibility for neighbors, when these websites get shut down, to talk about political issues. So for those of you arguing against the motion, I would be interested to hear your thoughts.
  789.  
  790. For the people who have to edit this for television--
  791.  
  792. Yes.
  793.  
  794. --could just restate the question--
  795.  
  796. I hate cats.
  797.  
  798. No, no. I think this is actually a great question and I like the idea that they'll tell you what cute cat theory is. But if you could ask them about cute cat theory and use that phrase in your question.
  799.  
  800. I'm sorry. I'm a little nervous. So I'm rambling.
  801.  
  802. Sure, go ahead.
  803.  
  804. OK, my question basically is about cute cat theory, and whether those who watch cute cat videos-- when a site like YouTube gets shut down, they are more likely to then go find out why YouTube got shut down, thus turning them into internet activists.
  805.  
  806. Eli Pariser, tell us what cute cat theory is and whether it's relevant.
  807.  
  808. I'm not-- I don't feel well versed in cute cat theory, unfortunately.
  809.  
  810. Who's good on cute cats?
  811.  
  812. Siva actually got tenure for his work on cute cat theory.
  813.  
  814. I'm the cat person on this panel, I think.
  815.  
  816. All right, if I understand, I think this is a question about whether people will rally to support these online media that they depend on when they're under threat.
  817.  
  818. So it basically proves my "cats is the opiate of the masses" theory. You sit at home. You watch videos of cats. Internet gets shut down. You go and overthrow Mubarak. If you think it's a good theory, good luck.
  819.  
  820. We actually know that it's really hard to watch YouTube in China right now. And for a while, it wasn't so hard. First of all, YouTube comes over really slowly in the People's Republic because its servers are hosted in North America.
  821.  
  822. Google knows not to put the servers in China. That would be bad news. During the Tibetan uprisings three years ago, YouTube basically went dark. During the Olympics it went dark. And it's been intermittent since. And nobody has banged on the doors of the Chinese government saying I want my YouTube.
  823.  
  824. They're busy with their own video services. They're busy with their own business. They're busy clicking on their own issues, right? So it doesn't necessarily follow in that example. The other thing to remember, when talking about the Arab Spring-- going to a different part of the world-- is that only 30% of the Egyptian population was online at the time. And then 11 days into the protest, 0% were. SMS was a much more important media form for doing anything and Al Jazeera was probably the most important media form for informing people. So what happened in North Africa had very little to do with cats, cute or otherwise.
  825.  
  826. OK, ma'am, right down in the front. In the third row. Sorry, right behind you, sir.
  827.  
  828. You guys have talked a lot about--
  829.  
  830. Can you tell us your name, please.
  831.  
  832. Debbie. You guys talked a lot about personalization of the internet and why that's bad. But I'm asking those who are for the motion-- what about people who go and seek out information in terms of wanting to comment on articles and videos that are against their own beliefs.
  833.  
  834. So again--
  835.  
  836. Eli Pariser.
  837.  
  838. It's not that it's impossible or even that hard to find information that you disagree with online. There's a great cartoon of someone staying up late at night. And a person tries to get them to come to bed, And they say, but someone on the internet is wrong.
  839.  
  840. That's like the experience that a lot of us have a lot of the time. But the question is, sort of, whether the daily routines that we move through online will tend to bring us into contact more with people who make us feel that way, or more with people who actually validate how we feel.
  841.  
  842. And as social networking becomes a bigger phenomenon and, I think, in specific, Facebook-- let's separate Facebook from Twitter-- it's easier than ever to surround yourself with people who mostly agree with you. That's who your friends are. Now just to address Twitter in particular. I agree with these guys that Twitter's like a pretty good medium for filtering the internet and I use it to-- you know, I follow Karl Rove on Twitter. And I get Karl Rove's tweets. And that's great. And I also follow him on Facebook. And I never see his Facebook posts, because Facebook actually does a lot more personalization than Twitter does. Now--
  843.  
  844. So is that a relief to you? Or--
  845.  
  846. Yes. Well, Twitter's a relief.
  847.  
  848. --do you want more Karl Rove?
  849.  
  850. Twitter is great. But Twitter is not how most people use the internet.
  851.  
  852. No, no. I mean, the lack of Karl Rove's presence on your Facebook feed. Do you want more?
  853.  
  854. The fact that he writes boring messages on Facebook, you know, and Facebook doesn't find them interesting--
  855.  
  856. It's all the same. It's just [INAUDIBLE].
  857.  
  858. --doesn't necessarily mean that personalization is so bad.
  859.  
  860. They're all [INAUDIBLE].
  861.  
  862. But there are a lot of caveats [INAUDIBLE] where you could say, oh, Twitter is great, but it doesn't count because it's not big enough. I mean just to go back to the numbers a minute. Twitter claims their numbers-- more than 200 million registered users, more than 100 million active users. Facebook claims 800 million registered users 400 million active users. I actually think Facebook's definition of the active user is much more generous to itself than Twitter. So maybe it has four times the users-- but it's not insignificant.
  863.  
  864. Americans spend seven hours a month on Facebook. They spend minutes on Twitter. Like that's just if you look at the comScore numbers. I mean, it's just nowhere near as powerful a medium.
  865.  
  866. As I said, people who use Facebook frequently see all most recent messages. There is no personalization. If you use it, not seven hours a day, but just one day a month, then there is personalization. And then there is no justification for me to see every single Karl Rove message. Because you only have 30 minutes. And you have 5,000 friends. And you have one hour a month to spend on it. There is no way for you to do it, except for personalization.
  867.  
  868. Right, no, I agree with that. I'm just saying a fact of that personalization--
  869.  
  870. For people who spend seven hours on Facebook every day, there is no personalization.
  871.  
  872. No, that's not correct.
  873.  
  874. There is no personalization.
  875.  
  876. EdgeRank only kicks in. I went on [INAUDIBLE].
  877.  
  878. Evgeny's right about that.
  879.  
  880. No, it's not the truth.
  881.  
  882. There's no personalization whatsoever.
  883.  
  884. EdgeRank only--
  885.  
  886. Most recent items come first.
  887.  
  888. All right. We're at an impasse. And that end round two of this Intelligence Squared US debate where our motion is, when it comes to politics, the internet is closing our minds. And here's where we are. We are about to hear closing statements from each debater. They will be two minutes each.
  889.  
  890. Remember how you voted before the debate, because this is their last chance to persuade you that they presented the better argument. And you'll be asked once again to vote and to pick the winner in just a few minutes.
  891.  
  892. Onto round three-- closing statements from each debater in turn. Our motion is, when it comes to politics, the internet is closing our minds. And here to summarize his position against the motion. Jacob Weisberg, chairman and editor-in-chief of "The Slate."
  893.  
  894. I did like one suggestion Eli made which is if there should be buttons other than Like on Facebook. There should be a I'm Not Buying It button. And I would have been clicking that the whole time I was hearing this argument. The internet doesn't change human nature. It just creates new opportunities for us.
  895.  
  896. And if your view of human nature is that people are naturally inclined to be ignorant and bigoted and extreme, you're going to focus on the way the Web lets them become even narrower and more close-minded. But if that's your view of things, your problem isn't with the Web, it's with democracy, because people like that aren't going to do a very good job of governing themselves.
  897.  
  898. If on the other hand, you think people are capable of informing and governing themselves, you have to appreciate the way that the Web, which is the greatest trove of open information that the world has ever known, empowers us and broadens our political perspective. Evgeny and I have talked about some of those ways tonight. And I think Siva and Eli have talked about a theoretical problem that just hasn't been supported by the evidence or borne out.
  899.  
  900. Does anyone here actually think that our political system would be better off without the Web's democratization of information and the multiplicity of voices, if Tim Berners-Lee had never invented the World Wide Web. Do you think we would be better off if the Web or Google or Facebook and Twitter blogs were somehow magically taken away or regulated in some paternalistic way?
  901.  
  902. And I want to ask you again to think about your personal experience, because it may not be an academic study but collectively it's meaningful. Do you think the internet is making you more narrow? Or is it exposing you to ideas people and arguments and points of view that you wouldn't have access to without it. In closing, I just want to say the idea you when I would be less narrow in our politics without the Web isn't just wrong, it's actually preposterous. That's why you have to vote against them.
  903.  
  904. Thank you, Jacob Weisberg. Our motion is, when it comes to politics, the internet is closing our minds. And here to summarize his position for the motion, Eli Pariser, moveon.org board member and author of "The Filter Bubble. "
  905.  
  906. So I want to read you a quote from two Stanford researchers from 1997. They said, we expect that advertising-funded search engines will be inherently biased towards the advertisers and away from the needs of consumers. We believe that the issue of advertising causes enough mixed incentives, that it's crucial to have a competitive search engine that is transparent and in the academic realm. Now, the two researchers were Sergey Brin and Larry Page. And this was just months before they launched Google as a for-profit entity.
  907.  
  908. I think that version of them was right. I think that the companies that increasingly control where and how we put information online have mixed motives. And the motives that they use in creating these filters may not be, and in fact aren't, in our best interest. They're more likely to surround us with voices that tell us that we're great and we're right and we're good enough and strong enough and they're less likely to confront us with the places where we're wrong.
  909.  
  910. You know what's been interesting for me since writing the book, is that I've actually been invited to come talk with engineers at all of these companies-- Amazon, Apple, Google, YouTube, and Facebook. We've had conversations with people in each of those places. And it's sort of funny for me to hear people try to suggest that this isn't a big part of what these companies are trying to do. The engineers know that it is. They wrestle with this every day.
  911.  
  912. They wrestle with the mixed motives, with these questions of how they should be building these platforms. And they know, as one Netflix vice president told me, that they can easily end up trapping people in these bubbles where they tend to believe the same thing.
  913.  
  914. So if you believe that the companies whose algorithms decide what we pay attention to will tend to expose us to a broad, diverse set of sources, then you should vote with them. But if you agree with me that we should scrutinize that, and that these commercial interests will tend to use that power to placate people, rather than exposing them to more broad senses of view, then you should vote that the internet has, in fact, unfortunately been closing our minds. Thank you.
  915.  
  916. This is our motion. When it comes to politics, the internet is closing our minds. And here to summarize his position against the motion, Evgeny Morozov, author of the "Net Delusion," visiting scholar at Stanford and Schwartz fellow at The New America. Foundation.
  917.  
  918. Well, I wish that Don Draper had a chance to respond to that 1997 paper from Google. Again, you may think that advertising is evil. But again, advertising is the kind of evil that's also inevitable. So Eli has mostly avoided the "what's to be done" question. In his book, he actually is much more, I think, straightforward. And he does want Google and Facebook to intervene and to expose us to more diverse information-- more diverse information diet-- than we would normally opt for.
  919.  
  920. The second [? camp ?] is a Utopian dream that is not realizable in practice. Do we really want Facebook and Google to start nudging us to pay more attention to Joseph Kony in Uganda when we are searching for information about our local city council. OK. But then why pay more attention to Joseph Kony and, say, not climate change or Syria.
  921.  
  922. Who will adjudicate here? Do really think the Silicon Valley is capable of this? Do you really think that Silicon Valley should be in this business? Again, you have to think about the proposed solutions. Often they're worse than the cure.
  923.  
  924. Don't forget that in 1995, in that article in "The Nation," Andrew Shapiro proposed that the government should re-nationalize the internet-- right, to bring it back to avoid all that advertising evil that you've just alluded to. Cass Sunstein, in his book, wanted to force, by government, bloggers to link to their ideological opponents. He wanted them-- conservatives-- to link to liberals and liberals to link to conservatives.
  925.  
  926. All of those ideas now seem very ridiculous. And that's what I think we will think about that idea that Google and Facebook should be in the business of actually preserving us and presenting us with a more diverse information diet. Information is not like food. Politics is not like food.
  927.  
  928. Diversity here is a very political matter that will be settled very easily. Ideological conflict here is inevitable. And I think this is why should vote against this preposition. The sooner we do it, the sooner we can start tackling more important tasks like privacy.
  929.  
  930. Thank you, Evgeny Morozov.
  931.  
  932. Our motion is, when it comes to politics, the internet is closing our minds. And here to summarize his position for the motion, Siva V--, chair of the University of Virginia's Department of Media Studies and author of "The Googlization of Everything."
  933.  
  934. Evgeny has just done a great job to convince me that there is no virtue in the arguments made by people who are not me and not Eli and not made during this debate. We have not advocated a government takeover of anything. We have not advocated any Cass Sunstein type intervention. We merely want you to be aware of the problem and correct for it. Jacob said, what if we had a world without the Web? What if we had the world without the Web? We might get there.
  935.  
  936. Let me tell you why. Because Facebook and Google and Microsoft and Apple all wish to be the operating systems of our lives. They are explicit about this they don't just want to be on the Web because the Web is 20 years old and it's actually kind of creaky. What they want to do is be there with you all the time, in your glasses, in your pocket, in your purse, and on your mind always. They want to be your personal assistant.
  937.  
  938. There's quote after quote after quote from every CEO of every one of these companies that that's what they want. And it might make things really cool for us. But it's not going to make things rich and diverse. It's not going to be the wonderful conversation that we could have had on the Web if we hadn't instigated these gated communities, these operating systems of our lives.
  939.  
  940. We're on the way to having a Balkanized world and a Balkanized society because of these gated communities. The fact is, nothing is determined here. We can decide that we like what the Web was supposed to be. We can opt out of certain of these practices.
  941.  
  942. We should encourage diversity, encourage other platforms. We should encourage experimentation. We should recognize that Facebook and Google and Yahoo and Bing and Apple and Microsoft are designed to gratify us immediately. And that's great for pizza. But it's not for politics. Thank you very much. Please vote for the resolution.
  943.  
  944. Thank you. And that concludes our closing statements. And now it is time to learn which side you believe has argued best. We're going to ask you again to go to those keypads at your seat, that will register your vote on which side you feel presented the stronger argument tonight. And we will get the readout pretty much immediately. Press one if you support the motion of when it comes to politics, the internet is closing our minds.
  945.  
  946. Press number two if you are against this motion. And press number three if you remain or became undecided. And you can correct your vote if you think you pressed the wrong button. It'll be locked in right away.
  947.  
  948. And we'll have the results in about 90 seconds. So first, I want to ask for a round of applause for our debaters for the quality of the arguments they brought here tonight. It was witty and it was informative and it went places. Thank you very much to them. Thank you also to everyone in the audience who had the guts to stand up and ask a question. And to all the people who didn't get to ask questions, I wasn't able to allow to ask questions, thank you very much for your contribution as well.
  949.  
  950. So going forward, I just want to let you know that our next debate is in May. It's our last debate of the season and our motion comes down to three words. Ban college football.
  951.  
  952. And we had actually set this debate up quite a while back. It was not in relation to the Penn State scandal. It was actually motivated more by reporting that was done about the business side of college football and the sometimes harsh nature of the sport on the bodies of the athletes who play it.
  953.  
  954. So our debaters will include somebody who wrote one of those articles and Malcolm Gladwell, who is the author of the "Tipping Point" and "Blink." And he will be on the panel arguing for the motion. And he wrote the piece in The New Yorker in which he compares college football to dog fighting. His partner will be Buzz Bissinger. He is the Pulitzer Prize winning journalist and author of "Friday Night Lights." And that served as the inspiration for a movie pf the same name and a critically acclaimed television series, also of the same name.
  955.  
  956. Opposing them we have a football player, Tim Green. He is a former Atlanta Falcons NFL defensive end and Syracuse University All-American. He was inducted into the college football hall of fame in 2002. And his partner, also debating against the motion, Jason Witlock. He is foxsports.com national columnist, and he's lettered as an offensive tackle a Ball State University.
  957.  
  958. So that's our last debate. That's on May 17. And we would love to see you there. Also something we're trying-- something new tonight after the debate. Any of you who want to join us, we are partnering with The New American Tavern to host a post-debate reception. That will be upstairs at Amity Hall. That's a block away from here on Third Street, between Thompson and Sullivan. Details are in the programs that you have.
  959.  
  960. But our idea was, we notice a lot of times people walk out of here still kind of energized and revisiting the issues and debating among yourselves. We wanted to give you a place to do it and get together, and to share your ideas across the partisan divide. So that'll be at Amity Hall. $5 beer and well drinks. And I will be there for a bit as well, along with other members of the Intelligence Squared staff.
  961.  
  962. And finally, you are encouraged to tweet about this debate. Our Twitter handle is @iq2.us and the hash tag is iq2.us. So we'll just wait about-- oh, it's ready already. All right. So the results are all in.
  963.  
  964. Our motion is this. When it comes to politics, the internet is closing our minds. You've heard the arguments for and against over the course of this debate. We asked you to vote before, and again afterwards, where you stand on this and how persuasive the teams were. Here are the results. Before the debate, 28% voted for the motion, 37% against, and 35% were undecided. After the debate, 53% are for the motion. That is up 25 percent. 36 percent are against. That is down 1%. 11% are undecided.
  965.  
  966. The motion, when it comes to politics, the internet is closing our minds has carried. Our congratulations to that team.
  967.  
  968. Thank you from me, John Donvan and Intelligence Squared US.
  969.  
  970. [MUSIC PLAYING]
Add Comment
Please, Sign In to add comment