Guest User

Untitled

a guest
Jul 9th, 2013
352
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 44.50 KB | None | 0 0
  1. CHRIS PELICANO: Good afternoon,
  2. everyone. My name is Chris Pelicano. I'm the security engineering
  3. manager here at Google. Very briefly, I will introduce
  4. our guest this afternoon. He's a blogger. Most of you know him
  5. from schneier.com. He's an author. Many works including "Liars and
  6. Outliers," which he'll be talking about today. Ladies and gentlemen,
  7. Bruce Schneier. [APPLAUSE] BRUCE SCHNEIER: If you put that
  8. near me, there's feedback and it's all bad. Hi, thanks. I'm actually not going to talk
  9. about my book, because I figure if you want to hear about
  10. it, you can read it. I'd rather talk about stuff that
  11. I've been thinking about since that. This is very much ideas in
  12. progress, which makes it good for a talk here. I'm always interested in
  13. feedback and comments, and there will be time for that. What I want to talk about is
  14. security and power because I think that is a lot of what's
  15. interesting right now, and going on right now. So basically, technologies
  16. are disruptive. They disrupt society by
  17. disrupting power balances. And you can look at the history
  18. of the plow, or the stirrup, or gun powder, the
  19. printing press, telegraph, radio, airplane, container
  20. shipping, disease resistant, drought resistant wheat and
  21. see how those technologies changed the balance of power. And there's a lot written
  22. about this-- written as history. Harder is doing this
  23. in the present-- which is really what I'm
  24. thinking about-- on the internet. The internet is incredibly
  25. disruptive. We've seen entire industries
  26. disappear. We've seen entire industries
  27. created. We've seen industries upended. We've seen the computer
  28. industry, itself, upended several times. Government has changed a lot. We see governments losing power
  29. as citizens organize. We're seeing political movements
  30. become easier. We're seeing totalitarian
  31. states use power. Really, the Obama campaign was
  32. revolutionary in how they used the internet to organize
  33. and engage people. You could look at how technology
  34. has changed the media, ranging from the
  35. 24-hour news cycle, to bloggers, and citizen
  36. journalism, and two-way communications, and the acute
  37. explosion of media sources, social power-- there's
  38. a lot here-- personal publishing,
  39. the internet email, criminal power-- certain crimes becoming
  40. easier-- identity theft, which is really
  41. impersonation fraud done to scale, and how the
  42. internet has changed things. And I think about how this
  43. affects computer security-- which is basically what I do-- and then, how that affects
  44. the rest of the world. So traditionally, computer
  45. security has had the model of the user takes care of it. That's been the traditional
  46. model. It's actually a very
  47. strange model. We are selling products that
  48. aren't secure, aren't any good, and expect the user
  49. to make them good. I think of it as an automobile
  50. manufacturer, when they sell you a car, saying, that car
  51. doesn't come with breaks. But brakes are really important,
  52. and we think you should have them. There's some good aftermarket
  53. dealers. But you should get some break
  54. installed pretty quickly, maybe on the drive home. It's a much safer
  55. car that way. In a lot of ways, that's what
  56. we would do with anti-virus, with firewalls. We would sell these products and
  57. expect the user to do it themselves, to have some level
  58. of expertise necessary to secure their environment. There's a lot of reasons
  59. why we did this. It's the speed of
  60. our industry. It's the youth of
  61. our industry. But it was the norm. That model is breaking. That model is less becoming
  62. the norm. It's changing, not because
  63. we've realized there are better ways to do security, but
  64. because of how computers and the net are working today. There are two trends that, I
  65. think, change this model. The first is cloud computing. Now on the one hand, cloud
  66. computing isn't anything new. In the '60s, we called
  67. it time sharing. In the '80s, we called
  68. it client server. In the '90s-- I had a company-- we
  69. called it managed security or managed services. It's, fundamentally, a balance
  70. between the cost of computation and the cost
  71. of data transport. In the '60s, computation's very
  72. expensive, so it makes sense to centralize computers in
  73. their own rooms with their own air conditioning and
  74. give people badges. In the '80s, what becomes
  75. expensive is large storage, so you end up with a client
  76. server model. In the '90s, it's
  77. more services. Right now, the cost of computing
  78. is really dropping towards free. The cost of transport is
  79. dropping towards free. So what makes sense economically
  80. is to put your computers on the places on the
  81. planet where they can be run the most cheaply, and access
  82. them from wherever you are. That seems to be the endgame. There's nothing cheaper
  83. than free. The times where you see
  84. computation pushed to the edges are places where you
  85. have relatively low bandwidth-- maybe mobile applications-- or relatively high
  86. need for local computation, like gaming. But even those are becoming
  87. more of a cloud model. So that's the first trend. The second trend is locked
  88. down endpoints. And I think this is more of a
  89. trend in businesses than in technology. But nowadays, the computing
  90. platforms we buy, we have much less control over. I have an iPhone. I can't clear my cookies
  91. on an iPhone. I can't get a program
  92. that does that. I can't even get a program that
  93. erases files, because I don't have direct control
  94. over the memory map. There's weird things going
  95. on in your system. FEMALE SPEAKER: I know. I'm trying to deal with it. BRUCE SCHNEIER: Wow. She's solving on her laptop. OK. All right, go. MALE SPEAKER: Do you want
  96. to borrow this? BRUCE SCHNEIER: So these end
  97. user devices, whether they are tablets, or phones, or Kindles,
  98. the user has much less control over. On a Kindle, updates are
  99. downloaded automatically and I can't even say yes. At least, on the iPhone,
  100. I can say yes or no. But I still don't have anywhere
  101. near the control I have on my OSs. OSs are moving that
  102. direction as well. Both Windows 8 and Mountain
  103. Lion are moving to the direction of these mobile
  104. platforms, to give the user less control. And I think this is just
  105. purely economics. The companies have realized that
  106. the more they can control the supply chain, the
  107. better they'll do. So whether it's Apple, with
  108. their Apple store-- however the system works, you're
  109. just better off if you can control as much of the
  110. environment as possible. So this brings us to a new
  111. model of security. And the model is someone
  112. else takes care of it. The model is it just happens
  113. automatically, by magic. This happens on my
  114. Gmail account. I have no control over
  115. Gmail security. I have to simply trust
  116. that Google does it. I have no control over my
  117. pictures on Flickr, or my Facebook account, any
  118. of that stuff. And I have less and less control
  119. over the devices where I view these things. So users have to trust vendors
  120. to a degree we hadn't, before. There are a lot of good reasons
  121. why we do this. All the reasons why these
  122. models make sense-- convenience, redundancy,
  123. automation, the ability to share things. And the trust can be
  124. surprisingly complete. We're living in a world where
  125. Facebook mediates all of our friend interactions. Already, Google knows more about
  126. my interests than my wife does, which is a
  127. little bit freaky. Google knows what kind of porn
  128. every American likes, which is really freaky. But it's a trade off. It's a trade off we actually
  129. do pretty willingly. We give up some control in
  130. exchange for the environment that works well for us. And we trust that the vendors
  131. will treat us well, and protect us from harm. On the other hand, we're running
  132. out of other options. For most everybody, there
  133. aren't any real, viable alternatives. I run Eudora, but I'm,
  134. increasingly, a freak. My mother has a way better time
  135. on her computer since she got an Apple and has Apple
  136. handling every part of her computing environment. She loses a phone, she
  137. gets a new one. It just works great. And most of us can't
  138. do it ourselves. This is becoming more
  139. and more complex. I can't offer advice to
  140. tell people to run their own mail servers. That didn't make sense
  141. 20 years ago. It really doesn't
  142. make sense now. And you can't run your
  143. own Facebook. So the model I think of when
  144. I think of this type of computing environment
  145. is feudal security-- and that's "feudal" with a "d,"
  146. and not with a "t." It's that we, as users, have to
  147. pledge our allegiance to some powerful company who, in turn,
  148. promises to protect us. And I like it as a metaphor both
  149. because there's a real, rich historical metaphor, and
  150. because everyone's watching "Game of Thrones." So you can
  151. pull from both sources. And if you go back to classic,
  152. medieval feudalism, it was a system designed for a dangerous
  153. environment where you needed someone more
  154. powerful than you to protect you. It was a series of hierarchical
  155. relationships. There were obligations
  156. in both directions. It was actually a pretty complex
  157. political system. And I see more of it permeating
  158. the environment that we work in today. It has its advantages. For most people, the cloud
  159. providers are better at security than they are. Automatic cloud backup
  160. is fantastic. Automatic updates
  161. is fantastic. All these things are good. So feudal security provides this
  162. level of security that most everybody is below. So it will raise them up to
  163. whatever level the providers are providing. For those up here, it
  164. lowers them down. And where you see barriers to
  165. people adopting this are things like the banks, who,
  166. naturally, have a higher level of security, and don't
  167. want to go down. I assume that at some point, we
  168. are going to see a business model of a high security
  169. cloud vendor-- whether it's a Dropbox
  170. or an email service-- just something for some
  171. of these more high assurance users. We also have the problem
  172. of regulation. For a lot of companies, they
  173. have auditing reporting requirements. And if you go to Dropbox and
  174. say, we're using you for our company, we need to audit your
  175. system, they will say, go away, or to Rackspace. I assume we're going to see
  176. some water flow auditing model, where the Rackspace audit
  177. flows down to whatever service works on top of that,
  178. which flows down to whatever company now uses that service. Because I think we
  179. have to solve the regulatory barriers, here. Feudal security has risks. The vendors are going to act
  180. in their self interest. You hope that their self
  181. interest dovetails with your self interest, but that's
  182. not always the case. It's much less the case when
  183. you're not paying for the service, when, in fact, you are
  184. a user, not a customer. As we see, vendors will make
  185. side deals with the government. And the legal regime
  186. is different. If the data is in your
  187. premises, then it's in their premises. Vendors can act arbitrarily. Vendors can make mistakes. And vendors have an incentive
  188. to keep users tied to themselves. You guys are an exception by
  189. allowing users to take their data and leave. Most companies don't do that. Because tying the data to the
  190. company increases lock in, increases the value
  191. of the company. So this model is inherently
  192. based on trust. It's inherently based
  193. on the companies-- the feudal lords-- convincing the users to trust
  194. them with their data, their photos, their friends--
  195. with everything. And unfortunately, the business
  196. model for a lot of these companies is basically
  197. betraying that trust for profit. And that is, depending on which
  198. company, more or less transparent, more
  199. or less salient. A lot of effort does go into
  200. hiding that fact, to pretending it's not true. And as it turned out, these
  201. companies have a side business betraying the trust to
  202. the government, too. So there is a little bit, or
  203. in some cases, a lot, of deceit that this is
  204. all based on. And I do worry about how
  205. long that can sustain. Some of it, it seems to be
  206. able to be sustained indefinitely. For others, I'm not so sure. The feudal model is also
  207. inherently based on power. And that's what I'm thinking
  208. is interesting, right now. And it does dovetail very
  209. nicely with the current alignment of power
  210. on the internet-- the rise of the controlled
  211. endpoints, and the third party holding your data-- those two different polls. So I started the talk by
  212. mentioning about the internet changing power. And if you look back at history
  213. of the internet, a lot of us thought that it would flow
  214. in a certain direction. The internet was really designed
  215. in the way that made most technical sense. There wasn't a lot of agenda
  216. placed on the net, as it was first designed. And if you look back at the
  217. literature around that time, you read about the natural laws
  218. of the internet, that the internet works a certain way
  219. because it's like gravity. It's just the way
  220. it has to work. It's the way that makes sense. And a lot of us thought this was
  221. inevitable, this was the way the world had to work. I have two quotes. One is John Perry Barlow. In 1996, he's addressing the
  222. World Economic Forum. And he has something called
  223. "The Declaration of Independence of Cyberspace,"
  224. which is a great document to read. And he's telling governments
  225. things like, "You have no moral right to rule us, nor do
  226. you possess any methods of enforcement we have reason to
  227. fear." Three years earlier, John Gilmore writes that, "The
  228. internet interprets censorship as damage, and routes
  229. around it." These are very Utopian
  230. quotes, but we all believed them back then. We believed that is how the
  231. internet works, that the internet takes the masses, makes
  232. them powerful, takes the governments and makes
  233. them powerless. It turns out, that's
  234. just not true. That's not the way it works. What the internet
  235. does, like many technologies, is magnify power. It magnifies power,
  236. in general. And what happened is, when the
  237. powerless discovered the internet, suddenly
  238. they had power. The hackers, the dissidents,
  239. the criminals, the disenfranchised-- as those marginal groups
  240. discovered the net, suddenly, they had power they didn't
  241. have before. And the change was fast,
  242. and it was stark. But when powerful interests
  243. realized the potential of the internet, they had more
  244. power to magnify. They were much slower, but
  245. their ability to use the internet to increase their
  246. power is greater. The unorganized were more nimble
  247. and quick, and the institutions were slower
  248. and more effective. And that's where we are today. So I look around and I
  249. see four classes of internet tools of power. And what's interesting about
  250. them is they all are tools by which a totalitarian government
  251. can increase their power, but they all have viable market reasons for existing. So censorship is also a content
  252. filtering, or data loss prevention. Propaganda is marketing. Surveillance is surveillance. I guess, personal
  253. data collecting. Surveillance is the business
  254. model the internet. Use control-- in China, programs have to be
  255. certified by the government in order to be used on computers
  256. there, which sounds an awful lot like the Apple store. I mean we laugh, but
  257. this is important. We're building tools that have
  258. very different sorts of uses depending on who's using
  259. them, and why. And in both the government and
  260. the corporate sphere, powerful interests are gaining power
  261. with these tools. Censorship and surveillance
  262. are both on the rise. The internet censorship
  263. project, which tracks censorship around the world,
  264. finds more of it every year. We see more surveillance by
  265. governments every year, even before the United States stuff
  266. that happened two weeks ago. More personal data is being
  267. collected and correlated. More control over our hardware
  268. and software. Less purchasing, more
  269. licensing-- we saw Adobe move
  270. to that model. This is getting harder. I'm trying to find a taskless
  271. productivity tool, and I can't find a good one that doesn't
  272. require me to use the cloud. And we have corporations-- I think Facebook is one
  273. interesting example-- that are actually changing
  274. social norms. They are affecting what people
  275. think is normal, is regular, for a profit motive. I think propaganda is something
  276. we don't talk about a lot, but it's both in
  277. companies and governments. I mean we might call it viral
  278. marketing, and there are some cute names for it, but
  279. basically, it's propaganda. And we're seeing more
  280. and more of it. And now, we're at the point
  281. where power basically controls everyone's data. Because in a lot of ways,
  282. personal data equals power, both on the government side
  283. and the corporate side. Even in non-internet businesses,
  284. the need to own the relationship, to know more
  285. about the customer, is driving a lot of data collection, and
  286. all that back end correlation. And I worry a lot about the
  287. commingling of corporate and government interests here. We live in a world-- I don't have to go through the
  288. details-- of ubiquitous surveillance. Basically, everything
  289. is collected. Charlie Strauss has written
  290. about this as the end of pre-history, that sometime in
  291. our lifetime we're going to switch from pre-history, where
  292. only some things were saved, to actual history, where
  293. everything is saved. Now, we're in a world where
  294. most everything is saved. And what's happening now-- and I
  295. think it's something I'm not happy about, but try
  296. to understand-- is how powerful interests are
  297. trying to steer this. I mentioned Facebook changing
  298. social norms. But we're seeing industries
  299. lobbying for laws to make their business models
  300. more profitable. So that's laws to prevent
  301. digital copying, laws to reduce privacy, laws allowing
  302. different businesses to control bandwidth. And on the government side,
  303. we're seeing international bodies trying to get rulings to
  304. make the internet easier to surveil, and to sensor. I've heard this called "cyber
  305. nationalism." And last November in Dubai, there was a
  306. meeting of the ITU-- that's the International
  307. Telecommunications Union. Those are the guys that
  308. run the phone system. They're not really very tech
  309. savvy, but they are very international. They are very non-US centric. And they want to wrest control
  310. of the internet from the US. For a lot of reasons, I think
  311. this would be a disaster. But there's a strong push. And unfortunately-- I wrote this in my
  312. blog, today-- I think all the Snowden
  313. documents make their case a lot easier. Because now, when they say,
  314. well you can't trust the Americans, everyone will say,
  315. oh yeah, you're right. You can't trust the Americans. So these things are
  316. happening now. We're seeing a large rise in the
  317. increase of militarization of cyberspace, which will push
  318. more of the internet under government control. I very much believe we are
  319. in the middle of a cyber war arms race. And it's heated up a
  320. little bit in the past couple of weeks. Because we've been complaining
  321. about China for the past few years. I've always assumed we've
  322. been giving as good as we're getting. And now, we're getting data that
  323. we are giving as good as we're getting, which is just
  324. going to make things worse. We're pretty sure that the cyber
  325. attack against the Saudi oil company Aramco was
  326. launched by Iran in retaliation for Stuxnet, which
  327. sounds complicated. But I don't know geopolitics. Maybe that makes sense. And we're seeing a lot of
  328. alignment of corporate and government power. I'm pretty sure I'm quoted in
  329. "The New York Times," today, as calling Facebook "the NSA's
  330. wet dream." I'm surprised I used those words. It was probably a
  331. long interview. So here's a way to
  332. think of it. In our country, we have two
  333. different types of law. There's constitutional law,
  334. that regulates what governments do, and there's
  335. regulatory law, that constrains what corporations
  336. so. And they're kind of separate. We're now living in a world
  337. where each group has learned to use the other's law to get
  338. around its own restrictions. If the government said, you
  339. all have to carry tracking devices 24/7, that would
  340. be unconstitutional. They could never get
  341. away with it. Yet, we all carry cell phones. If they all said, you must
  342. register whenever you meet a new friend, we'd
  343. never allow it. Yet, we all go on Facebook. And actually, I played
  344. this earlier. Two years ago, "The Onion"
  345. did a video. Just go to YouTube and type "the
  346. onion facebook cia." It's a short news video
  347. about Facebook being the new CIA program. It's hysterical. And is two years old, which
  348. makes it kind of sad. On the other hand, we're seeing
  349. corporations use the governments to enforce their
  350. business models. If, I don't know, the movie
  351. industry said that we're going to go into people's computers
  352. that trash them if we think they're copying files,
  353. that would be wrong. But they're going to
  354. try to get a law to do the same thing. Copyright-- a lot of examples where
  355. industries are bypassing their own problems by going
  356. through government. And I think this only gets
  357. exacerbated as there's more technology. Feudal lords get
  358. more powerful. And some of that is just the
  359. natural order of bigness in our society right now. The way technology is right
  360. now, it favors the big. It doesn't favor many small. It favors two or three on
  361. top and nobody else. And it's true in geopolitics,
  362. too. Think about it. In any climate change
  363. negotiation on the planet, who do you think has more power-- Exxon or Bolivia? It's not even close. Who has more power-- Exxon or the United States? That's actually a discussion. This is weird. So that's one trajectory. There's another trajectory. There's a counterbalancing
  364. one, based on different natural laws of technology. So in the book I'm not talking
  365. about, "Liars and Outliers," I discuss something called
  366. a security gap. And in that book, I'm talking
  367. about, effectively, the arms race between attackers and
  368. defenders, and that technology causes disruptions in that arms
  369. race, and then, there's a rebalancing. So firearms are invented,
  370. fingerprint technologies are invented. All those things upset
  371. the balance between attackers and defenders. And one of things I point out
  372. is that, as technology advances, attackers have
  373. a natural advantage. Some of it's a basic first
  374. mover advantage. But in general, unorganized
  375. attackers can make use of innovations faster. So imagine someone invents
  376. the motor car. And the police say,
  377. well that's a really interesting thing. We could use one of those. So they have a group to study
  378. the automobile, and they produce an RFP, and they get
  379. bids, and they pick an automobile manufacturer, they
  380. get a car, they have a training system. Meanwhile, the burglar says,
  381. oh look, a new getaway vehicle, and can, much more
  382. quickly, use that. We saw that on the internet. And if you remember, as soon
  383. as the internet became a commercial entity, we saw a new
  384. breed of cyber criminal appear organically, out of the
  385. ground, immediately able to commit crimes, and fraud,
  386. and identity theft. All of these new things
  387. just showed up. Meanwhile, the police, who
  388. were trained on Agatha Christie novels, took, what,
  389. 10 years to figure it out. And they have figured it out. But if you were around during
  390. that time, it was really painful, as they had no
  391. idea what cyber crime was, or how it worked. So there's this delay when
  392. a new technology appears. And that's what I think
  393. of as a security gap-- the delay between when the
  394. non-powerful can make use of the new technology-- the fast and nimble-- and when the powerful, the big
  395. and ponderous, can make use of the technology. And that gap gives attackers
  396. a natural advantage. And I'll spare you the details,
  397. but basically, that gap tends to be greater when
  398. there's more technology-- when your curve is greater. And it's greater in times
  399. of rapid technological-- actually, it's greater in times
  400. of rapid social change due to technological change. And today, we're living in a
  401. world with more technology than ever before, and greater
  402. ramp of social change, due to technological change,
  403. than ever before. So we're seeing an ever
  404. increasing security gap. So this is the big question
  405. that I do not have an answer to-- who wins? Who wins, and in what
  406. circumstance? Does big, slow power beat
  407. small, nimble power? And there's going to be some
  408. David and Goliath metaphor, or Robin Hood and sheriff. I guess I'm going to need a
  409. more medieval metaphor. But that seems like an open
  410. question that we don't know. So for example, in Syria,
  411. recently, we saw the Syrian dissidents use Facebook
  412. to organize. We saw the Syrian government
  413. use Facebook to arrest dissidents. So right now, it's
  414. kind of a mess. As this shakes out, who
  415. gets the upper hand? Right now, it seems like
  416. governments do. It seems like the ability to
  417. collect, to analyze, to employ police beats dissidents. It seems like the big
  418. corporations win. That the need to have a credit
  419. card, or be on Facebook, and to do all these things to live
  420. your life are such that you can't shut them off. And they win. But it's not clear to me. It does seem clear to me that
  421. those that want to get around the systems always
  422. will be able to. But really, I'm now concerned
  423. about everyone in the middle. The nimble are here. The powerful are here. Here's the rest of us,
  424. which, I guess, is the hapless peasants. And as the powerful get more
  425. control, I think we get largely left out of
  426. any negotiations. And you see this in arbitrary
  427. rules, in arbitrary terms of service. You see this in secret NSA
  428. spying programs, or secret overrides to rules, and power
  429. aligning with power. It's not clear to me that
  430. these actually do catch terrorists. It's pretty clear to me that
  431. they don't, actually. But they do affect
  432. the rest of us. And I think these power issues
  433. are going to affect all of the discussions we have about the
  434. future of the internet in the coming decade. Because these are actually
  435. complex issues. We have to decide how we balance
  436. personal privacy against law enforcement. How do we balance them when
  437. we want to prevent copy protection, or prevent
  438. child pornography? When we decide, is it acceptable
  439. for us to be judged by computer algorithms. Is it acceptable to feed
  440. us search results? To loan us money for a house? To search us at airports? To convict us of
  441. drunk driving? How do these algorithms
  442. affect us? Do we have the right to correct
  443. data about ourselves, or to delete it? Do we want computers
  444. to forget? There's a lot of social
  445. lubricant in our society by the fact that we are a
  446. forgetting species. Do you really want-- I mean, I don't want Google
  447. Glass because I don't my wife to be able to pull
  448. up old arguments. That seems bad. There's a lot of power
  449. struggles. And there are bigger
  450. ones coming. Cory Doctorow writes about the
  451. coming battles having to do with 3D printing. And they're very much the same
  452. as the copyright battles. There will be powerful interests
  453. that want to stop the execution of certain
  454. data files. It was music and movies. In the future, it will
  455. be working guns. It will be the Nike swoosh. His favorite example is
  456. anatomically correct, interchangeable Barbie torsos,
  457. which I never thought of, but would freak out Mattel,
  458. probably, rightly so. Or little statues of Mickey
  459. Mouse, which will freak out a very powerful company. RECORDED VOICE: Of
  460. Mickey Mouse BRUCE SCHNEIER: Who is that? We see some incredibly
  461. destabilizing technologies coming. And this whole debate on weapons
  462. of mass destruction-- nuclear, chemical,
  463. biological-- the idea is that, as technology
  464. magnifies power, society can deal with
  465. fewer bad events. So if the average bad guy-- I'm just going to make this up--
  466. can kill 10 people before he's captured, or rob 10 houses
  467. before he's captured, we could handle so
  468. many robbers. But if they can now do 100 times
  469. as much damage, we now need only 1/100 of them
  470. to maintain the same security level. A lot of our security is
  471. based on having some low level of badness. But as power magnifies the
  472. amount of badness each individual can do, you suddenly
  473. start needing much more control. I'm not even convinced
  474. that that will work. But that's going to
  475. be a huge debate. And that's going to
  476. push fear buttons. Today, largely, the powerful
  477. are winning these debates. And I worry that these
  478. are actually very complicated issues. They required meaningful
  479. debate, international cooperation, innovative
  480. solutions, which doesn't sound like I just described
  481. the US government. But we're going to
  482. have to do this. In a lot of ways, the internet
  483. is a fortuitous accident. It is a combination of lack
  484. of commercial interests, government benign neglect,
  485. some military core requirements for survivability
  486. and resilience, and computer engineers with vaguely
  487. libertarian leanings, doing what made technical sense. That was, kind of, the
  488. stew of the internet. And that stew is gone. There are policy battles going
  489. on, right now, over the future of the internet, in legislatures
  490. around the world, in international standards
  491. bodies, in international organizations. And I'm not sure how this is
  492. all going to play out. But I have some suggestions
  493. for different people. For researchers, I want to see a
  494. lot more research into these technologies of social
  495. control-- surveillance, censorship,
  496. propaganda, and use control. And especially for you guys at
  497. Google, you're in a unique position to study propaganda. There is very little work being
  498. done on recognizing propaganda. And what I want is for my
  499. internet to come with all propaganda with a little yellow
  500. box, kind of like what you do on your search pages. My paid commercial is
  501. flagged as such. I would like to be done
  502. automatically. This seems vaguely impossible,
  503. but I think we need to start thinking about it. There is some research done
  504. around the edges. There's research done in
  505. recognizing fake Yelp reviews, recognizing fake
  506. Amazon reviews. But there's, right now,
  507. questions whether trending topics on Twitter
  508. is being gamed. So when we're losing this
  509. transparency, there's a lot of questions about the information
  510. we get. But I think we need research. Because those four
  511. things are going to become very important. And understanding how they work,
  512. and how to get around them, is going to become
  513. very important. We need safe places to
  514. anonymously publish. Wikileaks was great, but
  515. now seems no more. Right now, the best thing we
  516. have is something called Strongbox, that "The New
  517. Yorker" is running. I'm in the process of trying
  518. to review that system right now. I think we need a lot more of
  519. these, all around the world. We do need research into
  520. use limitation. I believe we're going to get
  521. legislation on, basically, copy protection for digital
  522. objects because of the 3D printers, because of bio
  523. printers, because of software defined radio. And that's going to really
  524. hurt our industry. Because lawmakers are not
  525. going to get this right. They're going to do something
  526. draconian, and it's going to be ugly. So the better we can solve the
  527. actual problems, the less likely we are to be handed
  528. solutions that won't work, and will hurt everything else. To vendors, I want people to
  529. remember that a lot of the technologies we build have dual
  530. use, that business and military uses are basically
  531. the same. So you see Blue Coat used to
  532. censor the Syrian internet, or Sophos used to eavesdrop on the
  533. internet, or social media enabling surveillance. On the one hand, the FBI is
  534. trying to get laws passed to have us put back doors in our
  535. communication systems. On the other hand, we don't want
  536. other countries to do the same thing. This is hard. The policy prescriptions,
  537. I think, are harder. I think in the near term, we
  538. need to keep circumvention legal, and keep net
  539. neutrality. I think those two things give
  540. us some backstop towards the powerful becoming even
  541. more powerful. Long term, fundamentally, we
  542. have to recognize we can't have it both ways, that if we
  543. want privacy, we have to want it everywhere-- our country
  544. and abroad. If we think that surveillance
  545. is good, we have to accept it elsewhere. Fundamentally, I want to
  546. see power levelled. Because the relationship
  547. is real unbalanced. If you think about historical
  548. feudalism, or you read about it, it eventually evolved into
  549. a more balanced government relationship. So you had feudalism, which
  550. started out as this bilateral agreement-- we're in a dangerous world. I need you to protect me. So I will pledge my allegiance
  551. to you-- turned into something
  552. very unbalanced-- I'm powerful. I can do whatever I want. I will ignore my agreements. You're powerless. You can't do anything. And that eventually changed with
  553. the rise of the nation state, with, basically, rules
  554. that gave the feudal lords responsibilities as well as
  555. rights, culminating in something like the
  556. Magna Carta. And I think we're going to need
  557. something like that on the internet, with the current
  558. set of powers on the internet, which will be both government
  559. and corporate-- some basic understanding that
  560. there are rights and responsibilities. There's some more balanced
  561. relationship. And whether that's limitations
  562. on what vendors can do with our data, or some public
  563. scrutiny for the rules by which we are judged by
  564. our data, I expect-- no time soon, but eventually--
  565. these will come. Because I think this is
  566. how we get liberty in your internet world. And I think this is actually
  567. a very long and difficult battle. I think some of the results
  568. will upend your company. But they might not be coming
  569. for a decade or more. So that's what I
  570. have prepared. I'm happy to take questions. [APPLAUSE] BRUCE SCHNEIER: So there are
  571. some rules about a microphone that are confusing to me. AUDIENCE: You talked about a
  572. game between governments, on one hand, and corporations, on
  573. the other, using each other's power systems to essentially get
  574. at everyone in the middle. How long do you think that
  575. that game can play out? Is it indefinite? Can it continue for the
  576. foreseeable future? Or do you see some sort of
  577. turning point in which some scandal or something will so
  578. threaten the middle as to galvanize them? BRUCE SCHNEIER: I don't know. And I think we're very much in
  579. uncharted territory, here. We're living in a world where
  580. it's very hard for the middle to be galvanize, for lots
  581. of different reasons. A lot of people have
  582. written about this. I have trouble predicting the
  583. future, because things are changing so fast. Right now, it all seems quite
  584. dysfunctional, and that there's no mechanism for
  585. things to change. But of course, that's
  586. ridiculous. That things will change. Exactly how, I don't know. And which way they'll change,
  587. I don't know. If the world is the terrorists
  588. might have nukes, and we're all going to die unless we live
  589. under totalitarianism, people are going
  590. to accept that. Because when people
  591. are scared, that's what they'll accept. And technology is to the point
  592. where that actually might be the world. But there are a lot of other
  593. ways this can play. I think this is, vaguely, the
  594. topic of my next book, so I hope to explore the different
  595. avenues we might move out of this. I haven't even begun to have an
  596. idea of which is likely to be correct. And I think I wouldn't trust
  597. me when I've decided. Just read science fiction
  598. 20 years ago. We're really bad at predicting,
  599. not technical future, but social future. Everyone could predict
  600. the automobile-- that it would make people
  601. drive faster. But no one predicted
  602. the suburb. It's always the second
  603. order social effects. And that's what this
  604. is all about. So I just don't know. AUDIENCE: You mentioned the
  605. convergence of power, convergence of objectives,
  606. for the corporations and governments, and also, sort
  607. of the convergence of capabilities, like the
  608. Exxon Mobil comment. Do you see anything along the
  609. lines of the distinction between them vanishing? Corporations as states BRUCE SCHNEIER: --I think they,
  610. largely, are vanishing. And this is my main complaint
  611. with libertarianism as a philosophy. In the mid 1700s, it was great,
  612. because it identified the fact that power imbalance
  613. is bad, and we need to equalize power. But it kind of got stuck there,
  614. and didn't notice that power changed. I think there is a
  615. lot of blurring. And some of it is the fact
  616. that money controls government. And powerful corporations
  617. have the money. We have seen blurring at other
  618. times in history-- the Dutch East India
  619. Company in Africa. There are different examples
  620. where corporations were de facto governments in areas where
  621. they were operating. This is not as stark. But power is changing. Power is less hard power. Power is more soft power,
  622. to use Nye's term. That the nature of power
  623. is changing, such-- so I do think there
  624. is a blurring. But it's different than
  625. we thought, when we worried about this. The nature of social control is
  626. very, very different now, than it was. The nature of surveillance
  627. is very different. And it's going to
  628. change again. What is the half-life of
  629. these technologies? 10 years? Five? So what's going to happen in
  630. five to 10 years that will be completely different? I don't know. AUDIENCE: I really liked your
  631. feudalism analogy, and I see one potential flaw in it. And I wanted your-- BRUCE SCHNEIER: Oh, good. Flaws are good. I love those. AUDIENCE: --thought about it. As I understand it, feudal
  632. lords were pretty much monopolists. Like, the Russian serfs were
  633. bound to the land, and so they didn't get a choice of which
  634. Lord to be with. Whereas, people do, in fact,
  635. have a choice when there's two or three big guys, right? BRUCE SCHNEIER: They
  636. have a choice. But is it really a choice? If all three cellphone companies
  637. are collecting the same data, and giving it to the
  638. same government under the same rules, it's not
  639. really a choice. AUDIENCE: But if I can get a lot
  640. more customers by having a very clear privacy policy the
  641. respects you in a way the other guy doesn't, then-- BRUCE SCHNEIER: It seems
  642. not to be true. It seems you get more customers
  643. by obfuscating your privacy policy. And there are a lot of great
  644. psych experiments about this, that if you make privacy
  645. salient by showing your privacy policy, people will
  646. say, whoa, that's bad. Facebook is a great example. They make the privacy policy
  647. really hard to find. Because they don't want
  648. you think about it. Because if you don't think
  649. about it, you share. So this is the problem
  650. with mini-big. Your normal market economics,
  651. which involves multiple sellers competing on features,
  652. only works if you've got a lot of sellers competing
  653. on features. If the three companies that
  654. do the same thing-- I mean, what's the difference
  655. between Apple and Microsoft in operating systems? Is it really that different
  656. where privacy matters? Around the edges-- unless the
  657. companies choose to compete on those features-- I can't fly less secure airlines
  658. where we'd get you through air security quicker. There is no competition
  659. in that. Or more secure airlines--
  660. we do a background check on everybody. It's a perfectly reasonable
  661. feature to compete on, but there isn't any competition. So especially if some of these
  662. deal with government demands, you're just not going to
  663. have the competition. And there's a lot a reason
  664. to make that go away as much as possible. Because these companies want
  665. people to share more. [INAUDIBLE] land
  666. is interesting. No. Well, yes and no. It's very hard for someone
  667. regular to leave Facebook. That's where your party
  668. invites come from. That's where your friends are. That's where your social
  669. interaction is. You don't go on Facebook, you
  670. don't get invited to parties, you never get laid, you have
  671. a really sucky college experience. So you're not bound. But there's a lot of social
  672. push to stay. It's very hard to take your data
  673. when you leave-- again, Google is an exception, here. Remember the whole battles
  674. about cellphone number portability? That was all to bind people to
  675. the cellphone companies, to raise the cost of switching. You raise the cost of switching,
  676. you can do a lot more to your customers,
  677. or users. If the customers can't
  678. switch, you can piss them off a whole lot. So, yeah. And the other reason I kind of
  679. like the serf model is the notion of people doing stuff
  680. online, which is the raw material that companies
  681. use to make profits. So it's kind of like you're
  682. farming for your life. And I guess Farmville would be
  683. perfect for this, right? But maybe that's too much. The other way that the metaphor
  684. works-- and other people have written
  685. about this-- the feudal metaphor-- is that, in a feudal system,
  686. everything is owned. There's no commons. And we're seeing this on the
  687. internet, that no piece of the internet is a commons. In the United States,
  688. we have very particular rules about commons-- free speech rules, association
  689. rules, rules about protesting-- because you're on a street. You're on a public street. And those rules don't apply in,
  690. for example, Zuccotti Park in New York, because that
  691. was a privately owned public space. The internet is, entirely,
  692. privately owned public spaces. So Apple is well within its
  693. rights to say, to an app creator who made an app to
  694. show US drone strikes in Pakistan, you can't have
  695. your app on my store. Because it is not a free
  696. speech-- it is not a protest. This is a private space. Apple gets to decide. So this lack of a public sphere
  697. in the world where we are all associating is another
  698. way the feudal model works. I don't know how to fit it
  699. in to what I'm doing. I'll probably figure it
  700. out sooner or later. AUDIENCE: The feudal
  701. model is really appealing at first blush. But another problem with it is
  702. that we actually do live in a democracy, at least
  703. theoretically. And we do have the power to
  704. vote, at least theoretically. The problem seems, to me, not
  705. that there are currently all kinds of tricks, like the
  706. people who obfuscate the privacy policies win. It's more about the lassitude
  707. of those who are being governed by the government
  708. they set up, or the corporations they choose
  709. to do business with. And so ultimately, the problem
  710. is we aren't looking after our own interests. And so that seems to be what
  711. needs to be fixed. And it's not feudalism,
  712. because we have the opportunity to escape. We're just not taking advantage of it through tricks. BRUCE SCHNEIER: I
  713. agree with that. It's just getting harder
  714. and harder. And some of it is the fact that
  715. we are just too good at psychological manipulation. Advertising, political speeches,
  716. have gotten to good. I don't know how fair
  717. the game is. Yes, you are fundamentally
  718. right. The question is does
  719. that translate to being right in practice. The United States is
  720. particularly hard. Our political system is not
  721. designed for a huge spectrum of political ideas. Go to any other countries and
  722. you just realize how narrow our political debate is, just
  723. because of the way our two party system is set up. But again, unless the parties
  724. choose to compete on these features, we don't really
  725. have a choice. And some features they do,
  726. and some they don't. But yes, you are inherently
  727. right. By the book, that's correct. The question is how does that
  728. translate into to what we can actually do, realistically. AUDIENCE: So we need to
  729. trick Facebook into becoming the EFF. BRUCE SCHNEIER: I'm game. AUDIENCE: Does that mean that
  730. governments have an incentive to encourage there to be a few
  731. small companies, so that then, they don't compete on
  732. things like privacy? If there's only three, it's
  733. much harder for them to compete on something
  734. like that. BRUCE SCHNEIER: I don't know. There are a lot of examples
  735. we could look at. We could start poking
  736. at some of them. Automobile manufacturers-- they do compete on safety,
  737. and have for many years. Saab built an industry on our
  738. car is safer than your car. So you do see security
  739. features, sometimes. In a lot of ways,
  740. the organic food movement is a food safety-- MALE SPEAKER: Saab is gone. BRUCE SCHNEIER: Yeah, but
  741. in the '70s, that was what they did. MALE SPEAKER: Organic
  742. food was safe. BRUCE SCHNEIER: Yeah but,
  743. organic food is believed to be more pure. It's a food purity sale, which
  744. is inherently a health sale, and a safety sale. You can argue whether it's real
  745. or fake, but it's how the companies are competing. I don't think the government
  746. is incenting any particular economic outcome. I think there's just, right now,
  747. a very happy confluence. Thank you very much. [APPLAUSE]
Add Comment
Please, Sign In to add comment