Advertisement
genBTC

Joe Rogan Experience #1558 - Tristan Harris = ENTIRE SUBTITLES CAPTIONS TRANSCRIPT

Nov 9th, 2020
251
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 152.99 KB | None | 0 0
  1. [Laughter] [Music]
  2. chris don how are you good good to be here
  3. good to have you here man um
  4. you were just telling me before we went on air
  5. the numbers of the social dilemma and theyre bonkers
  6. so what just say that yeah uh
  7. the social dilemma was seen by a 38 million households
  8. in the first 28 days on netflix which i think is broken records
  9. and if you assume you know a lot of people are seeing it with their family
  10. because parents seeing it with their kids uh
  11. the issues that are on teen mental health
  12. uh so if you assume one out of ten
  13. families saw it with a few family
  14. members we're in the 40 to 50 million
  15. people range which is just broken
  16. records i think for netflix i think it
  17. was the second most popular documentary
  18. throughout the month of september
  19. or film throughout the month of
  20. september is really well done
  21. documentary but i think it's one of those documentaries that affirmed a lot of people's worst suspicions
  22. about the dangers of social media and then on top of that
  23. it sort of alerted them to what they were already
  24. experiencing in their own personal life and like highlighted it
  25. yeah i think that's right i mean most people were
  26. aware i think it's a thing everyone's been feeling that
  27. the feeling you have when you use social
  28. media isn't that this thing is just a
  29. tool or it's on my side
  30. it is an environment based on
  31. manipulation as we say in the film
  32. and that's really what's changed that you know
  33. i remember you know i've been working on
  34. these issues for something like eight or
  35. eight years or something now you please
  36. tell people who didn't see the documentary
  37. what your background is and what you how
  38. you got into it yeah so i uh
  39. that you know the film goes back as a set of technology insiders
  40. my background was as a design ethicist at google
  41. so i first had a startup company that we sold
  42. to google and i landed there through a talent acquisition
  43. and then um
  44. started work about a year
  45. into being at google made a presentation
  46. that was about how essentially technology was holding
  47. the human collective psyche in its hands
  48. that we were really controlling the world psychology uh
  49. because every single time people look
  50. at their phone they are
  51. basically experiencing thoughts and
  52. scrolling through feeds and believing things about the world this has become the primary
  53. meaning making machine for the world
  54. and that we as google had a moral responsibility to uh
  55. you know hold the collective psyche
  56. in a thoughtful ethical way and not create
  57. this sort of race to the bottom of the brainstem attention economy
  58. that we now have uh
  59. so my background was as a as a kid i was a magician we can get into that i studied at a lab at stanford called or
  60. studied in a class called the stanford persuasive technology class that taught a lot of the engineers
  61. at in silicon valley kind of how the
  62. mind works and the co-founders of instagram were there and uh
  63. then later studied behavioral
  64. economics and how the mind is sort of
  65. influenced i went into cults and started
  66. studying how cults work
  67. and then arrived at google through this
  68. lens of you know technology isn't really
  69. just this thing that's in our hands
  70. it's more like this manipulative
  71. environment that is tapping into our
  72. weaknesses everything from the slot machine rewards to you know the way you get tagged in a
  73. photo and it sort of manipulates your
  74. social validation and approval
  75. these kinds of things when you were at google
  76. did they still have the don't be evil sign up
  77. i don't know if there's actually a
  78. physical sign was there was never a
  79. physical sign i thought there was something
  80. that they actually had i think it was there's this guy
  81. was it paul not paul what was his last
  82. name he was the inventor one of the
  83. vendors of gmail and they had a meeting
  84. and they came up with this mantra
  85. because they realized the power that
  86. they had and they realized that
  87. there was going to be a conflict of
  88. interest between advertising on the search results
  89. and regular search results and so we
  90. know that they knew that they could have
  91. used that power and they came up with this
  92. mantra i think in that meeting in the
  93. early days to don't be don't be evil
  94. there was a time where they took that mantra down
  95. and i remember reading about it online
  96. and and they took it off their page i think
  97. that's what it was yeah and uh
  98. when i read that i was like that should be big news
  99. like there's no reason to take that down
  100. why would you take that down
  101. yeah why would you why would you say
  102. well let me give you a little evil
  103. let's not get crazy it's a good question
  104. i mean i wonder what logic would have you
  105. remove a statement like that that seems
  106. like a standard state like it's a great statement
  107. okay here it is google removes don't be
  108. evil claws from its code of conduct
  109. in 2018 yeah yeah i wonder why
  110. did they have an explanation did it say anything
  111. underneath him don't be evil has been a
  112. part of the company's corporate code of
  113. conduct since 2000 when google
  114. was reorganized under a new patent uh
  115. parent company alphabet
  116. in 2015 alphabet assumed a slightly adjusted version
  117. of the do the right thing
  118. do the right thing oh
  119. that's a spike lee movie [ __ ]
  120. however google retained its original
  121. don't be evil language until the past several weeks
  122. the phrase has been deeply
  123. incorporated into google's company culture
  124. so much so that a version of the phrase has served
  125. as the wi-fi password on the shuttles
  126. that google uses to ferry
  127. its employees to its mountain view
  128. headquarters i think i remember that yeah
  129. get on the bus and you type in don't be
  130. evil i wonder why they decided
  131. well i mean they did change it to do the
  132. right thing i mean we always used to say that
  133. um just to friends not within google but
  134. just you know instead of saying
  135. don't be evil just say let's let's do
  136. some good here right that's nice
  137. let's do some good years yeah think
  138. positive think doing good
  139. instead of don't do bad yeah but the
  140. problem is when you say do good the
  141. question is who's good
  142. because you live in a morally plural society and there's
  143. this question of who are you to say
  144. what's good for people and it's much
  145. easier to say let's reduce harms than it
  146. is to say let's actually do good like this
  147. it says the updated version of google's
  148. code of conduct still retains one
  149. reference to the company's unofficial motto
  150. the final line of the document is still and remember
  151. dot dot dot don't be evil and if you see something
  152. that you think isn't right speak up
  153. okay well they still have don't be evil
  154. though so maybe it's much ado about nothing but uh
  155. having that
  156. kind of power we were just before the
  157. podcast we were watching jack dorsey speak to
  158. members of the senate uh
  159. in regards to
  160. twitter censoring the hunter biden story
  161. and censorship of conservatives but
  162. allowing dictators to spread propaganda
  163. dictators from other countries and
  164. why and what what this is all about one of the things that
  165. uh jack dorsey has been pretty adamant about is that
  166. they really never saw this coming when
  167. they started twitter yeah and they
  168. didn't think that they were ever going
  169. to be in this position
  170. where they were going to be really the
  171. arbiters of free speech for the world
  172. right which is essentially in some ways
  173. what they are i think it's important to
  174. to roll back the clock for people
  175. because it's easy to think
  176. you know that we just sort of landed
  177. here and that they would know that
  178. they're going to be influencing the
  179. global psychology but i think we should really
  180. reverse engineer for the audience how did these products work the way that they did so like let's
  181. go back to the beginning days of twitter
  182. i think his first tweet was something like
  183. checking out the buffaloes in golden
  184. gate park in san francisco um
  185. you know jack was fascinated by the taxi
  186. cab dispatch system that you could send
  187. a message and then all the taxis get it
  188. and the idea is could we create a dispatch system so that i post a tweet
  189. and then suddenly all these other people can see it
  190. and the real genius of these things
  191. was that they weren't just offering this
  192. thing you could do they found ways of keeping people engaged i think this is important for people to
  193. get that they're not competing for your data
  194. or for uh
  195. you know money they're
  196. competing to keep people
  197. using the product and so when twitter for example invented this persuasive feature of the number of
  198. followers that you have
  199. if you remember like that was a new
  200. thing at the time right you log in and
  201. you see your profile
  202. here's the people who you can follow and
  203. then here's the number of followers you have
  204. that created a reason for you to come
  205. back every day to see how many followers do i have
  206. so that was part of this race to keep
  207. people engaged as we talk about in the
  208. film like these things are competing for
  209. your attention that if you're not paying for the product you are the product but the thing that
  210. is the product is your predictable
  211. behavior you're using the product in predictable way
  212. and i remember a conversation i had with
  213. someone at facebook who was a friend of mine
  214. who said in a coffee shop one day people think that
  215. we facebook are competing with something like twitter that
  216. one social network is competing with
  217. another social network but really he
  218. said our biggest competitor is youtube
  219. because they're not competing for social
  220. networks they're competing for attention
  221. and youtube is the biggest competitor in the digital space for attention
  222. and that was a real light bulb moment for me
  223. because you you realize that as they're designing these products they're finding new clever ways to get
  224. your attention that's the real thing
  225. that i think is different
  226. in the film the social dilemma rather
  227. than talking about you know censorship
  228. and data and privacy in these themes
  229. it's really what is the core influence
  230. or impact that the
  231. shape of these products have on how
  232. we're making meaning of the world when they're
  233. steering our psychology do you think
  234. that it was inevitable that someone manipulates
  235. the way people use these things to gather more attention
  236. and do you think that any of this could have been avoided
  237. if there was laws against that
  238. if instead of having these algorithms
  239. that specifically target things that you're interested in
  240. or things that you click on
  241. or things that are going to make you engage more
  242. if they just allow these things to
  243. if someone said listen
  244. you can have these things
  245. you can allow people to communicate with each other
  246. but you can't manipulate their attention span
  247. yeah i mean i think the
  248. so we've always had an attention economy right
  249. and you're competing for it right now
  250. um and politicians compete for it can
  251. you vote for someone you've never paid attention to
  252. never heard about never heard them say
  253. something you know outrageous no um
  254. so there's always been an attention economy and so it's hard to say we should
  255. regulate who gets attention or how but it's
  256. it's organic in some ways right like
  257. this podcast is an organic
  258. i mean if we're in competition it's
  259. organic i just put it out there and if
  260. you watch it you don't
  261. or or you don't i don't you know i don't
  262. have any say over it and i'm not
  263. manipulating it in any way
  264. sort of so i mean let's imagine that the
  265. podcast apps were different
  266. and they actually while you're watching
  267. they had like the hearts and the stars
  268. and the kind of voting up in numbers and
  269. you could like send messages back and forth and
  270. apple podcasts worked in a way that
  271. didn't just reward
  272. you know the things that you clicked
  273. follow on it actually sort of
  274. promoted the stuff that someone said the most outrageous thing
  275. then you as a podcast creator have an
  276. incentive to say the most outrageous
  277. thing and then you arrive at the top of
  278. the apple podcast or spot or spotify app
  279. and and that's the thing is that we
  280. actually are competing for attention
  281. it felt like it was neutral and it was relatively neutral
  282. and to progress that story back in time with um
  283. you know twitter competing for attention
  284. let's look at some other things that
  285. they did so they also added this retweet
  286. this instant resharing feature
  287. right and that made it more addictive
  288. because suddenly we're all playing the
  289. fame lottery right like i could retweet
  290. your stuff and then you get a bunch of
  291. hits and then you could go
  292. viral and you could get a lot of
  293. attention so then instead of
  294. um the companies competing for attention
  295. now each of us suddenly win the fame
  296. lottery over and over and over again
  297. and we're we're getting attention uh
  298. and then um
  299. i had another example i was gonna think
  300. about and i forgot it
  301. what was it um
  302. you can jump if you want um
  303. apple has an interesting way of handling
  304. sort of uh
  305. the way they have their algorithm
  306. for their podcast app is
  307. it's secret it's kind of it's weird but
  308. one of the things that it favors
  309. is it favors new shows and it favors
  310. uh engagement and new subscribers so
  311. comments engagement and new shows
  312. and that's the same as competing for attention
  313. because engagement must mean people like it and that's yeah and there's going to be
  314. a fallacy as we go down that road but go on
  315. well it's interesting
  316. because you could say if you have a podcast
  317. and your podcast gets like let's say a hundred thousand downloads a new podcast can come along and it can
  318. get ten thousand downloads and it'll be
  319. ahead of you in the rankings
  320. and so you could be number three and it
  321. could be number two
  322. and you're like well how is that number
  323. two and it's got
  324. ten times less but they don't do it that
  325. way and their logic
  326. is they don't want the podcast world to be dominated by you know
  327. new york times the big ones yeah and whatever whatever's number one and number two and
  328. number three forever we actually just experienced this um
  329. we have a podcast called urine divided
  330. attention and since the film came out
  331. in that first month we went from being
  332. you know in the lower 100 or something
  333. like that until we shot to the top
  334. five i think we were the number one tech
  335. podcast for a while and so we just
  336. experienced this through the fact not
  337. that we had the most listeners but
  338. because the trend was so rapid that we sort of jumped uh
  339. to the top
  340. i think it's wise that they do that
  341. because eventually it evens out over time you know you see
  342. some people rocking to the top like oh
  343. my god we're number three
  344. and you're like hang on there fella just
  345. give it a couple of weeks and then
  346. three weeks later four weeks later now
  347. they're number 48 and they they get depressed
  348. right well that was really where you
  349. should have been but
  350. the thing that apple does that i
  351. really like in that is it gives
  352. an opportunity for these new shows to be seen
  353. and where they might have gotten just stuck
  354. because these these rankings and the ratings for a lot of these shows
  355. these shows are so consistent and they
  356. have such a following already yeah it's very difficult
  357. for these new shows to gather attention right and
  358. the problem was that there were some
  359. people that game the system
  360. and there was companies that could
  361. literally like earl skakel
  362. remember earl became the number one podcast and like no one was listening to it earl has money
  363. and he he hired some people to game the system
  364. and he was kind of like open about it
  365. and and laughing about now isn't he banned from
  366. itunes now or something i think he got banned
  367. because of that
  368. because it was so
  369. obvious he game the system he had like a
  370. thousand downloads and he was number one
  371. i mean the thing is it were apple
  372. podcasts you can think of as like the
  373. federal reserve or the government of the attention economy
  374. because they're
  375. setting the rules by which
  376. you win right they could have set the
  377. rules as you said to be uh
  378. you know who has the most listeners and
  379. then you just keep rewarding the kings
  380. that already exist versus who is the
  381. most trending there's actually a story
  382. a friend of mine told me i don't know if
  383. it's true although it was a fairly
  384. credible source who said he was a meeting
  385. with steve jobs when they were making
  386. the first podcast app
  387. and that they had uh
  388. made a demo of something where you could see all the
  389. things your friends were listening to so
  390. just like making a news feed like we do
  391. with facebook and twitter right
  392. um and then he said was well why would we do that if something is important enough your
  393. friend will actually just send you a
  394. link and say you should
  395. listen to this like why would we
  396. automatically just promote random things
  397. that your friends are listening to and
  398. again this is kind of how you get back
  399. to social media how is social media so successful
  400. because it's so it's much more addictive to see what
  401. your friends are doing in a feed but it
  402. doesn't reward what's true or what's meaningful and this and this is the thing that
  403. people need to get about social media is
  404. it's it's really just rewarding the things
  405. that tend to keep people back
  406. addictively the business model is addiction
  407. in this race to the bottom of the brain
  408. stem for attention well it seems like
  409. if we in hindsight find size 20 20 what
  410. what should have been done or what could have
  411. been done had we known
  412. where this would pile out is that they
  413. could have said you can't do that you
  414. can't manipulate these algorithms to make sure
  415. that people pay more attention
  416. and manipulate them to ensure that people become deeply addicted to these platforms what
  417. you can do is just let them openly communicate
  418. right but it has to be organic
  419. and then the problem is so if this is
  420. the thing i was going to say about twitter
  421. is when one company does the
  422. call it the engagement feed meaning
  423. showing you the things that the most
  424. people are clicking on
  425. and retweeting trending things like that
  426. let's imagine there's two feeds so
  427. there's the feed
  428. that's called the reverse chronological
  429. feed meaning showing in order in time
  430. you know joe rogan posted this two hours
  431. ago but that's you know
  432. after that you have the thing that
  433. people posted an hour and a half ago all the way up to
  434. 10 seconds ago that's the reverse chronological um
  435. they have a mode like that on twitter if
  436. you click the sparkle icon i don't know if you know this
  437. it'll show you just in time here's what
  438. people said you know sorted by recency
  439. but then they have this other feat
  440. called what people click on retweet et
  441. cetera the most the people you follow
  442. and it sorts it by what it thinks you'll
  443. click on and want the most
  444. which one of those is more successful at
  445. getting your attention the sort of
  446. recency what they posted recently versus
  447. what they know people are clicking on
  448. retweeting on the most certainly what
  449. they know people are clicking on retweeting the
  450. most correct and so once twitter does that
  451. let's say facebook was sitting there
  452. with the recency feed like just showing
  453. you here's the people who posted
  454. in this time order sequence they have to also
  455. switch to who is like the most relevant stuff
  456. right the most clicked retweeted the
  457. most so this is part of this race for
  458. attention that once one actor does something like that and they algorithmically you know figure
  459. out what people what's most popular
  460. the other companies have to follow
  461. because otherwise they won't get the
  462. attention so it's the same thing if
  463. you know netflix adds the autoplay 54321
  464. countdown to get people to watch the next
  465. episode that if that works at say
  466. increasing netflix's watch time by five percent
  467. youtube sits there says we just shrunk
  468. how much time people were watching youtube
  469. because now they're watching
  470. more netflix so we're gonna add
  471. 54321 autoplay countdown and it becomes
  472. again this game theoretic race of who's
  473. going to do more now
  474. if you open up tik-tok tik-tok doesn't even wait
  475. i don't know if you know if
  476. your kids use tik-tok but when you
  477. open up the app it doesn't even
  478. wait for you to click on something it
  479. just actually plays the first video the second you open it
  480. which none of the other apps do right
  481. and the point of that is that causes you
  482. to enter into this engagement stream
  483. even faster so this is this again this
  484. race for attention produces
  485. things that are not good for society and
  486. even if you took the whack-a-mole
  487. sticker you took the anti-trust case and you
  488. whack facebook and you got rid of
  489. facebook or you whack google or you whack youtube
  490. you're just going to have more actors
  491. flooding in doing the same thing and one
  492. other example of this is um
  493. uh the time it takes
  494. to reach let's say 10 million followers so
  495. if you remember back in the ash wasn't
  496. ashton kutcher who raised for the first million followers race with cnn right yeah so now if you
  497. think of it the companies are competing for our attention if they find out that each of us
  498. becoming a celebrity and having a
  499. million people we get to reach
  500. if that's the currency of the thing that
  501. gets us to come back to get more attention then they're competing at who can give
  502. us that bigger fame lottery hit faster
  503. so let's say 2009 or 2010 when ashton kutcher did that it took him i don't know how long it
  504. took months to for him to get
  505. a million i don't remember it was it was
  506. a little bit though right
  507. um and then tick tock comes along and
  508. says hey we want to give kids
  509. the ability to hit the fame lottery and
  510. make it big hit the jackpot
  511. even faster we want you to go from zero
  512. to a million followers in
  513. 10 days right and so they're competing
  514. to make that shorter and shorter and
  515. shorter and i know about this
  516. because you know speaking from a silicon valley perspective
  517. venture capitalists fund these new social platforms based on how fast they can get to like
  518. 100 million users there was this famous line that like i forgot what it was but i think
  519. facebook took like 10 years to get to 100 million users
  520. instagram took you know i don't know
  521. four years three years or something like that tiktok can get there even faster and so
  522. it's shortening shortening shortening
  523. and that's what people are are that's
  524. what we're competing for it's like who
  525. can win the fame lottery faster
  526. but is a world where everyone broadcasts to millions of people
  527. without the responsibilities of publishers journalists etc does that produce an information
  528. environment that's helped that that's
  529. that's healthy and obviously the film
  530. the social dilemma is really about how it makes the worst of us rise to the top
  531. right so our hate our outrage our polarization um
  532. what we disagree about black and
  533. white thinking more conspiracy oriented
  534. views of the world q anon you know
  535. facebook groups things like that
  536. and i can we can definitely go into
  537. there's a lot of legitimate conspiracy
  538. theories i want to make sure i'm not
  539. categorically dismissing stuff um but that's really the point is that we have
  540. landed in a world where the things that we are paying attention to
  541. are not necessarily the agenda of topics that we would say
  542. in a reflective world what we would say is most important
  543. so there's a lot of there's a lot of conversation about free will
  544. and about letting people choose whatever they choo whatever they enjoy viewing and watching and paying attention to
  545. but when you're talking about these incredibly potent algorithms and the incredibly potent uh
  546. addictions that people
  547. that the people develop to these these
  548. things and we're pretending that people
  549. should have the ability to just ignore
  550. it and put it away
  551. right and use your willpower yeah that seems i have another kids
  552. i have a folder on my phone called addict and it's all all caps and it's at the end of my
  553. all you have to scroll through all my
  554. other apps to get to it and so if i want
  555. to get to twitter
  556. or instagram the problem is that the app
  557. switcher will put it in the most recent
  558. so once you switch apps and you have
  559. twitter in a recent it'll be right there
  560. so that's if i want to go
  561. left and yeah if i want to see that yeah
  562. you can't do that
  563. yeah it's um
  564. it's insanely
  565. addictive and uh
  566. if you can control yourself it's not that big a deal but how many
  567. people can control themselves well i think the the thing we have to hone in on
  568. is the asymmetry of power
  569. um you know as i say in the film it's
  570. like we're bringing this ancient brain
  571. hardware the prefrontal cortex which is
  572. like what you use to do
  573. um goal directed action self-control
  574. willpower holding back you know
  575. marshmallow test don't do
  576. the don't get the marshmallow now wait
  577. later for the two marshmallows later
  578. all of that is through our prefrontal
  579. cortex and when you're sitting there
  580. and you think okay i'm gonna go watch
  581. i'm gonna look at this one thing on facebook
  582. because my friend invited me to
  583. this event or it's this one post i have to look at and then next thing you know you find
  584. yourself scrolling through the thing for like an hour right and you say man that was on me i
  585. should have had more self-control
  586. but there behind the screen behind that glass slab is like a supercomputer pointed at your brain
  587. that is predicting the perfect thing to show you next and you can feel it like it's this is really important so like
  588. if i'm facebook and when you flick your finger you think um
  589. when you're using facebook it's just
  590. going to show me the next thing that my friend said but it's not doing that it when you
  591. flick your finger it actually literally
  592. wakes up this sort of super computer avatar voodoo doll version of joe and the voodoo doll of joe is um
  593. you know the more clicks you ever
  594. made on facebook is like adding the little hair to the voodoo doll and the more likes you've ever made
  595. adds little clothing to the voodoo doll and the more um
  596. you know watch time on videos you've ever had adds little
  597. um you know shoes to the voodoo doll so
  598. the voodoo doll is getting more and more accurate the more things you click on this is in
  599. the film the social dilemma like if you notice like the character you know as he's using this thing uh
  600. it builds a more and more accurate
  601. model that the ais the three ais behind
  602. the screen are kind of manipulating
  603. and the idea is it can actually predict
  604. and prick the voodoo doll with this
  605. video or that post from your friends
  606. or this other thing and it'll figure out
  607. the right thing to show you
  608. that it knows will keep you there
  609. because it's already
  610. seen how that same video or that same
  611. post has kept 200 million other voodoo dolls there
  612. because you just look like another
  613. voodoo doll so here's an example
  614. and this works the same on all the
  615. platforms if you are were a teen girl
  616. and you opened a dieting video on youtube um
  617. 70 of youtube's watch time comes from
  618. the recommendations on the right-hand side right so the things that are showing recommended videos next and it will uh
  619. show you it'll show
  620. what did it show that the girls who
  621. watch the teen dieting video
  622. it showed anorexia videos
  623. because those were better
  624. at keeping the teen girls attention not
  625. because it said these are good for them
  626. these are helpful for them it just says these tend to work
  627. at keeping their attention so again
  628. these tend to work if you are already
  629. watching diet videos yeah so if you're a
  630. 13 year old girl and you watch a diet video youtube wakes up it's voodoo doll
  631. version of that girl and says hey i've got like 100 million other voodoo dolls of 13 year old girls right
  632. and they all tend to watch these these
  633. other videos i don't know i just know
  634. that they have this word thin spo
  635. the inspiration is the name for it to be
  636. inspired for anorexia yeah it's a real thing um
  637. youtube addressed this problem a
  638. couple years ago but when you let the
  639. machine run blind all it's doing is
  640. picking stuff that's engaging
  641. why did they choose to not let the
  642. machine run blind with
  643. one thing like anorexia well so now
  644. we're getting into the twitter
  645. censorship conversation and the
  646. moderation conversation so the real
  647. this is why i don't focus on censorship in moderation
  648. because the real issue
  649. is if you blur your eyes and zoom way
  650. out and say how does the whole machine
  651. tend to operate like no matter what i
  652. start with what is it going to recommend next so um
  653. you know if you started with
  654. um you know a world war ii video youtube would recommend a bunch of holocaust denial videos right
  655. if you started teen girls with a dieting video it would recommend these anorexia videos
  656. uh in facebook's case if you joined
  657. there's so many different examples here
  658. because facebook recommends groups to people based on what it thinks is most engaging
  659. for you so if you were a new mom
  660. you had renee diresta my friend on this
  661. podcast we've done a bunch of work
  662. together and she has this great example of as a new mom
  663. she joined one facebook group for
  664. mothers who do do it yourself baby food
  665. like organic baby food
  666. and then facebook has this sidebar it
  667. says here's some other groups you might recommend you might want to join and what do you
  668. think was the most engaging of those
  669. because facebook again is picking on
  670. which group if i got you to join it
  671. would cause you to spend the most
  672. time here right so force some
  673. do-it-yourself baby food groups which group do you think
  674. it selected probably something about
  675. vaccines exactly so anti-vaccines for moms yeah okay so then if you join that group now
  676. it does the same run the process again
  677. so then so now look at facebook so it
  678. says hey i've got these voodoo dolls
  679. i've got like 100 million voodoo dolls
  680. and they're all they just join this
  681. anti-vaccine moms group and then what do
  682. they tend to engage with for very long time if i get them to join these other groups
  683. which of those other groups would show up i don't know chemtrails oh the pizzagate
  684. flat earth flat earth absolutely yep and
  685. youtube recommended so i'm
  686. interchangeably going from youtube to facebook
  687. because it's the same dynamic
  688. they're competing for attention and
  689. youtube recommended flat earth conspiracy theories hundreds of millions of times and so
  690. when you when you're a parent during covid and you sit your kids in front of youtube
  691. because you're like i'm
  692. i've got a this is the digital pacifier
  693. got to let them do their thing i got to do work right and then you come back to the
  694. dinner table and your kid says you know
  695. the holocaust didn't happen and the earth is flat and people are wondering why it's
  696. because of this
  697. and now to your point about this sort of moderation thing we can take the whack-a-mole stick after
  698. the public yells and renee and i you
  699. know make a bunch of noise or something
  700. in a large community by the way of
  701. people making noise about this
  702. and they'll say okay shoot you're right
  703. flat earth we got to deal with that and
  704. so they'll tweak the algorithm and then
  705. people make a bunch of noise about the inspiration videos for uh
  706. anorexia for kids and they'll deal with that problem but then they start doing it based
  707. reactively but again if you zoom out
  708. it's just still recommending stuff
  709. that's kind of from the crazy town
  710. section is the problem the recommendation
  711. because i i don't mind
  712. that people have ridiculous ideas about hollow earth
  713. because i think it's humorous but i'm also a 53 year old man right
  714. right i'm not i'm not a 12 year old boy with a limited education that is like
  715. oh my god the government's lying to us there's lizard people that live under the earth
  716. right but if that's the real argument
  717. about these conspiracy theories is that
  718. they can influence young people or the easily impressionable or or
  719. people that maybe don't have a sophisticated sense of vetting out [ __ ] right well and the
  720. algorithms aren't making a distinction between who is just laughing at it right and who
  721. is deeply vulnerable to it and generally it's just
  722. it just says who's vulnerable to it
  723. another example the way i think about
  724. this is if you're driving down the highway and and you know there's facebook and
  725. google trying to figure out like what
  726. should i give you based on what tends to keep your attention if you look at a car crash and everybody
  727. driving the highway they look at the car crash according to facebook and google's like
  728. the whole world wants car crashes we just feed them car crashes after car crashes
  729. after car crashes and what the algorithms do as guillaume chaslow in the film says who's the youtube whistleblower from the youtube
  730. recommendation system is they find the
  731. perfect little rabbit hole for you that
  732. it knows will keep you there for five hours and the conspiracy theory like dark corners of youtube were the dark corners that tends to keep
  733. people there for five hours
  734. and so you have to realize that we're
  735. now something like 10 years in
  736. to this vast psychology experiment where it's been
  737. you know in every language in hundreds
  738. of countries right and ever in hundreds of languages it's been steering people towards the
  739. crazy town when i say crazytown i think of you know imagine there's a spectrum on
  740. youtube and there's on one side you have like the calm walter cronkite carl sagan you know slow you know kind of boring but like
  741. educational material or something
  742. and the other side of the spectrum you
  743. have you know the craziest stuff you can find um
  744. crazy town no matter where you start
  745. you could start in walter cronkite or
  746. you could start in crazytown
  747. but if i'm youtube and i want you to
  748. watch more am i going to steer you
  749. towards the calm stuff or am i going to
  750. steer you more towards crazy town
  751. crazy dumb always more towards crazy
  752. town so then you imagine just tilting
  753. the floor of humanity
  754. just by like three degrees right and then you just step back and you let society run its
  755. course as jaren lanier says in the film
  756. if you just tilt society by one degree two degrees that's the whole world that's that's
  757. what everyone is thinking and believing
  758. and so if you look at the at the degree to which people are deep into rabbit hole conspiracy
  759. thinking right now and again i want to acknowledge cointelpro operation mockingbird like there's a lot
  760. of real stuff right so
  761. i'm not categorically dismissing it but
  762. we're asking what is the
  763. basis upon which we're believing the
  764. things we are about the world
  765. and increasingly that's that's based on
  766. technology and we can get into
  767. you know what's going on in portland
  768. well the only way i know that is i'm
  769. looking at my social media feed and
  770. according to that it looks like the
  771. entire city is on fire and it's a war zone but if you i called a friend there the
  772. other day and he said it's a beautiful
  773. day there's there's actually no violence
  774. anywhere near where i am it's just like
  775. these two blocks or something like that
  776. and and this is the thing is warping our view of reality and and i think that's what really for
  777. me the social dilemmas was really trying
  778. to accomplish as a film
  779. and you know the director jeff werlowski
  780. was trying to accomplish is
  781. is how did this society get go crazy
  782. everywhere all at once
  783. you know seemingly you know this didn't
  784. happen by accident happened by design of this business model when did the business model get
  785. implemented like when did they start
  786. using these algorithms to recommend things
  787. because initially
  788. youtube was just a series of videos and
  789. it didn't have that
  790. recommended correct section when was
  791. that you know it's a good question i mean um
  792. you know they originally youtube was just post a video and you can get people to
  793. you know go to that url and send it around uh
  794. they needed to figure out once the
  795. competition for attention got more intense they needed to figure out how am i gonna
  796. keep you there and so recommending those
  797. videos on the right hand side i think
  798. that was there pretty early
  799. if i remember actually
  800. because that's
  801. that was sort of the innovation is like
  802. keeping people within this youtube wormhole and once people were in the youtube
  803. wormhole constantly seeing videos
  804. that was what they could they could
  805. offer the promise to a new video
  806. uploader hey if you post it here you're
  807. going to get way more views than if you posted on vimeo right and that's that's the thing if i
  808. open up tik tok right now on my phone do
  809. you have tic tac on your phone
  810. um well i'm not supposed to obviously
  811. but more for research purposes
  812. do you know how to take talk at all no
  813. okay my 12 year old is obsessed oh really oh yeah she can't even sit around if
  814. she's standing still for
  815. five minutes she just starts like
  816. she starts tik-toking and that's the
  817. thing i mean 2012 2012 oh so the mayans were right
  818. right 2012 the platform announced an update to the discovery system uh
  819. designed to identify the videos people actually want
  820. to watch by prioritizing videos that hold attention throughout as well as increasing the amount of time
  821. a user spends on the platform overall utoh youtube could assure advertisers that it
  822. was providing a valuable
  823. high quality experience for people yeah so um
  824. that that's beginning of the end yeah
  825. so 2012 on youtube's timeline i mean um
  826. you know the twitter and facebook world i think
  827. introduces the retweet and reshare
  828. buttons in the 2009 to 2010 kind of time period so you end up with this world where the
  829. things that we're most paying attention to are based on algorithms choosing for us and so the sort of deeper argument that's in
  830. the film that i'm not sure everyone
  831. picks up on is these technology systems
  832. have taken control of human choice
  833. they've taken control of humanity
  834. because they're controlling the information that all of us are getting right
  835. think about every election like
  836. um i think of facebook as kind of a
  837. voting machine but it's a
  838. sort of indirect voting machine
  839. because it controls the information for four
  840. years that your entire society is getting and then everyone votes based on that
  841. information now you could say well hold on radio and television were there and were partisan before that but actually tv um
  842. radio and tv are often getting their news stories from twitter and twitter is recommending things based on these
  843. algorithms so when you control the
  844. information that an entire population is
  845. getting you're controlling
  846. their choices i mean literally in
  847. military theory if i want to screw up
  848. your military i want to control the
  849. information that it's getting i want to confuse the enemy and that information funnel is
  850. the very thing that's been corrupted and
  851. it's like the flint water supply for our minds i was talking to a friend yesterday and
  852. she was saying that there were articles that uh
  853. she was laughing that there's articles that are written
  854. about negative tweets that random people make about a celebrity doing this or that
  855. and she was like and she was quoting this article she's like
  856. look how crazy this is this is a whole article that's written about someone who decided
  857. to say something negative about some
  858. something some celebrity had done and
  859. then it becomes this huge art and then
  860. the tweets are prominently featured
  861. right and then the response to those i
  862. mean like like really
  863. like arbitrary like weird
  864. because it's a
  865. values-blind system that just cares
  866. about what will get attention
  867. exactly and that's what the article was
  868. it was just an attention grab
  869. it's interesting
  870. because um
  871. prince harry and megan have become very interested in these issues
  872. and are actually working on these issues and um
  873. getting to know them just a little bit are they really
  874. yeah well they're
  875. because it affects them personally
  876. well it's actually interesting i mean i don't want to speak for them but um
  877. i think megan has been the target
  878. of the most vitriol hate oriented stuff
  879. on the planet right from
  880. just the amount of sort of criticism
  881. that they that they get really and scrutiny yeah i mean she's just like news feeds filled
  882. with hate about just what she looks like
  883. what she says just constantly
  884. boy i'm out of the loop i've never seen anything
  885. she's pretty
  886. what do they think she looks like
  887. i honestly i don't follow it myself
  888. because i don't fall into these
  889. attention traps i try not to but
  890. people she just faces the worst victory
  891. i mean this is the thing with teen bullying right so i think they work on these issues
  892. because teenagers are now getting a
  893. micro version of this thing where each
  894. of us are scrutinized
  895. you know and i think that's what's not i
  896. mean think about what celebrity status
  897. does and how it screws up humans in
  898. general right like take an average celebrity like it warps your mind it warps your psychology
  899. and you get scrutiny right
  900. when you suddenly are followed each
  901. person gets thousands or project forward
  902. in the future a few years
  903. each of us have you know tens of
  904. thousands to hundreds of thousands of
  905. people that are following what we say
  906. that's a lot of feedback and you know as
  907. jonathan heights says in the film and i
  908. know you've had him here yeah
  909. you know it's made kids much more cautious and and less risk-taking and um
  910. and more bullied overall and um
  911. there's just huge problems in mental
  912. health around this yeah it's really bad for young girls right
  913. um especially for celebrities and i've
  914. had quite a few celebrities in here and
  915. we've discussed it i just tell them that
  916. you can't read that stuff just don't read it yeah
  917. like there's no good in it like i had a friend um
  918. she did a show she's a comedian she did
  919. a show and she was talking about this one negative comment that was inaccurate you know that said
  920. she only did a half an hour and her show
  921. sucked she's like [ __ ] her that's not like i go why are you reading that she's like
  922. because it's mostly positive i go but
  923. how come you're not talking about most of it then we're talking about this one person yeah
  924. it's one negative person we're both
  925. laughing about it like she's
  926. she's healthy you know she's not she's
  927. not completely [ __ ] up by it but
  928. this one person got into her head i'm
  929. like i'm telling you it's not the juice
  930. is not worth the squeeze
  931. but don't read those things but this is
  932. this is exactly right and this is based
  933. on how our minds work i mean our minds
  934. literally have something called
  935. negativity bias so if you have a hundred comments and 99 are positive and one is negative just where does the average human's mind go
  936. right they go to the negative
  937. yeah and it also goes to the negative
  938. even when you shut down the screen your
  939. mind is sitting there
  940. looping on that negative comment and why
  941. because evolutionarily it's really important that we look at
  942. social approval negative social approval
  943. because our reputation is at stake in the tribe yes so it
  944. matters yes but it's never been
  945. easier now for not just that that one
  946. comment to sort of gain more airtime but
  947. then for that to build a hate mob and
  948. then to see the interconnected clicks
  949. and i can go in and see
  950. 10 other people that responded to that that are now yes
  951. and so especially when you have
  952. teenagers that are exposed to this and
  953. you can keep going down the tree and see
  954. all of the hate fest on you this is the
  955. psychological environment that is the default way that kids are growing up now yeah i
  956. actually faced this recently with the film itself
  957. because actually the film has gotten just crazy positive acclaim for the most part
  958. and there's just a few you know negative comments
  959. and for myself even right becomes a conjunction
  960. but i was glued to a few negative
  961. comments and i and then you could click
  962. and you would see
  963. the other people that you know who positively like or respond to those comments like why did
  964. that person say that negative thing i
  965. thought we were friends that whole kind of psychology and we're all vulnerable to
  966. it unless you learn as
  967. you said to tell your celebrity friends
  968. just don't pay attention even mild stuff
  969. i see people fixate on even mild
  970. disagreement or mild criticism people fixate on and it's um
  971. it's it's also a problem
  972. because you realize that someone's
  973. saying this and you're not there and you
  974. can't defend yourself so you have this
  975. feeling of helplessness like hey that's not true i didn't and then you you don't get it out of your system
  976. you never you never get to express it
  977. and people can share that
  978. false negative stuff i mean not all
  979. negative stuff is false but you can
  980. assert things and build on the hate fest right and start going crazy and saying this person's a
  981. white supremacist or this person's even worse and that'll spread to thousands and
  982. thousands of people and next thing you
  983. know you check into your feed again at you know 8 p.m that night and you your whole
  984. reputation's been destroyed yes and you
  985. didn't even know what happened to you
  986. well and this happened to teenagers too
  987. i mean they're anxious like i'll post
  988. you know teenager opposed to photo
  989. uh their high school they make a dumb
  990. comment without thinking about it
  991. and then next thing they know you know
  992. at the end of the day the parents are all calling
  993. because like 300 parents saw it
  994. and are calling up the parent of that kid
  995. and it it's you know we talk to teachers a lot
  996. in our work at the center for humane technology and they um
  997. will say that on monday morning this
  998. is before kobe but on monday morning
  999. they spend the first like
  1000. hour of class having to clear all the
  1001. drama that happened on social media from the weekend for the kids jesus and
  1002. again like this and these kids are in what age group this is like eighth ninth ninth tenth
  1003. grade that kind of thing
  1004. and the other problem with these
  1005. kids is there's not like uh
  1006. a long history of people growing up
  1007. through this kind of influence and
  1008. successfully navigating it yeah
  1009. these are the these are the pioneers
  1010. yeah and they won't know anything
  1011. different which is why you know we talk
  1012. about in the film like
  1013. this they're growing up in this environment and you know one of the simplest
  1014. principles of ethics um
  1015. uh is the ethics of symmetry doing onto
  1016. others as you would do to yourself and
  1017. as we say at the end of the film like
  1018. one of the easiest ways you know that
  1019. there's a problem here is that
  1020. many of the executives at the social media tech companies don't let their own kids use social
  1021. media right they literally say at the
  1022. end of the film like it's
  1023. we have a rule about it we're religious
  1024. about it we don't do it the ceo of lunchable's foods
  1025. didn't let his own kids eat lunchables
  1026. that's when you know if you talk to a
  1027. doctor or a lawyer a doctor and you say
  1028. you know would you get this surgery for
  1029. your own kids oh no i would never do
  1030. that like would you trust that doctor
  1031. right and it's the same thing for a
  1032. lawyer so this is the relationship where
  1033. we have a relationship of asymmetry and
  1034. technology is influencing all of us
  1035. and we need a system by which you know
  1036. when i was growing up
  1037. uh you know i grew up on the macintosh
  1038. and technology and i was
  1039. creatively doing programming projects
  1040. and whatever else the people who built
  1041. the technology i was using would have their own kits use the things that i was using
  1042. because they were creative and they were about tools and empowerment
  1043. and that's what's changed we don't have that anymore
  1044. because the business model took over and so instead of having just tools
  1045. sitting there like hammers waiting to be used to build you know creative projects or
  1046. programming to invent things or paint brushes or whatever we now have a manipulation based
  1047. technology environment where everything you use has this
  1048. incentive to not only addict you but to
  1049. have you play the fame lottery get social feedback
  1050. because those are all the things that keep people's attention
  1051. isn't this also a problem with these information technologies being
  1052. attached to corporations that have this philosophy of unlimited growth
  1053. yes so they're they're no matter how much they make i i applaud apple
  1054. because i think they're the only company that takes steps to protect privacy to uh
  1055. block advertisements to make sure that
  1056. at least like when you when you use
  1057. their maps application they're not
  1058. saving your data and sending it to everybody and it's one of the reasons why apple maps
  1059. is really not as good
  1060. as google maps right but i use it
  1061. and that's one of the reasons why i use
  1062. it and when apple came out recently
  1063. and there was um
  1064. they were doing something to uh
  1065. to to block your uh
  1066. information being uh
  1067. sent to other places and they i forget what was the exact thing that it was in the new ios they
  1068. released a thing that blocks the tracking identifiers that's right and it's not actually out
  1069. yet it's going to be out in january or
  1070. february i think someone told me and
  1071. what that's due that's a good example of
  1072. they're putting a tax on the advertising industry
  1073. because just by saying you can't track people individually
  1074. that you know takes down the value of an advertisement by like 30 or something
  1075. here it is pops up and you when i do safari
  1076. i get this whole privacy report thing
  1077. right that says it's like in the last seven days it's
  1078. prevented 125 trackers from profiling me
  1079. right yeah and you can opt out of that
  1080. if you'd like if you're like no [ __ ] that track me yeah yeah you can do that you can let
  1081. them send your data but
  1082. that that seems to me a much more ethical approach to be able to decide whether or not
  1083. these companies get your information
  1084. i mean those things are great um
  1085. the challenge is imagine you get the privacy
  1086. equation perfectly right look at this
  1087. apple working on its own search engine as google ties could be cut soon
  1088. i started using duckduckgo
  1089. yep for that very reason
  1090. just because it's they don't do anything with it
  1091. you know they give you the information
  1092. but they don't they don't take your data
  1093. and and do anything with it the the
  1094. challenge is let's say we get all the privacy stuff perfectly perfectly right and data
  1095. production and data controls and all that stuff in a
  1096. system that's still based on attention
  1097. and grabbing attention and harvesting and strip mining our brains uh
  1098. you still get maximum polarization addiction mental health problems isolation teen depression and suicide um
  1099. polarization breakdown of truth right right so that's
  1100. we really focus in our work uh
  1101. on those topics
  1102. because that's the direct
  1103. influence of the business model on
  1104. warping society like we need to name
  1105. this mind warp we think of it like the
  1106. climate change of culture
  1107. that you know we they seem like they seem like different disconnected topics much like with
  1108. climate change you'd say like okay we've
  1109. got species loss in the amazon we've got
  1110. we're losing insects
  1111. we've got melting glaciers
  1112. we've got ocean acidification
  1113. we've got the coral reefs you know getting dying
  1114. these can feel like disconnected things
  1115. until you have a unified model
  1116. of how emissions change all those different phenomena right
  1117. in the social fabric
  1118. we have shortening of attention spans
  1119. we have more outrage driven news media
  1120. we have more polarization
  1121. um we have more breakdown of truth we
  1122. have more conspiracy-minded thinking
  1123. these seem like separate events uh
  1124. and separate phenomena but they're actually
  1125. all part of this attention extraction paradigm that the company's growth as you said
  1126. depends on extracting more of our
  1127. attention which means more polarization more extreme material more conspiracy thinking
  1128. and shortening attention spans
  1129. because we we also say like you know if we want
  1130. to double the size of the attention economy i want your attention joe to be split
  1131. into two separate streams
  1132. like i want you watching the tv uh
  1133. the tablet and the phone at the same time
  1134. because now i've tripled the size of the
  1135. amount of extractable attention that i
  1136. can get for advertisers
  1137. which means that by fracking for
  1138. attention and splitting you into
  1139. more junk you know attention that's like thinner we can sell that as if it's real
  1140. attention like the financial crisis
  1141. where you're selling
  1142. thinner and thinner financial assets as
  1143. if it's real but it's really just a junk asset
  1144. oh wow and that's kind of where we are
  1145. now where it's sort of the junk attention economy
  1146. because we we're we can shorten
  1147. attention spans and we're debasing the substrate
  1148. of that makes up our society
  1149. because everything in a democracy depends on individual
  1150. sense making and meaningful choice
  1151. meaningful free will meaningful
  1152. independent views but if that's all
  1153. basically sold to the highest bidder
  1154. that debases the soil
  1155. from which independent views grow
  1156. because all of us are jacked into this
  1157. sort of matrix of social media manipulation
  1158. that's that's ruining and degrading our
  1159. democracy and that's really
  1160. there's many other things that are
  1161. ruining integrating our democracy but
  1162. that's that's the sort of invisible force that's upstream
  1163. that affects every other thing downstream
  1164. because if we can't agree on
  1165. what's true for example
  1166. you can't solve any problem i think
  1167. that's what you talked about in your
  1168. 10-minute thing on the social dilemma i
  1169. think i saw on youtube yeah um
  1170. your organization highlights all these issues
  1171. in you know in an amazing way and it's very important
  1172. it's hard right so i just want to say
  1173. that this is as a complex a problem
  1174. as climate change um
  1175. in the sense that
  1176. you need to change the business model i
  1177. think of it like we're on the fossil fuel economy
  1178. and we have to switch to some kind of beyond that thing right
  1179. because so long as the business models of these companies depend on extracting attention can you expect
  1180. them to do something different like
  1181. you can't but how could you is it i mean
  1182. there's so much money involved and now
  1183. they've accumulated so much wealth that they have an amazing amount of influence
  1184. yeah you know and and the asymmetric influence can buy
  1185. lobbyists can influence congress and
  1186. prevent things from happening so this is
  1187. why it's kind of the last missiles
  1188. that's right but you know i think we're
  1189. seeing signs of real change we have the
  1190. anti-trust case that was just filed
  1191. against google in congress we're seeing more hearings what was the basis of that case you know
  1192. to be honest i was actually in the middle of uh
  1193. the social dilemma launch
  1194. when i think that happened and our my home burned down in the recent fires in santa rosa
  1195. so i actually missed that happening
  1196. it's hard to hear that
  1197. yeah sorry that was a big thing to drop
  1198. but yeah no it's it's awful there's so much that's been happening in the last six years
  1199. i've been uh
  1200. i was evacuated three times where i lived in california
  1201. oh really yeah
  1202. so we got real close to our house
  1203. justice departments who's monopolist google for violating antitrust laws
  1204. department files complain against google to restore competition and search
  1205. and search advertising markets okay
  1206. so it's all about search yeah this is right
  1207. this was a case that's about google using its dominant position to privilege
  1208. its own search engine
  1209. um in its own products and beyond
  1210. which is similar to sort of microsoft
  1211. bundling in the internet explorer browser but i you know this is all good progress but
  1212. really it misses the kind of fundamental
  1213. harm of like these things are warping
  1214. our society they're warping how our
  1215. minds are working and there's no
  1216. you know congressional action against that
  1217. because it's a really hard problem to solve i think the reason the film for me
  1218. is so important is that
  1219. if i look at the growth rate of how fast uh
  1220. facebook has been recommending people
  1221. into conspiracy groups and
  1222. um kind of polarizing us into separate
  1223. echo chambers which we should really
  1224. break down i think
  1225. as well for people like exactly the
  1226. mechanics of how that happens
  1227. but if you look at the growth rate of
  1228. all those harms compared to
  1229. you know how fast has congress passed
  1230. anything to deal with it like basically not at all they seem a little bit unsophisticated
  1231. in that regard like big big
  1232. understatement yeah yeah they are trying to be charitable i i want to be charitable too
  1233. and i want to make sure i call out and
  1234. there's senator mark warner blumenthal um uh
  1235. several other senators we've
  1236. talked to have been
  1237. really on top of these issues and led i
  1238. think senator warner's white paper
  1239. um on how to regulate the tech platforms
  1240. is one of the best it's from two years
  1241. ago in 2018
  1242. and rafi martina his staffer is an
  1243. amazing human being has worked very hard on these issues so there are some good folks but when
  1244. you look at the broad
  1245. like the hearing yesterday it's mostly
  1246. grandstanding to politicize the issue right
  1247. because you you turn it into on
  1248. the right um hey you're censoring conservatives and on
  1249. the left it's hey you're not taking down enough misinformation and dealing with the hate
  1250. speech and all these kinds of things right and they're not actually dealing with
  1251. how would we solve this problem they're
  1252. just trying to make a political point
  1253. to win over their base now the facebook
  1254. recently banned the q and on pages
  1255. which i thought was kind of fascinating
  1256. because i'm like
  1257. well this is a weird sort of slippery
  1258. slope isn't it like
  1259. if you decide that you i mean it's it
  1260. almost seemed to me like well we'll
  1261. throw them a bone we'll get rid of q on
  1262. because it's so preposterous let's
  1263. just get rid of that
  1264. what else like if you keep going down
  1265. that rabbit hole where do you draw the line like
  1266. where are you allowed to have jfk conspiracy theories are you allowed to have flat earth are
  1267. you allowed i mean i guess flat earth is not dangerous is that where they make the distinction
  1268. so i think their policy is evolving in the direction of when things are causing offline harm when online content is known to precede
  1269. offline harm that's when the platform
  1270. that's the standard by which platforms are acting what um
  1271. what offline harm has been
  1272. caused by the q and on stuff do you know
  1273. um there's several incidents we
  1274. interviewed a guy on our podcast about it um
  1275. there's some armed gunpoint type thing i
  1276. can't remember um uh
  1277. and there's there's things that
  1278. are priming people to be violent
  1279. you know um
  1280. uh these are i just wanna
  1281. say these are really tricky topics right
  1282. i think what i wanna
  1283. make sure we get to though is that there
  1284. are many people manipulating the group
  1285. think that can happen in these echo chambers
  1286. because once you're in one of
  1287. these things like i studied cults
  1288. earlier in my career
  1289. and the power of cults is like they're a
  1290. vertically integrated persuasion stack
  1291. because they control your social relationships they control
  1292. who you're hearing from and who you're not hearing from they give you meaning purpose and belonging they um
  1293. they have a custom language they have
  1294. an internal way of referring to things
  1295. and social media allows you to create
  1296. this sort of decentralized cult
  1297. factory where it's easier to
  1298. grab people into an echo chamber where
  1299. they only hear from other people's views
  1300. and facebook i think even just recently
  1301. announced that they're going to be
  1302. promoting more of the facebook group content into feeds which means that they're
  1303. actually going to make it easier for
  1304. that kind of manipulation to happen
  1305. but did they make the distinction
  1306. between group content and
  1307. conspiracy groups like how do you how do you when when does group content
  1308. when does it cross a line i don't know i mean the policy teams that work on this are
  1309. coming up with their own standards so
  1310. i'm not familiar with it if you think about you know think about how hard it is to
  1311. come up with a law at the federal level
  1312. that all states will agree to then you imagine facebook trying to come up with a policy
  1313. that will be universal to
  1314. all the countries that are running
  1315. facebook right well then you imagine how
  1316. you take a company that never thought
  1317. they were going to be in the position
  1318. to do that correct and then within a
  1319. decade they become the most prominent
  1320. source of news and information on the planet earth correct and now they have to regulate it
  1321. and you know i actually believe
  1322. zuckerberg when he says
  1323. i don't want to make these decisions i
  1324. shouldn't be in this role where my beliefs decide the whole world's views right he
  1325. genuinely believes that yeah
  1326. um and and to be certain of that but the
  1327. problem is he created a situation where he is now in that position i mean he got there
  1328. very quickly and they did it
  1329. aggressively when they went into
  1330. countries like myanmar ethiopia
  1331. uh all throughout the african continent
  1332. where they gave do you know about free basics no so this is the program that i think
  1333. has gotten something like 700 million
  1334. accounts onto facebook where they do a
  1335. deal with like a telecommunications
  1336. provider like at their version of 18t
  1337. in myanmar or something so when you get your smartphone facebook's built-in facebook's built-in
  1338. i do know about that and there's a
  1339. uh asymmetry of uh
  1340. access where it's
  1341. free to access facebook
  1342. but it costs money to do the other
  1343. things so for the data plan so
  1344. you get a free facebook account facebook
  1345. is the internet basically
  1346. because it's the free thing you can do
  1347. on your phone and
  1348. then there's we know that there's fake
  1349. information that's being spread
  1350. so the data doesn't apply to facebook
  1351. use yeah i think like the cost
  1352. you know how we pay for data here like i
  1353. think you don't pay for facebook but you do pay for all the other things which creates an
  1354. asymmetry where of course you're going
  1355. to use facebook for most things
  1356. right so you facebook messenger yeah and
  1357. what's that yeah yeah what's up
  1358. i don't know exactly with video
  1359. because different
  1360. little faces has video calls as well in
  1361. general they do yeah i just don't know
  1362. how that works in the developing world
  1363. but there's a joke within facebook i
  1364. mean this has caused genocides right so
  1365. in myanmar which is in the film
  1366. um the rohingya muslim minority group many rohingya were persecuted and murdered
  1367. because of fake information
  1368. spread by the government on facebook
  1369. using their asymmetric knowledge with
  1370. fake accounts i mean even just a couple
  1371. weeks ago facebook took down
  1372. a network of i think several hundred
  1373. thousand fake accounts in myanmar
  1374. and they didn't even have at the time
  1375. more than something like four or five
  1376. people in their extended facebook
  1377. network who even spoke the language
  1378. of that country oh god so when you
  1379. realize that this is like
  1380. the i think of like the iraq war colin powell pottery barn rule where like you know if
  1381. you go in and you break it then you are
  1382. responsible for fixing it
  1383. this is facebook actively doing deals to
  1384. go into ethiopia to go into myanmar to
  1385. go into the philippines or whatever and providing these solutions and then it breaks the society
  1386. and they're now in a position where they
  1387. have to fix it there's actually a joke within facebook that if you want to know which countries will
  1388. be quote unquote at risk
  1389. in two years from now look at which ones
  1390. have facebook free basics
  1391. jesus and it's terrifying that they do
  1392. that and they don't have very many
  1393. people that even speak the language so
  1394. there's no way they're gonna be able to
  1395. filter it that's right and so now if you
  1396. take it back i know we were talking
  1397. outside about the congressional hearing
  1398. and jack dorsey and the questions from the senator about are you taking down the content from the
  1399. ayatollahs or from the chinese
  1400. xinjiang province about the uyghurs
  1401. uh you know when there's sort of speech
  1402. that leads to offline violence in these other countries the issue
  1403. is that these platforms are managing the information commons for countries they don't even
  1404. speak the language of
  1405. right and if you think the conspiracy
  1406. theory sort of dark
  1407. corners crazy town of the english internet are bad and we've we've already taken out like
  1408. hundreds of whack-a-mole sticks and
  1409. they've hired hundreds of policy people
  1410. and hundreds of engineers to deal with
  1411. that problem you go to a country like ethiopia where um
  1412. there's something like 90 major
  1413. there's 90 something dialects i think in the country and six major languages where one of them
  1414. is the dominant facebook sort of
  1415. language and then the others get persecuted
  1416. because they actually don't have um
  1417. uh they don't have a voice on the
  1418. platform this is really important that um
  1419. the people in myanmar
  1420. who got persecuted and murdered
  1421. didn't have to be on facebook
  1422. for the fake information spread about them
  1423. to impact them for people to go after them
  1424. right so this is the whole
  1425. i can assert something about this minority group
  1426. that minority group isn't on facebook
  1427. but if it manipulates the dominant culture to go
  1428. we have to go kill them
  1429. then they can go do it and the same thing has happened um
  1430. you know in india uh
  1431. where there's videos uploaded about
  1432. hey those muslims i think they're called flesh killings
  1433. where they'll say that these muslims
  1434. killed this cow and hindu um
  1435. is it hinduism the cows are sacred um the uh
  1436. to get that right anyway
  1437. i believe you did yeah um the uh
  1438. they will post those they'll go viral on
  1439. whatsapp and say we have to go lynch those uh muslims
  1440. because they killed our sacred the sacred cows
  1441. and they went from something like five
  1442. of those happening per year to now
  1443. hundreds of those happening per year
  1444. because of fake news being spread
  1445. again on facebook facebook about them on whatsapp about them and again they don't have to be on the
  1446. platform for this to happen to them
  1447. right so this is critical that you know
  1448. imagine you and i are all let's imagine
  1449. all of your listeners
  1450. you know i don't even know how many you
  1451. have like tens of millions right and we
  1452. all listen to this conversation we say
  1453. we don't want to even use facebook and twitter or youtube
  1454. we all still if you live in the us still
  1455. live in a country that everyone else will vote based on everything
  1456. that they're seeing on these platforms
  1457. if you zoom out to the global context all of us don't
  1458. we don't use facebook in brazil but if brazil which uh
  1459. was heavily the last election was skewed uh
  1460. by facebook and whatsapp where something
  1461. like 87 percent of people
  1462. saw at least one of the major fake news
  1463. stories about bolsonaro and he got
  1464. elected and you have people in brazil
  1465. chanting facebook facebook when he wins
  1466. he wins and then he sets a new policy to
  1467. wipe out the amazon
  1468. all of us don't have to be on facebook
  1469. to be affected by a leader that wipes
  1470. out the amazon and accelerates climate change timelines
  1471. because of those interconnected effects
  1472. so i you know we at the center for
  1473. immune technology are looking at this
  1474. from a global perspective
  1475. where it's not just the us election
  1476. facebook manages something like 80 elections per year and if you think that they're doing all
  1477. the monitoring that they are for you
  1478. know english-speaking american election most privileged society
  1479. now look at the hundreds of other countries that they're operating in do
  1480. you think that they're devoting
  1481. the same resources to to the other countries this is so crazy it's like
  1482. [INTERRUPTION]
  1483. is that you jamie that's a weird noise
  1484. you hear like a squeaky
  1485. i heard it too
  1486. yeah maybe it's me i don't think it is
  1487. just might be feedback
  1488. there it is
  1489. it might be me breathing
  1490. i don't know do you have a you have asthma
  1491. i i think i had an allergy coming oh yeah
  1492. i was like sorry
  1493. [/INTERRUPTION]
  1494. um what's terrifying is
  1495. that we're talking about from 2012 to 2020 um
  1496. youtube implementing this program and then what
  1497. is the even the birth of facebook
  1498. what is that like 2002 or three like 2004.
  1499. this is such a short timeline and having these massive worldwide implications from the use of these things
  1500. when you look at the future do you look at this like a runaway train that's headed towards a cliff
  1501. yeah i mean i think right now this thing is a frankenstein that it's not like even if facebook is
  1502. aware of all these problems
  1503. they don't have the staff unless they
  1504. hired like hundreds of you know tens
  1505. hundreds of thousands of people definitely minimum to try to address all these problems but
  1506. the paradox we're in
  1507. is that the very premise of these
  1508. services is to rely on automation
  1509. like it used to be we had
  1510. editors and journalists or at least
  1511. editors or you know people edited even
  1512. when on television saying what is
  1513. credible what is true like you know you sat here with you know alex jones even yesterday and
  1514. you're trying to check him on everything
  1515. he's saying right you're researching and trying to look that stuff up
  1516. you're trying to be doing some more responsible communication
  1517. the premise of these systems is that you don't do that
  1518. like the reason venture capitalists find social media so um uh
  1519. profitable and such a good investment is
  1520. because we generate the content for free
  1521. we are the useful idiots right
  1522. instead of paying a journalist
  1523. 70 000 a year to write something credible we can each be convinced to share our
  1524. political views and we'll do it knowingly for free actually we don't really know the word
  1525. the useful idiots that's the kind of the point and then instead of paying an editor a hundred
  1526. thousand dollars a year to figure out
  1527. which of those things is true that we
  1528. want to promote and give
  1529. exponential reach to you have an
  1530. algorithm says hey what do people click on the most what people like the most and then you
  1531. realize the quality of the signals that are going into the information environment that
  1532. we're all sharing
  1533. is a totally different process we went
  1534. from a high quality gated process that cost a lot of money
  1535. to this um
  1536. really crappy process that costs no money which makes the company so profitable
  1537. and then we fight back for territory for for values
  1538. when we raise our hands and say hey
  1539. there's a thinspiration video problem
  1540. for teenagers and anorexia
  1541. hey there's a mass conspiracy sort of
  1542. echo chamber problem over here
  1543. hey there's um
  1544. you know flat earth sort of issues and again these get into tricky topics
  1545. because we want to
  1546. you know i i know we both believe in
  1547. free speech and we have this
  1548. feeling that um
  1549. the solution to bad
  1550. speech is better you know more speech
  1551. that counters the things that are said
  1552. but in a finite attention economy we
  1553. don't have
  1554. the capacity for everyone who gets bad speech to just have a counter response in fact what
  1555. happens right now is that that bad
  1556. speech rabbit holes into
  1557. not only called worse and worse speech
  1558. but more extreme versions of that view that confirms it
  1559. because once facebook knows that that
  1560. flat earth rabbit hole is good for you
  1561. at getting your attention back
  1562. it wants to give you just more and more
  1563. of that it doesn't want to say here's 20
  1564. people who disagree with that thing
  1565. right right so i think if you were to imagine a different system we would ask who are
  1566. the thinkers that are most
  1567. open-minded and synthesis-oriented where
  1568. they can actually steal man the other side actually they can do you know for this
  1569. speech here is the opposite counter argument they can show that they understand that
  1570. and imagine those people get lifted up
  1571. but notice that none of those people
  1572. that you and i know i mean we're both
  1573. friends with eric weinstein
  1574. and you know i think he's one of these
  1575. guys who's really good at sort of offering the steel manning here's the other side
  1576. of this here's the other side of that
  1577. but the people who generally do that
  1578. aren't the ones who get the tens of
  1579. millions of followers on these
  1580. surfaces it's the black and white
  1581. extreme outrage oriented thinkers and and speakers that get rewarded in this detention
  1582. economy and so if you look at how if i
  1583. zoom way out and say how is the entire
  1584. system behaving just like if i zoom out
  1585. and say climate you know the climate
  1586. system like how is the entire
  1587. overall system behaving it's not
  1588. producing the kind of information environment the thing that troubles me the most that
  1589. i clearly see you're thinking and i agree with you like i don't see any holes in what
  1590. you're saying like i don't know how this
  1591. plays out but it doesn't look good
  1592. and i don't see a solution
  1593. it's like if there are a thousand bison
  1594. running full steam towards a cliff and
  1595. they don't realize the cliff is there i
  1596. don't see how you pull them back
  1597. so i think of it like we're trapped in a body and um
  1598. that's eating itself so like it's
  1599. kind of a cannibalism economy
  1600. because our economic growth right now with these
  1601. tech companies is based on eating our
  1602. own organs so we're eating our own
  1603. mental health organs we're eating the
  1604. health of our children we're eating
  1605. sorry for being so gnarly about it but
  1606. it's it's a cannibalistic system
  1607. in a system that's hurting itself or
  1608. eating itself or punching itself
  1609. if one of the neurons wakes up in the
  1610. body it's not enough to change that it's
  1611. going to keep punching itself but if
  1612. enough of the neurons wake up and say
  1613. this is stupid why would we build our system this way and the reason i'm so excited about the
  1614. film is that if you have 40 to 50 million people who now recognize that we're
  1615. living in this sort of cannibalist system in which the economic incentive is to debase the
  1616. life support systems of your democracy
  1617. we can all wake up and say that's stupid
  1618. let's do something differently let's actually change the system let's use different platforms
  1619. let's fund different platforms let's regulate and tame the existing frankensteins
  1620. and i don't mean regulating speech i mean really thoughtfully
  1621. how do we change the incentives so it doesn't go to the same race to the
  1622. bottom and we have to all recognize that
  1623. we're now 10 years into this hypnosis
  1624. experiment of warping of the mind
  1625. and like you know friends with some
  1626. hypnotists like how do we snap our
  1627. fingers and get people to say
  1628. that that artifact there's an inflated
  1629. level of polarization and hatred right now that especially going into this election i think we all
  1630. need to be much more cautious about
  1631. what's running in our brains right now
  1632. yeah i don't think most people are generally aware of what's causing this polarization i
  1633. think they think it's the climate of society
  1634. because the president and
  1635. because of uh
  1636. black lives matter and the the
  1637. george floyd protests and all this jazz
  1638. but i don't think they understand that
  1639. that's exacerbated
  1640. in a fantastic way by social media and
  1641. the last 10 years of our addictions to
  1642. social media and these echo chambers
  1643. that we all exist in
  1644. yeah so i want to make sure that we're
  1645. both clear and i know
  1646. you agree with this that um
  1647. these things were already in society to
  1648. some degree right so we want to make
  1649. sure we're not
  1650. saying social media is blamed for all of
  1651. it absolutely not no no
  1652. gasoline is gasoline right exactly it's
  1653. it's lighter fluid for sparks of polarization it's lighter fluid for sparks of you
  1654. know more paranoid which is ironically
  1655. what everybody it was the opposite of
  1656. everybody what everybody hoped the
  1657. internet was going to be
  1658. right everybody hoped the internet was
  1659. going to be this bottomless resource of
  1660. information where everyone was going to
  1661. be educated in a way they had never
  1662. experienced before in the history of the
  1663. human race where you'd have access to
  1664. all the answers to all your questions
  1665. you you know eric weinstein
  1666. describes as the library of alexandria in your pocket yeah but no well and i want to be clear
  1667. so that i'm not against technology or
  1668. giving people access in fact i think a
  1669. world where everyone had a smartphone
  1670. and a google search box and wikipedia
  1671. and like a search oriented of youtube so
  1672. you can look up
  1673. health issues or how to do it yourself fix anything sure it would be awesome that would be
  1674. great i would love that just want to be really clear
  1675. because this is not an
  1676. anti-technology conversation
  1677. it's about again this business model
  1678. that depends on recommending stuff to people which just to be clear on the polarization front um
  1679. it social media is more profitable when
  1680. it gives you your own truman show that
  1681. affirms your view of reality every time
  1682. you flick your finger right
  1683. like it that's going to be more
  1684. profitable than every time you flick
  1685. your finger i actually show you here's a
  1686. more complex nuanced picture that disagrees with that here's a different way to see it that
  1687. won't be nearly as successful and the
  1688. best way for people to test this
  1689. we actually recommend even after seeing
  1690. the film to do this is
  1691. um open up facebook on two phones especially like you know two partners or people who have
  1692. the same friends so you have the same friends on facebook you would think if you scroll your feeds
  1693. you'd see the same thing you're the same
  1694. people you're following
  1695. so why wouldn't you see the same thing
  1696. but if you swap phones and you actually
  1697. scroll through their feed for 10 minutes
  1698. and you scroll through mine for 10
  1699. minutes you'll find
  1700. that you'll see completely different information and it won't you'll also notice that it
  1701. won't feel very compelling like if you asked yourself my friend emily just did this with with
  1702. her husband after seeing the film
  1703. and she literally has the same friends
  1704. as her husband and she scrolled through
  1705. the feed she's like this isn't interesting i wouldn't come back to this right and
  1706. so we have to again realize how subtle and and yeah just how subtle this has been i
  1707. wonder what would happen if i scrolled through my feed
  1708. because i literally
  1709. don't use facebook
  1710. i don't use it at all i only use
  1711. instagram use instagram i i stopped using twitter
  1712. because it's like a bunch
  1713. of mental patients throwing [ __ ] at each other um
  1714. and i uh
  1715. very rarely use it i should say occasionally i'll check some things to
  1716. see like what the climate
  1717. is but uh
  1718. of the cultural climate but
  1719. i use instagram and i facebook i used to use instagram to post to facebook but i kind
  1720. of stopped even doing that
  1721. because just it just seems gross yeah it's just and
  1722. it's these people in these verbose arguments about the politics and the economy and world events
  1723. and is that medium constructive to solving these problems no just not at all and it's an attention
  1724. casino right the house always wins and we're
  1725. you know eric you might see eric
  1726. weinstein in a thread you know
  1727. battling it out or sort of duking it out
  1728. with someone and maybe even reaching
  1729. some convergence on something but it
  1730. just whizzes by your feet and then it's
  1731. gone right and all the effort that we're
  1732. putting in to make these systems work
  1733. but then it's just all gone what do you do i mean i try to very minimally use social media overall um
  1734. luckily the work is so busy that that's easier um
  1735. i i want to say first that um
  1736. you know on the addiction fronts of these things i you know myself i'm very sensitive and
  1737. you know easily addicted by these things myself and that's why
  1738. i think i notice you were saying in a
  1739. social dilemma it's email for you huh
  1740. yeah i well i you know for me if i
  1741. refresh my email and pull to refresh
  1742. like a slot machine sometimes i'll get invited to meet the president of such and such to
  1743. advise on regulation and sometimes i get a stupid newsletter from a politician i
  1744. don't care about or something right
  1745. um so i email is very addictive
  1746. um it's funny i talked to daniel
  1747. kahneman who wrote the he's like the
  1748. founder of behavioral economics he wrote the book thinking fast and slow if you know that
  1749. one and he said as well that email was
  1750. uh the most addictive for him and he you
  1751. know the one thing you'll find is the
  1752. people who know most about these sort of persuasive manipulative tricks they'll say
  1753. we're not immune to them just
  1754. because we know about them
  1755. dan ariely who's another famous
  1756. persuasion behavioral economics guy talks about flattery and how flattery still feels good even
  1757. if i tell you i don't mean it like
  1758. i love that that sweatshirt that's an
  1759. awesome sweatshirt where'd you get it
  1760. you're just gonna [ __ ] me but that's that's the um
  1761. it feels good to get flattery even if
  1762. you know that it's not real
  1763. right and the point being that like
  1764. again we have so much evolutionary
  1765. wiring to care about what other people
  1766. think of us that just
  1767. because you know that they're
  1768. manipulating you and the likes or whatever it still feels good to get those hundred
  1769. extra likes on that thing that you posted yeah when do the likes come about
  1770. um well let's see well actually you know in the film you know justin rosenstein who's the
  1771. inventor of the like button
  1772. talks about i think the first version
  1773. was something called beacon and it
  1774. arrived in 2006 i think
  1775. but then the simple like one-click like
  1776. button was like a little bit later like
  1777. 2008 2009.
  1778. are you worried that it's going to be
  1779. more and more invasive i mean you think
  1780. about the problems we're dealing with now with facebook and twitter and instagram
  1781. all these within the last decade or so what what
  1782. do we have to look forward to i mean is
  1783. there something on the horizon that's
  1784. going to be even more invasive
  1785. well we have to change this system
  1786. because as you said
  1787. technology is only to get it is only
  1788. going to get more immersed into our
  1789. lives and infuse into our lives not less
  1790. is technology going to get more
  1791. persuasive or less persuasive more
  1792. more is ai going to get better at
  1793. predicting our next move
  1794. or less good at predicting our next move
  1795. well it's almost like we have to eliminate that and i mean it would be really hard to
  1796. tell them you can't use algorithms anymore that depend on people's attention spans
  1797. right it would be really hard but it seems like the only way for the internet to be pure
  1798. correct i think of this like the
  1799. environmental movement i mean some
  1800. people have compared the film
  1801. the social dilemma to um
  1802. rachel carson's silent spring right where that was the birth that was
  1803. the book that birthed the environmental movement and that was in a republican
  1804. administration the knicks administration
  1805. we actually passed we created the epa
  1806. the environmental protection agency
  1807. we went from a world where we said the
  1808. environment's something we don't pay attention to to we passed a bunch i forgot the laws
  1809. we passed between 1963 and 1972 over a decade we started caring about the environment
  1810. we created things that protected the national parks we and i think that's kind of what's going
  1811. on here that you know
  1812. imagine for example it is illegal to show advertising on youth oriented social media apps between 12 a.m and 6 a.m
  1813. because you're
  1814. basically monetizing loneliness and lack of sleep right like imagine that you cannot
  1815. advertise during those hours
  1816. because we say that like a national park
  1817. our children's attention between this is
  1818. a very minimal example by this would be like you know taking the most obvious piece
  1819. of low-hanging fruit and land it's like
  1820. let's quarantine this off and say this is sacred but isn't the problem
  1821. like the environmental protection agency it resonates with most people the idea
  1822. oh let's protect the world for our children right there's not a lot of people profiting
  1823. off of polluting the rivers right
  1824. but when you lose i mean over over
  1825. hunting you know certain lands or
  1826. overfishing certain fisheries and
  1827. collapsing them i mean there there are
  1828. if you have big enough corporations that
  1829. are based on an infinite growth profit model you know operating with less and less
  1830. you know resources to get this is a
  1831. problem we faced before
  1832. for so for sure but it's not the same sort of scale as 300 and x amount of millions of people
  1833. and a vast majority of them are using some form of social media
  1834. and also this is not something that
  1835. really resonates in a very clear
  1836. like one plus one equals two way like the environmental protection agency
  1837. it makes sense like if you ask people right should you be able to throw
  1838. garbage into the ocean everyone's gonna
  1839. say no that's a terrible idea right
  1840. should you be able to make an algorithm
  1841. that shows people what they're
  1842. interested in on youtube like yeah
  1843. what's wrong with that well it's more like sugar right
  1844. because sugar is always going to taste way
  1845. better than something else
  1846. because our evolutionary heritage says like that's rare and so we
  1847. should pay more attention to it
  1848. this is like sugar for the fame lottery
  1849. for our attention for social approval
  1850. and so it's always going to feel good
  1851. and we need to have consciousness about
  1852. it and we haven't
  1853. banned sugar but we have created a new
  1854. conversation about what healthy
  1855. you know eating is right i mean there's
  1856. a whole new fitness movement in sort of
  1857. yoga and all these other things that
  1858. people care more about their bodies and
  1859. health than they probably ever have i
  1860. think many of us wouldn't have thought
  1861. we'd ever reach it through uh
  1862. you know
  1863. get through the period of soda being at the sort of pinnacle popularity that is i think
  1864. in 2013 or 14 was the year that water crossed over as being more of a successful drinking product than soda
  1865. i think really i think that's true
  1866. you might want to look that up but
  1867. so i think we could have something like
  1868. that here we have to
  1869. i think of it this way if you want to
  1870. even get kind of weirdly
  1871. i don't know spiritual or something
  1872. about it which is we are the only species that could even know that we were doing this to ourselves right
  1873. like we're the only species with the capacity for self-awareness
  1874. to know that we have actually like roped ourselves into this matrix
  1875. of like literally the matrix um of of sort of undermining our own psychological weaknesses
  1876. like a lion that somehow manipulated its environment
  1877. so that there's gazelles everywhere and is like overeating on gazelles
  1878. doesn't have the self-awareness to know
  1879. wait a second if we keep doing this
  1880. this is going to cause all these other
  1881. problems it can't do that
  1882. because its brain doesn't have that capacity
  1883. our brain we do have the capacity for
  1884. self-awareness we can name
  1885. negativity bias which is that if i have
  1886. 100 comments and 99 are positive my
  1887. brain goes to the negative we can name
  1888. that and once we're aware of it we get some agency back we can name that we have a draw towards
  1889. social approval so when i see i've been
  1890. tagged in a photo i know that they're just manipulating my social approval we can name social
  1891. reciprocity which is when i get all
  1892. those text messages and i feel oh i have
  1893. to get back to all these people
  1894. well that's just an inbuilt bias that we
  1895. have to get back
  1896. reciprocity we have to get back to
  1897. people who do give stuff to us
  1898. the more we name our own biases like
  1899. confirmation bias we can name that
  1900. my brain is more likely to feel good
  1901. getting information that i already agree with that information that disagrees with me
  1902. once i know that about myself
  1903. i can get more agency back yeah and
  1904. we're the only
  1905. like species that we know of that has
  1906. the capacity to realize that we're in a
  1907. self-terminating
  1908. sort of system and we have to change
  1909. that by understanding our own weaknesses
  1910. and that we've created the system that
  1911. is undermining ourselves and i i think the film is doing that for a lot of people it
  1912. certainly is but i think it needs
  1913. more it's like inspiration it needs a refresher on a regular basis right do you feel
  1914. this massive obligation to be that guy
  1915. that is out there sort of as the paul revere of uh
  1916. the technology influence
  1917. uh invasion i just see these problems
  1918. and i want them to go away yeah you know
  1919. i i didn't
  1920. i you know didn't desire and wake up to run a social movement but
  1921. honestly right now that's what we're trying to do um
  1922. with the center for humane technology
  1923. we realize that before the success of the film
  1924. we were actually more focused on working with technologists inside the industry
  1925. you know i come from silicon valley many of my friends are executives at the companies
  1926. and we have these inside relationships
  1927. so we focused at that level we also worked with policymakers um and we were trying to speak to policymakers
  1928. we weren't trying to mobilize the whole world against this problem but with the film
  1929. suddenly we as an organization have had to do that and we're frankly i wish we had i'm
  1930. speaking really honestly we i really
  1931. wish we'd had those funnels
  1932. so that people who saw the film could
  1933. have landed into you know a carefully
  1934. designed funnel where we actually
  1935. started mobilizing people to deal with this issue
  1936. because there are ways we can
  1937. do it we can pass certain laws
  1938. we have to have a new cultural sort of
  1939. set of norms about how do we want to
  1940. show up and use the system
  1941. um you know families and schools can
  1942. have whole new protocols of how we want
  1943. to do group migrations
  1944. because one of the problems is that if a teenager says by themselves
  1945. whoa i saw the film
  1946. i'm going to delete my instagram account
  1947. by myself or tiktok account by myself
  1948. that's not enough
  1949. because all their friends are still using instagram and
  1950. tiktok and they're still going to talk
  1951. about who's dating who or gossip about this or homework or whatever on those services
  1952. and so the services instagram and tick
  1953. tock prey on social exclusion
  1954. that you will feel excluded if you don't participate and the way to solve that is to get
  1955. whole schools or families together like
  1956. put different parent groups or whatever together and do a group migration
  1957. from instagram to signal or imessage or some kind of group thread
  1958. that way
  1959. because notice that when you as you said
  1960. apple's a pretty good actor in this space if i make a facetime call to you
  1961. facetime isn't trying to monetize my attention right it's just sitting there being like
  1962. yet when how can i help you have a good
  1963. face it's close to face-to-face
  1964. you know conversation is possible jamie
  1965. pulled up an article earlier that was saying that uh
  1966. apple was creating its own search engine yeah uh
  1967. i hope that is the case and i i hope
  1968. that if it is the case they apply the
  1969. same sort of ethics that they have
  1970. towards sharing your information that they do uh
  1971. with other things to to their search engine but i wonder
  1972. if there would be some sort of value in them creating a social media platform that doesn't
  1973. rely on that sort of algorithm yep
  1974. well i think in general one of the exciting trends that has happened since the film is
  1975. there's actually many more people
  1976. trying to build alternatives social
  1977. media products that are not based on
  1978. these business models yeah um uh
  1979. i could name a few but i i don't
  1980. want to be endorsing it i mean there's
  1981. people building marco polo clubhouse
  1982. wikipedia is trying to build a sort of for a non-profit version um
  1983. i always forget the names of these things but okay but the interesting thing is
  1984. that for the first time people
  1985. are trying to build something else
  1986. because now there's enough
  1987. people who feel disgusted by the present
  1988. state of affairs and that wouldn't be
  1989. possible unless we created a kind of a cultural movement based on something like the film that
  1990. reaches a lot of people it's interesting
  1991. that you made this comparison to the environmental protection agency
  1992. because there's kind of a parallel
  1993. in the way other countries handled the
  1994. environment versus the way we do and how
  1995. it makes them competitive i mean that's
  1996. always been the republican argument for um
  1997. not getting rid of certain fossil fuels and coal and
  1998. all sorts of things that have a negative consequence we we need to be competitive with china
  1999. we need to be competitive with these
  2000. other countries that don't have these regulations in effect the concern would be well first of all
  2001. the problem is these companies are
  2002. global right like facebook is global
  2003. if they put these regulations on america
  2004. but didn't put these regulations worldwide then wouldn't they use the uh
  2005. the income and the algorithm in other countries unchecked right
  2006. and have this negative consequence and gather up all this money which is why
  2007. just like sugar it's like everyone
  2008. around the world has to understand and be more antagonistic yeah and not like sugar's evil but just
  2009. you have to have a common awareness
  2010. about the problem but how could you educate people that like if you're talking about some a
  2011. country like myanmar or
  2012. these other countries that that have had
  2013. these like serious consequences
  2014. because of facebook how how could you possibly get
  2015. our ideas across to them if we don't
  2016. even know their language and it's just
  2017. this system that's already set up in this very advantageous way for them
  2018. where facebook comes on their phone like
  2019. how could you hit the brakes on that
  2020. well i mean first i just want to say this is
  2021. an incredibly hard and depressing problem yeah just the scale of it right right um
  2022. you need something like a global i mean language independent global self-awareness about this problem now
  2023. again i don't want to be tweeting the
  2024. horn about the film but the thing i'm excited about is it launched on netflix in 190 countries
  2025. and in 30 languages so you shouldn't [Laughter] well i think you know the film was seen
  2026. in 30 languages so
  2027. you know the cool thing is i wish i
  2028. could show the world my inbox i think
  2029. people see the film
  2030. and they feel like oh my god this is huge and i'm
  2031. a huge problem and i'm all alone how are
  2032. we ever going to fix this
  2033. but i get emails every day from indonesia chile argentina brazil people saying oh
  2034. my god this is exactly what's going on
  2035. in my country i mean i've
  2036. never felt more optimistic and i felt
  2037. really pessimistic for the last eight
  2038. years working on this
  2039. because there really hasn't been enough movement
  2040. but i think for the first time there's a
  2041. global awareness now that we could then
  2042. start to mobilize i know the eu's
  2043. mobilizing canada is mobilizing
  2044. australia's mobilizing
  2045. california state is mobilizing with prop 24
  2046. there's a whole bunch of movement now in the space
  2047. and they have a new rhetorical arsenal
  2048. of you know why we have to make this bigger transition now
  2049. you know are we going to get all the countries that you know
  2050. where there's the six different major dialects in in ethiopia
  2051. where they're going to know about this
  2052. i don't think the film was translated into all those dialects
  2053. i think we need to do more um it's it's a really really hard messy problem
  2054. but on the topic of um uh
  2055. if if we don't do it someone else will
  2056. you know one interesting thing in the environmental movement was um
  2057. there's a great um
  2058. wnyc radio piece about the history of lead
  2059. and when we regulated lead i don't do
  2060. you know anything about this
  2061. yeah i do yeah yeah the cruises matches
  2062. up with with your experience
  2063. the my understanding is that obviously
  2064. lead was this sort of miracle thing we
  2065. put it in paint we put it in gas
  2066. it was like great and then um
  2067. the way we figured out that we should regulate
  2068. lead out of our sort of infused product supply is by proving there was this this guy
  2069. who proved that it dropped kids iq by
  2070. four points for every i think
  2071. microgram per deciliter i think
  2072. in other words for for the amount of if you had a microgram of lead per deciliter of either
  2073. i'm guessing air um
  2074. it would drop
  2075. like the iq of kids by four points
  2076. and they measured this by actually doing
  2077. a sample on their teeth or something
  2078. because lead shows up in your bones i think
  2079. and they proved that if the iq points dropped by four points
  2080. it would lower future
  2081. age warning age earning excuse me
  2082. wage earning potential of those kids
  2083. which would then lower the gdp of the country
  2084. because it would be shifting the iq of the entire country down
  2085. by four points if not more
  2086. based on how much lead is in the environment
  2087. if you zoom out and say is social media
  2088. now let's replace the word iq
  2089. which is also a wrought term
  2090. because there's like a whole bunch of views about how that's
  2091. designed in certain ways and not others and measuring intelligence
  2092. let's replace iq with problem solving capacity what is your problem solving capacity
  2093. which is actually how they talk about it
  2094. in this radio episode
  2095. um and imagine that we have a societal
  2096. iq or a societal problem-solving
  2097. capacity the u.s has a societal iq
  2098. russia has a societal iq germany has a societal iq
  2099. how good is a country at solving its problems
  2100. now imagine that what does
  2101. social media do to our societal iq
  2102. what distorts our ideas it gives us a
  2103. bunch of false narratives
  2104. it fills us with misinformation it makes
  2105. it impossible to agree with each other
  2106. and in a democracy if you don't agree
  2107. with each other and you can't even do
  2108. compromise people recognize that
  2109. politics is invented to avoid warfare right so we have compromise and understanding so that we don't
  2110. like physically are violent with each other
  2111. we have compromise and conversation
  2112. if social media makes compromise conversation
  2113. and undershared understanding and shared truth
  2114. impossible it doesn't drop our societal
  2115. iq by four points it drops it to zero
  2116. because you can't solve any problem
  2117. whether it's human trafficking
  2118. or poverty or climate issues or
  2119. um you know racial injustice whatever it
  2120. is that you care about
  2121. it depends on us having some shared view
  2122. about what we agree on
  2123. and by the way and on the optimistic
  2124. side there are countries like taiwan
  2125. that have actually built a digital
  2126. democratic sort of social media
  2127. type thing audrey tang you should have
  2128. audrey tang on your show she's amazing
  2129. she's the digital minister of taiwan
  2130. and they've actually built a system that
  2131. rewards unlikely consensus so when two
  2132. people who would traditionally disagree
  2133. post something online um
  2134. and when when they actually two people who
  2135. traditionally disagree actually agree on something
  2136. that's what gets boosted to the top of
  2137. the way that we look at our information feeds really yeah
  2138. so it's about finding consensus
  2139. whether it'd be unlikely
  2140. and saying hey actually you know you joe
  2141. and tristan you typically you agree you
  2142. disagree on these six things
  2143. you agree on these three things and of
  2144. things that we're going to encourage you
  2145. to talk about on a menu we hand you a
  2146. menu of the things you agree on
  2147. and how did they manipulate that um
  2148. honestly we did a great interview with her on our podcast um
  2149. that people can listen to
  2150. uh i think you should have iran honestly
  2151. i would love to but what does your
  2152. podcast again tell people
  2153. it's called your undivided attention um
  2154. and with the interview is with audrey tang is her name
  2155. uh and i think that's this is one model of how do you have
  2156. you know sort of digital media bolted
  2157. onto the top of a democracy and have it work better as opposed to how do you it just
  2158. degrades into kind of nonsense and
  2159. polarization and inability to agree
  2160. that's such a unique situation too right
  2161. because china doesn't recognize them and there's
  2162. a real threat that they're going to be invaded by china correct and so what's interesting about
  2163. taiwan is there's we didn't we haven't
  2164. talked about the disinformation issues but it's under
  2165. like you said not just physical threat
  2166. from china but massive propaganda
  2167. disinformation campaigns are trying to run there right i'm sure and so what's amazing is that
  2168. their digital media system is good at
  2169. um dealing with these disinformation
  2170. campaigns and conspiracy theories and other things even in the face of a huge threat like
  2171. china but there's more binding energy in the country
  2172. because they all know
  2173. that there's a tiny island and there's a
  2174. looming threat of this big country
  2175. whereas the united states we're not this
  2176. tiny island with a looming threat
  2177. elsewhere in fact many people don't know
  2178. or don't think that there's actually
  2179. information warfare going on
  2180. um i actually think it's really
  2181. important to point out to people that um
  2182. the social media is one of our biggest national security risks
  2183. because while we're obsessed with protecting our
  2184. physical borders and building walls and
  2185. you know spending a trillion dollars
  2186. redoing the nuclear fleet
  2187. um we left the digital border wide open
  2188. like if russia or china tried to fly a plane into the united states our pentagon and
  2189. billions of dollars of defense
  2190. infrastructure from raytheon and boeing
  2191. or whatever will shoot that thing down
  2192. and it doesn't get in if they try to
  2193. come into the country they'll get
  2194. stopped by the passport control system
  2195. ideally if they try to fly if russia or china try to fly
  2196. an information bomb into the country
  2197. instead of being met by the department
  2198. of defense they're met by a facebook
  2199. algorithm with a white glove that says
  2200. exactly which zip code you want to target like it's the opposite of protection so
  2201. social media makes us more vulnerable i think of it like
  2202. if you imagine like a bank that spent billions of dollars um
  2203. you know surrounding the bank with physical bodyguards right like just the buffers guys in
  2204. every single quarter you just totally
  2205. secured the bank but then you installed
  2206. on the bank a computer system
  2207. that everyone interacts with and no one
  2208. changes the default password
  2209. from like lower case password anyone can hack in that's what we do when we install
  2210. facebook in our society or you install facebook in ethiopia
  2211. because if you think russia or china you know or
  2212. iran or south korea or excuse me north korea um
  2213. influencing our election is bad just
  2214. keep in mind the like dozens of countries throughout africa where we actually know
  2215. recently there was a huge campaign
  2216. that the stanford cyber policy center
  2217. did a report on of russia targeting i
  2218. think something like seven or eight
  2219. major countries and disinformation
  2220. campaigns running in those countries
  2221. or the facebook whistleblower who came
  2222. out about a month ago uh
  2223. sophie zhang i think is her name
  2224. uh saying that she personally had to step in to deal with
  2225. disinformation campaigns in honduras azerbaijan um
  2226. i think greece or some other
  2227. countries like that so the scale of what these technology companies are managing
  2228. they're managing the information
  2229. environments for all these
  2230. these countries but they don't have the
  2231. resources to do it so they
  2232. not only that they're not trained to do
  2233. it they're not qualified correct they're
  2234. making up as they go along
  2235. 20 to 30 to four and they're way behind
  2236. the curve when when i had
  2237. rene de rest on and she detailed all the issues with the uh
  2238. internet research agency in russia and
  2239. what they did during the 2016 campaign
  2240. for both sides i mean the idea is they just promoted trump but they were basically selling the seeds of uh
  2241. just the decline of the democracy
  2242. they were trying to figure out how to create turmoil
  2243. and they were doing it in this like very bizarre calculated way
  2244. that it didn't seem it was hard to see like
  2245. what's the end game here
  2246. well the end game is to have everybody fight
  2247. yeah i mean that's really what the end game was
  2248. and if i'm you know one of our major adversaries
  2249. you know after world war ii
  2250. there was no ability to use kinetic like nukes or something
  2251. on the bigger countries right
  2252. like that's all done
  2253. so the what's the best way to take down the biggest you know country
  2254. you know on the planet on the block you use its own internal tensions
  2255. against itself this is what sun tzu
  2256. would tell you to do yeah
  2257. and that's never been easier
  2258. because of facebook and
  2259. because of these platforms being
  2260. open to do this manipulation
  2261. and if i'm looking now we're four days
  2262. away from the u.s elections or something
  2263. like that when this goes out jesus christ there is never we have never been more
  2264. destabilized as a country
  2265. until now i mean this is the most
  2266. disabled you probably have ever
  2267. been i would say um and polarized
  2268. um maybe people would argue the civil
  2269. war was worse but in recent history
  2270. um there is maximum incentive
  2271. for foreign actors to drive up again not
  2272. one side or the other but to drive us
  2273. into conflict so i would really
  2274. you know i think what we all need to do
  2275. is recognize how much incentive there is
  2276. to plant stories to actually have so
  2277. physical violence on the streets i think
  2278. there was just a story
  2279. wasn't we talking about this morning that um
  2280. there's some kind of truck i
  2281. think in philadelphia or dc
  2282. loaded with explosives or something like
  2283. this there's there's such an incentive to try to you know throw the agent provocateur
  2284. like throw the first stone throw the first um
  2285. you know molotov cocktail throw the first uh
  2286. you know make the first shot fired uh
  2287. to drive up that conflict
  2288. and i think we have to
  2289. realize how much that may be artificially motivated very much so and the rene de resta
  2290. podcast that i did where she went into
  2291. depth about all the different
  2292. ways that they did it and the most curious one being funny memes yep that there's so many of
  2293. the memes that you read that you laughed at yeah well there's it was just so weird
  2294. that's they were humorous and she said she
  2295. looked at probably a hundred thousand memes and the funny thing is you actually can
  2296. agree with them right like they should
  2297. you would you would laugh at them
  2298. like oh you know and they're being
  2299. constructed by foreign agents
  2300. that are doing this to try to mock
  2301. certain aspects of our society
  2302. and pit people against each other and create a mockery and you know back in 2016 there was no
  2303. there's very little
  2304. collaboration between our defense
  2305. industry and cia and dod and people like that uh
  2306. and the tech platforms and the tech
  2307. platform said it's government's job to
  2308. deal with if foreign actors are doing these things how do you stop something like the ira
  2309. like say if they're creating memes in
  2310. particular and they're funny memes
  2311. well so one of the issues that renee
  2312. brings up and i'm just a huge fan of her and her work uh
  2313. is as am i yeah uh
  2314. is that if i'm you know china
  2315. i i don't need to invent some fake news
  2316. story i just find someone in your
  2317. society who's already saying what i
  2318. want you to be talking about and i just
  2319. like amplify them up i take that dial
  2320. and i just turn it up to ten right so i find your texas secessionists and like oh texas
  2321. that would be a good thing if i'm trying
  2322. to rip the country apart so i'm going to
  2323. take those tested secessionists and the california secessionists and i'm just going to dial them up to
  2324. 10. so those are the ones we hear from
  2325. now if you're trying to stop me in your
  2326. facebook and you're the
  2327. integrity team or something on what
  2328. grounds are you trying to stop me
  2329. because it's your own people your own free speech i'm just the one amplifying the one i
  2330. want to be out there
  2331. right and so that's what gets tricky
  2332. about this is i think our moral
  2333. concepts that we hold so dear of free
  2334. speech are inadequate in an attention
  2335. economy that is hackable
  2336. and it's really more about what's
  2337. getting the attention rather than what
  2338. are individuals saying or can't say
  2339. and you know again they've created this
  2340. frankenstein where they're making mostly automated decisions about who's looking like what
  2341. pattern behavior or coordinated and
  2342. authentic behavior here or that and
  2343. they're shutting down
  2344. people i don't know if people know this
  2345. people facebook shut down two billion
  2346. fake accounts i think this is a stat
  2347. from a year ago
  2348. they shut down two billion fake accounts
  2349. they have three billion active real users
  2350. do you think that those two billion were the perfect
  2351. like real you know real fake accounts
  2352. and they didn't miss any or they didn't overwhelm
  2353. and took some real accounts down with it you know
  2354. our friend brett weinstein he just got taken down by facebook
  2355. i think he saw that
  2356. that seemed calculated though
  2357. facebook has shut down 5.4 billion fake accounts this year
  2358. and that was in november 29th
  2359. oh my god
  2360. oh my god that is insane that's so many
  2361. and so again it's the scale that these things are operating at
  2362. and that's why you know when brett got his thing taken down
  2363. i didn't like that but i
  2364. it's not like there's this vendetta against brett right
  2365. oh i don't know about that that seemed to me to be a calculated thing
  2366. because uh
  2367. you know eric uh
  2368. actually tweeted about it saying that
  2369. you know you could probably find the tweet
  2370. because i retweeted it
  2371. like basically it was reviewed by a
  2372. person so you're lying
  2373. he's like this is not something that was
  2374. uh taken down by an algorithm he
  2375. believes that it was
  2376. because it was unity 2020 platform
  2377. where they're trying to bring together
  2378. conservatives and and liberals
  2379. and try to find some common ground and
  2380. create like a third party candidate that combines the best of both worlds
  2381. i don't understand what policy his uni unity 2020 thing was going up
  2382. against like i have no idea he's going
  2383. against a two-party system
  2384. the idea is that it's taking away votes
  2385. from biden and then it might help trump win right banned him off twitter as well you
  2386. know that too they they blocked the
  2387. account or something from they
  2388. they banned the entirety they banned the
  2389. 20 unity 2020 account yeah
  2390. unity yeah i mean literally unity
  2391. they're like nope no unity [ __ ] you
  2392. we want biden yeah the political bias on social media is undeniable and that's
  2393. maybe the least of our concerns in the long run but it's a tremendous issue
  2394. and it also it it for sure sows the seeds of discontent
  2395. and it creates more animosity and it creates more conflict the interesting thing is that
  2396. if i'm one of our adversaries
  2397. i see that there is this view that
  2398. people don't like the social media
  2399. platforms that i want them
  2400. to be more like let's say i'm rushing
  2401. china right and i'm currently using
  2402. facebook and twitter successfully to run information campaigns and then i want them i can actually
  2403. plant a story so that they end up
  2404. shutting it down and shutting down
  2405. conservatives or shutting down one side
  2406. which then forces the platforms to open
  2407. up more so that i then russia china can
  2408. keep manipulating even more i understand yeah so right now they
  2409. want it to be a free-for-all where
  2410. there's no moderation at all
  2411. because that allows them to get in
  2412. and they can weaponize the conversation
  2413. against itself right i don't see a way out of this tristan we have to all be aware of it i
  2414. mean even if we are
  2415. all aware of it it seems so pervasive
  2416. yeah well it's not just pervasive it's
  2417. like we said it's
  2418. we're 10 years into this hypnosis experiment this is the largest psychological
  2419. experiment we've ever run on humanity
  2420. it's insane
  2421. it is insane and it and it's also with
  2422. tools that never existed before
  2423. evolutionarily so like we would we
  2424. really are not designed
  2425. just the way these brightly lit metal
  2426. devices and glass devices
  2427. interact with your brain they're so enthralling right we've never had to resist anything
  2428. like this before with the things we've
  2429. had to resist is don't go to the bar
  2430. you know you have an alcohol problem
  2431. stop smoking cigarettes it'll give you cancer right we've never had a thing that does so much right
  2432. you can call your mom you can text
  2433. a good friend you can
  2434. you can receive your news you can get an
  2435. amazing email about this project you're working at and it could suck up your time staring
  2436. at butts and the
  2437. and the infusion of the things that you
  2438. that are necessary for life like text messaging or like looking something up are infused
  2439. and right next to
  2440. right all of the sort of corrupt stuff
  2441. right and if you're using it to order
  2442. food and if you're using it to
  2443. get an uber and right but imagine if we
  2444. all wiped our phones of all the
  2445. extractive business model stuff and we
  2446. only had the tools
  2447. have you thought about using a light
  2448. phone yeah it's funny i
  2449. those guys just to be brought up in my
  2450. awareness more more often
  2451. um for those who don't know it's like
  2452. it's like a mini
  2453. one of the guys on the documentary is
  2454. one of the creators of it right
  2455. no i think you're thinking of tim kendall who started he's the guy who invented who brought in
  2456. facebook's business model of advertising
  2457. and he runs a company now called moment that shows you uh
  2458. the number of hours you spend on
  2459. different apps and helps you use it
  2460. someone involved in the documentary was also a part of the light phone team no no no
  2461. not not officially no i don't think so
  2462. um but the light phone is like a
  2463. basically a thing black and white black and white phone things text
  2464. and i think it does it plays music now which i was like well that's a
  2465. mistake right like that's a slippery slope that's the thing
  2466. and we have to all be comfortable with losing access to things that we might
  2467. love right like oh maybe you do want to
  2468. take notes this time but you don't have
  2469. your full keyboard to do that and are you willing to i think the thing is one thing people
  2470. can do is to take like a digital sabbath
  2471. one day a week off completely
  2472. because at the very
  2473. imagine if if you got several hundred
  2474. million people to do that that drops the
  2475. revenue of these companies by 15
  2476. because that's one out of seven days
  2477. that you're not on the system so long as
  2478. you don't rebalance and
  2479. use it more on the other days i'm
  2480. inclined to think that apple's
  2481. their solution is really the way out of this that to opt out of all sharing of your information and uh
  2482. if if they could come up with
  2483. some sort of a social media platform
  2484. that kept that as an ethic
  2485. yeah i mean it might allow us to
  2486. communicate with each other
  2487. but stop all this algorithm nonsense and
  2488. it's look if anybody has the power to do
  2489. it they have so much goddamn money
  2490. totally well and also they're like that
  2491. you know people talk about
  2492. you know the government regulating these
  2493. platforms but apple is kind of the government that can regulate the attention economy
  2494. because when they do this thing we talked about earlier of um
  2495. saying do you want to be tracked right and they give you this option when
  2496. like 99 of people are gonna say no i
  2497. don't want to be tracked right when they
  2498. do that they just put a 30
  2499. tax on all the advertising-based businesses
  2500. because now you don't get as personalized in ad right
  2501. which means they make less money
  2502. which means that business model is less
  2503. attractive to venture capitalists to
  2504. fund the next thing which means
  2505. so they're actually enacting right a
  2506. kind of a carbon tax but it's like a
  2507. uh you know on the polluting stuff right
  2508. they're enacting a kind of
  2509. um social media polluting stuff they're
  2510. taxing by 30 but they could do more than that like imagine you know they have this 30 70 split on um
  2511. app developers get 70 of the revenue
  2512. when you buy stuff and apple keeps 30 percent they could modify that percentage based
  2513. on how much sort of social
  2514. value that those things are delivering to society so this gets a little bit weird people
  2515. may not like this but if you think about
  2516. who's the real customer that we want to be like how do we want things oriented how
  2517. should we if i'm an app developer i want
  2518. to make money the more i'm helping
  2519. society and helping individuals not how
  2520. much i'm extracting and stealing their time and attention um
  2521. and imagine that governments in the future actually paid um
  2522. like some kind of budget into let's say the app store there's anti-trust issues
  2523. with this but you pay money into the app store and then as apps started helping people
  2524. with more social outcomes like let's say
  2525. learning programs or schools or things
  2526. like khan academy things like this
  2527. that more money flows in the direction
  2528. of where people got that value
  2529. and it was that that revenue split
  2530. between apple and the app developers
  2531. um ends up going more to things that end
  2532. up helping people as opposed to things
  2533. that were just good at capturing attention and monetizing uh
  2534. zombie behavior one of my favorite
  2535. lines in the film is justin rosenstein
  2536. from the like button
  2537. um saying that you know so long as a
  2538. whale is worth more dead
  2539. than alive and a tree is worth more as lumber and two-by-fours than a living tree now
  2540. we're the whale
  2541. we're the tree we're worth more when we have predictable zombie-like behaviors when
  2542. we're more addicted distracted outraged polarized and disinformed
  2543. than if we're a living thriving citizen or a growing child
  2544. that's like playing with their friends
  2545. and i think that that kind of distinction
  2546. that just like we protect national parks
  2547. or we protect you know certain fisheries
  2548. and we don't kill the whales in those areas
  2549. or something we need to really protect
  2550. like we have to call out what's sacred to us now
  2551. yeah it's um
  2552. it's an excellent message
  2553. my problem that i see is that
  2554. i just don't know how well that message
  2555. is going to be absorbed on the people that are already in the trance i mean i think it's
  2556. so difficult for people to put things
  2557. down i mean how like i was telling you
  2558. how difficult it is to for me to tell my
  2559. friends don't read the comments
  2560. right you know right it's it's hard to
  2561. have that kind of discipline and it's
  2562. hard to have that kind of
  2563. because people do get bored and when
  2564. they get bored like if you're waiting in line for somewhere you pull out your phone
  2565. you're at the doctor's office you pull out your phone like totally i mean and that's why
  2566. you know and i do that right i mean this
  2567. is incredible right this is incredibly hard um
  2568. back in the day uh
  2569. when i was at
  2570. google trying to change
  2571. i tried to change google from the inside
  2572. for two years before leaving what was it like there pl please share your experiences
  2573. because when you said you tried to change it
  2574. from the inside what kind of resistance
  2575. were you met with and what was their
  2576. reaction to these thoughts that you had
  2577. about the unbelievable negative consequences of well this is in 2013 so we didn't know
  2578. about all the negative consequences but
  2579. you saw the writing on the wall
  2580. at least some of it some of it yeah i
  2581. mean the notion that things were
  2582. competing for attention which would mean
  2583. that they would need to compete to get
  2584. more and more persuasive and hack more
  2585. and more of our vulnerabilities and that
  2586. that would grow that was the core insight i didn't know that it would lead to
  2587. polarization or conspiracy theory
  2588. like recommendations but i would i did
  2589. know you know more addiction
  2590. kids having less you know weaker
  2591. relationships when did it
  2592. occur to you like what were your initial feelings um
  2593. i was on a hiking trip in the santa cruz
  2594. mountains with our co-founder now
  2595. um aza raskin um
  2596. it's funny enough our
  2597. co-founder aiza
  2598. his dad was jeff raskin who invented the
  2599. macintosh project at apple i don't know
  2600. if you know the history there but
  2601. he started the macintosh project and
  2602. actually came up with the word
  2603. um humane to describe the humane
  2604. interface and that's where our name
  2605. and our work comes from is from his
  2606. father's work he and i were in the
  2607. mountains in santa cruz and just experiencing nature and just came back and realized like
  2608. this all of this stuff that we've built
  2609. is just distracting us
  2610. from the stuff that's really important
  2611. and that's when coming back from that trip um
  2612. i made the first google deck that
  2613. then spread virally throughout the
  2614. company saying never before in history
  2615. have you know 50 designers uh
  2616. you know white 20 to 35 year old engineers who look like me
  2617. to hold the collective psyche of humanity
  2618. and then that presentation was released
  2619. and about you know 10 000 people at
  2620. google saw it it was actually the number one um
  2621. meme within the company they have
  2622. this internal thing inside of google called moma that has like people can post like gifs
  2623. and memes about various topics
  2624. and it was the number one meme that hey
  2625. we need to talk about this
  2626. at this week's tgif which is the like
  2627. weekly thank god it's friday type
  2628. company meeting um
  2629. it didn't get talked
  2630. about but i got emails from across the company
  2631. saying we definitely need to do something about this
  2632. it was just very hard to get momentum on it
  2633. and really the key interfaces to change within
  2634. google are chrome and android
  2635. because those are the neutral portals
  2636. into which you're then using
  2637. apps and notifications and websites and
  2638. all of that like those are the kind of
  2639. governments of the attention economy that google runs and when you work there did they um
  2640. did you have to use android was it part of the requirement to work there no i mean a
  2641. lot of people had android phones i still used an iphone was it an issue no no i mean people
  2642. because they realized that they needed
  2643. products to work on on all the phones i
  2644. mean if you worked directly on android then you would have to use an android phone but we
  2645. tried to get you know some of those things like the screen time features that are now
  2646. launched you know so everyone now has on their phone like it shows you the number of hours or
  2647. whatever is that on android as well it
  2648. is yeah and actually that came i think
  2649. as a result of this
  2650. advocacy and that's shipping on a
  2651. billion phones which shows you you can
  2652. you can change this stuff right like
  2653. that goes against their financial interest people spending less time in their
  2654. phones getting less notifications it does but it doesn't work well correct so it
  2655. doesn't actually work is the thing
  2656. yeah and let's separate the intention
  2657. and the fact that they did it it's like
  2658. labels on cigarettes that tell you it's
  2659. going to give you cancer like by the
  2660. time you're buying them you're already hooked correct i mean it's even worse imagine like um
  2661. every cigarette cigarette box had like
  2662. um a little pencil inside so you can
  2663. mark there's like little streaks that
  2664. said the number of days in a row you
  2665. haven't smoked and you could like mark
  2666. each day it's like it's too late
  2667. right right like yeah um
  2668. it's just the wrong paradigm um
  2669. the fundamental thing we have to
  2670. change is the incentives and how money flows
  2671. because we want money flowing in the
  2672. direction of the more these things help us like leave me a concrete example like let's say um
  2673. you want to learn a musical instrument
  2674. and you go to youtube to pick up ukulele or whatever um
  2675. and you're seeing how to play the ukulele like from that point in a system that was designed in a
  2676. humane and sort of time well-spent kind of way it would really ask you instead of
  2677. saying here's 20 more videos that are
  2678. going to just like suck you down a rabbit hole it would sort of be more oriented
  2679. towards what do you really need help with like do you need to buy ukulele here's a
  2680. link to amazon to get the ukulele are
  2681. you looking for a ukulele teacher let me
  2682. do a quick scan on your facebook or
  2683. twitter search to find out which of
  2684. those people are ukulele teachers
  2685. do you need instant like tutoring
  2686. because there's actually the service you
  2687. never heard of called skillshare or something like that where you can get instant ukulele
  2688. tutoring and if we're really designing
  2689. these things to be about
  2690. what would most help you next you know
  2691. we're only as good as the menu of
  2692. choices on life's menu and right now the menu is here's something else to addict you and
  2693. keep you hooked instead of here's a next
  2694. step that would actually be
  2695. on the trajectory of helping people live their lives better
  2696. but you'd have to incentivize the companies
  2697. because like there's so much incentive on getting you addicted
  2698. because there's so much financial reward what would be the financial reward that
  2699. they could have to
  2700. get you something that would be helpful
  2701. for you like lessons or this
  2702. i mean so one way that could work is
  2703. like let's say people pay a monthly
  2704. subscription of like i don't know 20
  2705. bucks a month or something so it's never gonna work i get you but like let's say you pay
  2706. some you put money into a pot
  2707. where the possibility but then we have
  2708. the problem the problem is like
  2709. it costs some money versus free like
  2710. there was a um
  2711. there's a company that
  2712. still exists for now that uh
  2713. was trying to do the netflix of podcasting and uh
  2714. they they approached us and
  2715. they're like we're just gonna get all
  2716. these people together and they're gonna make them people gonna pay to use your podcast i'm
  2717. like why would they do that when
  2718. podcasts are free yeah like that's one
  2719. of the reasons why podcasts work is
  2720. because they're free
  2721. right when things are free they're
  2722. they're attractive
  2723. it's easy when things cost money you
  2724. have to have something that's extraordinary like netflix
  2725. yeah like when you say the netflix of podcasting well
  2726. netflix makes their own shows right they spend millions of dollars on special effects
  2727. and all these different things and
  2728. they're really like enormous projects like
  2729. you're you're just talking about people talking
  2730. [ __ ] and you want money right well
  2731. that's the thing is we have to actually
  2732. deliver something that's totally qualitatively better right and would also have to be like someone
  2733. like you or someone who's really aware of the
  2734. issues that we're dealing with with
  2735. addictions to social media should have
  2736. to say this is this is the
  2737. best possible alternative like in this environment you are you yes you are paying a certain
  2738. amount of money per month
  2739. but maybe that could get factored into
  2740. your cell phone bill
  2741. and maybe with this sort of an ecosystem
  2742. right you're no longer being uh
  2743. drawn in
  2744. by your addictions and you know it's not
  2745. playing for your attention span
  2746. it's rewarding you in a very productive way and imagine joe if like 15 more of your time
  2747. was just way better spent like he was
  2748. actually spent on you were
  2749. actually doing the things you cared
  2750. about and it actually helped improve
  2751. your life yeah like imagine when you use
  2752. email if it was truly designed
  2753. i mean forget email people don't relate to that
  2754. because email isn't that popular but whatever it is that's a huge time sync
  2755. for you for me email's a huge one for me
  2756. you know web browsing or whatever is a
  2757. big one imagine that those things were
  2758. so much better designed that i
  2759. actually wrote back to the right emails
  2760. and i mostly didn't think about the rest
  2761. that when i was spending time on you
  2762. know whatever i was spending time on that it was really my my more and more of my life
  2763. was a life well lived and time well
  2764. spent that's like the retrospective view
  2765. i keep going to apple but
  2766. because i think that the only social media comp or
  2767. excuse me the only technology company
  2768. that does have these ethics to sort of protect privacy
  2769. have you thought about coming to them
  2770. yep have you well i mean i i think that they've made
  2771. great first steps and they were the
  2772. first along with google to do those
  2773. the screen time management stuff but
  2774. that was just this
  2775. barely scratching the surface like baby
  2776. baby baby steps like what we really need them to do is radically um
  2777. reimagine how those incentives and how the phone
  2778. fundamentally works so it's not just
  2779. all these colorful icons and one of the
  2780. problems they do have a disincentive
  2781. which is a lot of the revenue comes from
  2782. gaming and as they move more into
  2783. apple tv competing with hbo and hulu and
  2784. netflix and that whole thing where they
  2785. they need subscriptions so the
  2786. apple's revenue on devices and hardware
  2787. is sort of maxing out
  2788. and where they're going to get their
  2789. next bout of revenue to keep their stock price up is on these subscriptions i am less
  2790. concerned with those addictions
  2791. i'm less concerned with gaming
  2792. addictions than i have information addictions
  2793. because at least it's not
  2794. fundamentally altering your view of the
  2795. world right it's screwing up democracy
  2796. and making it impossible to agree well
  2797. and this is coming from a person that's had like legitimate video game addictions in the past but uh
  2798. like my wife is addicted to subway surfer like i don't know what it is it's a
  2799. crazy game it's like you're riding on
  2800. the top of subways you jumping around
  2801. it's like
  2802. it's really ridiculous but it's fun like
  2803. you watch like whoa but i don't [ __ ]
  2804. with video games but i watch it and
  2805. it's those games at least
  2806. are enjoyable there's something silly
  2807. about it like ah
  2808. [ __ ] and then you start doing it again
  2809. when i see people getting angry about
  2810. things on social media i don't see
  2811. the upside right i don't mind them
  2812. making a profit off games
  2813. there is an issue though with games that
  2814. addict children and then these children
  2815. there's like you could spend money on like roadblocks and you can you know have all these
  2816. different things you spend money on you wind up you know you're having these enormous
  2817. bills you leave your kid with an ipad
  2818. and you come back you have a 500
  2819. bill like what did you do yeah this is
  2820. this is an issue for sure but at least
  2821. it's not an issue
  2822. in that it's changing their view of of
  2823. the world right and i i feel like
  2824. there's a way
  2825. for i keep going back to apple but a company like apple to rethink the way that you know they
  2826. already have a walled garden right with imessage and facetime and all this different i can
  2827. totally build those things out i mean
  2828. imessage in icloud could be
  2829. the basis for some new neutral social
  2830. media yeah it's not based on instant
  2831. social approval and rewards right yes
  2832. they can make it easier to share
  2833. information with small groups of friends
  2834. and have that all synced and even
  2835. you know in the pre-covet days i was
  2836. thinking about apple a lot i think
  2837. you're right by the way to really poke
  2838. on them i think they're the one company
  2839. that's in a position
  2840. to lead on this and they also have a
  2841. history of thinking along those lines
  2842. you know they had this feature that's
  2843. kind of hidden now but to find my
  2844. friends right they call it find my now
  2845. it's all buried together so you can find your devices and find your friends but in a
  2846. pre-coveted world imagine they really built out the you know where are my friends right
  2847. now and making it easier to know when
  2848. you're nearby someone
  2849. so you can easily more easily get
  2850. together in person so right now all the
  2851. like to the extent facebook wants to
  2852. bring people closer together they don't
  2853. want to and again this is pre-coveted but they don't want to incentivize lots and
  2854. lots of facebook events they really care
  2855. about groups that keep people posting it
  2856. online and looking at ads
  2857. because of the category of bringing people
  2858. closer together they want to do the
  2859. online screen time based version of that
  2860. right as opposed to the offline
  2861. apple by contrast if you had little
  2862. imessage groups of friends you could say
  2863. hey does everyone in this little group
  2864. want to opt into being able to see where
  2865. each other are where we all
  2866. are on say weekdays between 5 and 8 pm
  2867. or something like that so you could like time bound it and make it easier for serendipitous
  2868. connection and availability to happen
  2869. that's hard to do it's hard to design
  2870. that but there's things like that that
  2871. apple's in a position to do
  2872. if it really took on that mantle and i
  2873. think as people get more and more
  2874. skeptical of these other
  2875. products they're in a better and better
  2876. position to do that
  2877. one of the antitrust issues is
  2878. do we want a world where our entire well-being as a society
  2879. depends on what one massive corporation worth over a trillion dollars does or doesn't do
  2880. right like we need more openness to try different things and
  2881. we're really at the behest of whether one or two companies apple or google
  2882. does something more radical
  2883. and there has to be some massive
  2884. incentive for them to do something
  2885. that's really going to change
  2886. yeah the way we interface with these
  2887. devices and the way we interface with social media and i don't know what incentive exists
  2888. it's more potent than financial
  2889. incentives well and this is where the
  2890. you know if the government in the same
  2891. way that we want to transition long term uh
  2892. from a fossil fuels oriented economy
  2893. to something that that doesn't
  2894. um that changes the kind of pollution levels uh
  2895. you know we have a hugely emitting um
  2896. you know society ruining kind of
  2897. business model of this attention extractive paradigm and we could long term sort of just like
  2898. a progressive tax on that
  2899. transition to some other thing the
  2900. government could do that right um
  2901. that's not like who do we censor it's
  2902. how do we disincentivize these
  2903. businesses to pay for the
  2904. sort of life support systems of society
  2905. that they've ruined a good example of this
  2906. i think in australia is there um
  2907. i think it's australia
  2908. that's regulated
  2909. that google and facebook have to pay the
  2910. publishers who they're basically hollowing out
  2911. because one of the effects
  2912. we've not talked about
  2913. is the way that google and facebook have
  2914. hollowed out the fourth estate in journalism i mean
  2915. because journalism has turned
  2916. into in local web news websites
  2917. can't make any money except by basically producing click bait so even to the extent that local
  2918. newspapers exist they only exist by basically click betification of even lower and lower paid you know
  2919. workers who are just generating content farms right so anyway so that's an example of
  2920. if you force those companies to
  2921. pay to to revitalize the fourth estate
  2922. and to make sure we have a
  2923. very sustainably funded fourth estate
  2924. that doesn't have to produce this clickbait stuff uh
  2925. that's that's you know another direction yeah that uh
  2926. that's interesting that they have to pay i mean these are the wealthiest
  2927. companies in like the history of humanity right so that's the thing so we
  2928. shouldn't be cautious about how much
  2929. they should have to pay
  2930. except we also don't want to happen on
  2931. the other end right you don't want to
  2932. have a world where
  2933. you know we have roundup making a crazy amount of money from giving everybody cancer and lymphoma from uh
  2934. you know all the chemicals right glyphosates and then they pay everybody on the other end
  2935. after a lawsuit of a billion dollars but
  2936. now everyone's got cancer let's actually
  2937. do it in a way
  2938. so we don't want a world where facebook and google profit off of the erosion of our social fabric
  2939. and then they pay us back
  2940. how do you quantify how how much money
  2941. they have to pay
  2942. to journalism yeah it seems like it's almost a form of socialism or yeah i mean this
  2943. is where like that
  2944. the iq led example is interesting
  2945. because they were able to
  2946. disincentivize and tax the lead producers
  2947. because they were able to produce some
  2948. results on how much this lowered the
  2949. wage earning potentials of the entire
  2950. population i mean like how much does
  2951. this cost our society we used to say
  2952. free is the most expensive business
  2953. model we've ever created
  2954. because we get the free downgrading of
  2955. our attention spans our mental health
  2956. our kids like our ability to agree with
  2957. each other our capacity to do anything as a democracy like yeah we got all that for free
  2958. wonderful obviously we get lots of
  2959. benefits and i want to
  2960. acknowledge that but that's just not
  2961. sustainable the real question i mean
  2962. right now we're
  2963. we have huge existential problems we
  2964. have a global competition power competition going on
  2965. i think china just passed the gdp of the us
  2966. i believe there is you know if
  2967. if we care about the us having a future in which it can lead the world in in some meaningful and enlightened way
  2968. we have to deal with this problem
  2969. and we have to have a world where digital democracy outcompetes digital authoritarianism
  2970. which is the china model
  2971. and right now that builds more coherence
  2972. and is more efficient and doesn't evolve the way that our current system you know does
  2973. i think taiwan estonia and countries like that where
  2974. they are doing digital democracies are
  2975. good examples that we can learn from
  2976. but we're behind right now well china
  2977. also has a really fascinating situation with huawei where google is banned huawei
  2978. so you can't have google applications on
  2979. huawei so now huawei is creating their own operating system and they have their own
  2980. ecosystem now that they're building up
  2981. and that's you know it's it's weird that
  2982. there's only a few different operating systems now i mean there's a very small amount of
  2983. people using linux phones
  2984. then you have a large amount of people
  2985. using android and iphones and if china
  2986. becomes the first to
  2987. adopt their own operating system and then they have even more unchecked rules and
  2988. regulations in regards to like
  2989. the influence they have over their
  2990. people with an operating system that
  2991. they've developed
  2992. and they control and who knows what kind
  2993. of back doors and
  2994. spying tons yeah it's
  2995. it's weird yeah when you see this
  2996. do you like it feels
  2997. so futile for me on the outside looking in looking but you you're working on this
  2998. how long do you anticipate is going to be a part of your
  2999. life i mean what does it feel like to you [Music] um
  3000. i mean it's not easy right um
  3001. in the film ends with this question
  3002. do you think we're gonna get there
  3003. yeah i just say we have to like i mean
  3004. if you care about this going well i wake
  3005. up every day and i ask
  3006. what will it take for this whole thing
  3007. to go well like
  3008. and how do we just orient each of our
  3009. choices as much as possible
  3010. towards this going well we have a whole
  3011. bunch of problems i do
  3012. look a lot at the environmental issues
  3013. the permafrost methane bombs like
  3014. the timelines that we have to deal with
  3015. certain problems are crunching and we
  3016. also have certain dangerous exponential
  3017. technologies that are emerging
  3018. decentralization of you know crispr and
  3019. like there's a lot of existential
  3020. threats i hang out with a lot with the
  3021. sort of existential threats community
  3022. it's going to take it must be a lot of fun it's uh
  3023. there's a lot of psychological
  3024. problems in that community actually
  3025. a lot of depression there's only an
  3026. imaginary suicide as well
  3027. it's it's uh
  3028. you know it's
  3029. it's hard but i i think we each have a
  3030. responsibility when you see this stuff to say what will it take for this to go well
  3031. and i will say that really seeing
  3032. the film impact people the way that it has i i used to feel like oh my god how are
  3033. we ever going to do this no one cares
  3034. like none of people know
  3035. right at the very least we now have about 50
  3036. 40 to 50 million people
  3037. who are at least introduced to the problem
  3038. the question is how do we harness them
  3039. into a collective movement and that's
  3040. what we're trying to do next i mean i
  3041. i'll say also these issues get
  3042. more and more weird over time my
  3043. co-founder is raskin will say that it's
  3044. making reality more and more virtual over time
  3045. because we haven't talked about how as technology advances
  3046. at hacking our weaknesses we start to
  3047. prefer it over the real thing we start
  3048. for example there's a recent company vc funded
  3049. raised like i think it's worth like over 125 million dollars
  3050. and what they make are virtual influencers
  3051. so these are like virtual people virtual video
  3052. that is more entertaining more interesting
  3053. and that fans like more than real people
  3054. oh boy and it's kind of related to the kind of
  3055. deep fake world right where like people
  3056. prefer this to the real thing and cheri turkel um
  3057. you know who's been working at mit
  3058. wrote the book reclaiming conversation
  3059. and alone together she's been talking
  3060. about this forever that
  3061. over time humans will prefer connection to robots and bots
  3062. and the computer generated thing more than the real thing
  3063. think about ai generated music being more it'll start to sweeten our taste buds and give us exactly that
  3064. thing we're looking for better than we will know ourselves just like youtube can give us the
  3065. perfect next video that actually every
  3066. bone in our body will say actually i
  3067. kind of do want to watch that even
  3068. though it's a machine pointed at my
  3069. brain calculating the next thing
  3070. there's an example from microsoft
  3071. writing this chat bot called
  3072. xiaoice i couldn't pronounce it that
  3073. after nine weeks people
  3074. preferred that chatbot to their real friends and 25 or
  3075. 10 10 to 25 percent of their users
  3076. actually said i love you to the chatbot
  3077. oh boy and that many there are several
  3078. who actually said that it convinced them not to commit suicide to have this relationship with
  3079. this chatbot so it's her
  3080. it's her it's the movie exactly which is
  3081. what so all these things are the same right we're veering into a direction where
  3082. technology if it's so good at meeting these underlying paleolithic emotions that we have
  3083. the way out of it is we have to see that
  3084. this is what's going on we have to see
  3085. and reckon with ourselves saying this is
  3086. how i work i have this negativity bias
  3087. if i get those 99 comments and one spot
  3088. one's positive comments and one's negative
  3089. my mind is going to go to the negative
  3090. i don't see that
  3091. i see you in the future wearing an overcoat you're you are literally lawrence
  3092. fishburne in the matrix
  3093. trying to tell people to wake up well
  3094. that's there's a line in the social
  3095. dilemma where i say how do you wake up
  3096. from the matrix if you don't know you're in the matrix well that is the issue right and i even
  3097. in the matrix we at least had a shared
  3098. matrix the problem now is that in the
  3099. matrix each of us have our own matrix
  3100. that's the real kicker
  3101. i struggle with the idea that this is all inevitable
  3102. because this is a natural
  3103. course of progression with technology
  3104. and that it's sort of figuring out the best way to to have us with
  3105. as little resistance embed ourselves into its system and that our ideas are what we are
  3106. with emotions and with our biological
  3107. uh issues that this is just how life is
  3108. and this is how life always should be
  3109. but this is just all we've ever known
  3110. that's all we've ever known einstein
  3111. didn't write into the laws of physics
  3112. that social media has to exist for
  3113. humanity right right we've gotten rid
  3114. again the environmental movement is a
  3115. really interesting example
  3116. because we passed all sorts of laws we got rid of lead we've changed
  3117. from you know some of our pesticides um
  3118. you know we're slow on some of these things
  3119. and corporate interests and asymmetric power of large corporations
  3120. you know which i want to say markets and capitals are great
  3121. is that when you have asymmetric power for predatory systems
  3122. that that cause harm they're not going to uh terminate themselves
  3123. they have to be bound in by the public by culture by by the state and um
  3124. we just have to point to the
  3125. examples where we've done that
  3126. and in this case i think the prob
  3127. the problem is that how much of our stock market
  3128. is built on the back of like five companies generating a huge amount of wealth
  3129. so this is similar i don't mean to make this example but um
  3130. there's a great book by um adam hokeshield
  3131. called bury the chains which is about
  3132. the british abolition of slavery
  3133. in which he talks about how for the
  3134. british empire like if you think about it when when we collectively wake up and
  3135. say this is an abhorrent practice that has to end but then at that time in the 17 1800s
  3136. in britain slavery was what powered the
  3137. entire economy it was free labor
  3138. for you know huge percentage of the
  3139. economy so if you say we can't do this anymore we have to stop this
  3140. how do you decouple when your entire economy is based on slavery right
  3141. and the book is actually inspiring
  3142. because it tracks a collective movement that was through
  3143. networked all these different groups the quakers uh
  3144. in the u.s the uh
  3145. people testifying before parliament
  3146. the former slaves who did first-hand accounts
  3147. the graphics and art of all the people had never seen what it looked like on a slave ship
  3148. and so by making the invisible visceral and showing just how abhorrent this stuff was
  3149. through a period of about 60 to 70 years the british empire had to drop their gdp by 2 every year
  3150. for 60 years and willing to do that to get off of slavery
  3151. now i'm not making a moral equivalent
  3152. i want to be really clear for everybody taking things out of context um
  3153. but just that it's possible for us to do something
  3154. that isn't just in the interest of economic growth and i think that's the real challenge
  3155. that's actually something that should be on the agenda
  3156. which is how do we one of the major tensions is economic growth
  3157. you know being in conflict with dealing
  3158. with some with many of our problems
  3159. whether it's some of the environmental issues or you know with some of the technology
  3160. issues we're talking about right now
  3161. artificial intelligence is something
  3162. that people are terrified of as an
  3163. existential threat they think of it as
  3164. one day you're going to turn something
  3165. on and it's going to be sentient
  3166. it's going to be able to create other
  3167. forms of artificial intelligence that
  3168. are exponentially more powerful than the
  3169. one that we created
  3170. and that will have unleashed this beast
  3171. that we cannot control
  3172. what my concern is with all this yeah
  3173. that's my concern my concern is that this this is a a slow acceptance of drowning
  3174. yeah that's like a slow we're okay i'm only up to my knees oh it's fine
  3175. it's just uh
  3176. my waist high it could be boiling water exactly exactly it seems like
  3177. this is like humans have to fight back to reclaim our autonomy and free will from
  3178. the machines i mean
  3179. one clear okay neo it's very much
  3180. the matrix and one of my favorite lines
  3181. is actually when the oracle says to neo
  3182. and don't worry about the vase and he
  3183. says what face and he knocks it over
  3184. that face and so it's like she's the ai
  3185. who sees so many moves ahead in the chess board she can say something which will cause
  3186. him to do the thing that verifies the
  3187. thing that she predicted what happened
  3188. yeah that's what ai is doing now except
  3189. it's pointed at our nervous system
  3190. and figuring out the perfect thing to dangle in front of our dopamine system
  3191. and get the thing to happen which
  3192. instead of knocking off the vases to be
  3193. outraged at the other political side and
  3194. be fully certain that you're right even
  3195. though it's just a machine that's
  3196. calculating [ __ ] that's going to make you you know do the thing when you're
  3197. concerned about this how much time do
  3198. you spend thinking about simulation theory
  3199. the simulation yeah
  3200. the idea that it if not currently one day there will be a simulation that's indiscernible
  3201. yeah from regular reality and it seems we're on that path
  3202. i don't know if you mess around with vr at all but well
  3203. this is the point about you know the virtual chat bots
  3204. out competing for exactly the technology you know
  3205. i mean that's what's happening is that reality is getting more and more virtual right
  3206. because we interact with a virtual news system
  3207. that's all this sort of click-bait economy outrage machine
  3208. that's already a virtual political environment that then translates into real world action then
  3209. becomes real and that's the weird feedback go back to 1990 whatever it was
  3210. when the internet became mainstream or at least started becoming mainstream
  3211. and then the small amount of time that
  3212. it took the 20 plus years to get to where we are now
  3213. and then think what what about the virtual world and once this becomes something that's
  3214. has the same sort of rate of growth that
  3215. the internet has experienced or that
  3216. we've experienced through the internet
  3217. i mean we're looking at like 20 years
  3218. from now being unrecognizable
  3219. yeah we're looking at i mean it's it
  3220. almost seems like that
  3221. is what life does the same way bees create bee hives you know a caterpillar doesn't
  3222. know what the [ __ ] going on when it
  3223. gets into that cocoon but it's becoming a butterfly we seem to be a thing
  3224. that creates newer and better
  3225. objects correct more effective but we have to realize ai is not conscious and won't be
  3226. conscious the way we are and so
  3227. many people think that but is consciousness essential i think so to us
  3228. i don't know essentially we're the only
  3229. ones who have it no i don't know that
  3230. no theory but there might be more yeah
  3231. things that have consciousness but
  3232. is it is it essential i mean it's the to
  3233. the extent that choice
  3234. exists it would exist through some kind of consciousness and this choice is choice essential
  3235. it's essential to us as we know it like
  3236. as life as we know it
  3237. but my worry is that we're in essential that like we we're thinking now like
  3238. single-celled organisms being like hey i
  3239. don't want to
  3240. gang up with a bunch of other people and
  3241. become an object that can walk
  3242. i like being a single cell organism this
  3243. is a lot of fun i mean i hear you saying
  3244. you know are we a bootloader for the ai that then runs that's eli's perspective i mean i think
  3245. this is a really dangerous way to think
  3246. i mean we have to
  3247. yeah so are we then dangerous for us yeah i mean what if the next version of the life is
  3248. the next version being run by machines
  3249. that have no values that don't care that
  3250. don't have choice and are just
  3251. maximizing for things that were
  3252. programmed in by our little miniature brains anyway but they don't cry they don't commit
  3253. suicide but then consciousness and life dies that could be the future i think this is
  3254. the last chance to try to snap out of that and is it important in the eyes of the
  3255. universe that we do that i don't know it
  3256. feels important how does it feel to you
  3257. it feels important but i
  3258. i'm i'm a monkey you know the monkey's like i'm staying in this tree man you guys
  3259. are out of your [ __ ] mind i mean this
  3260. is the weird paradox of being human is
  3261. that again we have these lower level
  3262. emotions we care about social approval
  3263. we can't not care at the same time like i said there's this weird proposition here
  3264. we're the only species that if this were
  3265. to happen to us
  3266. we would have the self-awareness to even
  3267. know that it was happening
  3268. right like we can consent like this
  3269. two-hour interview we can conceptualize
  3270. that this this thing has happened to us
  3271. right that we have built this matrix
  3272. this external object which has like ai
  3273. and supercomputers and voodoo doll
  3274. versions of each of us
  3275. and it has perfectly figured out how to
  3276. predictably move each of us in this matrix
  3277. let me propose this to you
  3278. we are what we are now human beings homo sapiens in 2020.
  3279. we we are this thing that uh
  3280. if you believe in evolution i'm
  3281. pretty sure you do
  3282. we've evolved over the course of
  3283. millions of years to become who we are right now should we stop right here are we done no
  3284. right we should keep it evolving
  3285. what does it look like if we go ahead
  3286. just forget about social media
  3287. what would you like us to be
  3288. in a thousand years or a hundred thousand years
  3289. or five hundred thousand years you certainly wouldn't want us
  3290. to be what we are right now right
  3291. no one would no i mean i think this is
  3292. what visions of star trek and things
  3293. like that we're trying to ask right like hey let's imagine humans do make it and we
  3294. become the most enlightened we can be
  3295. and we actually somehow make peace with
  3296. these other you know alien tribes
  3297. and we figure out you know space travel
  3298. and all of that i mean actually a good heuristic that i think
  3299. people can ask is on an enlightened
  3300. planet where we did figure this out
  3301. what would that have looked like isn't
  3302. it always weird that those movies
  3303. it's people are just people but they're
  3304. in some weird future but they haven't
  3305. really changed that much
  3306. right i mean and which is to say that
  3307. the fundamental way that we work is
  3308. just unchanging but there are such
  3309. things as more wise societies more
  3310. sustainable societies more peaceful or harmonious societies ultimately biologically
  3311. we have to evolve as well but our version of like the best version
  3312. is probably the gray aliens
  3313. right maybe so that's the ultimate
  3314. future i mean we're going to get into
  3315. gene editing and becoming more
  3316. perfect perfect on the sense of you know that but uh
  3317. we're going to start optimizing for
  3318. what are the outcomes that we value i
  3319. think the question is how do we actually
  3320. come up with brand new values
  3321. that are wiser than we've ever thought
  3322. of before that actually are able to
  3323. transcend the win lose games that lead
  3324. to omni lose lose that everyone loses
  3325. if we keep playing the win lose game at
  3326. greater and greater scales
  3327. i like you have a vested interest in the
  3328. biological existence of human beings
  3329. i think people are pretty cool yeah i
  3330. love being around them i enjoy talking to you today my fear is that we are
  3331. we're we're a model t right you know and
  3332. there's there's no sense in making those
  3333. [ __ ] things anymore the brakes are terrible they smell like [ __ ] when you drive them
  3334. they don't go very fast
  3335. we need a better version you know the funny thing is god there's some quote by someone i
  3336. think like i wish i could remember it
  3337. it's something about how much would be solved
  3338. if we were at peace with ourselves
  3339. like if we were able to just be okay with nothing
  3340. like just being okay with living and breathing
  3341. i don't mean to be you know playing the woo new age card i just genuinely mean
  3342. how much of our lives is just running away from
  3343. you know anxiety and discomfort and aversion
  3344. it is but you know in that sense
  3345. some of the most satisfied and happy people
  3346. are people that live a subsistence living
  3347. that have these subsistence existences in the middle of nowhere
  3348. just chopping trees and catching fish right
  3349. and more connection probably yeah authentic than something else i think that's
  3350. probably resonates biologically too
  3351. because of the history of human
  3352. beings living like that is just
  3353. so much longer and greater totally and i
  3354. think that those are more sustainable societies we can never obtain peace in the outer
  3355. world until we make
  3356. peace with ourselves dalai lama yeah but
  3357. i don't buy that guy
  3358. you know that guy he's uh
  3359. he's an interesting case i was thinking there was a different
  3360. slightly different quote but actually
  3361. there's one quote that i would love to
  3362. if it's possible one of the reasons why
  3363. i don't buy him he's just
  3364. chosen they just chose that guy yeah
  3365. also he doesn't have sex wait how how um
  3366. yeah how much can you be enjoying
  3367. life if that's not not a party come on bro you wear the same outfit every day the
  3368. [ __ ] out of here with your orange robes
  3369. can i there's a there's a really um
  3370. important quote that i i think would
  3371. really be good to share
  3372. uh it's from the book have you read
  3373. amusing ourselves death by neil postman no from 1982 no um
  3374. so especially when we get into big tech and
  3375. we talk about censorship a lot and we talk about orwell um
  3376. he has this really wonderful opening to this book it was written in 1982 it literally
  3377. predicts everything that's going on now
  3378. i frankly think that
  3379. i'm adding nothing and it's really just
  3380. neil postman called it all in 1982.
  3381. uh he had this great opening it says
  3382. um let's see we're all looking out for
  3383. you know 1984 when the year came and the prophecy didn't
  3384. thoughtful americans sang softly in
  3385. praise of themselves the roots of
  3386. liberal democracy had held
  3387. this is like we made it through the 1984 gap wherever else the terror had happened
  3388. we at least had not been visited by orwellian nightmares but we had
  3389. forgotten that alongside orwell's dark vision there was another slightly older
  3390. slightly less well-known
  3391. equally chilling vision of aldous
  3392. huxley's brave new world
  3393. contrary to common belief even among the educated
  3394. huxley and orwell did not prophecy the same thing
  3395. orwell warns that we will become overwhelmed overcome by an externally imposed oppression but
  3396. in huxley's vision
  3397. no big brother is required to deprive
  3398. people of their autonomy maturity or history as he saw it
  3399. people will come to love their oppression to adore the technologies that undo
  3400. their capacities to think
  3401. what orwell feared were those who would ban books what huxley feared was that there would
  3402. be no reason to ban a book
  3403. for there would be no one who wanted to
  3404. read one orwell feared those who would
  3405. deprive us of information
  3406. huxley feared those who would give us so
  3407. much that we would be reduced to passivity and egoism
  3408. orwell feared the truth would be concealed from us
  3409. huxley feared the truth would be drowned in a sea of irrelevance
  3410. orwell feared we would become a captive
  3411. culture but huxley feared we would become a trivial culture preoccupied with some
  3412. equivalent of the feelis
  3413. and the orgy porgy and the centrifugal
  3414. bumble puppy don't know what that means
  3415. as huxley remarked in brave new world
  3416. revisited the civil libertarians
  3417. and rationalists who are ever on the alert to oppose tyranny failed to take into account
  3418. man's almost infinite appetite for distractions lastly in 1984 orwell added people are
  3419. all people are controlled by inflicting pain
  3420. in brave new world they are controlled by inflicting pleasure in short or well
  3421. feared that what we fear will ruin us
  3422. huxley fear that what we desire will ruin us
  3423. holy [ __ ]
  3424. isn't that good
  3425. that's that's the best way to end this god damn
  3426. but again if we can become aware that this is what's happened
  3427. we're the only species with the capacity
  3428. to see that our own psychology our own emotions
  3429. our own paleolithic evolutionary system has been hijacked
  3430. i like that you're optimism is probably the only way to live in a meat suit body and keep going
  3431. otherwise it certainly helps yeah it certainly helps
  3432. thank you very much for being here man
  3433. i really enjoy this even though i'm really depressed now
  3434. i really don't want you to be depressed
  3435. i really hope people you know
  3436. i'm kidding we're not
  3437. we really want to build a movement and and uh
  3438. you know we're just
  3439. i wish i could give people more
  3440. resources we do have a podcast um
  3441. called undivided attention
  3442. and we're trying to build a movement at humanetech.com
  3443. but well listen any new revelations or new developments
  3444. that you have i'd be more than happy to have you on again
  3445. we'll talk about them and send them to me
  3446. and i'll put them on social media and whatever you need
  3447. awesome i'm here to help
  3448. awesome man
  3449. great great to be here resist yeah
  3450. grizzly together humanity resist humanity
  3451. we're in this together
  3452. thank you tristan
  3453. i really really appreciate it
  3454. [Music]
  3455.  
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement