Advertisement
Guest User

SH and TH Podcast 71

a guest
Oct 20th, 2019
10,002
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 118.28 KB | None | 0 0
  1. Unknown Speaker 0:00
  2. Today I'm speaking with Tristan Harris. Tristan has been called by the Atlantic magazine, the closest thing that Silicon Valley has to a conscience. He was a design ethicist at Google, and then left the company to start a foundation called time well spent, which is a movement whose purpose is to align technology with our deepest interest. Kristen was recently profiled on 60 minutes happened last week. He's worked at various companies, apple, wikia, aperture, and Google, and he graduated from Stanford, with a degree in Computer Science have been focused on human computer interaction. We talk a lot about the ethics of human persuasion, and about what information technology is doing to us and allowing us to do to ourselves. This is an area which I frankly haven't thought much about. So listening to Tristan was a bit of an education but needless to say, this is an area that is not going away. We are all going to have to think much more about this in the years ahead. And it was great to talk to Chris Don, I have since discovered that I was mispronouncing his name. Apologies, Tristan. Sometimes having a person merely say his name in your presence proves insufficient, such as the caprices of the human brain. But however you pronounce his name, Chris, Don has a lot of wisdom to share. And he's a very nice guy as well. So, meet Chris Don Harris. I am here with Kristen Harris. Kristin, thanks for coming on the podcast. Thanks for having me, Sam. So we were set up by some mutual friends. We have a few friends and acquaintances in common. And you are in town doing an interview for 60 minutes. Yeah, right. So you're I was actually I confess I was not aware of your work. I think I'd seen the Atlantic article that came out on you recently. But I think I had only seen it. I don't think I had read it. But what you're doing is fascinating and incredibly timely given our dependence on this technology. And I think this conversation we're going to have I'm imagining is going to be something like a, a field guide to what technology is doing to the human mind. I think we'll talk about how we can decide to move intentionally in that space of possibilities in a way that's healthier for for all of us. And this is obviously something you're focused on, but to bring everyone up to speed because even I was not up to speed until just a few days ago. What is your background? And I heard your you've had some very interesting job titles at Google, perhaps among other places. One was the resident product philosopher and design ethicist at Google. So how did Kristen Harris get to be interesting Harris and what are you doing now?
  3.  
  4. Unknown Speaker 2:52
  5. Well, first thing, thanks for having me. Really. It's it's honor to be here. I'm a big fan of this podcast.
  6.  
  7. Unknown Speaker 2:57
  8. So
  9.  
  10. Unknown Speaker 2:58
  11. yeah, my role at Google That was an interesting name. So design ethicists and product philosopher I was really interested in, essentially, when a small number of people in the tech industry, you know, influence how a billion people think every day, without even knowing it. How if you think about your role as a designer, how do you ethically steer a billion people's thoughts, framing, cognitive frames, behavioral choices, basically the schedule of people's lives since so much of what happens on a screen, even though people feel as if they're making their own choices will be determined by the design choices of the people at Apple and Google and Facebook. So we will talk I'm sure a lot more about that. I guess prior to that, when I was a kid, I was a magician very early. And so I was really interested in the limits of people's minds that they themselves don't seek. That's what magic is all about. That there really is a kind of a band of attention or short term memory or ways that people make meaning or causality that you can exploit as a magician. That had me fascinated in as a kid and I did a few little magic shows. And then flash forward when I was at Stanford did computer science. But I also studied as part of a lab called the beat of the persuasive Technology Lab with BJ Fogg, which basically taught engineering students how this this kind of library of persuasive techniques and habit formation techniques in order to build more engaging products, basically different ways of taking advantage of people's cognitive biases so that people fill out email forms that people come back to the product so that people register a forms that they fill out their LinkedIn profiles, that they tag each other in photos. And I became aware when I was at Stanford doing all this, that there was no conversation about the ethics of persuasion. Right? And just to ground how impactful that cohort was in my year in that class and the persuasive Technology Lab, the my project partners in that class and very close friends of mine were the founders of Instagram So, and many other alumni of that year in 2006, actually went on to join the executive ranks and many companies we know, LinkedIn and Facebook, when they were just getting started. And, and again, never before in history have such a small number of people with this toolset, influenced how people think every day by explicitly using these persuasive techniques. And so, at Google, I just got very interested in how we do that. And so you are studying computer scientist, Stanford, originally computer science, but I dabbled a ton in linguistics and actually symbolic systems. Yeah, cuz, you know, we're at Stanford eventually.
  12.  
  13. Unknown Speaker 5:34
  14. Yeah, yeah. So that was a great major at Stanford. I was in the philosophy department that there's overlap between philosophy and computer science. Yeah, symbolic systems. I think Reid Hoffman was one of the first symbolic systems majors yet ever. Yeah. So he has a persuasion. His connection to magic is interesting is there's an inordinate number of magicians and fans of magic in the skeptical community as well, perhaps somewhat due to the influence of James Randi but Magic is really the ultimate act of persuasion, you're persuading people of the impossible. So you see a significant overlap between the kinds of packs of people's attention that magicians rely on and our new persuasive technology.
  15.  
  16. Unknown Speaker 6:16
  17. Yeah, I think, Well, I think if you just abstract away what persuasion is, it's the ability to do things to people's minds, that they themselves won't even see how that process took place. And I think that parallels your work in a big way. And that beliefs do things to have a belief shapes the subsequent experience of what you have. I mean, in fact, in magic, there's like principles where you kind of want to start bending reality and creating these aha moments so that you can do a little hypnosis trick later, for example, that people be more likely to believe having gone through a few things that have kind of bent their reality into being more superstitious or more open. And there's, there's just so many ways of doing this that most people don't really recognize. I wrote an article called how technology hijacks your mind that ended up going viral to about a million people. And it goes through a bunch of these different techniques. But yeah, it's not something people mostly think about.
  18.  
  19. Unknown Speaker 7:08
  20. You also said in the setup for this interview that you have an interest in cults. Yeah. What's that about? and to what degree Have you looked at cults?
  21.  
  22. Unknown Speaker 7:17
  23. Um, well, I find cults fascinating because they're kind of like vertically integrated, persuasive environments, instead of just persuading someone's behavior or being the design of supermarket or the design of, you know, technology products, you are designing the social relationships, the power dynamic between a person standing in front of an audience, you you can control many more of the variables. And so I've done a little bit of sort of undercover investigation of some of these things. I mean, actually joining a cult or no not joining, but physically and mentally showing up physically many of these things are none of these cultures. would call themselves calls. I mean, many of them are simply workshops, sort of new agey style workshops. But you start seeing these parallels in the dynamics. Do you
  24.  
  25. Unknown Speaker 8:07
  26. want to name any names? Do I know these?
  27.  
  28. Unknown Speaker 8:10
  29. I might prefer not to at the moment, we'll see if we get there. Okay.
  30.  
  31. Unknown Speaker 8:12
  32. You have a former girlfriend who's still in one
  33.  
  34. Unknown Speaker 8:15
  35. note, but I did actually one of the interesting things is the way that people that I met in those Colts who eventually left and later talked about their experience and the confusion that you face. And I know this is an interesting had, the confusion that you face. When you've gotten many benefits from a cult. You've actually deprogrammed let's say early childhood traumas or identities that you didn't know you were holding, or different ways of, of seeing reality that they helped you, you know, get away from, and you get these incredible benefits and you feel more free. But then you also realize that was all part of this larger, persuasive game, to get you to spend a lot of money on classes or courses or these kinds of things. Right. And so what the confusion that I think people experience in knowing that they got All these benefits, but then also felt manipulated. And they don't know in the sort of minds natural black and white thinking how to reconcile those two facts. I actually think there's something parallel there with technology because for example, in my previous work on this, a lot of people expect you if you're criticizing how technology is designed that you might say something like, oh, you're saying Facebook's bad. But look, I get all these benefits from Facebook, look at all these great things it does for me. And it's because people's minds can't hold on to both truth that we do derive lots of value from Facebook. And there's many manipulative design techniques in across all these products that are not really on your team to help you live your life. Right. And that that's that distinction is very interesting when you start getting into what ethical persuasion is.
  36.  
  37. Unknown Speaker 9:47
  38. Yeah, it is a bit of a paradox because you can get tremendous benefit from things that are either not well intentioned or just objectively bad for you or not optimal You know, the ultimate cases you hear from all these people who, you know, survived cancer and cancer was the most important thing that ever happened to them. So a train wreck can be good for you on some level, because your response to it can be good for you, right? You can become stronger and all kinds of ways, even by being mistreated by people. And so but it seems to me that you can always argue that there's probably a better way to get those games and
  39.  
  40. Unknown Speaker 10:26
  41. this is I mean, frankly, with your work on the moral landscape, you know, when you're thinking about if you're a designer at Facebook, or it at Google, because of how frequently people turn to their phone, you're essentially scheduling these little blocks of people's time. If you put you know if I if I immediately notify you for every Snapchat message, which Snapchat is one of the most abusive, more manipulative, other technology products out there. You know, when you see a message from a friend in that moment, urgently, a lot that will cause a lot of people to go swipe over and, and not just see that message, but then get sucked into all the other stuff that They've been hiding for you, right? And that's all very deliberate. And so if you think of it as, let's say you're a designer at Google, and you want to be ethical, and you're steering people towards these different timelines, you're steering people towards Schedule A in which these events will happen or schedule be in which these other events will happen. You know, back to your point, should I schedule something that you might find really challenging or difficult, but that later, you'll feel is incredibly valuable? Do I take into account the peak end effect where people will have a peak of an experience and end? Do I take a lot of their time or a little bit of their time? Should the goal be to minimize how much time people spend on the screen? What is the value of screen time? And what are people doing that's lasting and fulfilling? And when are you steering people as a designer towards choices that are more shallow or empty?
  42.  
  43. Unknown Speaker 11:45
  44. So you're clearly concerned about time as we all should be as the one non renewable resource? It's the one thing we can't possibly get back. Any of no matter what other resources we Marshall and it's clear that our Technology, especially smartphone based technology is just a kind of bottomless sink of time and attention. I guess there's the other element that we're going to want to talk about which is the consequence of bad information or superficial information and just what it's doing to our minds. And isn't that the fake news phenomenon being of topical interest, but just the quality of what we're paying attention to is crucial. But the automatism of this process the addictiveness of this process, the fact that we're being hooked, and we're not aware of the of how calculated this intrusion into our lives is this is the thing that's missing is that people don't realize because there's this the most common narrative and we hear this all the time that technology is neutral, and it's just
  45.  
  46. Unknown Speaker 12:48
  47. up to us to choose how we want to use it and if it happens if people do fake news or if people start wasting all their time that that's just people's responsibility with this missus is that because of the attention economy, which is every basically business, whether it's a meditation app, or the New York Times, or Facebook, or Netflix or YouTube, you're all competing for attention. The way you win is by getting someone's attention. And by getting it again tomorrow, and by extending it for as long as possible. So it becomes this arms race for getting attention. And the best way to get attention is to know how people's minds work. So that you can basically push some buttons and get them to not just come but then to stay as long as possible. So there are design techniques, like making a product more like a slot machine that has a variable schedule reward. So you know, for example, I know you use Twitter, you know, you land on Twitter, notice that there's that extra variable, time delay between like one and three seconds before that little number shows up,
  48.  
  49. Unknown Speaker 13:45
  50. you return the page loads, there's this extra delay.
  51.  
  52. Unknown Speaker 13:50
  53. I haven't noticed that,
  54.  
  55. Unknown Speaker 13:50
  56. yeah, hold your breath. And then there's a little number that shows up for the notifications. And that delay is makes it like a slot machine. You're literally when you load the page you your position. pulling a lever and you're waiting. You don't know how many there's going to be is there going to be 500? Because some big tweet storm or is there going to be doesn't
  57.  
  58. Unknown Speaker 14:06
  59. always say 99?
  60.  
  61. Unknown Speaker 14:08
  62. Well, you're not everyone is Sam Harrison has said,
  63.  
  64. Unknown Speaker 14:11
  65. No, no, but I mean, I mean, isn't isn't that always the maximum? Never says 500. Right.
  66.  
  67. Unknown Speaker 14:15
  68. You know, I don't because again, I'm not you, as many followers? Well, I think
  69.  
  70. Unknown Speaker 14:19
  71. I can attest that. Mine is always at 99. So it's no longer salient to me
  72.  
  73. Unknown Speaker 14:24
  74. well, right, which actually speaks to how addictive variable rewards work, which is the point is it has to be a variable reward. So the idea that I push the lever or pull a lever, and sometimes I get, you know, too, and sometimes I get nothing, and sometimes I get,
  75.  
  76. Unknown Speaker 14:37
  77. you know, 20. And this is the same thing with email.
  78.  
  79. Unknown Speaker 14:39
  80. Well, let's talk about what is the interest of the company because I think most people are only dimly aware. I mean, they're certainly aware that these companies make money off of ads very often they sell your data. So you're your attention is their resource. Yep. Take an example. And so something like Twitter came Seemingly can't figure out how to make money yet. But Facebook doesn't have that problem. Let's take the clearest case. What is Facebook's interest in you as a user?
  81.  
  82. Unknown Speaker 15:10
  83. Well, obviously, that
  84.  
  85. Unknown Speaker 15:12
  86. there's other there's many sources of revenue. But it all comes down to whether it's data or everything else that comes down to advertising in time, because of the link that more of your attention or more of your time equals more money. They have an infinite appetite in getting more of your time.
  87.  
  88. Unknown Speaker 15:30
  89. So time on your newsfeed. And this is literally what they want.
  90.  
  91. Unknown Speaker 15:34
  92. That's right. And this is literally how the metrics and the dashboards, look, I mean, they measure what is the current sort of distribution of time on site time on site is the that in seven day actives are the currency of the tech industry. And so really other industry that measures users that way is sort of drug dealers, right, where you have the number of active users who login every single day. So that combined with time on site are the key principle metrics and the whole goal is to Maximize time on site. So Netflix wants to maximize how much time you spend their YouTube wants to maximize time on site, they recently celebrated people watching more than a billion hours a month. And that was a goal. And not because there's anyone who's evil or who, you know, wants to steal people's time, but because of the business model of advertising, there is simply no limit on how much attention that they would like from people.
  93.  
  94. Unknown Speaker 16:23
  95. Well, they must be concerned about the rate at which you click through to their ads or they not
  96.  
  97. Unknown Speaker 16:29
  98. they can be concerned about that, but because an ad rates are depreciating, but because they can make money just by simply showing you the thing, and there is some link between showing it to you and you clicking you can imagine with more and more targeted things that you are seeing things that that are profitable, and there's always going to be someone willing to pay for that space. But this problem means that as this starts to saturate, because we only have so much time to even hold on to your position, the attention economy, what do you do, you have to ratchet up how persuasive you are. So here's a concrete example. If your YouTube, you need to add autoplay, the next video,
  99.  
  100. Unknown Speaker 17:03
  101. rescue YouTube, you see that in the last year was find that incredibly annoying. Yep. I wonder what percentage of people find that annoying? Is it conceivable that that is still a good business decision for them? Even if 99% of people hate that feature?
  102.  
  103. Unknown Speaker 17:19
  104. Well, it's it's with the whole exit voice for loyalty if people don't find it so annoying that they're going to stop using YouTube because the defense of courses and there's no way they're going to stop using YouTube. So of course not. And that's what these these companies often hide behind this notion that if you don't like it, you can stop using the product. But while they're saying that, I mean, they have teams of thousands of engineers whose job is to deploy these techniques I learned at the persuasive Technology Lab to get you to spend as much time as possible. But just with that, one example, let's say YouTube ads, autoplay the next video, so there's add that feature, and let's say that increases your average watch time on the site every day by 5%.
  105.  
  106. Unknown Speaker 17:55
  107. So now they're eating up 5%
  108.  
  109. Unknown Speaker 17:56
  110. more of this limited attention mark. So now Facebook sitting there, saying, well shoot, we can't let this go, you know, to dry. So we've got to actually add autoplay videos to our newsfeed. So instead of waiting for you to scroll, and then click play on the video, they automatically play the video. They didn't always use to do that. Yes,
  111.  
  112. Unknown Speaker 18:16
  113. another feature I had.
  114.  
  115. Unknown Speaker 18:17
  116. Yep. And the reason though, that they're doing that, but people miss about this is it's not by accident, the web. And all of these tools will continue to evolve to be more engaging and to take more time because that is the business model. And so you end up in this arms race for essentially, who's a better magician, who's a better persuader who knows these back doors in people's minds as a way of getting people to spend more time.
  117.  
  118. Unknown Speaker 18:38
  119. Do you see this as intrinsically linked to the advertising model of revenue? Or would this also be a problem? If it was a subscription model? It's a it's a problem in both cases, but
  120.  
  121. Unknown Speaker 18:50
  122. advertising exacerbates the problem. So you're actually right that, for example, Netflix also maximizes time on site. What is heard from someone through some back channels was that the reason they have to do this is they found that if they don't maximize the for example, they have this auto countdown watching the next episode,
  123.  
  124. Unknown Speaker 19:10
  125. right? So they don't have to do that. Why are they doing that? Frankly, I like that feature. Yeah, try to figure that out. Psychologists among you
  126.  
  127. Unknown Speaker 19:16
  128. well, and this is this is where it gets down to what is ethical persuasion? Because let's that's one persuasive transaction where they are persuading you to watch the next video. Yeah, but in that case, you're happy about it.
  129.  
  130. Unknown Speaker 19:26
  131. I guess the reason why I'm happy about it there is that it is at least nine times out of 10. It is, by definition, something I want to watch because it's in the same series as the series I'm already watching, right? Whereas the YouTube is showing me just some random thing that they think is analogous to the thing I just watched. And then when you're talking about Facebook, or I guess I've seen this feature on on embeds in news stories like on the Atlantic or Vanity Fair. The moment you bring the video into the frame of the browser, it'll start playing I just find that annoying, especially if your goal is to read the text rather than watch the video. Yep.
  132.  
  133. Unknown Speaker 20:00
  134. But again, there's this because of the game theory of it, when one news website evolves that strategy, you can think of these as kind of organisms that are mutating new, persuasive strategies that either work or not at holding on to people's attention. And so you have some neutral playing field and one guy mutates this strategy of on the news website of auto playing that video when you land, let's say at CNN. So now the other news websites, if they want to compete with that, they have to that assuming that CNN has enough market share that that
  135.  
  136. Unknown Speaker 20:25
  137. that makes it a difference.
  138.  
  139. Unknown Speaker 20:27
  140. The other ones have to start trending in that direction. And this is why the Internet has has moved from being this neutral feeling resource where you're kind of just accessing things to feeling like there's this gravitational wormholes suck kind of inequality that pulls you in. And this is what I think is so important. You asked, you know how much of this is due to advertising and how much of it is due to the hyper competition for attention. It's both. One is we have to be able to decouple the link between how much tension we get from you and how much money we make. And we actually did the same thing with You know, for example, in an energy markets where it used to be the energy companies made more money, the more energy you use. And so therefore, they have an incentive, they want you to please leave the lights on, please leave the faucet on. We are happy. We're making so much more money that way. But of course, that was a perverse incentive. And so this new regulatory commission got establish that that basically, decoupled was called decoupling, decouple the link between how much how much energy you use and how much energy they how much money they make.
  141.  
  142. Unknown Speaker 21:28
  143. Well, and there's some ads online. I can't even figure out how they're working or why they're there. They're there these horrible ads at the bottom of even the most reputable websites like the Atlanta you'll have these ads. I think usually they're framed with you know, from around the web, and it'll be an ad like, you won't believe what these child celebrities look like today.
  144.  
  145. Unknown Speaker 21:51
  146. Yet to bola an outbreak. There's a whole actual kind of market of companies that specifically provide these related links at the bottom of his web. sites but
  147.  
  148. Unknown Speaker 22:00
  149. that mean they're so tawdry and awful. And he said, you can go from just, you know, reading, literally the best long form journalism and hit just one garish ad after another. But the thing that mystifies me is when you click through to these things, I can't see that ever land at a product that anyone who was reading that article would conceivably buy, you're just going down the sinkhole into something horrible. Everything looks like a scam. It's just,
  150.  
  151. Unknown Speaker 22:30
  152. it all comes down to money, though, the reason why so I actually know a lot about this because the company the way I arrived at Google was they bought our little startup company for our talent. And we didn't do what this this sort of market of websites did. But we were almost being pushed by publishers who used our technology to do that. So one of the reasons I'm so sensitive at this time on site stuff is because I had a little company called aperture which provided a little in depth background pieces of information without making you leave news websites, so you'd be on Economists and it would talk about Sam Harris and you say who Sam Harris you'd highlighted and would give you sort of a multimedia background or thing and you could interactively explore and go deeper. And the reason we sold this the reason why an economist wanted it on their website is because it increased time on site. And so I was left in this dilemma where the thing that I got up to do in the morning as a founder was, let's try to help people understand things and learn about things. But then the actual metric was, is this increasing our time on site or not? And publishers would push us to either increase revenue or increase time on site. So the reason that the economist and all these other even reputable websites have these bucket of links at the bottom is because they actually make more money from tabula and our brain and a few others.
  153.  
  154. Unknown Speaker 23:43
  155. No time on site seems somewhat insidious as a standard except if you imagine that the content is intrinsically Good. Now, someone who's slowly but surely building a meditation app, right. So now it's time on my app will be time spent practicing meditation. And so in so far as I think that's an intrinsically good thing for someone to be doing. Yep, anything I do in the design of the app, so as to make that more attractive to do and in the best case irresistible to do, right. I mean, the truth is, I would like an app in my life, that got me to do something that is occasionally hard to do, but I know is worth doing and good for me to do right. Rather than waste my time on Twitter, something like meditation, something like exercise, eating more wisely. I don't know how that can be measured in terms of time, but there's certain kinds of manipulations speaking personally of my mind that I would happily sign up for. Right. So how do you think about that?
  156.  
  157. Unknown Speaker 24:48
  158. Absolutely. So that this is a great example. So because of the attention economy, constantly ratcheting up these persuasive tracks. The price of entry for say a new meditation app is it You're gonna have to try and find a way to sweeten that front door. So that that is that competes with the other front doors that are on someone's screen at the moment when they wake up in the morning.
  159.  
  160. Unknown Speaker 25:09
  161. And of course,
  162.  
  163. Unknown Speaker 25:10
  164. you know, as much as I know, and I think many of us don't like to do this, it's like the Twitter and the Facebook and the email ones are just so compelling first thing in the morning, even if that's not what we'd like to be doing. And so because all of these different apps are neutrally competing on the same playing field for morning attention, and not a specific kind of like helping Sam wake up best in the morning for your meditation app, and what many meditation apps I personally know. They have to provide these usually these notifications, so they start realizing, oh, shoot, Facebook and Twitter are notifying people first thing in the morning to get their attention. So if we're going to stand a chance to get in the game, we have to start notifying people. And then everyone starts again, amping up in the arms race and you don't end up with it's this race, classic race to the bottom. You don't end up with you know a screen you want to wake up two in the morning at all. It's not good for anybody. But it all became came from this, this need to basically get there first to race up. So wouldn't we want to change the you know the structure of what you're competing for. So it's not just attention at all costs.
  165.  
  166. Unknown Speaker 26:09
  167. So yeah, so you have called for what I think you've called a Hippocratic Oath for software designers to first do no harm. What do you think designers should be doing differently now?
  168.  
  169. Unknown Speaker 26:22
  170. Well, I think of it less as the Hippocratic oath. That's the thing that got captured in the Atlantic article. But a different way to think about it is that the attention economy is like this city. You know, essentially, Apple and Google and Facebook are the urban planners of this city that a billion people live inside of. And we all live inside of it. Like all a billion people live inside of this attention city. And in that city, it's designed entirely for commerce. It's maximizing basically attention at all costs. And that was fine when we first got started. But now, this is the city that people live inside of. I mean, the amount of time people spend on finally wake up with them. They go to sleep with them, they check them 150 times a day. That's actually a real figure. 250 times a day is a real figure for sure. Yeah. And so now what we what we'd want to do is organize that city almost like, you know, Jane Jacobs created this sort of live livable cities movement, and said, You know, there are things that make a great city great. there things that make a city livable. You know, she pointed out eyes on the street, you know, students in New York, she was talking about Greenwich Village. These are things that make a neighborhood feel different, feel more homey, livable, safe. These are values people have about what makes a good urban planning city. There is no set of values to design this city for attention. So far, it's been this wild west let each app compete on the same playing field to get attention at all costs. So when you ask me what would what should app designers do? I'm saying it's actually a deeper thing. That's like saying what should the casinos who are all building stuff in the city
  171.  
  172. Unknown Speaker 27:58
  173. do differently? Right?
  174.  
  175. Unknown Speaker 28:00
  176. casinos there. And the only way to for it to even be there is to do all the same manipulative stuff that the other casinos are doing. It's going to go out of business if it doesn't do that. So the better question to ask is, how would we reorganize the city by talking to the urban planners by talking to Apple, Google and Facebook, to change the basic design. So let's say there are zones, and one of the zones in the attention economy city, would be the morning habits zone. So now you just get things competing for what's the best way to help people wake up in the morning, which could also include the phone being off, right, that could be part of the how the phone the option of the phone, being off for a period of time and telling your friends that you're not up until 10 in the morning, or whatever, could be one of the things that competing for the morning part of your life and the life zone there. And that would be a better strategy than trying to change. You know, meditation app designers to take a Hippocratic oath to be more responsible when the whole game is just not set up for them to succeed.
  177.  
  178. Unknown Speaker 28:56
  179. Well, to come back to that question because it's it's personal. interest to me because I do want to design this app in a way that seems ethically impeccable. If the thing you're directing people to is something that you think is intrinsically good, and they forget about all the competition for mindshare that exists that you spoke about. It's just hard to do anyway. I mean, people are reluctant to do it. That's why I think an app would be valuable. I think the existing apps are valuable. So if you think that anytime on app is time well spent, which I don't think Facebook can say, I don't think Twitter can say but I think headspace can say that, you know what, whether or not that's true, you know, someone else can decide. But I think without any sense of personal hypocrisy, I think they feel that if you're using their app more, that's good for you, right? Because I think that it's intrinsically good to meditate. And I'm sure any exercise app, you know, the Health app or whatever it is, I'm sure that they all feel the same way about that. They're probably right Take that case. And then let's move on to a case where everyone's motives are, are more mercenary and more time on the app means more money for the company, which isn't necessarily the case for some other apps. One time on the app is intrinsically good. Why not try to get people's attention any way you can? Right?
  180.  
  181. Unknown Speaker 30:22
  182. Well, so this is where the question of metrics is really important. Because in an ideal world, the thing that each app would be measuring would align with the thing that each person using the app actually wants. So time well spent would mean, in the case of meditation app, asking the user I mean, just not saying the app and do this. But if you were to think about it, user would say, okay, in my life, what would be time well spent for me in the morning, waking up? And then imagine that whatever the answer to that question is, should be the rankings in the App Store, rewarding the apps that are best at
  183.  
  184. Unknown Speaker 30:54
  185. that.
  186.  
  187. Unknown Speaker 30:56
  188. So that again, is one of the systemic answer that The systems like the app stores and the ranking functions that runs a search, Google search or Facebook newsfeed would want to sort things by what helps people the most, not what's got the most time.
  189.  
  190. Unknown Speaker 31:13
  191. And the measure of that would be the evaluation of the user of the some questionnaire based rating. Yeah, this is working for you.
  192.  
  193. Unknown Speaker 31:23
  194. Yeah. And in fact, we've done some initial work with this actually, there's an app called moment on iOS.
  195.  
  196. Unknown Speaker 31:29
  197. So moment,
  198.  
  199. Unknown Speaker 31:30
  200. tracks, how much time you spend in different apps. You send it a screenshot of your battery page on the iPhone, and it just captures all that data. And we've actually they partnered with time well spent to ask people which apps do you find are most time well spent most happy about the time you spent when you can finally see, this is all the time you spent in it, and which apps do you most regret?
  201.  
  202. Unknown Speaker 31:50
  203. And we have the data back
  204.  
  205. Unknown Speaker 31:52
  206. that people regret the time that they spend in Facebook, Instagram, Snapchat and WeChat the most and they tell So far the current rankings are, for the most are like My Fitness Pal and podcasts. And
  207.  
  208. Unknown Speaker 32:07
  209. there's a bunch of other ones that I forgot.
  210.  
  211. Unknown Speaker 32:09
  212. The irony is that being ranked first in regret is probably as accurate a measure as any of the success of your app. Yeah, exactly.
  213.  
  214. Unknown Speaker 32:20
  215. And this is why the economy isn't breaking things are aligning things with what we actually want. Yeah, I mean, if you think about it, as everything is a choice architecture, and you're sitting there as a human being worth
  216.  
  217. Unknown Speaker 32:30
  218. taking from the menu,
  219.  
  220. Unknown Speaker 32:31
  221. and currently the menu sorts things by what gets the most downloads the most sales the most gets most talked about, the things that most manipulate your mind to the whole economy is become this if you assume marketing is as persuasive as it is, in a bigger level. The economy reflects what's best at manipulating people's psychology, not what's actually best in terms of delivered benefits in people's lives. So if you think about this as as a deeper systemic thing about if you would want How would you want the economy to work? You'd want it to rank things so that the easiest thing to reach for would be the things that people found to be most time well spent in their lives. Would it for whatever category of life choice that they're making at that moment, in terms of making choices easier, hard, because you can't escape. You know, in every single moment, there is a choice menu, and some choices are easy to make. Some choices are hard to make,
  222.  
  223. Unknown Speaker 33:20
  224. excuse me run into a problem which behavioral economists know quite well. And this is something that Dan economist spoken a lot about that there's a difference between the experience in self moment to moment and the remembered self. So when you're giving someone a questionnaire, asking them whether their time on all these apps and websites was well spent, you are talking to the remember itself. And Danny and I will once argued about this, how to reconcile these two different testimonies, but at minimum, you can say that they're reliably different so that if you were experienced sampling people along the way, you know for every hundred minutes On Facebook, yep. Every 10 minutes you were saying how happy are you right now? You would get one measure if at the end of the day you ask them how good to use of your time was that to be on Facebook for 100 minutes, you would get a different measure. Yep, sometimes are the same, but they're very often different. And the question is, who to trust? Where are the data that you're going to use to assess whether people are spending their time? Well,
  225.  
  226. Unknown Speaker 34:24
  227. well in the problem right now is that the all of the metrics just relate to the current present self version, right? Everything is only measuring what gets most clicked or what gets most shared. So back to fake news. Just because something is shared the most, doesn't mean it's the most true. Just because something gets clicked the most doesn't mean it's the best just because something is talked about the most, doesn't mean that it's real or true. Right? Right. The second that Facebook took away its human editorial team for the Facebook trends. In the end, they fired that whole team and so it's just an A on Picking what the most popular news stories are, within 24 hours it was gained. And the top story was a false story about megan kelly and Fox News. And so right now I'm getting into AI about all of these topics and AI is essentially have a pair of eyes or sensors that are trying to pick from these impulsive or immediate signals. And it doesn't have a way of being in the loop or in conversation with our more reflective selves. It can only talk to our present in the moment selves. And so you can imagine some kind of weird dystopian future where the entire world is only listening to your present in the moment feelings and thoughts, which are easily attainable by persuasion,
  228.  
  229. Unknown Speaker 35:37
  230. although it just is a question How to reconcile the difference between being pleasantly engaged, moment by moment in a activity at the end of which you will say I kind of regret spending my time that way. There are certain things that are captivating where you you're hooked for a reason, right? Yeah, you know, whether it's a video game or Whether you're eating french fries or popcorn or something that is just perfectly salted, so that you just can't stop, you're bingeing on something because in that moment, it feels good. And then retrospectively, very often you regret that use of time.
  231.  
  232. Unknown Speaker 36:17
  233. What's the one frame of this is this sort of shallow versus deep sense, that's what you're getting at here is a sense of something can either be full but empty, which you don't really words in the English language for this, or something can be full and fulfilling.
  234.  
  235. Unknown Speaker 36:29
  236. Things can be very, very engaging or pleasurable, but not fulfilling. Yes.
  237.  
  238. Unknown Speaker 36:36
  239. And even more specifically regretted. Yeah, and then there's the set of choices that you can make for a timeline. If you're again, scheduling someone else's life for them as people at Google and Facebook to every day, you know, where the you can schedule a choice that is full and fulfilling. Now, does that mean that we should never put choices on the menu that are full but you regret like, should we never do that from Google or Facebook? That's one frame but let me actually flip it over. And make it even more philosophically interesting. Let's say that in the future, YouTube is even better at knowing exactly what it every bone in your body you've been meaning to watch, like the professor or lecture that you've been told is the best lecture in the world. Or just think about what every bone your body tells you, you, in fact, would be full and fulfilling for you. And let's imagine this future DeepMind powered version of YouTube is actually putting those perfect choices next on the menu. So now it's auto playing the perfect next thing that is also full and fulfilling, right? There's still something about the way the screen is steering your choices that are not about being in alignment with the life you want to live, because it's not in alignment with the time dimension now, so now it's sort of blowing open or blowing past boundaries, you have to bring your own boundaries,
  240.  
  241. Unknown Speaker 37:48
  242. right? You have to resist the perfect you have
  243.  
  244. Unknown Speaker 37:52
  245. to resist the perfect now should that be and by the way, because of this arms race, that is where we're trending trending to. People don't understand this the whole point of attention, the attention economy because of this need to maximize attention. That's where YouTube will be in the future.
  246.  
  247. Unknown Speaker 38:07
  248. And so
  249.  
  250. Unknown Speaker 38:08
  251. wouldn't you instead, say, I want Netflix his goal to basically optimize for whatever is time well spent for me, which might be, let's say, for me watching one really good movie a week that I've been really meaning to watch. And that's because I'm defining that it's in conversation with me about what I reflective Lee would say, is time well spent. And it's not trying to just say you should maximize as much as possible. And for that relationship to work, the economy would have to be economy, an economy of loyal relationships, meaning I would have to recognize as a consumer, that even though I only watch, you know, one movie a week, that's enough to justify my relationship with Netflix, because they found in this case that if they don't maximize time on site, people actually end up canceling their subscription over time. And so that's why they're still trapped in the same race race,
  252.  
  253. Unknown Speaker 38:52
  254. right? And what concerns you most in this space? Is it social media more than anything else? Or is everything That's grabbing attention, engaged in the same arms race and kind of have equal concern to what, you know, it's as a systems
  255.  
  256. Unknown Speaker 39:07
  257. person, it's it's really the system. And it's it's the attention economy. It's the race for attention itself. that concerns me. Because one is people are in the tech industry appear to me very often as being blind to what that race costs us. You know, if if one. Let's I mean, for example, the fake news stuff, but instead of going to fake news, let's call it fake sensationalism. You know, the news feed is trying to figure out what people click the most. And if one news site evolves, the strategy of outrage, outrage is a way better persuasive strategy, getting you to click and generates outrage. Right. And so the newsfeed without even having any person at the top of it and the captain of the ship, saying, Oh, I know it's going to be really good for people as outrage, or that'll get us more attention. It just discovers this as an invisible trait that starts showing up in the AI so it starts steering people towards news stories that generate outrage. And that's literally where, like the news feeds of con last where we are three months. This is, you know, your
  258.  
  259. Unknown Speaker 40:06
  260. true or fake. It's an outrage machine.
  261.  
  262. Unknown Speaker 40:08
  263. And then the question is how much is that outrage? I mean, there's if you thought about it in the world, is there any lack of things that would generate outrage? I mean, there's an infinite supply of news today. And there was even 10 years ago, that would generate outrage. Right? And if we had the perfect day, I 10 years ago, we could have also delivered you have a day full of outrage. And so
  264.  
  265. Unknown Speaker 40:29
  266. as a funny title,
  267.  
  268. Unknown Speaker 40:31
  269. how easy would that be to market a day full of outrage? Nobody thinks they want that. But we're all acting like that's exactly what we want.
  270.  
  271. Unknown Speaker 40:39
  272. Well, and I think this is where the language gets interesting, because when we talk about what we want, we talk about what we click, but in the moment, right before you click, I mean, I'm kind of a meditator, too. It's like I noticed that what's going on for me right before I click is not as you know, from free will like how much is that a conscious choice what's really going on phenomenal logically in that moment, right before the click
  273.  
  274. Unknown Speaker 41:00
  275. None of your conscious choices are conscious choices, right? You're the last to know why you're doing the thing you're about to do. And you're very often misinformed about it. We can set up experiments where you will reliably do the thing, for reasons that you when you're forced to articulate them are completely wrong about
  276.  
  277. Unknown Speaker 41:18
  278. Absolutely. And even more over people. Again, when they're about to click on something, don't realize there's 1000 people on the other side of the screen whose job it was, was to get you to click on that, because that's what Facebook and Snapchat and YouTube are all for. So it's not even a neutral moment.
  279.  
  280. Unknown Speaker 41:34
  281. Do you think that fact alone would change people's behavior? If you could make that transparent? It just seems it would be instructive for most people to see the full stream of causes that engineered that moment for them.
  282.  
  283. Unknown Speaker 41:49
  284. Well, one thing I've got some friends in San Francisco, we're talking about this that people don't realize, especially when you start applying some kind of normal activity and saying, you know, the news feeds really not good. We need to rank it a different way. And they say well, well Whoa, whoa, whoa, who are you to say,
  285.  
  286. Unknown Speaker 42:02
  287. what's good for people? And I always say this is status quo bias. People are thinking that somehow the current thing we have is set up to be best for people. It's not Yes, it's best for engagement, if you were to give it a name, if Google has PageRank, Facebook is engagement rank. Now, let's just let's say, let's take it all the way, all the way to the end. Let's say you could switch modes as a user, and you can actually switch Facebook to addiction rank, the fact that Facebook actually has a version of newsfeed that I'm sure it could deploy called, you know, let's just actually tweak the variable so that whatever, let's show people the things that will addict them the most, or we have outrage rank, which will show you the things that will outrage you the most, where we have NPR rank, which actually shows you the most boring long comment threads where you have like, these, you know, long in depth conversations that your whole newsfeed is these long, deep threaded conversations. Or you could have the bill o'reilly mode where you get these, something I know you care about these sort of attack dog style comment threads where people are yelling at each other.
  288.  
  289. Unknown Speaker 42:59
  290. You can imagine that These feet can be ranked in any one of these ways. Actually, there's this form of of choices already implemented on flicker where you, when you look for images, you can choose relevant or interesting or so you could have that same drop down menu for any of these other media and this is
  291.  
  292. Unknown Speaker 43:16
  293. this is your point of like people don't see transparently what the goals of the designers who put that choice in front of you are, right. So the first thing would be to reveal that there is a goal it's not a neutral product. It's not just something for you to use. You can obviously with enough effort get you know, use Facebook for all sorts of things. But the point is the default sort of compass or or North Star on the GPS, that is Facebook of steering Your life is not steering your life towards,
  294.  
  295. Unknown Speaker 43:43
  296. hey, help me have the dinner party,
  297.  
  298. Unknown Speaker 43:44
  299. you know that I want to have or helped me get together with my friends on Tuesday or help me make sure I'm not feeling lonely on a Tuesday night.
  300.  
  301. Unknown Speaker 43:53
  302. There is seems to me a necessary kind of paternalism here that we just have to accept Because it seems true that we were living in a world where no one or virtually no one would consciously choose the outrage tab. Right, right. Like, basically, I want to be as outraged as possible today, show me everything in my newsfeed that's going to piss me off with the addiction tag. Yeah, nor the superficial uses of attention tab, you know, just cat videos. Yeah, let's just give me the Kardashians all day long. And I'll regret it later. So no one would choose that. And yet we are effectively choosing that by virtue of what proves to be clickable in the attention economy
  303.  
  304. Unknown Speaker 44:33
  305. in the service of the greater goal of advertising again, like that goal wasn't by accident. In fact, in some ways it was because I think sometimes what's so interesting when you talk to the people who make the products, of course, there's no one there Who says I want to take you or steer you away from the life choices that you want to make. No one thinks that way. The narrative of course at Facebook is that we're helping make the world more open and connected. And of course, it's hard to argue with that because it does do that to the problem is that's not what the thing that Measuring every day is, right? And also what would that mean? What would be the values or the measurable outcomes or the teleological frames that you'd be choosing for? I mean, instead, imagine a time well spent rank, which would be basically a life rank, like what do you want most in your life? And instead of having those thousand engineers working to get me to scroll or click on the next thing, those thousand engineers would be basically working to help me schedule the next moments of my time in ways that I would find, take me closer to the life that I want to live.
  306.  
  307. Unknown Speaker 45:29
  308. You keep using this phrase, time well spent. That is a both a website and a foundation you started or
  309.  
  310. Unknown Speaker 45:36
  311. Yeah, so it's a nonprofit movement, that I when I left Google and my work as the designer, this is there. I realized that there was this fundamental conflict of interest that the attention economies, maximization of time spent was just never going to go away. And so if you want to change that, that core currency of success, you have to go outside. It's kind of like what the organic food movement was, right? We're organic food, it was just a race to the bottom to provide the cheapest lettuce on the shelf.
  312.  
  313. Unknown Speaker 46:05
  314. And it's whoever can put the cheapest lettuce
  315.  
  316. Unknown Speaker 46:07
  317. and then someone figures out we can put cheap lettuce on there, we can get even cheaper lettuce by using this pesticide. And no one's discovered the pesticide yet.
  318.  
  319. Unknown Speaker 46:14
  320. So that farmer starts to win.
  321.  
  322. Unknown Speaker 46:16
  323. And it isn't until we have this new standard of this sort of movement that says we want organic food, we want a different kind of lettuce. Yeah. So you need that, or something like that some kind of intervention like that to gradually change the success metrics. And so we call that time well spent, just because it encapsulates the distinction between time spent today.
  324.  
  325. Unknown Speaker 46:35
  326. So to use that analogy, the differentiator there is a person's willingness to pay for something that's harder to grow harder to provide. So they're willing to pay more than what you have to pay for the cheapest possible let us right, what differentiates time stolen or squandered from time Well spent in the marketplace. Just imagine five years from now we have solved all of these problems. Our technology is as good for us as possible. Yeah. Good question. How did we get there? What if we changed?
  327.  
  328. Unknown Speaker 47:12
  329. So what it would take is, there's different questions about how you get there, whether it's regulation, or it's enlightened. founders of big technology companies choosing to rank suddenly everything differently. But you can imagine a world where Apple and Google recognize that the current designs invisibly of the smartphones are to maximize let's say, the time people spend in all the apps, and that instead of maximizing for that, and instead of just wanting every app developer to be successful at all costs, they say we're going to reorganize home screens and notifications and app stores to rank success in terms of how time well spent people found these things to be for that part of their life. So news media is ranked on the whatever you would call it, the epistemology credibility or the, you know, truth seeking or I mean that again, there's a bunch of different values that would be in that category, morning habits and meditation apps would be ranked by whatever people find helps them wake up best in the morning.
  330.  
  331. Unknown Speaker 48:14
  332. Things would be ranked by the thing that matters for that category.
  333.  
  334. Unknown Speaker 48:17
  335. Although it seems to me you're asking a lot of people to keep telling you whether things are working for them, because because with a click, they've told you without taking any time to tell you they're not aware of having answered a questionnaire, but the way they're using their attention is giving everyone the information of you know, whether this stuff is hackable in the way that the status quo now demands in your world. We need people to assess the effects of their users of attention and report back, but what do you actually picture and how do people give the information back to the system so these rankings reflect time well spent.
  336.  
  337. Unknown Speaker 48:54
  338. So here's a concrete example. So let's imagine in the future and I totally hear this concern by the way, that Do you want a world where everything is constantly asking you for a rating every single day? For all of these things? And the answer is no. Obviously, the question would be, from a time well spent angle that time is the finite resource to manage. What amount of ratings would we want from people. But let's make it concrete. So let's say your phone, you know, once a week basically shows you this reflection of the biggest sort of surface areas of your phones footprint in your life. So let's show let's say that it shows you this mirror of saying, hey, Sam, this is how I see you waking up in the morning. Here's what morning for you looks like I have the data. It's right here on the phone. It's stored locally, not in the server. And I and it says, hey, look, you know, I noticed that you wake up usually spend about 20 minutes surfing Twitter. You're in your email for about 15 minutes. And then you get stuck in the apple news app for about 30 minutes is you've got about an hour and 15 minutes before you're kind
  339.  
  340. Unknown Speaker 49:57
  341. of
  342.  
  343. Unknown Speaker 49:58
  344. ever doing it. Notice you haven't moved from your supine position yet.
  345.  
  346. Unknown Speaker 50:03
  347. Let's even call it the the sort of like, bed zones, part of the reflection, you haven't even gotten up. Which by the way, again, back to persuasion, if if you could persuade someone just to lie down while they're doing all this stuff, you've actually are inherently changed the choice architecture, they're more likely to stay in the inertia of lying down, then if you had stood them up, like just we know from Brian in physical space, okay, so we have this moment where where we see, you know, your reflection that you've got this hour and 15 minutes that approximately split between these different apps. And the phone says, Is that what's time well spent for you in the morning? Like, what would be time well spent for you? And you would say, actually, not these things at all. They say, Great, what will it look like instead? And you say, hey, I want to meditate with my, my friends to two days a week. And it says, great, who else meditates and it would be directly linked into your app and it basically say, when you and the other person wake up in the morning, around the same time, do you want to opt into some kind of mode so when you wake up, it shows you that if they also woke up around that time, and you could swipe over and you know you're in a Meditation experience with. And so now your meditation app wins on the basis of how well it actually helps people in that morning, not how well it can just hijack and throw notification in or create the bottom was bold or do these other kind of persuasive techniques, right. And so that's an example where the phone is asking you for reflection at an infrequent enough basis that it doesn't feel taxing. Let's say it's one thing once a month even. And then it reorganize is the choice architecture of the home screen to be most aligned with empowering you to make those choices. But again, now you're talking at the level of the city. This is not what anyone app does is what Apple says what Apple and do and this is my frankly, you need Apple and Google to make these changes or anyone who makes the platform whether it's virtual reality or augmented reality or an earpiece, they're in general, there's going to be these choice architectures that we need those platform makers, the urban planners to be made to be thinking about it this way.
  348.  
  349. Unknown Speaker 51:54
  350. Yeah, I want to talk about virtual reality and other technology but to get my bearings here is do you think Mostly in terms of mobile or is is the web just as much
  351.  
  352. Unknown Speaker 52:03
  353. as I'm concerned
  354.  
  355. Unknown Speaker 52:04
  356. with both because just the total time people spend on the screen every day is enormous. And it obviously takes place. Increasingly on mobile. The thing about mobile is it's just the most ubiquitous. So it's the thing that affects everyone, no matter what sort of socio economic background, you have,
  357.  
  358. Unknown Speaker 52:18
  359. you know, how the time is split between mobile and the web?
  360.  
  361. Unknown Speaker 52:21
  362. I don't off the top of my head. But, you know, obviously, there's there's people knowledge workers who spend a third of their day in email at a desktop. And there's, this is wide range. I The question is just, you know, for each one of these core screens, how well is the choice architecture aligned with what people want?
  363.  
  364. Unknown Speaker 52:35
  365. So now,
  366.  
  367. Unknown Speaker 52:37
  368. what do you think is coming? What do you think virtual reality will do to us? And do we get new concerns there or just the same concerns but more pressing?
  369.  
  370. Unknown Speaker 52:47
  371. Well, one interesting thing about virtual reality is, if you were to rate a medium in terms of how persuasive it is. So there's sort of a upper bound on how persuasive your mobile phone is, right? It can't convince you of a new belief system. Right? But actually virtual reality can.
  372.  
  373. Unknown Speaker 53:05
  374. It's been shown that,
  375.  
  376. Unknown Speaker 53:07
  377. you know, in looking at this whole problem through a lens of persuasive technology you're always concerned with, what are the dimensions of persuasion? Can I just persuade people's beliefs? Can I persuade people's attitudes? Can I persuade people's identity to think of themselves differently. And one thing that's been shown this guy, Jeremy balance and at Stanford, is that a bunch of experiments on this, that you can have someone in virtual reality cut down a tree, and to feel like the haptic feedback of this thing, jogging in your hands back and forth, and that that actually changes your relationship to paper, like to basically wasting paper. You can do some other things where people are embodied as like, their opposite gender, and they experience something like that sexual assault, but some kind of or opposite race or opposite ethnic background, and the experience some kind of embodied feeling or experience of someone looking at them as her way in that mode, and it changes their feelings about what those policy issues might be later in the world. And so VR is really interesting because it's the most persuasive medium that we that we have. And the problem, though, is that people tend to think about the future of technology as this being kind of an uncertain thing that we don't know what's going to happen. It's like a grassy field. And it's not that because the attention economy will still create this race for who's better at seducing your attention and keeping your attention and holding on to it. Which means that things that are more like porn or more like the candy crushes of the VR realm will probably outcompete other things in that realm. And the other stuff will exist, but it will be niche. And so what I would say is like, this is the opportunity now before those things come to be say, you know, if we had a different philosophical lens, on what rankings we would want the virtual reality App Store to have, we'd want to rank things in terms of what persuades people in a positive way. What is time well spent for them?
  378.  
  379. Unknown Speaker 55:01
  380. It's interesting because it on some level, what we're talking about isn't new at all and the way people were wasting time 2000 years ago, presumably, when you look at how the, you know, the people like the Buddha talked about the use of human life and the obstacles in this case to practicing meditation enough so as to have good experience doing it. Or do you know that the obstacles becoming a truly ethical person, it was the same dynamic of doing things that you will later regret doing things that you will discover were not as gratifying as they seem, and they were never going to be as gratifying as they seemed. And if you could have had a a wiser view, a top level view of the situation, you would have agreed to cancel some of those opportunities to squander your time in advance, right? Because, however captivating they are, it is just more soft drinks or candy or pour back and say, Sam, you're running Google, who are you to say that with The things we should put on the menu are and what stuff we should leave out because
  381.  
  382. Unknown Speaker 56:02
  383. we have to put some default choices on the menu. Right,
  384.  
  385. Unknown Speaker 56:04
  386. right. Yeah, so but so it's this ancient problem, except what we have now are technologies that can not only exploit our bugs to the if not the maximum degree to certainly to a new degree where we can be manipulated by others, to waste our time. But we can actually design technologies that will change not only what we wind up doing, but what we want to do. Yep. The fundamental question is like, what sort of person do you want to be and what sort of life do you want to live and what's what sort of life will you want to have lived when you're on your deathbed? Looking back? How much regret will you have? We know the answer to this question, how much regret Do you want to have? Everyone's answer is none. Right? Or as little as possible, they won't change the fact that they'll go for the donut or you know, go for the exhilarating experience cast on seen as a crazy line about this that, you know, just because there are these things people regret later. Doesn't mean life should be in his words long, dry and chocolate free,
  387.  
  388. Unknown Speaker 57:03
  389. right? You know that there is a value some of our peak experiences in life are these sort of impulsive moments. But there is a question of roundedness or frequency like, do you want that all the time? Do you want the default choices to always be set that way?
  390.  
  391. Unknown Speaker 57:16
  392. You really get this raising kids because you you're constantly in the position as a parent of doling out empty pleasures or even unhealthy pleasures to your kids, you know, as sparingly as is commensurate with your philosophy on these things. So whatever it is ice cream, do you have ice cream every day? No, right? But ice cream is a treat. And it's fun. And you don't want to live a life without ice cream as far as I can tell. So you're in a conservative way, exploiting your child's really bottomless capacity for joy around certain things which shouldn't in the end be the focus of life. You don't want to kid that when he or she becomes an adult, has no way to console herself or himself. But to go for a tub of ice cream, you don't want to figure out how to engineer that problem for them. But you still want a life with ice cream. And everything is like that on some level.
  393.  
  394. Unknown Speaker 58:10
  395. And the problem is the new ice cream could be just the number of likes that someone has. I mean, I can convince as so many teenagers are they've been convinced that their self worth or their popularity is the number of likes that they have,
  396.  
  397. Unknown Speaker 58:21
  398. you know, or Minecraft and I have an eight year old daughter who's now obsessed with Minecraft and I can't even figure out whether Minecraft is good neutral or bad for her. People have strong opinions on that. Tell me what you think about Minecraft. It's captivating to a degree that worries me right? It's like if I said Listen, you know what you can just do as much Minecraft as you want. She would just disappeared into my virtual space and never come out again. She's Snapchat. No. Okay, nothing nothing like that. Yeah, we will be late adopters of all of that but yeah, I can see that. You have the social aspect to this and and social social media. It seems to me Brings at least two levels of concern that don't exist in these areas. Well, yeah, we're just squandering our time. And the worst case scenario is for an you know, an individual siloed app or website, you're just misusing your time and you'll regret it. But with social media, you are. This, again connects to the fake news problem. You are very likely consuming misinformation that is manipulating you. And this is bad not only for you, but for society. And then you have this other level of your sense of self worth being leveraged and in many cases destroyed and at an early age, in ways that can be difficult to recover from, by your interaction with friends and strangers in this
  399.  
  400. Unknown Speaker 59:47
  401. one. This is this is where it relates to cults, right? Because cults partially manipulate you by orchestrating your sense in the social realm as well, right? Just like we're talking about Facebook ranking before you can imagine AJ Jealousy rank, which is a ranking of newsfeed or Instagram, that is optimized to make you the most jealous of everybody else, or to feel the worst about yourself. Like that. That's Yeah, that's just a mode you could create. And, you know, I've written about this, but there's these, we have not just this list of individual cognitive biases, but we also have these social psychological biases, we really care about others approval. So think of the moment where people post a new profile photo on Facebook, that's a moment where you're putting your whole sort of self identity on the line, you're like, my new photo. And so knowing this, I don't know if Facebook does this, but knowing this, oh my god, I would just say this is an easiest opportunity to exploit people's sense of self worth. So what I'm going to do is time delay, how often I notify you of new likes. So over the period of three days, I'll keep showing it to other friends strategically over the course of three days, because then they'll start to like it more. And then I caring about who's seen my who's liked my new thing. Will will fall back into the the loop right? So that's that would be one way of hijacking people's attention again, by controlling their sense of identity. And snaps. I don't know if how much you follow Snapchat, but they have this feature called snap streaks, which is, I think, really manipulative. Do you know?
  402.  
  403. Unknown Speaker 1:01:12
  404. No, I never use Snapchat. So
  405.  
  406. Unknown Speaker 1:01:15
  407. I mean, I don't either. But they added this thing called snap streaks, which shows you the number of days in a row that you have sent a message with someone. And so, you know, people actively Yeah, it's like, it's like in the meditation app, you'd have the same
  408.  
  409. Unknown Speaker 1:01:29
  410. right but not but I see the implications here where if you Well, there's there's a reciprocity issue when you're exact communicating with someone.
  411.  
  412. Unknown Speaker 1:01:36
  413. And the thing is that as Snapchat, you can manufacture reciprocity, you can know that that person over there is vulnerable to needing to reciprocate. Let's say you have a person that you know, their psychology specifically they have to say, Thank you back when someone says thank you to them, so you can just make sure that they see Thank you everywhere or Facebook, frankly, does this with the Happy Birthday thing? Right. It can sort of orchestrate someone into saying Happy Birthday to Someone else. So the other person has to come back into Facebook and respond. And this is actually how LinkedIn works, where they say, we're going to suggest to Sam that right after our conversation that he adds trust on on LinkedIn, because I'll know that I'll feel compelled to come back to you. And and again, this is like a cult, it's like you don't just control these individual biases, you can orchestrate people's whole social psychological biases without the million.
  414.  
  415. Unknown Speaker 1:02:24
  416. So we've spoken a little bit about what designers of the city should do what, whether they will do any of these things, and over what timeframe? That's certainly an open question. What should our listeners do? in light of the fact that nothing will get done today to help them Marshal their attention in a way that they will not regret?
  417.  
  418. Unknown Speaker 1:02:47
  419. Well, I mean, there,
  420.  
  421. Unknown Speaker 1:02:49
  422. I try not to get into the productivity hacks game. I mean, I do know a lot about that, that space. I think the biggest thing is to change culturally, the perception that technology is neutral, when we use just when we're using this stuff. We have to recognize that there's 1000 engineers on the other side of the screen whose goal in designing the way I'm looking at this screen now was not to empower me most to make the life choices of my time that I want to make, but empowering me to spend time on the screen, knowing that you will be able to spot some of these different techniques more easily because you can start to become aware of you know, what, what actually is my being steered to do in any given moment. Now, that's actually not a pleasant way to live. And, and that's why I think we, we actually need a new conversation about persuasion because we don't want to sit there in a world that's increasingly going to get better and better at persuading us and then be forced to sort of notice all these steering mechanisms in our lives and feel taxed and vigilant all the time. An ideal would be to be able to only deploy conscious energy when for the for the places for the for the choices that matter, and to not be forced to kind of steer away from the from the donuts, you can imagine an adversarial persuader Who puts donuts in your, you know, right next to your bedside table right when you wake up? And that would you could say, well, what's the big deal there? Because you could simply choose to avoid not choose it. But of course, we just do it, that would be disempowering. And we have to expend some amount of conscious energy. This is the Roy Baumeister stuff on willpower. There's some amount of energy that it takes to resist this stuff when it shows Yeah. And we don't want a world where as we're navigating to Facebook or something else, we have to constantly say, okay, shit, they're going to, you know, try to get me to do something else then this one task I wanted to do of looking up a contact or finding out where that event is tonight. And so, the whole idea behind like a sort of a time well spent world is that things are designed so that they're aligned with you. The biggest thing about ethical persuasion is that the goals of the persuader are aligned with the goals of the persuading.
  423.  
  424. Unknown Speaker 1:04:48
  425. persuasion is in most cases neutral and we want to be persuaded to do things which we will feel constituted time well spent, right? So if There are 10 things on the menu, which I really want to get done want to immerse myself in, there's no possibility that I will regret doing any of them. And two of them are readily captivating, which is to say that there's basically no friction in me that's impeding my paying attention to those things. But the other eight are, on some level, a matter of my eating my vegetables, as opposed to the ice cream. I want to be persuaded to do those things. Right, right. So we're not trying to get rid of persuasion. We just we want some wisdom in the system. We want to be able to tell the system what it should be persuading us to do,
  426.  
  427. Unknown Speaker 1:05:37
  428. right let's imagine if we're playing the AI game here. And we're trying to just imagine that we're training the AI to be hyper intelligent persuaders. Let's imagine you even give the AI and Encyclopedia of every magician cult technique that in the book, so it's literally we have an AI that can persuade you to do anything. thought experiment. That's sounds we then really care about what does it mean for that AI to persuade you ethically Or for the good, right? And to do that, you can say, Well, look it's causing, it's persuading you to click on this thing, and you seem to keep clicking on it. So you're keep reinforcing it. That must be good. But clearly, there's still something missing in that signal that the clicks weren't enough. Yeah. And so it has to be, I mean, in the API community, they call us the human in the loop computing where they keep the human in the loop of, say that automatically self driving car or something like that. This is essentially a reflective self in the loop, kind of persuasion. I care. Just like I said, with the phone example. It cares about what is time well spent for you in the morning. And it reflects that back to you and says, How is this going? And it only does that for the things that are meaningfully important in your life. It actually matters to do that reflection, but part of ethical persuasion, I think is is actually caring and in helping the other person reflect on whether they're getting kind of what they what they came for.
  429.  
  430. Unknown Speaker 1:06:54
  431. It's interesting that there is a disconnection between what people will say they want And what they actually want the drives through all of this. And so your your analogy to self driving cars reminded me of this fact that if you ask people, if you're driving down the street and you're going to hit a group of schoolchildren, or you can drive off a cliff, speaking generically, they think the car should probably drive off a cliff. But nobody wants to be in that car. People have to get better at choosing from the menu. Yeah, even in their most reflective moments. Well, people have
  432.  
  433. Unknown Speaker 1:07:29
  434. to have values. I mean, to your point earlier about, I think, you know, children I think that the dangerous thing in this happened with consumerism is where their values the businesses values became our values, which is successful advertising
  435.  
  436. Unknown Speaker 1:07:45
  437. that is successful advertising.
  438.  
  439. Unknown Speaker 1:07:47
  440. They generated now you desire or want the thing that they got you to want right? Now again, what's so bad about that if on reflection, you feel happy about that? That's one thing but there's still another thing which is when you realized that someone who manipulated you into doing that into wanting that. And you weren't even aware that that was happening. People sometimes have a different point of view about whether or not they feel good about that. And this is the cult thing, right? This is I was manipulated into an experience that then made me more free on the other end. And I feel really good about that. But then on retrospect, I feel like oh, I had no, they were manipulating me that entire time. Yeah, well, suddenly, people have a different and quality programming, showing people how they were manipulated, is one of the most powerful ways to help people get out of the programming that they were,
  441.  
  442. Unknown Speaker 1:08:37
  443. I guess, on some level, transparency of intention is crucial there, insofar as that relationship with other people is important. So if someone's trying to get you to do something, and you understand their motives, right, that you understand one that they're actually trying to get you to do something and to, you know why and your why is actually the Why well, then there's no problem right? Then it's a completely consensual situation. You're not going to feel manipulated. But you could feel pressured, you could feel you could feel a lot of things. You could have a very hard charging performance coach who is trying to get you to change and it could be uncomfortable,
  444.  
  445. Unknown Speaker 1:09:18
  446. but they could yell at you. They could know that you're traumatized by this particular form of confrontation. So they use that form of confrontation. That might be harsh, right. But you're happy that on the other
  447.  
  448. Unknown Speaker 1:09:27
  449. Yeah. And you'll be and you understand what you signed up for. Right? What gets interesting and seemingly demeaning. When there's this mismatch between what you think is going on and what is actually going on in the other person's head, not it doesn't matter if it's personalized, if it's just an algorithm, right? That in fact, is a black box that no one understands, but it's just designed to maximize clicks in the universe. That's one thing but if you have people who are you know, twirling their mustaches Amish Have you followed, like the Cambridge analytic type stories about political advertising and automated political ads
  450.  
  451. Unknown Speaker 1:09:59
  452. protesting
  453.  
  454. Unknown Speaker 1:10:00
  455. a little bit just in the you know, what's happened in this with the Trump phenomenon?
  456.  
  457. Unknown Speaker 1:10:04
  458. Yeah, so there's a lot of controversy over whether Cambridge Analytica specifically was so effective as it was claimed to be any election. But it really doesn't matter. What we're talking about is that there, there are metaphorically, things that are very good at using personalized, persuasive targeting, to persuade you about. I mean, the famous example in their deck was, if you're the kind of person who values tradition and authority, and kind of old school values and your mail, it would show you a sunset picture of a of a grandfather and a boy with a gun saying, you know, this is just like the homeland when we used to go hunting or something like that. And then if it's instead, you know, sort of a libertarian, you know, woman in Kansas or something like that, who really values a second amendment and they'll put a thing of Obama trying to grab your gun
  459.  
  460. Unknown Speaker 1:10:59
  461. or something.
  462.  
  463. Unknown Speaker 1:11:00
  464. And knowing what I would know about your specific psychology, I could persuade you towards a particular attitude or belief. And if you just imagine that that exists, and it's very effective, and that Facebook, hundred million times a second runs this auction, it's got Sam's eyeball. And it's basically saying, who's going to pay me the most to persuade Sam. And it doesn't care whether your intention is nefarious, or you want to help them or you're trying to sell them some shoes. It can't make this distinction. We don't have language in English for these subtle versions, the subtle distinctions in persuasion, what is the difference between manipulate, coerce, seduce drive steer? There is no vocabulary. And so this is what I actually think is is most important that I think your work is so related to this is how do we come up with the language of persuasion, where when we have an increasingly persuasive world, it's one that we actually want to live in. And the biggest thing is that the goals of the persuader are aligned with the goals. persuading, and that there's transparency, like you said,
  465.  
  466. Unknown Speaker 1:12:03
  467. Now, what was this recent decision in Congress to let Internet Service Providers sell all of our browsing history? And this is something that just happened. I think last week, just like last 24 hours, I think, yeah, and I haven't followed up. I'm sure there's going to be pushback against this. I don't know if there's a fair complete yet or not. But I can't imagine more than 10% of the American population would want this to be true. Politically, it's hard to see how this is this is something that is
  468.  
  469. Unknown Speaker 1:12:33
  470. easily accomplished. I only have to imagine that the telephone companies had funded politicians who, you know, yeah, signed off on this.
  471.  
  472. Unknown Speaker 1:12:42
  473. freak out completely over this and and it will get rolled back. Yeah.
  474.  
  475. Unknown Speaker 1:12:46
  476. I mean, I would hope so. I mean, as an example, I mean, people don't make the connection. If you try to sell someone and talk about persuasion, if you invoke the frame of, oh, they have your data, have your data, that data, the data, the data, big data, that doesn't mean anything to someone. America today, there's nothing that's sort of alarming about that. Some people hold on to it, they don't like the idea of their privacy being taken away. But data is kind of boring. But when you realize that your data could be used to figure out exactly what persuades you, like, let's say with this specific act and Congress, I now know which websites you visit. So I know which kind of news consumer you are, which means I know something about what persuades you, knowing that I can probably cross pollinate that information with something else, voter data or something else. And I would know exactly how to persuade you for the next election. And Facebook because it can't make a distinction between who wants to persuade you for good and who's trying to say manipulate you an election. Again, we don't have language for what kind of persuasion we would want to enable or allow, it has to simply 100 million times a second, let whoever pays them the most, put whatever message they want in front of you. And I think this really happened in the election and people, we haven't had a conversation about it. It's an existential threat to the belief systems that we have and I We haven't ever named or protected this other thing, which is what what is the kind of functioning of the mind or critical mind or a sovereign mind? If there is such a thing because of these deep questions about free will, that we would want to protect? Are there certain ground rules or Geneva Convention around persuasion that we would want to say hands off for this for this kind?
  477.  
  478. Unknown Speaker 1:14:21
  479. I think the the master value here Firstly, it comes back to the thing you said which is you have to have values. Yep. And one thing you have to value is the truth and write the fact fiction distinction, right, or the fact fantasy distinction. And if that is somehow up for grabs, if persuasion slips the rails of a reality based conversation, right, as it does in politics, and as it does in advertising as it does when people are just, they want to manufacture and spread mere sentiment. Not only do people not care about what's true than not caring is there's kind of an orgy of not caring, they're celebrating their immunity to truth and then they will claim fake news about news that they don't like, you know, claim Oh, something's a lie when they they're not actually even claiming it's a lie. They're there. They call it a lie, so as to disparage it. But
  480.  
  481. Unknown Speaker 1:15:26
  482. well, they're persuading. That's the thing is, at the end of the day, all of these statements, which we're calling free speech, are actually forms of persuasion by calling it a lie. They are invoking a persuasive transaction, which has an impact. Yeah, on people have different predispositions. Yeah. And that's the conversation. We're not having that people are persuadable. We don't even think of ourselves that way. And especially, you know, people who are who are whatever educated believe that they're not there somehow and part of their exceptional from this, this group there and they're not part of this. The system or profession, they can never be persuaded. But we actually have to live in a world where we realize that we're all persuade. I mean, Here's a good example. You know, there was a science Post article that said 70% of articles on Facebook are shared before even reading the article. Now the thing about this article was that when you click on it, the first paragraph was real text,
  483.  
  484. Unknown Speaker 1:16:15
  485. right? The rest of the rest of gibberish was gibberish. Latin any This test was
  486.  
  487. Unknown Speaker 1:16:19
  488. how many people shared this thing, right? And a lot of people shared it and it was a joke. But the point being the reason I invoke this example, is it something that actually all of us have a predisposition to believe, because all of us, and this this speaks to whatever your predisposition beliefs are, you can layer in whatever you want on top of that, and that's how our mind works. It's just this kind of, if you have a hammer, everything looks like a nail. And the question is, who said in the first layer of hammers, that then confirming the next layer of things? And so, you know, as a universal, everyone would be persuaded by an example like that. We haven't acknowledged that there's a way that our mind works. If you repeat something to someone multiple times, you're embedding that in their mind whether they want it to be there or not. If I go bump up, but I'm pumped, right? I mean, you fell on
  489.  
  490. Unknown Speaker 1:17:03
  491. your mind that I mean, the magic is all about these automatic processes. But we don't walk around saying, Yeah, I've got this massive, like a brain, you know, patient, you know, with a big hole waiting for anyone just to go in there and just pop something in. That's actually how we are. And we instead of talking about that, we just talked about free speech, instead of saying, there's actually some people who are tossing some pretty dangerous stuff in those holes, because they know the rules better than the other guys. And unfortunately, some of the rules are inconvenient for aligning your beliefs with reality. So for instance, there's the backfire effect that has been much publicized, where you give people evidence against their cherished opinion and they only read doubles their conviction, no matter what the quality of the evidence you give even more troublesome, there's a truth bias where even if you have only been told a story, in the context of hearing that it's false You will tend to have a false memory of its being true. This is I mean, this is Elizabeth Loftus his
  492.  
  493. Unknown Speaker 1:18:05
  494. research and it's also the same as the 70% Facebook thing, right? Like if I just repeated that 10 times, it actually is true that people share more articles then that before reading then after reading, right, but that number, if I just repeated that 10 times what I've done is made salient to your subconscious into your memory. That is the thing you're going to remember later when someone says, Oh, yeah, it's like, it's like 70% or something.
  495.  
  496. Unknown Speaker 1:18:25
  497. Yeah. But again, even if you only told it in the context of it is not true, right, as is widely believed. But this is not true. Instead of
  498.  
  499. Unknown Speaker 1:18:34
  500. Remember, a memory version of don't think of an elephant. Yeah, it's just the same thing. It causes you to think of an elephant. Yeah,
  501.  
  502. Unknown Speaker 1:18:38
  503. there are many things that are bugs than the non features about the human mind. And we need to have a systems level view of how to optimize ourselves.
  504.  
  505. Unknown Speaker 1:18:48
  506. To do that. We would want to have almost like a map of here all the ways people get hijacked, they're starting they're starting with the fact that people can be habit formed into an addiction with the slot machine, starting with the idea that you can manipulate people social approval or social comparison starting with all these things that you can do what then would we want the choice to it architecture and environment or the social environment to look like if we can have free controlled to place people and who and what and when? How would we want to be orchestrating in an ethical way? Because this is actually what's happening. I mean, like the the Facebook newsfeed that people talk about runaway AI but the Facebook newsfeed is already a runaway AI to its eyes, it's just maximizing the clicks and engagement. It never knew it was invisibly maximizing outrage, you know, polarization or fake news. And so that the humans don't even have control over this stuff. And it is persuading us because we don't have a language for what we want to what we don't want. And we can't wait to correct it at the other end. I think that a lot of the problems right now as people. I talked to some people in the industry who said, you know, well, it's, you know, you're probably right, but we're going to correct this later the society will kind of flip around or self correct.
  507.  
  508. Unknown Speaker 1:19:53
  509. But, you know,
  510.  
  511. Unknown Speaker 1:19:54
  512. the fake news thing for example, everybody scrambled on that after the election. Oh, yeah. So we need a kind of real time, we still
  513.  
  514. Unknown Speaker 1:20:02
  515. don't know how, obviously we know that if it had any effect and given how close the election was any effect was too much and rain and determined the outcome, but we still don't know the extent to which we were manipulated by bots and propaganda, and
  516.  
  517. Unknown Speaker 1:20:20
  518. we have no idea and that's why I think, you know, I know and Applebaum was just on the show. And you know, that's what propaganda is the same thing, right? You just compromise and all of these strategies, all these disinformation, campaigns, confusion, overwhelming people in information.
  519.  
  520. Unknown Speaker 1:20:33
  521. These are all nameable playbook equations. And you can find the people who consciously do these things.
  522.  
  523. Unknown Speaker 1:20:40
  524. Yeah, it exists. I went and I for the tech side, I went to school with all of them. And there's a whole industry that does it. There's conferences, there's books, there's one next week in San Francisco. I mean, there's this whole dark art form but because there is no language for it, and because we walk around with this illusion that we're not persuaded, but some of us are, some of us are not persuaded even. I mean, I sometimes feel like You know, if you imagine, you know, you could beam enlightenment down into an ant, and an ant could wake open its eyes and see all of the ways that it's an architectural worked, you literally gave it a full download of the matrix and showed it, here's all the ways it's pheromones are manipulated. And here's all the ways that it walks. And here's all the social rules. And at the end of the day, when it opens his eyes, and it knows all this stuff, the next wave of pheromones comes along, it's still inside of it, and body still wants to be in and it still wants to be in it. You know, we all are trapped inside of our architecture, and we just don't have a name for this architecture that we're inside of. And I think it's actually the existential problem, because when you talk about runaway AI and AI getting better and better and better, what it is, is it's something that is greater in controller power or intelligence, then the people who made it, and persuasion is kind of like that. There's something that can subvert my architecture. I can't close the holes that are in my brain. They're just there for someone to exploit. The best I can do is become aware of some of them, but then I don't want to walk around the world being just vigilant. All the time that all the ways my buttons are being pressed.
  525.  
  526. Unknown Speaker 1:22:03
  527. Again, this has always been true to some degree, but it's getting more and more true, you can decide to change your architecture insofar as it's changeable or manipulated, have the system manipulated live in the part of the city that will manipulate it differently in ways that you won't regret.
  528.  
  529. Unknown Speaker 1:22:19
  530. So that's a relevant form of agency is Does someone have the ability to move to the part of the city where they're not exposed to the manipulation they don't want to be exposed to,
  531.  
  532. Unknown Speaker 1:22:29
  533. right, that's one form of agency.
  534.  
  535. Unknown Speaker 1:22:31
  536. The thing I'm so interested by with our phones and with Facebook is that these are things that people live by, I mean, 50% of the US population, the Pew Internet and internet study, basically, the number one news source for 50% of the US was Facebook. So this is something that people are checking every day and runs their beliefs about the world. It's so invisible the way that you know, when you just take a breath and look at where we are in your beach like you know, that's a very different family illogical experience than the beliefs that get put inside me by look can get Facebook. And, you know, we just don't talk about this.
  537.  
  538. Unknown Speaker 1:23:05
  539. I think the issue that concerns you and this, you know, perhaps paradoxically, this is my concern as well, even though I think the notion of free will makes no sense. But there's a loss of agency, there's a loss of control over our lives is a loss of ability to get what we want out of a human life that we're both worried about. And you know, this is, I've talked about this in other podcasts I've written about this, and it will be a seeming paradox to listeners who haven't followed me down that rabbit hole, but you don't need to believe in free will to worry about a loss of control or a loss of agency or a loss of getting what you want out of life. And I don't know what your views on freewill are. But I
  540.  
  541. Unknown Speaker 1:23:46
  542. mean, I've studied a little bit experiments in the illusion of conscious will book by whitener and this kind of stuff. Yeah, and I do think that there is an illusion there. But I think pretend it's comments on your podcast that they're relevant degrees of freedom.
  543.  
  544. Unknown Speaker 1:24:00
  545. You take away. Yeah.
  546.  
  547. Unknown Speaker 1:24:01
  548. And what to me is so concerning about persuasion, if you actually really trust that persuasion is real, that that you can be persuaded
  549.  
  550. Unknown Speaker 1:24:09
  551. that, you know, there's even this stuff on.
  552.  
  553. Unknown Speaker 1:24:12
  554. People named Larry are statistically more likely to become lawyers, people named Dennis are statistically more likely to become a dentist. And people are more likely to like faces that look like their own, you can imagine a future version of an election where I can manufacture a candidate whose name and appearance resonate with the things that are truest for you, that it could just actually feel as meaningful or as real as something else. And we didn't talk about, you know, for example, the persuasiveness of these false video generation things now where you can actually generate live video of a celebrity talking and say anything along with the new audio version of that where you can actually with 20 minutes of someone's voice, generate them saying anything.
  555.  
  556. Unknown Speaker 1:24:56
  557. Yeah. Well, this is something I've commented on briefly in a joking way but this worries me because you can just you take two minutes sample of this podcast and then you can make me say anything in a way that will be
  558.  
  559. Unknown Speaker 1:25:11
  560. unrecognizable to you saying this, you know this
  561.  
  562. Unknown Speaker 1:25:13
  563. video of
  564.  
  565. Unknown Speaker 1:25:14
  566. Sonic. It's a YouTube video where someone took your podcast and has you saying really interesting gibberish, I'll have to send it to you.
  567.  
  568. Unknown Speaker 1:25:21
  569. But it was someone just did. I did a podcast with two podcasts with the psychologist Jordan Peterson and these were famously frustrating for for most listeners. And so someone has chopped those up getting us to say, just a string of outrageous things. But the edits are obvious enough that it's just pure comedy and I totally support this but the moment the this editing gets good enough so as to misrepresent me in a way that's undetectable, then it becomes a real concern. I don't know. I'm sure there must be some technological fix for this in the end,
  570.  
  571. Unknown Speaker 1:25:52
  572. whether it will and it's going to be a cat and mouse game arms race right where the guys who are good at simulating your voice and video will have to be matched by an AI that can detect and catch those things because it'll be outside of human awareness. Right. And this is what's so scary is that this is just an example of something can overload the architecture, the human architecture, your ability to discern that this is real evidence. Now that that already hasn't existed, as we both know. But But when you really take that all the way, the idea that persuasion can undermine your beliefs, your attitudes, your behavior, even some sense of identity, you can you can shift all that and then the way where we put authority after the Enlightenment, which was to put authority in the individual feelings and thoughts and beliefs of people to for elections for markets, both of these things are now subject to question. Because if you can persuade someone to have different beliefs about a product, or how much it'll benefit them, or how much it will make them happy, and it doesn't actually do any of those things, or persuade someone of a candidate, you know, with with all these techniques, and they don't do any of those things. This is really, really dangerous. And it under Mines where we've put authority in in our current age. And, you know this, this opens up some very big questions about what is the kind of persuasive world we want to live in, because it's only going to get better?
  573.  
  574. Unknown Speaker 1:27:12
  575. Yeah. And it cuts directly against the normative value of democracy or putting anything to the will of the majority. Because if the majority is just an ocean of attention, that can be diverted and manipulated. Now more or less at a whim and reliably, so as to guarantee a given outcome? Well, then not only is it not a good thing to have a democracy, it becomes just another cog in the machine of totalitarian control. Exactly.
  576.  
  577. Unknown Speaker 1:27:43
  578. And wouldn't we want to know, given that this exists, wouldn't we want to sort of walk around every day without understanding and now build the world and our institutions and our technology in service of that view of our nature, meaning, wouldn't we wouldn't want to forget all Now go back to business as usual, whether it's with technology that pretends we don't have biases that are slot machine type things that can get manipulated, we want to say, okay, that's a real thing about how people work. And we want to fix that, so that we don't manipulate people that way. And here's some ways that people are getting manipulated in elections and here's people be manipulated in marketing. And we do do that with things like the FDA, checking the claims of a of someone saying that marketing certain claims and benefits, right we actually say oh, we want to check to make sure that's true. And and essentially, we have win this race to the bottom arms race for persuasion, we were going to now need some other more nuanced, more philosophical set of checks on what kind of persuasion we want to live in the world. And what when we don't
  579.  
  580. Unknown Speaker 1:28:42
  581. so much of this for me comes back to honesty and the consequences of lying and ever what terrifies me about the current political environment is that not only is there no penalty for being caught line, it becomes a almost a point of pride. I feel like you have enough power and a sufficient lack of concern by how you're perceived by your detractors, that to be caught line to the world. Everyone knows I'm talking about Trump and his supporters here, but it's almost a singular example where his brand is not harmed by him, obviously lying to everyone right in a moment where he's believed by no one, it's not a successful lie. It's not a lie, which was crafted so as to be believed, even by the people who support him, right? It's just this naked declaration of I'm not bound by your norms of discourse. I am so successfully persuading he's persuaded the epistemological architecture.
  582.  
  583. Unknown Speaker 1:29:43
  584. Yeah, well don't eat that that doesn't even exist or matter anymore. Right.
  585.  
  586. Unknown Speaker 1:29:46
  587. The fact falsity distinction doesn't matter to me or my supporters, and we're winning. That's the structure of the communication. It's amazing to me, we have to get back to a place where the Out of harmony with what is demonstrably true, pays a penalty, right? And the value we have to all embrace is we have to care to be in register to the truth. And we have to care that especially we have to care when people who are in power whose decisions affect the lives of millions of we have to care when they're in register or out of register with what's true. And we have to care if they care. Right. Right. All of that has to be right, have a piece,
  588.  
  589. Unknown Speaker 1:30:27
  590. you know, there was something I've been talking to a professor at Stanford about propaganda and and actually, we're in the process of starting a group that would help define the sort of code of ethics for persuasion, would be great to talk to you about it, actually. And one of the things we were talking about is there used to be something called the democratic personality, or the democratic psychology in the 1940s, meaning that there was a kind of way that minds or a psychological mind would would, ought to work for to function as intended in a democracy We have to fulfill certain basic requirements. And I think this is something that you're really interested
  591.  
  592. Unknown Speaker 1:31:03
  593. in, you know,
  594.  
  595. Unknown Speaker 1:31:05
  596. openness to updating to evidence, being able to dispassionately say, what would convince me otherwise. Right? Right. That's that's a move a cognitive move that someone would have to be able to make. It's like dissipate. It's
  597.  
  598. Unknown Speaker 1:31:19
  599. basically like a circus, Olay level move right in the current environment, but it's the most basic feature of human sanity, you would think, to say, how would the world have to be different so that I would believe differently,
  600.  
  601. Unknown Speaker 1:31:32
  602. right, I do. And then what's weird is that there's going to be into this whole list of these kinds of moves that would be part of the Democratic psychology. And there's also a kind of developmental angle to this to where some developmentally some minds are going to be capable of doing that move, and others will not. For example, I mean, like if you ask a child before age for before it has theory of mind to do a thing that requires theory of mind, you can't do it. There's no way to explain it or convince They just can't do that. And there's going to be some moves like asking someone to simulate a thought experiment. Some people I'm sure you've had conversations with them actually can't do that move, they get caught up in the concrete details of them.
  603.  
  604. Unknown Speaker 1:32:15
  605. Right? That that could never happen. Unfortunately, discover the Noam Chomsky appears to be one of these people. But unlike many other things, where people where people's deficits are obvious, and they will acknowledge even the holder of the deficit will acknowledge the consequences of not being able to do certain things. Athletics is the perfect case where people not only have a fairly good estimation of their own athletic ability, but they will recognize that they're nothing like the best in the world. I mean, you don't have people walking around imagining that there's good as anyone on Earth at golf or tennis or basketball or anything right unless they are pretty close to being as good Anyone on earth President Trump that? I don't know, right? Hell yeah, he has actually said some insane things even in that domain. But in intellectual and moral space, you tend to get confusion here where it's people assume that they are reliably doing what even the best of the experts are doing. And this is kind of related to the idea. No one believes that they can be persuaded or the Dunning Kruger effect, right, and some variation of that everyone more, but what is it like 90% of people think they're better than average drivers? Yeah. But and also, this does. I think the stat that reveals that this moves into a fairly high level of education, at least is I think it is 95% of college professors think that they're above average professors, right? I'm sure there's about 30 years old that
  606.  
  607. Unknown Speaker 1:33:45
  608. it's just a universal, this it that's the thing about this, I mean, I didn't calling it persuasion this whole time or the architecture, but it's really, it's just the universal ways that will, let's say overestimate something or that we would assume that we have the moral or cognitive moves that everybody else has.
  609.  
  610. Unknown Speaker 1:33:59
  611. There's going That'd be some curriculum where you would not only expose these problems in yourself, but you could address them some part of the educational system for everyone should be revealing these glitches and human psychology and addressing them in a way that improves people. I guess you will get to some interesting liminal case where some mismatch between your view of yourself and the true statistical view of you is actually beneficial to you so that perhaps believing that you're a little better than you are actually makes you a little better, right? And in certain cases,
  612.  
  613. Unknown Speaker 1:34:40
  614. like you're persuading yourself in an ethical way, because it leads to you reaching for the thing that you may not have
  615.  
  616. Unknown Speaker 1:34:46
  617. yet. I guess it's just there may be cases where too much information about yourself psychometric Lee in some way or another could be a bad thing. And certainly knowing that other people have this information About you could be a bad thing. You know, just pedagogically, people who know much more about education than I do, would need to vet this but it just seems like you want to collapse that distance, as much as is good for us the distance between private delusion and right belief that's uncoupled to reality and reality itself right then a part of this sort of, we've been calling it the democratic psychology, but a reasonably functioning mind would not have such wide gaps between what they believe to be true and what what actually is. And they would care to again, you have this, this master value of human flourishing not for lack of a better word, but you would care to collapse that distance to the degree that you lead a life worth living. And again, knowing that your estimation of what is good is one of these things is constantly open for refinement and dialogue. Someone could come into this room right now and convince us if we have these sorts of minds that were advertising, that many of the things we value. are not the things we should value, there are better things to value. Right? And, you know, I certainly believe that I'm open to argument on almost almost anything I value. That's a conversation I want to have what we what worries me is that so much of our public life, certainly all of our politics seems to be optimized for not allowing conversation to proceed in a direction of changing minds and anything like a systematic way that produces good outcomes, right, that gets people to convert people start with at some distance from this, you know, the same value of the same, even just the same fact space. I mean, just climate change, you know, let's talk about climate change and have a rewarding conversation, right? I mean, that we can't even talk about right the value of facts, or the value of expert opinion. And it's just completely destabilizing from a political point of view. And I think also,
  618.  
  619. Unknown Speaker 1:36:55
  620. linking it with the technology conversation. Where are these kinds of con perceptions that we're talking about the style of openness to evidence and discourse that leads to a converging of values, and a negotiation of what actually matters for that value. Where is that happening? on, say, Twitter or Facebook or whatever people are having conversations? Where is that not happening? Let's imagine that Facebook being I use go back to it so often No, no, you don't use it as much. But that being the place where people are having these conversations, I would say, Where could we help work like what are the play we can build a classifier that that notices when people are having these kinds of open vulnerable reconciliation story values based conversations, just passionate like build a dispassionate openness to truth identifier and just find those key phrases when people do that move.
  621.  
  622. Unknown Speaker 1:37:45
  623. You can imagine
  624.  
  625. Unknown Speaker 1:37:46
  626. them making a tweak and suddenly those things are everywhere. You know that that actually is the thing that gets rewarded in the current thing just says wherever people are talking a lot to each other.
  627.  
  628. Unknown Speaker 1:37:56
  629. You could have a an additional button in addition to Like or 100 even know what the buttons are on Facebook but you could have something something that that indicated that your mind was changed yeah I love that something yeah by a tweet by the article that was linked or by write a comment
  630.  
  631. Unknown Speaker 1:38:16
  632. well this is the thing people miss is that there's this whole like, like landscape of opportunities meaning that's one example of so many the face that people mistake these these services as being static and fixed Twitter's just twitter facebook just face they're just trying How else could it be? How could it be there is a just insane set of positive possibilities if that if the people that were at those companies had say hire a bunch of nonviolent communicators and philosophers and say what are the what actually constitutes the conversations we want like Cora you know, the website has a thanks button so when someone provides an answer you like you say thanks to different gesture than a like in a comment in our share. I mean, another gesture might be you know, when you post something controversial, it could be organize a dinner to talk about it. You know, can be right there organize a dinner a super lightweight thing. And people just add themselves right there underneath the story and they can just,
  633.  
  634. Unknown Speaker 1:39:06
  635. you haven't dinner on Tuesday,
  636.  
  637. Unknown Speaker 1:39:08
  638. you know? And the question would become, again, if we're steering people's choices about their time. We don't just have to use the time now we can be scheduling all sorts of things in the immediate versus in the longer term. And all of these choice architectures are possible in digital media, if the people making them
  639.  
  640. Unknown Speaker 1:39:25
  641. ask the deepest question, which is what would be the
  642.  
  643. Unknown Speaker 1:39:28
  644. time well spent for this set of people talking about this thing? Right. I mean, it is just astonishing how different it could be. Yeah, we were thinking differently.
  645.  
  646. Unknown Speaker 1:39:38
  647. But I think I think there are two parts to engineering that change. One is to expose how it is and why it's that way to make us less comfortable with the status quo and and you got it at that earlier on when you were talking about if you put the frame around this. This is your day on outrage. Right? This is
  648.  
  649. Unknown Speaker 1:39:58
  650. the outrage San Diego
  651.  
  652. Unknown Speaker 1:40:00
  653. That reveals that people are are selecting for a variable that they weren't maybe had never considered. And maybe maybe it took some years for even the engineers to realize that clicks meant outrage rather than something else. Right? Rather than desire satisfied, right? But then there's just envisioning the changes that we could make that would open up, right space, you know, in the intellectual space social space in ways that no one has thought of yet.
  654.  
  655. Unknown Speaker 1:40:26
  656. And that's both a design question because someone has to be designing and putting those verbs those actions those other choices on the menu on the screen. And it's also the metrics question. I mean, this whole thing about what is the AI maximizing the paperclips thing, right you know, it maximizes paper clicks until it turns the earth inside out. It's like, well, the Facebook thing is currently maximizing clicks and attention until it turns the world and outrage. Right. And, and so, with with the AI The question is, what else should that AI be aware of? What other values should it be listening for tuning into so it's Not just maximizing the one thing. And this is what you know, you can ski and all these other guys are talking about when they say there's this this, how do we build in these other values, these comprehensive value, complex value mixture models of saying, well, this is how much this matters for conversation? How much does long comments matter? How much does putting evidence in their matter? How much does the head nodding? Or the you change my mind button, you know, feature matter? Right. And each of those are philosophical conversations. And I do want to say that, you know, one of the structural problems right now, is that if you view this attention economy as this city that these three or four companies basically built,
  657.  
  658. Unknown Speaker 1:41:39
  659. that city isn't
  660.  
  661. Unknown Speaker 1:41:42
  662. we don't have public representation in that city. You know, when we see a pothole, or we see this, this sort of A intersection of people getting in a car crash all the time. It's like, the equivalent of that in Facebook, as people are getting into sort of weaponized arguments all the time. We don't have a way of saying
  663.  
  664. Unknown Speaker 1:41:57
  665. that's not even acknowledged to be a car crash. It's I was more action less
  666.  
  667. Unknown Speaker 1:42:02
  668. platform people will do what they do. It's No, that's not true. It's that the size of the text box matters, whether the three actions being like, comment or share versus something else versus organize the dinner conversation, that one click button to get together.
  669.  
  670. Unknown Speaker 1:42:16
  671. Yeah, I mean there. There's also even confusion over what it means to like something or to forward something that people have forwarded or like things that they actually hated. Right? Just because that's the way you spread this thing you hate that you want everyone to say,
  672.  
  673. Unknown Speaker 1:42:30
  674. well, that's actually literally what outrage is right? You share it and then comments. I can't but can you believe with that said, Right. I mean, that's that's what that's why that outreach thing is so powerful.
  675.  
  676. Unknown Speaker 1:42:37
  677. I remember there was a adorable moment of confusion. I was in a debate with this. Sometimes comic, full time. Muslim apologist Dean Obeidallah, I think we were on we're on CNN together and he had posted the video on his YouTube page. And the video got a ton of likes, and he thought because it was getting so he actually said this on this on Twitter, the aftermath of this was somewhat bloody. And he thought that the the number of likes he was getting suggested that people thought he had won the debate, right? I mean, because we had disagreed, you know, viciously. Yeah. And as far as I could tell 99% of people who saw that video thought he had not acquitted himself well, but he actually, he defended himself on the basis of how many likes he was getting, it was nice as
  678.  
  679. Unknown Speaker 1:43:25
  680. well. And this and people can be confused and all sorts of
  681.  
  682. Unknown Speaker 1:43:27
  683. ways around this. Because you know, a lot of a lot of teenagers think that the number of likes that they get on a post is representative of how valuable they are. In fact, I don't know if you know this, but they people actually, like delete their post. People will post a photo and Instagram, if it doesn't get enough likes, they'll actually take it down because they feel like their self worth is threatened. That's how seriously these things have mediated people's values. Just like we talked about consumerism, their values became my values, their representation system, of the number of likes that I got, became the way that I value myself either in fact Did I am a robot, they've just drilled a hole in the back and they have got me I now value myself based on that.
  684.  
  685. Unknown Speaker 1:44:07
  686. And
  687.  
  688. Unknown Speaker 1:44:08
  689. this is why people like to your point, I think, need to see first that this is a deliberate in deliberate service of maximizing attention and engagement. It's not all built to help us. Well, let's change it. Let's do it. That's what we're working on it.
  690.  
  691. Unknown Speaker 1:44:24
  692. No doubt they'll be more to talk about in the future because I'm barely using social media, apparently. And I have spent about 15 minutes in virtual reality. So you'll come back on the podcast when when everything is much more matrix like this at present. Look forward to it. Hopefully it pays. That'll be in about six months, you know? Well, thank you, Tristan. Thanks for having me. Before we go. How can people find out more about you? Are you more on Twitter you more Where? Yeah, I'm on Twitter at Tristan Harris. Yeah. Is there any irony and pointing people to your Twitter page, given what we've just said?
  693.  
  694. Unknown Speaker 1:44:58
  695. There is I feel always on comfortable, but how much I use these products. But the thing is, again, there's incredible benefit to them as well. It's just a matter of can we align, they're designed to be more aligned with all these things that we care about
  696.  
  697. Unknown Speaker 1:45:09
  698. trust on Harris. Well be continue. Thanks. And
  699.  
  700. Unknown Speaker 1:45:17
  701. if you find this podcast valuable, there are many ways you can support it. You can review it on iTunes or Stitcher or wherever you happen to listen to it. You can share it on social media with your friends, you can blog about it or discuss it on your own podcast, or you can support it directly. And you can do this by subscribing through my website at Sam harris.org. And there you'll find subscriber only content, which includes my Ask me anything episodes. You also get access to advance tickets to my live events, as well as streaming video of some of these events. And you also get to hear the Bonus questions from many of these interviews. All of these things and more you'll find on my website at Sam harris.org. Thank you for your support of the show. Its listeners like you that make all of this money.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement