Guest User

Erik Torenberg: Upstream of the Revolution

a guest
May 6th, 2023
51
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 198.18 KB | None | 0 0
  1. When I was getting started in tech, you know, 12 years ago, you didn't really see 20-year-olds,
  2. 22-year-olds coming out of nowhere and being able to build audience, you know, social credibility
  3. from people who matter. You kind of had to get attention in different ways and you'd often it was by
  4. by building something really interesting. One thing I think to really appreciate about dating for
  5. men is that it gets a lot better as you get older and I think it's hard to fully appreciate that.
  6. You asked me about SF politics and I said to mostly, you know, so I'm interested to the extent that
  7. there is a platform where I want to describe it a little bit. It's something like it's okay to be an
  8. elite. It's okay to want to be elite. It's okay to want your children to be an elite. Hi, hi,
  9. welcome. Welcome. This is The From the New World Podcast. Today is an episode I've been hoping to
  10. bring to you for a long time. I'm speaking with Eric Torrenberg, the founder of Ondeck Village
  11. Global and as we'll discuss in the episode, first employee at Product Hunt. He's also started three
  12. new podcasts, upstream, cognitive revolution and moment of zen on which I've appeared as a guest. We
  13. discuss a variety of topics. Really, you'll see why I see Eric as one of the most insightful public
  14. commentators as also someone who and also someone who gives good advice on many of the topic areas
  15. that both me and hopefully all of you are interested in. We discuss this early career, venture
  16. capital. Silicon Valley is a growing media power, particularly the all-in-podcast, the ideas of
  17. Curtis Yarvin, investing for public impact, social desire ability bias, elite theory, his idea of
  18. talk left and act right, and egalitarianism, artificial intelligence, and dating and advice for
  19. young people. Really, we've hit almost all of the from New World bullet points that you're familiar
  20. with. So, it's a great episode. It's almost four hours and I really enjoyed all of it. So, without
  21. further ado, here's Eric Torrenberg. So, you told me you were just coming in from another podcast.
  22. What was the most interesting thing you talked about there? I recorded two podcasts, so I will give
  23. you two different bullet points. So, one I did with someone in the creator space, in name is Steph
  24. Smith, and we were talking about how the Mr. Beastification of creators were basically creators are
  25. treating themselves as entrepreneurs in the sense that they're not just focused on how do they build
  26. the biggest content machine, but they're focused on how do they build their content as a wedge to
  27. get distribution, to then build products, to then sell to their audience. So, Mr. Beast has his
  28. company Feastables, which made 100 million in revenue already, where he's just selling. I believe
  29. it's burgers, maybe it's chocolates. So, it's kind of like more basic products, but Mr. Beast just
  30. raised money from investors that over a billion dollar valuation. And so, the question is, in order
  31. to do that, investors need to believe they can have potential to 100x their money. So, can Mr. Beast
  32. build or partner with people who build software products? And it kind of turns the sort of venture
  33. creation model on its head, where people typically focused on building products first, and then
  34. finding distribution. But now, in partnership with creators, companies can get distribution first,
  35. and then build products on top of. So, that was one conversation. And then the second conversation
  36. that I paused was with Mike Solana for a moment of Zen, and we were comparing the response to AI
  37. with the response to crypto. Because, for the past few years, the response to crypto, the mainstream
  38. response is, hey, this doesn't really do anything. What's the use case? This is a all hype, et
  39. cetera, or worse fraud. And the response to AI is kind of the opposite. Like, this does too much.
  40. This is going to take over. This is way too powerful. And another difference is that a lot of the
  41. critics in AI are technologists. So, the call is coming from inside the house, so to speak, where
  42. there's kind of an intra-tech debate as to whether we should be accelerators or pausers or doomers,
  43. whatever you want to call them. So, those were two conversations. Right. It's interesting, because I
  44. have, I mean, especially now, I'm trying to really circulate through the beltway through DC. And
  45. that's not the picture I get about AI policy there. You know, like AI policy on one hand, there are
  46. some people who are worried about really just fitting it into the same old political shoe boxes. You
  47. know, what if the AI says, you know, racist facts? And then there are people who are, once again,
  48. fitting it in their old shoe boxes, like, oh, what does this mean for great power competition? And
  49. actually, that I'm a little bit more amenable to both in terms of their kind of people who I find
  50. more respectable. And I think it's a much more serious issue. Like China is a big deal, but is it,
  51. you know, is it the entirety of the deal when it comes to AI? Is it really the kind of showing point
  52. that everyone should be focusing on? I don't think so either. So I think it's the case where it
  53. might be true that among the people who are building the technology, this is the kind of mainstream
  54. discussion. It's like, oh, is AI going to create an apocalypse? Are we going to get run out of
  55. control, misaligned intelligence? But I think like when it comes to the question of whether AI is,
  56. or whether and how AI is regulated, there'll be much more of an influence from the kind of like
  57. academic, you know, quote-unquote, woke types, more so probably at the FTC. And then you'll get, you
  58. know, I'm not sure if they're still considered right-wing anymore, but at least people who you would
  59. use to consider right-wing national security hawks at, yeah, at Department of Defense. Where have
  60. you netted out in terms of your interests with AI? Because, you know, early on, you were, you know,
  61. writing about kind of the excess of chat, you know, the woke access of chat GPT, etc. And you
  62. haven't been as vocal about it since. I'm curious how you've evolved your thinking or as sort of the
  63. chessboard is evolved to. Yeah, it's kind of been focused around what I just said, right? I can, and
  64. I do say like I do like talking to the people who listen to the podcast or who read my writing. I'm
  65. very good friends with them, but at the end of the day, there's kind of like two things that I could
  66. do, right? I could spend more time doing basically public conversations or public writing or like
  67. private conversations and private writing. And at least from my current point of view, I think the
  68. latter is just, you know, thousands of times more effective, like literally, you know, one second
  69. spence, you know, talking to say like a national security contact, right? Is more likely to have
  70. influence than like one hour writing a blog post. And, you know, there's like a there's like a mean,
  71. right? I'm going to still be working on the newsletter and working on a very long article about
  72. actually specifically about the kind of AGI effective altruist concern and my skepticism on why
  73. machine learning progress is not going to be kind of continually exponential is not going to
  74. continue at the rate that it is for much longer because of many specific technical factors. We can
  75. talk about this a bit if you'd like, but I really like to dive into actually some of the stuff that
  76. you've been working on as well. Sure. Yeah, I'll let you go. Right, yeah, I just want to make sure I
  77. just want to make sure because some people are really interested in the AGI arguments and then some
  78. people aren't, right? So you've been a very successful investor already. You have multiple companies
  79. as well. And now you started a podcast. It seems like it seems like to me the opposite trajectory of
  80. a lot of people might where they will start a podcast will gather some kind of social media
  81. following and use that to leverage and use that kind of contact. But like you said earlier, a bit
  82. like you said earlier, get the distribution first and then try to build a startup afterwards. So why
  83. do you think, why do you think, you know, like the best use of your time is doing a startup. Do you
  84. think that or sorry, not a startup doing media starting three podcasts? Do you think that? And if
  85. so, why? Yeah, it's interesting. When I was getting started in tech, you know, 12 years ago, you
  86. didn't really see 20 year olds, 22 year olds coming out of nowhere and being able to build audience
  87. and, you know, social credibility from people who matter the way that, you know, you're able to
  88. today, the way that DoorCache is able to today, the way that a number of people are able to today.
  89. And so you kind of had to get attention in different ways. And often it was by by building something
  90. really interesting. And the thing that I got a little bit of attention for and I was, you know, the
  91. first employee founding team, not the CEO, was product time. And so that's what put me on the map in
  92. the sense that you've gotten on the map via via your podcast and via your writing. And I used
  93. product hunt to, you know, be able to help entrepreneurs, basically, because we could help them get
  94. customers from, you know, product time is a discovery platform for startups. It's almost the
  95. equivalent of being a journalist in that you get to give people traffic, but without any of the kind
  96. of, you know, emotional loading of or connotations and negative connotations of being journalists,
  97. only the positive parts. And I would say it's closer to something like Expedia, right? We're like
  98. Expedia ranks hotels and flights, product hunt ranks, you know, software startups. Yes, yes, and
  99. highlights them. And the, I used that to build a network of entrepreneurs. I then started investing
  100. in those in entrepreneurs that we're doing well on product time and using that as a, as a way to get
  101. on cap tables. And that's, you know, you know, when startups raise money from investors, the
  102. collection of investors is known as the cap table. Right. So you basically get in the door, you're
  103. investing early in some tech startups. Yes, if you're, if you're in the music world, it's like ANR
  104. for record label, you know, you're discovering artists, you're discovering startups. And some
  105. investment firms would give me money on their behalf to invest because I had the access and because
  106. I had the relationship. And so my, my advice to anyone who wants to be an investor is to find a way
  107. to build a deal flow machine that makes them money. Because if they can build a deal flow machine
  108. that makes them money, they can sustain themselves. I either make money part. I paid them a salary.
  109. And they can get access to deals, which they can use the money to invest. It used to be that you had
  110. to work your way up the totem pole of a busy firm, which could take 10, 20 years. But because of
  111. their, these new ways of building media, whether it's a newsletter, where you're an expert on a
  112. different category or a podcast or a, or kind of a, you know, community ranking. You started for a
  113. rating site like we built a product on. It's easier to get in the door and connect directly to
  114. entrepreneurs and investors will really respect that. And so what I'm doing now is basically
  115. realizing how helpful it was to have built product on for my investing career, how helpful it was to
  116. have built on deck for my investing career. And you can think of on deck as a, for people who don't
  117. know, it helps people find co-founders. So if product on helped startups find customers, it gets a
  118. distribution, get traffic, on deck help them find early hires and co-founders. And so I'm interested
  119. in building more tools like on deck and product on to have an investing advantage as we will get
  120. into venture capital in a bit. But in, you know, money is green and money is a commodity. And
  121. there's thousands of investors out there all competing for the best entrepreneurs. And venture is
  122. different than other asset classes in that the entrepreneur picks the investor. And so you have to
  123. be, you have to bring something to the table. And so if you don't bring, you know, 20 years, 30
  124. years of operating experience at the highest level, which I didn't, when I started, I have a little
  125. bit now, but certainly not to what I just described, you need to have actual products or service
  126. that make and help entrepreneurs at a meaningful scale. And it was, I don't want to ramble too much
  127. about VC. And but one thing. Don't worry, this is a pro rant podcast. Yes. I'm sure I'm sure a lot
  128. of people in my audience want the insight about VC. Yeah. So I am building a podcast network and
  129. also a newsletter network because I think it's basically this trend that that I mentioned, people
  130. are starting to, you know, companies are built audience first. And so I want to build different
  131. media properties in different categories for different positions. And I want them to be able to
  132. access these distribution, you know, these customers that are previously hard to reach or otherwise
  133. hard to reach and then invest in companies that sell to them or build companies that sell to those
  134. customers. So right now I'm starting kind of more, you know, general interest, more, more high level
  135. to build the, build the audience. But I'm going to launch a series of more dedicated, specific,
  136. vertical specific podcasts. My first one is an AI one, but also a position based podcast that, you
  137. know, we've seen this in the sports world. There are these like media conglomerates, like the ringer
  138. or bar stool or the athletic. And tech media hasn't really innovated much. There've been a few
  139. individuals. Lenny Vichitsky who's got a product management newsletter. Harry Stebings has a VC
  140. podcast and he's only like 25 years old and he's got $400 million of their management just because
  141. he's built an amazing VC podcast. Pachy McCormick is another example. And so I want to, I believe
  142. that there are more of these people out there who are practitioners, who are experts, who can make
  143. more money doing what Lenny, Pachy and Harry do it, then they do at their sales job, or their
  144. engineer job, or whatever job they currently have, and have a more interesting life. And so I wanted
  145. to aggregate those people, help fund them, get them off the ground and build a collective of them.
  146. And in some way, it also intermediates the journalists, too, because these people are actual
  147. experts. And it's expert to expert sort of, you know, content where as opposed to just a, you know,
  148. general middleman who's not an expert at all. And it is often, you know, in a different class and
  149. has counter incentives in some ways. And so it's, it's very funny. I remember talking about this
  150. with Jonathan Rouse. She kind of describes journalists. And I think like this is actually accurate
  151. to some degree. He compares kind of like journalism and social media and like journalism as being
  152. between a more kind of like, he doesn't use this word, but this is how I interpret it, kind of
  153. elite, basically elite consumers of news who have basically norms around, basically like filtering
  154. out a lot of, I think like this is, in fact, the filter of low status, but I think like correctly
  155. filters out a lot of incorrect information with some biases, of course, of information that they
  156. failed to filter out. But like when I was speaking with him, I actually had, you know, I had the
  157. same understanding of kind of tech media, right? That a lot of tech media is kind of like insular.
  158. And it's insular in a similar way, I think, to how, how kind of legacy media is insular. And that
  159. you have these concentric circles you have with you have the general public. They have certain
  160. predispositions that lead to bad media patterns. And then New York Times comes up and they, and like
  161. Richard and Ania kind of convinced me of this, right, or mutual friend. And he's convinced me that,
  162. you know, there are a lot of virtues in New York Times, Jonathan Rauch as well. There are a lot of
  163. virtues that kind of circumscribing this smaller circle in which you have to observe these norms.
  164. And then like tech media, I think like does kind of the same thing in creating an even smaller
  165. circle, like a lot, not a lot of people, I think, can really listen to like a group podcast with
  166. Nathan Lebenz, cognitive revolution, and really get everything, right? Maybe some people can listen
  167. to it and still enjoy it and get a lot of things. But to me, it feels like very inside of me in a
  168. way that I think is like generally positive in terms of its effect on the content there. But like
  169. the reason I'm saying this is I all lead to like this one one big trade off, right, which is kind of
  170. private power or leap power versus public power, right? You want to, on one hand, create, you know,
  171. if you're going for Mr. Beast, then you're creating this kind of public really like any man appeal,
  172. or if you're going like cognitive revolution, or if you're going even further than that, I don't
  173. know. Actually, I'm not completely sure on like a tech example or like a tech example actually, like
  174. a machine learning example is just like papers, right? Like like like you're building like neurics
  175. or something, right? Where would you put your project around that? Like is it closer to the elite
  176. end or is it closer to like the public end? I would say most of it is closer to the elite end if
  177. we're talking about like, you know, how technical versus versus accessible. But I mean, the best
  178. things, you know, all in, for example, is appreciated. The tech podcast is appreciated by insiders
  179. and it's broad, meaning and it's like deeply popular. It's one of the most popular tech podcasts in
  180. the world. I think it's the one or two. And so I think, I mean, that in some sense, like, you know,
  181. I wonder if it's a false trade off. If all in and others are able to achieve both, like, is that
  182. really what the, what the, like Lex is another example, right? Like, I mean, all in is truly
  183. insideery and that they get in the weeds and stuff Lex is more like what's the meaning of life and
  184. stuff. Yeah. All in, I think it's like a better example because, right, I listened to it and I see
  185. it as, you know, like I often have to follow up a lot, especially with a lot of the financial stuff,
  186. I'm less experienced with that and just almost everything that Friedberg mentions with like the
  187. biotech. I'm always, you know, having follow up Google searches. To me, it feels like a very, yeah,
  188. that, that I think is actually a spectacular example. What do you think is the secret of their
  189. success? Because I do feel it as like a, it definitely gives very strong kind of elite vibes to me
  190. and to like not just to me, but to a lot of other people who I know. But it also, like you said, it
  191. has like just empirically, it has kind of mass market appeal, right? I think there's a few things
  192. going on there. First off, they're undoubtedly successful and they're undoubtedly, like they're more
  193. successful and more intelligent than other people doing tech podcasts or tech journalism. And then
  194. they're also good friends slash have good banter. And that's entertaining. And you get a window into
  195. their, you know, their personal lives a bit, which are also very entertaining. The personal kind of
  196. like, you know, how they spend their wealth. So it's always interesting. Like hearing, you know,
  197. sentient millionaires talk to each other, kind of candidly is not something that most people hear
  198. about. But then they're also, you know, quite good at describing current events in ways that are
  199. easy, easy-ish to understand. So you feel like you're getting smarter listening to it. You're
  200. enjoying it. But then also, and I think one big thing is they've introduced basically like, they
  201. have moved the over to window pretty significantly. And they're not like a right wing like podcast
  202. or anything. But they like to give you an example of the rest of tech media tech crunches main
  203. podcast is called equity. Like that just gives you a sense. Is that a pando, you know, like raising
  204. equity? I think they mean it in the reducing disparities. That is the number one interview question.
  205. I'm not half teasing, but you know to ask people like, what does equity mean? It's either reducing
  206. disparity or upsetting companies. And that kind of tells you what the emphasize tells you what you
  207. need to know. And like, Chimoff, you know, like a year ago or something said like, you know, equity
  208. is problematic or whatever. Like I don't believe in equity or whatever. Like, and that like, you
  209. know, it's interesting because you came into this world in the last few years. But relatively
  210. recently, like, it would have been interesting for you to come in like 2014 or 2015, like every
  211. podcast episode you have would have gotten you canceled. I'm teasing, but I say that to say that the
  212. Overton window was just so different. There was just such a strident, like narrow window of what was
  213. acceptable. And to question equity would have been insane. And I think a lot of tech people, there
  214. was just this wave of massive preference falsification because people saw what was happening in
  215. their companies either were getting kind of like destroyed by activists or, you know, and they saw
  216. what was happening in San Francisco if they lived in San Francisco. And like, okay, that's getting
  217. destroyed. And they wanted to push back. But anytime they did push back, they were met with, oh, are
  218. you a Trump supporter? Like, and faced between the option of like, you know, on their knees, you
  219. know, like apologizing to leftists or being seen as a, you know, normy, you know, rude Trump
  220. sympathizer, they would rather do the, be on their knees. And once that kind of went away in terms
  221. of the inspector of Trump went away, basically it became okay. There was just this wave of massive
  222. preference falsification that all in helped kind of pierce where they were just saying common sense
  223. things. And a number of people were like, I actually believe them. And so it just gave them a new
  224. voice. I think mixelana has done a fantastic job of this as well of not being like overtly right
  225. wing, but just being common sense. You have a common sense being funny, being smarter, like, just
  226. better on every dimension. And I think that's why both salon and partwires and all in have just
  227. blown up because they offer an alternative perspective that just resonates more with tech people
  228. than the one that journalists has because they are tech people. Right. What do you think of the
  229. market, a marketplace of ideas hypothesis, right? The ideas that is that, you know, if you just let
  230. people discuss the best ideas will ultimately win out. It does not seem that the best ideas or it
  231. does not seem that the truest ideas win out. You had Curtis on your show and he has written and
  232. talked a lot about that. I mean, there are a lot of untrue ideas that, you know, seem to pass on for
  233. many, many generations. So I think the marketplace of ideas maybe makes sense for like fit ideas,
  234. ideas that just like have some sort of, you know, unfair advantage. But it seems like, and this is
  235. what I would say, you know, the Jonathan Roush's of the world, you know, he wrote kindly inquisitors
  236. in 1992. And I think that was like a really seminal book. That's like the best case for like free
  237. speech, best case for classical liberals and best case for marketplace of ideas. But it feels like
  238. it's turned out that, you know, when you're advocating for free speech, you're not going to censor
  239. anyone, you're kind of putting a hand behind your back. And when you're, when someone sees that you
  240. play by those rules, they will, you know, leverage those rules against you. They will, you know,
  241. censor, they will encourage free speech when it supports their aims. But when they are using speech
  242. that, you know, when they're using a liberal methods and you know, the only way to fight them is to
  243. use a liberal methods back. They will say they will use your liberalism against you. So I mean,
  244. there's a couple of different responses to it. I don't think true ideas when I think fit ideas win.
  245. And you know, if someone plays behind with a hand behind their back, they're often going to lose.
  246. And that's what happens to be a classical liberalos or happens to libertarians. It's what happens to
  247. many people in the in the gray tribe who are not willing to, not really willing to fight. Yeah, it's
  248. interesting to me because at least my interpretation of the past few years is that as kind of the
  249. limits of censorship have really been reached by, you know, like vaguely left, like I didn't even, I
  250. wouldn't really even call them left wing at this point. But you know, like people concerned with
  251. quote unquote disinformation. As the limits have been reached, they kind of like actually improved
  252. their arguments. Like, for example, just Bidenism over Clintonism, right? Bidenism is like a genuine
  253. innovation, both in terms of affect, like a lot of right wingers to listen to this podcast underrate
  254. this. No, but a lot of normal people, normally even like center right people, you know, really, you
  255. know, like the Biden vibe, they like the Biden kind of, almost like appeasement, right? That's what
  256. it kind of seems like, right? Not in terms of foreign policy, but in terms of like, you know, I hear
  257. you guys, like really, like maybe still disagreeing with right wing voters who might meet on the
  258. streets, but basically saying, you know, like he wants to live in the same country as people who
  259. disagrees with him. Now, like you might argue, like it's fair to argue that that's not actually
  260. reflected in these policies, but as a kind of like political figure, right? I think like people,
  261. people under analyzed Bidenism, right? People are still analyzing like Trumpism. Like, I think
  262. there's a reason Biden won. And I do think Biden won. And like it's actually like pretty obvious, at
  263. least like the places to get started in thinking about Bidenism. And so like to return to the root
  264. topic, like the reason why I bring this up is like, there's this quote from the last psychiatrist,
  265. another kind of online blogger, who says, who said knowledge is a defense against not having power.
  266. So like people who don't have power, they make like logical arguments because that's basically the
  267. only thing that they have. And I think this is like a very good description of EA of effective
  268. altruism. But yeah, I do think that's like a better predictor of kind of what's where good arguments
  269. will emerge from than just the mainstream like marketplace of ideas hypothesis. So where will they
  270. emerge from? They'll emerge from the places that that have reached basically what they can do with
  271. power, right? Basically, if I've done all I can with hard power, I'll do what I can with soft power.
  272. This is the idea. Yeah, it's a few responses. So one, I think right when people have to be surprised
  273. with Biden on a few, not just vibe, you know, topics, but also actual policies like in terms of his
  274. hard line against China and some other things. It's funny, right? Like people who are more right
  275. wing than me, they want to be, you know, tough on China. Actually, I think that the trade war stuff
  276. is probably self-destructive and has worsened inflation. But it's interesting. Yeah, like people who
  277. are, you know, right enough to support, like, or like people who support basically with the Trump
  278. policy in China, like Biden has emulated a lot of that. Yeah, I remember Curtis said before Trump,
  279. before 2020 that he wanted Biden to win because, you know, the way he evaluates candidates is
  280. whether they help their friends or my friends and punish their enemies. And he saw Trump as helping
  281. his enemies and hurting his friends, which is to say that even on, if you had Trump's goals and aims
  282. or sympathized with them, you'd prefer Biden. And at the time, people, you know, people were very
  283. skeptical of Curtis's claim, like, oh, it sounds clever and cute, but, you know, winning by losing
  284. doesn't seem to make much sense. But if you're, you know, anti-woke or right wing to some degree,
  285. you have to admit that the culture has radically shifted in your favor since Trump got out. And so
  286. it is interesting. Like, if you think that the, if you think as Curtis does, that the president has
  287. very little power and, you know, a fraction of the power that they should have, then maybe Curtis,
  288. you know, maybe that view keeps making the argument that, hey, let's get more Biden's people who are
  289. effectively, you know, ineffectual, but don't present the same boogie man to the left that enabled
  290. kind of excess leftism to expand significantly as we saw in 2016, 2020. Here's a question I want to
  291. ask. So, so how much of the post, you know, post-Trump new right, do you think is like downstream of
  292. Curtis? Do you think like agrees with him one is like major precepts? I think, you know, there's the
  293. quote Anne-Ran has like all of, I'm sorry, about Anne-Ran where her heroes are fake, but her
  294. villains are real. I think similarly, like it's like Curtis's solutions are fake, but his like
  295. identification of the problems are real, like his analyses are real. Like, I think he's convinced a
  296. lot of libertarians to be more like, like, of why they're losing. Doing more right wing. I think
  297. he's given people a mental model for governance, for how governments work that makes sense. I mean,
  298. I think he's been very influential in helping tech people just understand politics and power and the
  299. kind of, you know, the combination of corporate and public power combined as it terms of like
  300. crowning winners and cementing, you know, incumbents. And I think Twitter is this kind of great
  301. culmination of it where it's a thing that Curtis has, has, you know, wanted for a long time, which
  302. is someone to take over an existing entity. In his case, he wants the Curtis wants a government. In
  303. Elon's case was Twitter. And just say, hey, you know, there's a new sheriff in town. And actually,
  304. you're not verified in New York Times. And actually, Dogecoin is going to be the, like, just
  305. basically, like, shit all over the symbols that they hold dear. And, you know, I think to some
  306. degrees, he's done the playbook to some degrees. He's probably bashed it significantly. But, you
  307. know, does that happen in an era without Curtis? I don't think so. You know, I think that, you know,
  308. does Elon know who Curtis is? I don't even know. I don't, he certainly hasn't read Curtis probably.
  309. But I think it's been filtered through a number of people. So, yeah, I do think Curtis is pretty
  310. influential. I do think he's not getting credit for that, probably for reasons that people don't
  311. want to publicly associate with him. But I think that, now Curtis has drastically cleaned up his
  312. image. And I think he's done a fantastic job in the past, like, a couple of years. So he's changed
  313. things for himself. He's not courting controversy in the same way that he was perhaps in the past
  314. decade. But I think, I think you see, you do see a new tech right. It doesn't use the words right.
  315. And it shouldn't. And it doesn't, it truly is not, you know, there, it's not right-wing in a sense
  316. that it's pro choice. It's pro gay marriage. It's probably pro immigration. So I think on
  317. meaningful, you know, policy decisions differ significantly. Is libertarians who care about power a
  318. good, a good label? It's, it's, it's not a bad label. I'd first say gray tribe or something, because
  319. it's not libertarian. Like, they're not, they're actually want to use the state. You know, it's
  320. like, even Tyler Cowan, like, he's not a libertarian. He's a state capacity. What's the difference
  321. between state capacity libertarianism and populist nationalism? I think it's, it's like two things.
  322. I mean, like, I'm not, you know, I don't want to speak for Tyler, but I will do so anyway. I don't
  323. know, like, this is not necessarily representative of what he actually thinks, but is my best guess
  324. at what he thinks. I'll say that, right? Like, he would distance himself from, from kind of like the
  325. new right or from kind of nationalist populism by kind of, he basically thinks that's the populists
  326. are far too skeptical and are far too antagonistic towards people who are left-wing. He will use
  327. much more and compromise. I mean, Tyler does. And I would think that's the main difference. Yeah. I,
  328. I think it's mostly a brand, like, you know, conservatives have a bad brand. Popular nationalism has
  329. a bad brand. It has a low IQ brand. It has a kind of reactionary brand. I mean, just look at Richard
  330. and Ania's discussed towards, towards them. And, and so, but so it's, it's great. It's trying to be
  331. something that's different. It's a little bit, you know, like one way of putting it, it's the midwit
  332. meme. And they're on the other side of the, the, the, the high IQ side that kind of has some similar
  333. conclusions to the common sense lower IQ side, but just doesn't want to be associated with them. So
  334. they, they needed new word. And it, I think this is where libertarians used to be, but I, I think
  335. the main difference here is that they realize just kind of like how infeasible, you know, and
  336. unlikely libertarianism is, and how we actually do need a function in government. We need state
  337. capacity. We need a government to get out of the way in a bunch of different places, but we also
  338. need it to do the things that it does well, like, you know, handle crime. So yeah, I would say more,
  339. more great tribe that, that understands power. Right. I would say that like a big, um, a thing that
  340. libertarians don't get credit for is that they kind of had the political theory, right? Like this is
  341. just like public choice theory, right? Yeah, a concentrating benefits dispersing costs. It's very
  342. much, that's actually quite similar to the Burnham, to the Burnham understanding of political
  343. theory, right? James Burnham, someone who's quoted a lot nowadays, especially in those kind of like
  344. great tribe circles. But what's what's most interesting, I think, is that like, this is this is a
  345. central question to, yeah, this I think is actually the central question to like what should the
  346. strategy be? And like the thing that actually separates kind of like, you know, new rights and
  347. libertarians is like, Democrat exceptionalism versus Republican exceptionalism, right? Like, like,
  348. the libertarian view is that, you know, Democrats are just much better at creating people who would
  349. actually be willing to work for the state. They're much more, basically like, predisposed for
  350. geographic and psychological reasons. And, you know, the Republican equivalent of Lena Khan is not
  351. going to spend a decade toiling away to be, you know, FTC chair. He's going to become a very
  352. successful entrepreneur and maybe that's better for society even, right? But it means that they're
  353. not going to have as much power and the best thing you can do is just destroy the federal
  354. bureaucracies and defund them. Or like the new, or like the new right view is that actually this is
  355. due to specific strategic decisions, the reason why the right loses in bureaucracies is because it
  356. does not have basically a talent pipeline for creating, for creating people who would actually take
  357. these perceptions or for, you know, finding them, recruiting them, helping them along their journey.
  358. And yeah, to me, like, it's not, it's not one or the other, right? Both of these things matter to
  359. some extent. But what do you think? Where do you fall on the kind of spectrum there? Well, firstly,
  360. let me just say we were, you know, you mentioned effective altruism earlier and effective altruism
  361. is getting bashed a lot these days. It's punching bag and Antonio and my podcast. You're seeing what
  362. the regime press is doing to Eliza Yukowsky is just so sad. Like, you know, like, I am one of the
  363. people who makes very strong arguments against the kind of A.I. Doom things. And like, man, it still
  364. feels, it almost feels like a false victory, you know, to see really, like, basically, it's
  365. slandered coming from, you know, Vox and from time and whatever, right? Or not time. Time actually
  366. published Yukowsky, but like a lot of mainstream press or, you know, regime press are really just, I
  367. think on this issue in particular, it just like hits their sweet spot of like, oh, we're just going
  368. to like drop all pretenses of being intellectually honest and caring about the facts. Like, they
  369. could have written in like an intelligent, like, I'm going to write an article, very long article,
  370. like specifically because this is a difficult problem to argue for, you know, putting a lot of time
  371. into writing a clean article. Like, if you're the New York Times, if you're like, supposedly like an
  372. educated, intelligent person, like, that's what you would do. And instead, no, they're just writing
  373. really, like, I think this is, this is the affect of this podcast that I think makes it special. Is
  374. that like, Richard Nonia, like, has like this, he describes like left-wing ideology as kind of like
  375. a disgust at like right-wing, right-wing aesthetics or right-wing preferences. Yeah. And like, I
  376. think like this podcast is kind of like the nested version of this, right, where it's like, and not
  377. just like, not just simply disgust, right? But that is the vibe, or like, contempt, contempt at like
  378. this, this like, not quite egalitarian, but like, status-oriented way of approaching things that
  379. like, is I think like closer to like, if you think of like truth and whatever we're comparing it to,
  380. there's maybe, you know, like a 10% correlation of like, whatever the, you know, whatever the
  381. conservatives are doing, like, like, really like the, like, the normie comment, like the boomer cons
  382. are doing. And then like, maybe there's like 20% correlation with whatever the New York Times doing.
  383. And, you know, looking at this, it's just like 20%, really, and you guys, like, dare to call
  384. yourself, you guys really dare to like, want to censor, it's just, it's just so childish and lacking
  385. in foresight. What did it go you off? No, go ahead. Let me say a couple of things. So first, and we
  386. can return to this, I just want to say, it's, you know, effective altruism has been a punching bag,
  387. but there's another universe where the markets didn't tank, and SBF was still, you know, worth $30
  388. billion, or whatever, FDX was worth, and EA had tens of billions of dollars under its direction, and
  389. could actually accumulate some real political power. Let's say, you know, Biden wins the next
  390. election, and SBF's the biggest donor, and he's, you know, he's where Tiel was in 2016, he's putting
  391. all EA people in government, although, you know, Tiel probably could have done a lot more of that,
  392. but it probably weren't that many people to put in. And so EA is actually an interesting example,
  393. because it's both a punching bag, but it's also like, they got close to something. They got close to
  394. something closer to something than I think libertarians have ever gotten close. Like EA, no, no, go
  395. for it. I don't, I don't see any picture of DC where you can, like, like, I'm honestly, like, do
  396. you, like, my first instinct was like, do you actually believe this? Like, I don't, no, especially
  397. like democratic policy is so inert, and like, well, I guess EA was basically democratic policy.
  398. Like, like, I guess what I'm saying is, I see EA the movement getting power in that world. I don't
  399. see like, um, EA pushing back on any core, you know, democratic principles, or, or, you know, DNC
  400. principles, but I see them getting their pet causes in their, you know, animal rights. I don't know,
  401. like, things that the DNC doesn't really care about, or is, sure, have this. Mike, this is a
  402. hypothetical, but I mean, we can also bet on the current case, right? I would bet, you know, pretty
  403. insane odds that, you know, that the next democratic, whenever the Democrats want to do some kind of
  404. enforcement regulation on AI, it will be on, like, quote unquote, like equity grounds instead of on,
  405. um, EA grounds, right? Like, they're like, for every, for every staffer who's kind of convinced by
  406. the AI doom arguments, there are at least 10 who are convinced by like the disinformation argument,
  407. and like another 10 who are convinced by like the, the equity are like, I don't, like, just on an
  408. empirical level, I just don't believe this is true. We're saying different things. I agree with you,
  409. and everything you just said, I don't think you, Lazar Yutowsky, although we'll go, it's got to him
  410. in a minute, would be like, you know, having some role, you know, like, yeah, it's just too smart,
  411. like, like, this is the main thing, right? Like, when, like, when Hananiah calls, like, the
  412. Republican party, like, the stupid party, like, like, here's the thing, the democratic party is
  413. like, impermeable to smart people, like, it is, like, it's, it's entire kind of like,
  414. credentialization, and routinization system is like, a system specifically filtering out, like,
  415. people who would actually be kind of like, people, people exactly like, like, Lazar Yutowsky,
  416. actually, this is the thing, like, the, the, the regime press specifically, as like a mechanism, if
  417. we were to construct like an archetype of the person that the regime press is meant to politically
  418. assassinate, it would be, Lazar Yutowsky, this kind of like, extremely smart, obviously, like, I
  419. disagree with him on technical issues, but like, passionate, you know, focused, and honestly, like,
  420. somewhat autistic, you know, like, the democratic party is kind of anti-intelligence in that way,
  421. and the Republican party is like, maybe, you know, like, I kind of agree with, or like, I think this
  422. is empirical evidence that, you know, like, the average, like, the median Republican voter is like,
  423. slightly lower IQ, and that's fine, but like, the elite levels of the Republican party, both
  424. contain, you know, many people, even in our, like, mutual circles, right, contain many people who
  425. are, you know, these exceptional people, and are, is just so much more permeable, like, this is the
  426. reason why I just like, completely disagree with this characterization, like, the democratic party
  427. is a party that cannot be smart, well, the Republican party is like, a party of, like, high
  428. variants, and to me, like, like, you're familiar with, you know, like, the, like, the founder
  429. theory, you know, like, sound more various stuff, right? Like, to me, like, if you are a smart
  430. person who, who wants, like, some kind of career, like, let's say you just have, like, no, kind of,
  431. like, aesthetic preference, right? I think it would be, like, abundantly clear that the Republican
  432. party is kind of like the home for where you would, where you would go for that. A couple of
  433. responses there. One is the, I think, EA would end up looking more like, like, think more Bill Gates
  434. than Elijah Kowski, right? Like, Bill Gates, he's very smart. Sure. You know, I've been in rooms
  435. with Bill Gates, and he's talking to founders with different, in different industries, and he knows
  436. the industry is better than the founders. Like, so, it would be this corporatized, you know, DNC's,
  437. sanitized version. I think what EA had, meaning EA, in my opinion, had transcended Elijah, like, in
  438. a bad way, had to transcended Yudkowski. I mean, Richard Hennie wrote about this, right? He said EA
  439. is, like, in a fork in the road, and they could, you know, go woke. I, you know, they, they must be
  440. anti-woke or die, right? Yes. And I think they had already, I think, Richard Hennie was, you know,
  441. posting that, like, hopefully, but I think that decision had already been made. And I think, you
  442. know, short of SBF, having, you know, an Enron plus made off level, you know, extinction. I think EA
  443. was on the path to kind of DNC establishment level politics. There's nothing to do with you.
  444. Yudkowski, EA name only. It's like, you know, you know, talk about it. Yeah. Yeah. I was, although I
  445. think it actually does go deeper than that. Like, I was arguing about this with Rocco, like, EA is
  446. fundamentally based on the harm principle. It's, it's fundamentally based on like, JS Mill, you
  447. know, like, how do we stop people from being harmed? And I think that kind of, like, neurotic affect
  448. is intrinsically left-wing. Like, like, maybe it's smarter, right? Like, I agree, or like, I
  449. definitely would agree that it's like much less dumb and not subject to kind of, like, the specific,
  450. you know, conspiracy theories that the left-wing establishment believes. Yep. But I still think it's
  451. intrinsically left-wing. In the same way, that kind of like, you know, like, I don't know, it's hard
  452. coming up with a version on this with the right, because like, a lot of the right is also kind of
  453. intrinsically left-wing. Like, Christianity is sort of intrinsically left-wing. But, um, yeah, in
  454. the same way that like Nietzsche, this is a good example. Like Nietzsche, you know, like, would not
  455. necessarily, you know, support either party. I think I talked with Brett Anderson about this. We're
  456. not necessarily support either party today, but I think, like, by that kind of affect is
  457. intrinsically right-wing. But, sorry, go on. One thing we have to wrestle with, or any person has to
  458. wrestle with is that, you know, most competent people, whether they're running big corporations, if
  459. they're, you know, technical engineers, if they are, you know, running hedge funds or very
  460. successful in finance, um, most of them are like, left-wing. There are some that are disagreeable
  461. and just seeking truth above all else, but most people at that level tend to be, tend to be left-
  462. wing. And not just, like, in the last five years, although it's obviously been, like, all of them
  463. were less, like, there's been such a polarization. But in general, they tend to, to be more left.
  464. And so, to your point earlier about it filtering out smart people, I think it filters out, like,
  465. disagreeable, truth-seeking above all else. But that is separate from, like, deep competence and
  466. even, like, technical competence. And so, um, those things... I think you can filter... No, I don't
  467. mean that, like, the Democratic Party says, you know, like, if you're a smart person, don't vote for
  468. us. But you, if you want to have, like, the maximum impact as, like, a smart person, if you want to
  469. enter and change the direction of the Democratic Party versus the Republican Party, it's
  470. overwhelmingly a Republican Party. Like, like, for example, it's just, like, hair trade, like, Peter
  471. Teele versus, like, Bill Gates. Right, like, Peter Teele, I, I, from... Oh, and where is
  472. significantly less wealthy than, uh, Bill Gates. Yep. And I think, like, this is true intro to
  473. Democratic Party, too, right? The Democratic Party is much more, uh, amenable to, kind of, cultural
  474. shifts in Bill Gates. Like, like, just look at, like, how much, like, Bill Gates really wanted to
  475. talk about, like, basically, like, the global poverty stuff, right? He was, yeah, he was kind of,
  476. like, a proto-eA, um, or in the same direction, or, like, some of the pandemic preparedness stuff.
  477. And we can look at the results for themselves, how much, you know, pandemic preparedness stuff did,
  478. uh, Bill Gates actually do, right? Like, next to none. Yep. Right? Like, like, in practice, that,
  479. like, that amounted to, like, next to no impact, in fact, maybe even negative impact on the US's
  480. pandemic policy. Um, which I mean, like, I don't, you know, I'm not one of those people who are
  481. angry at Bill Gates for trying. But, like, you know, maybe, maybe donate to Republicans next time,
  482. you know? The things that get you power aren't necessarily the things that, you know, make you be
  483. successful once you have that power. Um, and so. Yes, exactly. This is exactly what I mean. This is
  484. the way that, like, the way to win within, like, the democratic party. And like, this is true within
  485. the Republican Party elections specifically as well, right? In both parties, the way to win in
  486. elections is to be like, dumber than you actually are. But like, in, like, the way to win, like,
  487. marginally, in like, democratic policy is also to be dumber than, than you actually are, right? In
  488. terms of the race stuff, cert, for sure, in terms of being anti-market. Um, and this isn't to say
  489. that, like, there aren't some people who can win despite their kind of intelligence and despite the
  490. fact that their policy preferences are intelligent, right? Like Tyler Cowan, um, as we're clients,
  491. been one of the best faith actors, he's been more amenable to basically, like, oh, a lot of the
  492. Yimbi stuff is a great example, right? I'm not saying that it's like completely futile. I'm just
  493. saying, like, just do the pair trade here, right? It's not, it's not that you can't accomplish
  494. anything within the democratic party. That's the much stronger position than I actually believe.
  495. It's that just like, basically, like, the impact ratio is just going to be so much higher in the
  496. Republican party. Like, like, the Republican party is so much more open to, like, good, new and
  497. unique ideas. It's, um, there's a greenfield there, for sure. And that is, we did bring two things
  498. on the table. One is, you can compare Tyler Cowan and Richard Hennania and their relative impact at
  499. the moment as a, it's part of a broader conversation of how truthful one should be, because, I mean,
  500. Tyler is a, you know, epic intellectual. He's also much more straw-seeing and much more careful. Or
  501. at least, that's my read as someone who's admired of him and also, you know, he's been great to me
  502. and his friend. And then Richard Hennania is much more truthful, almost to the point of, you know,
  503. is he seeking controversy sometimes? Like, I can, I can rely on Tyler to maybe hold back and, and
  504. Richard to maybe lean in, like, a bit more. I mean, Richard has said this publicly, right? He said
  505. this on Twitter that he kind of has, like, a reaction to audience capture. Like, like, like, the
  506. more his audience is, like, made up of a certain group of people, the more he notices, like, the
  507. stupid things that they believe. Right. Yeah. The, um, and then, in terms of the Republican thing, I
  508. mean, it's interesting because Vivek, who's now running for president, I just had him on my podcast
  509. and I'm going to release it soon. And it's going to blow my audience's mind in a, in, meaning
  510. they're going to be like, wow, who is this guy? He's really controversial. Vivek is really
  511. interesting because he is, you know, certainly an elite. He's been a very successful entrepreneur.
  512. He went to Harvard, Yale. He's very smart. Um, he's, um, an Indian guy. Um, and he is almost like
  513. running, he's trying to like, up level Maca stuff, basically. I mean, he's not saying the election
  514. was stolen, but he is like going on all their issues, he's pro-life. He's like anti-climate. I mean,
  515. he's leaning into culture war issues and taking the right wing side, like to the end degree. And,
  516. and it's interesting to see how he's going to be perceived because Trump did that. But Trump was,
  517. was way more of them. And, and of course, people would say, oh, Trump's a billionaire or whatever,
  518. you know, or he's not a billionaire. But like, how was he, you know, speaking to mainstream America?
  519. Well, he had their affect. He had their like, uh, craftness. He had their, like, he was a man of the
  520. people in vibe and style. And it just happens when you're like a real estate, you know, person, you
  521. do actually communicate with, um, and work with, you know, normal people. And you build up a, uh,
  522. you know, ability to connect with them. And you do your point about Biden. Like Biden has done that
  523. too in his way. And so, but Vivek has not done that, right? Vivek has been on an elite track, um,
  524. for his entire life. Now, he's from Ohio and, and, you know, I don't know, I'm super well. So maybe
  525. I'm underwriting something. But he, Vivek comes across as slick, um, whereas Trump came off as like
  526. much more smooth. And, you know, where people might resonate with Vivek on a policy issue, or
  527. they're glad that he's fighting for them. I wonder if there's going to be enough emotional
  528. resonance. But last thing I'll say about Vivek is he's just so interesting because he didn't need to
  529. do this. Like, he could have just been a successful entrepreneur and investor. Like, you see Joe
  530. Lonsdale, you see Peter Teele, right? Like, you know, these people don't get into politics directly.
  531. They just, they get it into indirectly, right? They start companies that sell to the government and
  532. solve the problems. They, they get their friends in office. You know, people they think are
  533. credible, um, in, you know, in the case of Peter Teele and JD Vance. Um, and they start like policy
  534. orgs in the case of Joe Lonsdale. Vivek could have done that. But Vivek instead decided to be a
  535. culture warrior himself. Now, he's so wealthy. He doesn't need to like make more money. But it's
  536. interesting to see in the next few years, which path is going to have the most success? And it did
  537. bring a full circle. Like, Vivek is optimizing for distribution first. Like, he's trying to build
  538. almost like a Tucker Carlson level audience. And then what's he going to do with that? Like, mate,
  539. you know, so I don't think he wants to like, this is what's interesting, right? Like, I don't think
  540. Vivek, like, I don't, I don't know him personally. Um, but like, you can kind of see Vivek. You as
  541. like the Andrew Yang of 2024. Right? Like, someone who's obviously way too smart for his party is
  542. way less is way too like nonpartisan for his party. Um, and this isn't to say that Vivek isn't, you
  543. know, like not a real Republican or something like that. I think he's like Republican, but he's
  544. nowhere near. Like, he said, he's not that ex, like, he, he doesn't want to lean into kind of like
  545. the culture war fights. I think he's much more technocratic than that, just from like what he said
  546. publicly, right? And on podcasts and stuff like that. Um, no, I think, I think Vivek is upping his
  547. tone. So he, I mean, when it was main points, so I think there's some ways in which he overlaps with
  548. Andrew Yang, both, uh, you know, very, very smart, both, I think it's actually key that they're
  549. Asian, you know, they're not working on back. Um, that was West Yang's like, uh, you know, reason
  550. why Andrew Yang could win because he could be a unifier. Um, I, I think there's some, and they're
  551. both entrepreneurs and, you know, respected by entrepreneurs. I think there is a, um, key
  552. difference. I think Andrew Yang did have more of a man of the people vibe and Andrew Yang to your
  553. point was, was non, nonpartisan. I mean, Vivek is, you know, he's trying to like, end affirmative
  554. action. He's trying, you know, end the climate religion. Like, Vivek is becoming very aggressive in
  555. his tone on certain issues that are even to the right of other, um, you know, right wing. Um,
  556. people, now he's not going to say the election was stolen. He's not going to say, you know, like ban
  557. all immigration. Like he's too smart for certain dumb issues. But where, where the midwit meme like
  558. makes sense, you know, and, and maybe it's not affirmative action, maybe it's on some other things.
  559. Like, Vivek is going all in. So, um, in some ways, it seems like a key difference. Right. Like, I
  560. don't think, you know, practically, if this isn't done by law, because there's a chance that, or
  561. sorry, this will, if it's not done by like Supreme Court ruling, because I think there's a non zero
  562. chance that it's done by the Supreme Court ruling. But I think like that there's, there's, there's
  563. just so much energy. Whoever is elected in the next candidate, even Trump as, as kind of like, you
  564. know, incompetent as I think that Trump is. I think that kind of like, here's the funny thing,
  565. right? Like the Republicans to talk about, you know, like the deep state, like the Republican
  566. version of the deep state, whatever that is, is actually like, I think kind of a hero in the
  567. circumstance that they're just in enough, like there's a critical mass of think tankers and staffers
  568. on the Republican side that like, Republican, or that affirmative action is going to get severely
  569. curtailed. Like, it's not like, literally inevitable, but you know, it's like very, you know, I
  570. would put it very high in the probability distribution that, so yeah, like the vector is kind of
  571. like, it's interesting, because I think he has, I think he like, first of all, I think he's like,
  572. Red Garvin and he's Red Burnham almost certainly, right? I think he's actually like reference
  573. Burnham and like some podcast interview, although I might be misremembering that. Yeah, he knows,
  574. you know, he knows it's not really a democracy. So like, why are you running for president if you
  575. know, it's not really a democracy, because there are people in think tanks, because there are
  576. people, policy staffers, congressmen, senators who who'll notice me, because I'm running for
  577. president, and I'm, you know, actually having some, some rise in polling, I think like that's the
  578. answer. Maybe I'm projecting a little too much, once again, I don't know Vivek personally, but like
  579. that's my best guess. Well, the, the, the, the, I don't think he's actually trying to win, win the
  580. presidency. Sorry, go on. Well, yeah, it's interesting, because Andrew Yang, it's like, okay, you
  581. run for president, and then, you know, you lose, and then you're a media entrepreneur, or, you know,
  582. a media person, and that's a pretty nice life. But I mean, you know, Andrew Yang was an accomplished
  583. entrepreneur in his own right, but Vivek was on another level of accomplishment. And so like, you
  584. know, Joe Lonzail or Peter Teo, I put it, you know, like Vivek is very smart and accomplished. I put
  585. in maybe a similar camp if Vivek kept on the tech path for another two decades. Like, they don't
  586. want to be media people. So meaning like if Vivek loses, like having Andrew Yang's career is, is
  587. probably not what he wants. And so I think he thinks he's going to, well, I think he thinks he wants
  588. to win. And I think he thinks he's maybe going to win at some point in the future. You know, not
  589. now, but maybe, you know, eight years from now, 12 years from now, I don't know. But I think he's
  590. going for it. And to your point, I mean, it's an example of a really brilliant person saying, hey,
  591. I'm just going to like take a chance at reforming, reforming this party. And I mean, I tend to
  592. sympathize with the critique that even if you, it's kind of a horseshoe theory of like why no one
  593. should want Trump in office. Because if you hate him, you certainly don't, if you hate what he
  594. stands for, you certainly don't want it in office. And if you actually like are sympathetic with
  595. what he stands for, well, it seems like if there's any repeat of 2016-2020, you're just going to
  596. like lose on all the issues that you care about. And so to the extent that there's like some grand
  597. unification in, you know, not wanting Trump to win, then, you know, the more smart people who
  598. challenge him, the better. I think like, I think a bet on Trump is like him. Or like, this is not
  599. originally my argument. I'm not sure if the people who made this argument to me would want me to
  600. give their names. But like, yeah, some pretty like some pretty like new right people basically said
  601. like a bet on Trump is, you know, basically a bet on like the new right deep state, right? It's
  602. basically a bet that, you know, like, uh, Trump knows whose friends are that those people are going
  603. to be the people who actually care about political power. Um, where were they in 2016 or 2018? Yeah.
  604. Yeah. Like, like, this whole movement escalated after that. Like, like, I don't actually believe
  605. this, right? I think the best predictor of future action is past action. But like, you know, the,
  606. the probability of the, the, I call this, you know, like the, this time we'll get it argument. I
  607. don't think it's like zero. I think it's low, but I don't think it's zero. But at least, you know,
  608. that's the, that's the consideration that those people would, um, would make at least. Yeah. It is
  609. interesting. I mean, to, to support Trump, you have to be like in the most charitable view, like,
  610. there's such a, um, sacrifice in terms of status, in terms of being seen in good standing with
  611. employed. Like, have you ever read the flight 93 election by Michael Anton? I have a long time ago.
  612. It was basically the punchline that, like, California is dying and like, this is all the marbles.
  613. Like, if, if Republicans don't win, it's just so over, as they say. Right, right. The idea is that
  614. with, yeah, basically with, um, the left wing kind of racial, um, scapegoating that they are
  615. basically planning to take. Yeah. And I think like this is actually like a pretty, like, I was
  616. skeptical of the take then, but I think it's actually kind of aged better. But the idea that, you
  617. know, like, if they don't stop Trump, then the, uh, many out previously neutral elements of
  618. government of private life of, um, I mean, certainly academia at this point, right, that, that part
  619. I think is undeniable will be reckoned, will be weaponized permanently against conservatives. Um, I
  620. think life for conservatives has gotten better from 2020 to 2023 than it has been from 2016 to 2020.
  621. Now, not in all areas, like what they're trying to do in K through 12 or with the China, like, you
  622. know, the, the, the left had a ton of soft power, um, and like hidden hard power. And now they're
  623. losing a lot of soft power and they have a, you know, more hard power, but it's constrained. Um,
  624. and, um, yeah, I mean, they've been losing a lot of the, the cultural, um, sort of momentum that
  625. certain, certain movements had. And the lack of a boogeyman on the right has given, has allowed the
  626. more moderate left people to have more power, hence like Bidenism. Um, so, I mean, that's not me.
  627. Yeah. Yeah. Like my counter argument to like the Michael Anton thing is that like Trump won and it
  628. still happened. Yeah. Right. Exactly. Like it happened even more, like, you can't do
  629. counterfactuals, but like, yeah. Yeah. Yeah. Like, like, like, like, there, there was a kind of,
  630. there was a flight nine or like, there, there was a like, uh, like a flight, but like, you know,
  631. Trump did not successfully storm the cockpit. It didn't work. It was not a solution. Yeah. That
  632. would be, I still think that's the kind of best counter argument, but that was not like the argument
  633. back then. Right. Yeah. Or like, some people for sure believe that, I don't know. But most people
  634. were saying, you know, like, this wouldn't actually happen. Like, like, that was the line of attack
  635. on Anton. And I think in hindsight, like, it's funny. Like, I think in hindsight, you know, like,
  636. Trump was both much more incompetent than we expected. Um, and Anton was like, more right than we
  637. expected on, at least on the half that was like, you know, this is going to happen. Not not
  638. necessarily that Trump would, uh, Trump would succeed in stopping it. But, but you know, like, to
  639. his credit, he, he, he, he like says that, right? That that's like a crucial point of the, of, of
  640. the essay is like, we don't know who would work. Um, and it didn't. Um, yeah. I'm intrigued by your,
  641. your, your question that you've been asking, which is like, how do you get really talented people
  642. into government? How do you redirect a lot of talent, um, you know, uh, to, to serve? And I, you
  643. know, you have to make it high status for them to, for it to, to, to, to, no, not, not, not high
  644. status, not, not high status, high, um, high, like work environment. Like, like, like, here's the
  645. thing, like, like I have this quote, um, that I have not, uh, I've, I've sat on Twitter a few times,
  646. but I don't think I've, I've sat in my podcast world, which is like, the free market selects against
  647. the free market. Um, because when you have a free market, uh, it creates just like such awesome
  648. talent, uh, or such awesome opportunities for talent, that like, I know, you know, like, the, the
  649. most brilliant, I was, I had a, uh, mouse Olympiad computer science Olympiad background, like all
  650. the most brilliant people are, you know, working in tech now, right? They're, they're working
  651. technical jobs. Yep. They're not involved in politics. They're not gaining any sort of power or
  652. connection or network. They're, they're working on important technical problems. You know, many of
  653. them, you know, there's, there's an exceptional number of them at open AI specifically, you know,
  654. um, on the technical teams. Um, and in fact, you, you can even argue that they've contributed a lot
  655. more to society by doing so. Um, but it means that those are exactly the people who are not buying
  656. for power. Yep. Um, like, and of course, you know, in order to keep this prosperous system that
  657. allows them to work so happily running, you have to not, you know, like, I'm sure there were a lot
  658. of brilliant nuclear engineers that upon, upon, you know, like the, the, the construction of the
  659. nuclear regulatory commission were condemned to like basically being an irrelevant or basically
  660. working in like purely theoretical physics and never replying it, right? Like, that is a possible
  661. path for machine learning today, right? And so, like, like me specifically, like I've had to make
  662. this decision, right? Like do I want to make money, do I want to, um, or do I want, like, like this
  663. specific fact, like, like the fact that I just talked about about the free market selecting against
  664. the free market is like specifically the reason why I'm at least like mostly convinced that I do
  665. want to, uh, that I do want to do something involving policy in the future. Well, I'll sort of stand
  666. it a bit, you know, crudely or simplified, which is if you have two options, and one of them is work
  667. at open AI, work at big tech, whatever, like, you know, um, or the other one is kind of work, you
  668. know, to user words, like against the regime. Like one option gives you, you know, can help make you
  669. rich and is like seen as, you know, more noble or something by a number of, you know, college grads,
  670. which a lot of people care about, and, um, and kind of more people in, at least society. The other
  671. path has probably less economic upside or certainly less economic upside. And it's depending on
  672. which circles they're in, certainly, um, you know, perhaps less approval from like a certain
  673. mainstream, early person, but what it does have is, um, it's, you know, a certain like, um, a
  674. certain integrity, intellectual integrity to it, to the extent that they are truth seeking, um, and
  675. so, I think open AI engineers are really truth seeking though. I don't think that's an advantage,
  676. but they're selectively truth seeking, right? Sure. Now, like, they can be honest with their
  677. political beliefs a lot of the time too. Like, I don't know, like, maybe if they were like very,
  678. very based, you know, they wouldn't. Yeah. I mean, most of them have like, normy, apolitical, you
  679. know, like a lot of them support Andrew Yang. Yeah. Um, all of them support, you know, like Nikki
  680. Haley or whatever, right? Like Shama supports Nikki Haley, right? Like all of them are kind of like
  681. apolitical or like, centuries or like, center libertarians. I guess what I'm saying is that in order
  682. to get people on this path, because it's not going to be as economically competitive, you have to
  683. compete on some other axis. And I think it, like, let's just say, for example, that Elon Musk said,
  684. Hey, you know, I Elon Musk said that we need a balanced, you know, government. I tweeted by that.
  685. That's why I said you should vote Republican in the last election. And I, I Elon acknowledged that
  686. the Republican party is totally broken. Hey, Democrats, we grew with that too. And we need to fix
  687. it. And so I am Elon, I'm going to have like a teal fellowship for policy. Um, and it's going to be
  688. the Elon fellowship. And it's going to be 20 people are going to get selected. 2000 are going to
  689. apply and, you know, I'll pay you a decent salary. It's not open AI, you know, level salary and
  690. equity. But, you know, you'll, you'll be fine. Um, and it'll be a two year thing, a three year
  691. thing. And then I'm going to help you start a company or something. A program like that would get
  692. people flooding and droves because they're not sacrificing their like career upside. And that's like
  693. long term status upside. In fact, maybe they're accelerating it. Um, and so they get to, and then
  694. you would get people applying who aren't even truth see like they're actually just like career hums,
  695. you know, like climbing and maybe they're so competent, um, that you actually want them there. Um,
  696. and so I feel like that's the kind of direction, um, that you, you know, you need to move, what
  697. needs to move in if you want to like shape where people go. And I think the same thing applies in
  698. higher ed too. I know you wanted to shout about that, which is like, you can have something like
  699. UATX, which I know you are, you know, went to and I really admire, uh, or you know, you attended
  700. some summer class or something. And I really admire, um, in terms, you know, I'm friends with, with
  701. Joe and Barry and, and, and these people, but like, I think, you know, you're not going to get
  702. people who are, you know, avoiding Harvard or Stanford and picking UATX over them because it's
  703. sacrificing career upside. And the best people aren't going to do that or don't want to do that.
  704. There would be some who are so, you know, intellectually pure that they're willing to sacrifice some
  705. upside. And, and you know, UATX, I talked to Joe about this. He said he's gonna, he's going to, um,
  706. you know, make it such that they're not sacrificing by getting companies like Tesla and SpaceX to,
  707. you know, guarantee jobs and stuff like that. So you can, you can measure with that, but like an
  708. organization that has a much bigger chance of competing with Harvard and Stanford is the teal
  709. fellowship. Because that is more prestige. One, because it borrows teals name and it's just
  710. extremely selective and they advertise how selective they are in the same way that Harvard does. Um,
  711. and then two, um, because they have, um, a track record or it like the people who've gone through
  712. are, are incredible. Um, now, if Elon tomorrow also similarly said, hey, we're starting to Elon, um,
  713. you know, degree and it's, it's competing with Harvard and Stanford. We have 2000 people and we, um,
  714. you know, we have signed up with all these companies that have promised to hire from here. I think
  715. he too could compete, but you need to borrow some level of prestige, um, either from individuals or
  716. from corporations, um, and use that to create a very selective program that gets the best people,
  717. um, and it most importantly excludes everybody else, um, so that people are making the Pareto
  718. optimal, you know, decision, non-ideological when they're choosing between Harvard and, and whatever
  719. competitor and until you're creating something that is just long term better for them on a, on the,
  720. you know, career and status prestige access access. You're, you're not gonna compete for the, for
  721. the most, um, for the most prestigious, um, sort of, you know, for the students that are chasing the
  722. most prestige, I eat the, the most, um, status seeking and, and, and often most competent and, and,
  723. and tell them. Right. Yeah, it has to be kind of philanthropic, but that's how the kind of, uh,
  724. incentive problem gets squared. Yeah, I think that makes sense. It doesn't, it doesn't, it doesn't,
  725. it doesn't, philanthropic. Like, Elon could be doing it as a for-profit venture, like he could
  726. charge $40,000, like he could charge the same price. It just, it has to be extremely selective and
  727. has to be associated with some level of prestige, right? Like, we have university competitors, UATX,
  728. Minerva, which I think Minerva is, is trying to do this to some degree that you have benchmark fund
  729. of them. They, they have some Silicon Valley pedigree. You know, there's the, the university. Is
  730. that a Austin Allard thing? No, he's doing Lambda. Um, okay. Which, which is interesting. I mean,
  731. he's, he's, he's not going after the most brilliant students. He's going after people who are trying
  732. to learn how to code. Yeah, like the marginal software engineer. Yeah. Yeah. But, but he did a good
  733. job of borrowing prestige in order to become the definitive bootcamp. Um, but, I mean, I think we're
  734. doing such a bad job in general. If we care about getting the top students, or even like,
  735. directionally, the top students out of Harvard or Stanford over the next 10 years, like, it doesn't
  736. seem like we're making any real progress to doing that. And you ask Silicon Valley, like, what are
  737. they doing about that? They're like, well, we think the education model is broken. So we're just
  738. going to wait until the next innovation, you know, happens. And the next like platform shift or
  739. something, because we don't want to do all the dirty work of creating a new universe. And this is
  740. why actually, like, Tiel looked into this. He looked into, I want, you know, creating a university,
  741. like a real university that offered degrees and that, you know, like competed head to head. And he,
  742. he said it would just become a copy. It's, it's just too much, whatever you didn't want to compete
  743. in the same way. And I think that's, that's a shame because Tiel Fellowship has had a tremendous
  744. impact for the people that went through it. It had a tremendous impact in terms of shifting the
  745. conversation. Like, when I was in college in 2010, 2008, like, saying that college was, um, sort of
  746. like a joke or a cartel or all these things was pretty controversial. Like Tiel's fellowship was
  747. extremely controversial. And now it's like fact. It's like, you know, even people like who
  748. previously would have thought it was a joke or, or, or, you know, beyond the pale, now agree with
  749. it. And, and yet, there's like zero change. And so, um, he's definitely shifted the conversation.
  750. But I think we need people to really like understand how, um, these institutions are competed with.
  751. And it's not just through, like, a better product via, like, meaning a better education, um, or a
  752. better network, necessarily, it's, it's by the whole, the whole package. And that whole package
  753. includes a more prestigious, um, option. Right. I think like a big problem here, honestly, like,
  754. with, with university specifically is that like, if you're, if you're trying to basically convince
  755. people to, like, locate their resources inefficiently, right? You either have to do two things,
  756. right? You have to either pay them the requisite amount, right? So that they're not actually, that
  757. they're being compensated for allocating their resources. And then it becomes actually optimal. Or,
  758. you know, you have to convince them to dislike the market for some reason. Or, you know, not
  759. actually to dislike it, but say, like, okay, here, I'm going to do this instead. And like, I am
  760. still, you know, I am still pro market in almost all areas. It's just that you need some people, you
  761. know, you need some people to exit the market or at least not, for that not to be the main thing
  762. that they're working on in order to actually defend it in the first place, right? Like, you know, we
  763. could have taken, you know, like, even just like 20% of the top nuclear engineering talent. And if
  764. we could have taken them and gotten them to basically work in VC and make sure that they stop
  765. banning nuclear technologies, then we would have a not, we would have a much more thriving nuclear
  766. engineering sector. Um, so like, yeah, this is the problem, you know, if the free market does not
  767. select for the free market existing, um, let me give an interesting example. And like, the biggest
  768. problem is that like in in the policy space, right, because of a selection effect, left wingers have
  769. an intrinsic advantage like people who hate markets just have an intrinsic advantage because you
  770. hate markets where you're going to go work. I'm sorry going. There was a famous conversation that
  771. happened with a very ambitious, um, very smart person and someone high up in effective altruism
  772. where he said, hey, I'm really interested in effective altruism. How can I have the biggest impact
  773. on effective altruism? And the person high up might have been Wilma Gaskar, I don't know who it was,
  774. said, get rich and come back and give some money and make a big and back that way. And that person
  775. very famously was San Franklin freed. And he he followed the formula and it worked short of, you
  776. know, massive unparalleled global fraud, but he transformed the the EA community for a few years and
  777. the community, their prospects of making a difference. And so, you know, if you're thinking about,
  778. hey, how do you maximize the impact over the next 20 years, or, you know, you know, period of time
  779. that you want to optimize your impact, you're probably asking yourself the question, yeah, like, do
  780. I go into policy or do I get rich first? I'd be successful first. And it seems that unfortunately,
  781. the way the world works is like, once you're really successful, once you have a successful startup
  782. or your stressful investor or whatever it is that you can do, you not only have money that can then,
  783. you know, direct other people's times and start organization and stuff like that, but you also have
  784. a level of credibility and prestige that can even further shape where, you know, labor and capital
  785. goes. And so, now, I don't know how you think about it for yourself, but that's one framework of
  786. thinking about it. Yeah, I think that makes sense. The counterpoint to psychology has this quote
  787. right there, there are like, there are like personal billionaires and then there are like, I forget
  788. what it was, like, like, manager billionaires or something like that, right? Like, like, the idea is
  789. like, the city of San Francisco has like billions of dollars being managed. Obviously, states, those
  790. states have more than billion dollars being managed. And that is all being directed. And a lot of it
  791. is just under the discretion of look whoever is the executive, right? Like, just what can be done
  792. with executive orders, right? Obviously, becoming president is quite difficult. But you have all
  793. these positions which are, you know, in effect billionaires, people who control billions of dollars,
  794. and in fact, maybe have even more discretion or at least are not punished in the same way that you
  795. would be punished, for example, for selling stock, you know, the stock price would drop. In spending
  796. that money. So, like, this is true in some cases. I'm sure of that, right? It might have been true,
  797. you know, if San Bank Madrid was running, you know, an actual profitable company that might have
  798. been the case that you would have done more by doing that instead of, instead of by, you know,
  799. influencing some kind of policy. I think it's definitely true if you want to influence the
  800. Democratic Party because of, once again, how just anti-intelligence they are. Both in terms of,
  801. like, also, denying that, like, I think, like, this is the reason, right? Like, I think, you know,
  802. like, there is an actual kind of virtual ethics here where, like, the lack of valuing intelligence
  803. philosophically leads to, like, lack of valuing intelligence practically. Yeah. Let me encounter to
  804. what I just said, which is every movement, and I do believe this, every, you know, EA needed Will
  805. and the gas kill, and it needed, and I might be pronouncing Will's name wrong, but it needed a Will,
  806. and it needed a Justin Moskovitz and San Bank Madrid. Like, you need the actual capital, and you
  807. need kind of moral or intellectual capital. And Will was an idea entrepreneur, and he was an amazing
  808. aggregator of talent and capital, and Will could only do that by having spent, you know, a decade or
  809. more, you know, really immersed in ideas, and also, really immersed in kind of, you know, local
  810. community organizing, so to speak, or some of this political work, even. So Will had, you know, he
  811. was very successful in terms of his, like, movement impact, and more so than he would have been if
  812. he had tried to be an entrepreneur, because it's so hard to be an SBF or a, or a Justin Moskovitz,
  813. there's a lot of luck involved. At the same time, you know, it's hard to be a, a, a Will too, but
  814. if, if, if, if people have that kind of, you know, unique prowess of ideas, and I remember having a
  815. conversation with Bology, as he was thinking about his, his next thing, because Bology is both an
  816. idea entrepreneur and an actual entrepreneur, but I think one thing he said to me, which I don't
  817. think reveals anything in confidence is, you know, I'm a competent entrepreneur, Ibology, I've
  818. helped start a council which sold for $300 million, it was CTO of Coinbase, but there are lots of
  819. companies and entrepreneurs out there, and Ibology, you know, I'm not Elon Musk, but in terms of
  820. idea entrepreneurship within technology, it's actually not that many. And so, yes, it is
  821. interesting, and Bology moved to Singapore, well, I'll take that out, sorry, Bology, you know, wait,
  822. I think you said that public. Okay, so maybe we could leave it in that, but Bology left the country.
  823. I say that to say that he hasn't seen most people in person since COVID, and yet he's become way
  824. more influential than he has prior, and that's just, that's based on ideas, that's based on, you
  825. know, publishing that we're state, that's based on, you know, waiting into the discourse. And so, if
  826. you're good at ideas, there's a lot of power in that, so I don't mean to undermine that, but one has
  827. to go all in, and one has to, you know, put their time in. Now, when you talk about doing policy
  828. stuff, you just have to find the right medium that, that, and maybe it is podcasting and, and
  829. newsletters plus, like some version of community organizing or movement building, but, you know,
  830. those are a couple examples of people who've, who've done that well, which, which put, and a dark
  831. action in the same position, right, like he's also brilliant. He came in your podcast for, for
  832. podcast, doesn't know as they checked that out. And he can start a company, or he can keep pursuing
  833. his intellectual work, which, like yours, seems to be really resonating, and seems to be pretty
  834. differentiated and pretty novel. I mean, you guys are both in your early 20s, and some of the best
  835. idea entrepreneurs in Silicon Valley already, which just shows the, the, the opportunity. Right,
  836. that, that's interesting, because I think like that, that phrase is really good, because, yeah, I
  837. was actually, like, okay, maybe I shouldn't, I won't mention who I was, was, was talking to, but I
  838. was talking to someone, and I basically said, like, I consider myself to be like a very poor writer,
  839. and like, not like, amazing of a podcast, or either. In terms of like, charisma, and in terms of
  840. like, really creating a kind of feeling of like, relatability of interest, I think I just like have,
  841. I've kind of like, speculated, I've kind of like, bet on ideas that have become, have much bigger.
  842. Right, like, and that's been, that's been the lane of like, okay, if you come to, you know, if
  843. you're listening to the From New World podcast, if you're subscribed to the newsletter, you will,
  844. you know, you won't get like, the most compelling paragraph about a new idea that, that will matter
  845. a lot to you in like, five years, right, or at least it will matter a lot to a lot of people in five
  846. years, but you will get that idea, right? You, you will get, you will get that idea in some form,
  847. and I think that's like, that, that is the draw, I think, for a lot of people, including people who
  848. have talked to you about it. Um, right. I mean, just to spend another minute on that, I mean, if you
  849. look at Richard and Annie's work, like, what is he most known for, what is a great intellectual
  850. contribution? From my perspective, it's, it's a few, and you know, he's got books and stuff, so I
  851. don't mean to undermine them, but it's a few blog posts. It's, you know, wokeness is civil rights
  852. law. It's, um, the, uh, the, you know, the liberals conservative analysis. Red liberals read
  853. conservative watch TV. Yeah, it's classic. It's some of the stuff on, on, on, on gender in terms of
  854. free speech. Um, you know, uh, like, and how gender impacts organizations are being, you know, a
  855. little, a little vague, but he can go read it. Um, I think, and then also, yeah, I think that the
  856. headline on that was something like, uh, the free marketplace of ideas, favors women's tears. Yes.
  857. Um, and then also, like, and then, you know, separate from that is just kind of his, his sense of
  858. humor on Twitter, which, um, you know, or his antics on, on Twitter, as some might say. Um, and, you
  859. know, I think one question I have for you is like, are you also someone who's going to write kind of
  860. seminal, like, you know, um, blog posts that, that will explain a concept that people didn't know
  861. how to understand. I mean, I think you were the only one on the wokeness and AI front for, for
  862. example, you know, like this woman, Renee, for the last name, Dresda, is built a whole career on
  863. kind of like writing the, you know, misinformation wave. You know, she's from the other side, of
  864. course, um, on the left via the social media stuff. But do they send that AI and, um, censorship is
  865. going to be on a major issue. It's going to require someone really technical to figure that out, who
  866. also understands some of the politics stuff. So that, I mean, that's an interesting angle. Um, yeah,
  867. I think it's interesting just to think about, we don't spend too much time on it. But if you, um,
  868. you know, really take the public intellectual path, just like what is the way to break out in a way
  869. that, you know, Richard and, and a couple of these others have not done, you know, yeah, I think
  870. like, I don't know that that's been like a different, that's been an interesting thing for me. Um,
  871. because, you know, I really am kind of a true believer in elite theory, um, or public choice, you
  872. know, like, like, this is a point that I made, made a lot, right? It's, it's pretty similar. It's
  873. almost the same thing. But like, yeah, um, a real believer in elite theory, a real believer that it
  874. matters kind of to, like, like, you, you found out about me by listening to this podcast, right?
  875. Like, and, like, I think there are many such cases where, um, they're, they're, they're like, once
  876. again, like, to go back to the beginning, I do think they're trade-offs. I think they're pretty
  877. strong trade-offs, um, between public and, and private appeal, like, public and elite appeal, right?
  878. I think that, you know, if I had basically, like, a clickbait, thumbnail, and headline on every
  879. single one of my episodes, right? Like, this was something that I actually discussed with a
  880. different podcaster. He suggested, you know, like, putting in clips of, like, news articles or
  881. whatever, right? Um, and like, I think, like, from pure kind of growth perspective, like, that's,
  882. that makes sense. That's true, right? But, and it would make sense, you know, not to do, like, four
  883. hour podcasts. But on the other hand, I think that it's actually crucial that you have some of these
  884. signals that basically say, like, actually, you know, this is not a norm, this is not a normal
  885. podcast. Yeah. Right? Like, I think that, that kind of, like, counter-signalling actually really
  886. matters. Um, mate, like, I think it kind of matters, or I think it matters a lot, that, like, the
  887. from the new world podcast does not, like, rarely touches on kind of, like, first order, um, culture
  888. war issues, and touches, and like, preferentially touches on, like, second order, um, uh, cultural
  889. issues, cultural war issues, right? Like, a good example of, like, the second order cultural issue
  890. is, like, Richard Hennami is writing on, like, affirmative action, right? Like, like, not, not just
  891. saying, you know, like, oh, they're going after your kids or whatever, right? But saying, like,
  892. okay, maybe, you know, like, and this doesn't even, like, necessarily mean you have, like,
  893. underlying policy disagreement, right? Like, it's like, here is, here is why, you know, affirmative
  894. action is so influential in each of these companies, it is due to these laws in specific, you
  895. should, or you should repeat all these laws in specific, right? Like, I think that that, that,
  896. that's like, both proper context in terms of, like, what I actually want to do, right? Like, in
  897. terms of what I actually care about and want to focus on, but it's also like, the proper context
  898. for, for a kind of like recruiting, or like, yeah, we can go with recruiting, right? Like, like,
  899. attracting, that's a better word, attracting a kind of very interested audience, who's just much
  900. more likely to actually do things in the future. Yeah, it isn't, it's interesting. I think they're
  901. trade-offs everything. Um, and I think that, that path that you just outlined, um, it's kind of
  902. like, you know, the, in different way, but like, the Curtis path, right? Like, he is, um, you know,
  903. sort of undesirable to, uh, to ignore me audience, and that attracts a certain level of die-hard,
  904. um, followership, and he's, you know, he's not the person named, but his ideas of influence, you
  905. know, someone like him, and back, or someone like a teal, even if, or even Elon, even indirectly,
  906. and that is impact, and that is power, um, at the same time, um, there, you know, there's always a
  907. question of, did this person succeed, um, in because of their, um, you know, the decisions they
  908. made, or in spite of certain decisions they made, and, you know, um, someone like, biology, maybe,
  909. is, um, is a bit of both in that, you know, like, um, he's both an idea entrepreneur and kind of
  910. like, good actual entrepreneur slash, like, community organizer, or like, he's able to, yeah, he was
  911. elite much more before he was like, publicly well known. Yes, and, um, I guess what I'm saying is,
  912. it's easy, it's a potential cope, this idea that, you know, I can't have my ideas, my ideas spread
  913. widely because it would sacrifice some, you know, some of the main points of why I even do this,
  914. because like we were talking about with all in before, like, some things are able to be mainstream,
  915. and also appeal to elites. Now, you know, it's certainly watered down. It's certainly, um, although
  916. in their case, actually don't think they're, they're watering down, but it's watered down relative
  917. to, like, a purist or, or someone, you know, someone who spends all their time thinking about
  918. certain things. Um, and, you know, some people, like, because they don't want to be seen with the
  919. normies, you know, sort of snuff their, their, you know, some of their nose at it, but it's
  920. certainly has a bigger impact. I mean, I think you need everything, but I was saying, yeah, I
  921. wouldn't rule out having a more excess, like, I don't think Richard Hanania has sacrificed a ton by,
  922. you know, growing his audience, you know, 10X in the past, you know, a couple of years, and, you
  923. know, if Richard Hanania grows his audience 10X again, the next, like, I don't think, now he might
  924. be audience captured to your point earlier and kind of a different kind of way, and he starts to,
  925. instead of cater to his audience, like, you know, uh, hate his audience, um, but I just want to rule
  926. that out, is what I'm saying. I think there could be successful models in, in, in both, you know,
  927. deliberately niche ways and also in ways that, um, you know, transcend that need. Yeah, I think the
  928. trade-off is kind of a lot more simple than, like, this might have been my fault that, that like, I
  929. was miscommunicating it, right? But like, the practical trade-off is like, you know, I can go to the
  930. DC meetup, or I can write another article, right? And maybe the time scales on that aren't quite
  931. right, but it's like, literally, like, a time trade-off in terms of elite versus public influence.
  932. Yeah. Right? Like, like, like, they're like, literally, you know, like, the same, the same time
  933. slot. Yeah. So, like, I'm, I'm wondering, like, do you, do you think that, like, where do you think
  934. the most value is generated? Red, I know you talked about earlier, already, that, like, it's kind of
  935. saturation dependence, right? We have a lot of idea entrepreneurs, or like, their idea entrepreneurs
  936. are actual entrepreneurs. Like, to me, like, this is actually something, like, like, I think about
  937. this a lot, actually, in terms of just, like, speculating on the kind of, like, talent metagame,
  938. right? Like, something that pushed me, like, people don't know, the people in my audience don't know
  939. this, right? I was like, really interested in machine learning in, like, 2018, 2019, right? And so
  940. was, like, the rest of the entire, you know, computer science, Olympiad scene. And, and like, I just
  941. saw so many people, especially like so many people who I like personally respect. And you already
  942. were like, new personally, we're like, just, just extraordinary people going to machine learning and
  943. like, man, do I want to really want to be, like, the n plus one's machine learning engineer, like,
  944. like, how, how impactful, how much will that actually matter as opposed to, like, doing literally
  945. anything else, right? It's, and, you know, like, I've kind of returned to that indirectly over over
  946. the past year or so. But I think that philosophy still kind of applies, right? Yes. Now, like, like,
  947. there are a lot of, like, like, applying that to, to hear their, I think there are a lot of, like,
  948. maybe this is a controversial take. But I think like, the current environment of public
  949. intellectuals is very good. Like, there are a lot of very good public intellectuals, like, like,
  950. biology, like Richard, like, Scott Alexander, and, like, Curtis, like, of all kinds of differing
  951. ideologies. And, like, in my experience, the quality of, like, a well-known public intellectual is,
  952. like, higher than the quality of, like, a DC lobbyist. I'll put it in Ezra Klein and Noah Smith just
  953. to get some more diversity. Sure, yeah. But the, I endorse that. Here's the way I would look at it.
  954. I mean, there's a few different dimensions of the question. Because in some ways, yeah, there are a
  955. lot of great entrepreneurs. There are a lot of great idea entrepreneurs. And yet, at the same time,
  956. there's, you know, there's a shortage, like, there's only one Elon Musk, or, you know, there's only,
  957. like, a few, right, people on that level. And there's only a few, you know, sort of tower accounts
  958. of Richard and Ania's, like, and so at the top level, you know, you could always have more. So a
  959. question I'd ask. There are a few questions. And one is, like, wherever there's more interest,
  960. there's just going to be more, like, desire to put in the work to get really great. I think what you
  961. are in, and many really talk to people are in, are they're, like, you know, they have the struggle
  962. of having choice. It's not obvious to them what they should do, because you, if you said, hey, I'm
  963. going to focus on making as much money as possible, and you applied your brain to that, you'd
  964. probably be pretty successful. You might be extremely successful. And similarly, if you said, hey, I
  965. am going to apply myself 100% to idea innovation, or, you know, public intellectual life. You know,
  966. you yourself could, could become a Richard within a few years, right? Like, Richard came out of
  967. nowhere. Like, Richard before COVID was not on anyone's radar, right? Like, Richard, Richard really,
  968. you know, rose up pretty fast relative to someone like a Tyler Cowan or something. And so, you're
  969. cursed by that, by the optionality, but the problem there is if you don't go all in on one, you
  970. might not have success in either, right? Because they require intense focus. Now, so if you assumed,
  971. so interest matters because it's going to determine how hard you work. But if you assumed for
  972. intensive purposes, interest was equal. And you just said, hey, let's say there's two universes, two
  973. worlds. One is which I then expand the next five years. And even after five years, you'd, you know,
  974. be in your late 20s or whatever, like you'd still plenty of time. But next five years, either
  975. focused on entrepreneurship, or, you know, getting wealthy or on idea work. And you just kind of
  976. like sketched out what that could look like. And, you know, you know, your skills and opportunities
  977. better, better than I do. And if, if either of them looks like you can make more progress, you, you
  978. cannot add the other. Like, let's say, let's say, for example, you in the next five years have the
  979. level of success that Richard has today. Your scene is like one of the great public intellect, or
  980. maybe biology has, but like, you know, putting aside his entrepreneurial accomplishments or
  981. something. Your, your scene is like a leading voice on issues that, that, that matter. And you have
  982. an audience and you have distribution and you have respect and brilliant people follow you. Well, at
  983. that point, you can do a number of things. Certainly you can, you know, start a media organization
  984. or have a successful career via media. But just as I was saying earlier, this conversation, like,
  985. distribution is a wedge into other things. Like, if you're also technical and you recruit, recruit
  986. technical people, well, you know, then you can co-found something or invest in, like, there's a lot
  987. of people who use media to become investors. And they, you do use your distribution, you know, ruin
  988. is an example, right? Like, someone who built an audience on Twitter and has leveraged that to get
  989. some influence such that some people, and I don't know, it's an exaggeration. But he's just, yeah.
  990. And for the audience, you might, you might not know who ruin is. We did an episode with him, the
  991. fourth episode in this entire podcast. And that will be, that will be linked as well. That's all
  992. right. Keep going. Sure. So I think, and I would say the same things are cash. It's like, where can
  993. you make the kind of quickest traction? Like, what is, what is the quickest path to, and and for,
  994. let's say I'm talking to our cash, it's like, hey, podcast is, you know, taking off. Like, what if
  995. you went all in on the, like, how far could you go with, with the podcast? And then it's like, okay,
  996. what are adjacent things you could do from there? It's like, I mean, Tyler Cowan, if, if Tyler was
  997. more kind of, you know, if he was younger in his career and more ambitious, he could start, or more
  998. commercially ambitious, he could start a fund as well. He could start, I mean, like, he could be a
  999. big time investor. He is the deep respect of, of the value and people who matter. He could certain,
  1000. like, he's, I mean, here he does it with his grants. He's a proven talent, a tractor and selector.
  1001. You know, VCs do get rich if they're successful, and he could be a very successful VC. Because of
  1002. his ideas, he's just choosing not to. And so I think it's really like the combination of where you
  1003. most interested, and where do you think you can make the most traction? And if you're, and the thing
  1004. with you is you already have some momentum in the idea space. So if you're like, hey, I could really
  1005. like go all out on this for the next few years. And, and really make some traction, whereas if you
  1006. evaluate to get wealthy path, and it's like, I don't really see a path, or it's not obvious. Like,
  1007. I'd have to, you know, scrounge for a while. You know, that said, I could have said the same thing
  1008. to us, let's say SBF, you know, before he started FTX or whatever, had a blog, and that blog was
  1009. doing pretty well. I could have been saying, hey, why don't you like take this blog forward? And he,
  1010. you know, kind of clinically identified, you know, a few opportunities to make money, like Miss
  1011. allocations, you know, or just like arbitrage opportunities. And now, if we had given, if SBF had
  1012. been given that advice in 2004, or some other time period, like maybe we would never be talking
  1013. about SBF because he would have tried some, you know, other internet thing in which he had no
  1014. strategic advantage, and it wasn't a way to get $30 billion in three years, because there were,
  1015. before crypto, there was no way. So, timing really matters, too, in terms of, like, road is the,
  1016. what really is the arbitrage opportunity. And so, I mean, the, the, the, the comforting, but also,
  1017. you know, somewhat challenging, you know, TLDR on this is like, it seems like you can be successful
  1018. in either path. And it seems like either path could lead to the other. And so, it's really just a
  1019. combination of, like, your assessment of your, your skills, in terms of where you think you could
  1020. have the highest leverage, where you're most distinct. And that, that's on a more granular level,
  1021. because you, I haven't worked with you, but you know, like, on the surface, you, you, you seem like
  1022. you could do both. So, your assessment, your own interests, your own talents, your own interests,
  1023. and then, you know, your assessment of the, of the market opportunities, and timing, in
  1024. correspondence with those. Right. How do you react to that? I think I just believe in base rates too
  1025. much, right? Like, like, like, here's the case for it, right? Like, so, yeah, technical, technical
  1026. development, kind of, or like, yeah, I kind of separate that off a little bit, right? Basically,
  1027. like, frontier tech research, some other kind of entrepreneurship, media, or kind of like insider
  1028. politics. Like, like, what is the, what is the correct ratio of allocation of, like, top level
  1029. talent between those, right? And to me, just kind of, like, at the population level, there is a lot
  1030. of tech, or there's a lot of allocation into technical development and entrepreneurship. And, yeah,
  1031. like, once again, going back to the quote, right? Like, the potential, it's, it's kind of strange as
  1032. well. Like, like, it's a, the things that you would do on the policy side. Maybe this is also
  1033. another thing that makes it kind of, like, easier to motivate, it's leveling people, right? If you
  1034. think that government is intrinsically or like, on average, right? If you think like, government is
  1035. on average bad, then you eat, then you know that you're playing a defensive game, right? Like, my
  1036. goal with like, the future, you know, like machine learning policy think tank is like, if, if
  1037. literally nothing happens, right? I would, we would be like, celebrating. If like, no AI regulation
  1038. happens in the next five years, we will, like, be partying. We will be, you know, we'll be like,
  1039. this is extraordinary, extraordinarily successful. We have accomplished everything we wanted to do.
  1040. Right? And in terms of like, in terms of motivation, I have to admit that like, that is, you know,
  1041. it's a difficult motivator. I've been going through this right now. I've been looking for people to
  1042. recruit. And it's, you know, especially if you're someone who has the ability to kind of either do
  1043. frontier level research or to do, or to be an entrepreneur, right? Like, it's not, it's not too
  1044. appealing, right? I don't think it's necessarily like, it's less a status thing than like, it is
  1045. just intrinsically not appealing, right? Like, like, yeah. So, so the question is, right? Like, or
  1046. sorry, to finish up on that last point, this just makes it to me at least. This just makes my
  1047. assessment of like talent allocation, that the base rate is just, of like, people who could do both,
  1048. is just like, significantly misallocated towards the kind of, like, making money, making money side.
  1049. Yeah, like, if only, you know, like, there's the case that, you know, you want, like, more
  1050. corruption and more, basically, like, bribery and government because it, like, allows this to flow
  1051. more efficiently. I'm not sure, like, the normal, you know, like, the normal version of corruption
  1052. would be actually successful in incentivizing that, right? But, like, for example, like, prediction
  1053. markets is one way that, like, maybe this becomes better. Yeah, I think that, like, in the long run,
  1054. I really want to find some way of kind of optimizing the kind of meta-level allocation of talent
  1055. between these two areas. But yeah, I just don't, like, on the individual level, I think, like, the
  1056. base rates just make it much more likely that doing something in kind of entrepreneurship or tech is
  1057. oversaturated. And, yeah, just looking at, I don't know, because, like, even before I had any
  1058. interest in politics, I was thinking, you know, like, I'd much rather be a kind of CTO, and I can
  1059. have CEO, right? I'd much rather be someone who works on the base level of technology. But at the
  1060. same time, yeah, at the same time, I am fairly, yeah, like, like, the thing to rate is like the
  1061. ratio of what I want to be doing and have what motivated to do combined with talents to, like, that
  1062. kind of base rate misallocation, right? Yeah, I am, like, the thing is that I'm, like, 75%
  1063. confidence, or sorry, confidence, that some kind of policy work is the right idea, and then, but
  1064. that's still only 75%. I still think, like, thinking about this more would be very valuable, like,
  1065. literally thinking, you know, like, what am I going to do for the next five to 10 years? Right,
  1066. right. Even, let's go with, hypothetically, let's say you're taking the policy work path, and then
  1067. let's just brainstorm how to do that. I mean, I want to work that as interesting. Are you familiar
  1068. with Teach for America? I think I remember, like, Andrew Yang talking about it as a long time ago.
  1069. So, Andrew Yang actually started an offshoot, or an organization inspired by Teach for America. It
  1070. was called venture for America. Right. Yeah. Teach for America. I'm not surprised that, you don't
  1071. know about it. It used to be much more relevant, like a decade ago. Like, when I was, it's kind of
  1072. lost its lustre for whatever reason, but when I was in college, like, I applied Teach for America,
  1073. got in, and I was planning on being a teacher for these, for two years. Interesting. Now, and I
  1074. would have been a special ed teacher in the Bronx. I have, no patience for even amazing people who
  1075. are underperforming or something. I would have been terrible. And so, the question you ask is, how
  1076. did Teach for America convince me to apply, get in, and almost do it? And they convinced a lot of
  1077. people at top schools to do it. It's a combination of, like, talk left, act right. They made it so
  1078. prestigious. In turn, it was super selective. So it was a top signal. So the idea was, like, you
  1079. would go do TFA for two years and then go to, like, Goldman Sachs or whatever, Bane, or, like, it
  1080. was a career accelerator. And they had all these partnerships and they got these brilliant people.
  1081. And then they made them look like heroes. They were like, hey, education's broken. You need to save
  1082. education. And they're marketing or propaganda, whatever you want to call it, was amazing. And so,
  1083. like, if you want to do, if you want to shift talent into policy work, like, what's your, what's
  1084. your propaganda, right? Or what, what is this org's propaganda? Like, it needs, and I think that's
  1085. where, you know, this biologist rate, like, talk left, act right in terms of, like, it needs to be
  1086. seen as, like, more moral and noble and important. And I, you know, you can obviously create that
  1087. argument. But then also needs to be seen, in my opinion, as something that will, like, just be net
  1088. better for their lives, even if they were non ideologically motivated or non morally motivated. And
  1089. so, that's where partnerships with companies or partnerships with, um, people, you know, who are,
  1090. like, if I was aiming to do what you wanted to do, I would try to find someone like a biology or
  1091. someone who's, who's got credibility and saying, hey, can we create this fellowship together? Can we
  1092. create, you know, X, Y, and like, if you, let's say your cause was network states or charter cities,
  1093. like, biology would definitely fund like a fellowship or grant program for people. And he's, he's
  1094. actually doing that, right? And he's, he's, right, right. I'm familiar with CCI. Yeah, he shipped
  1095. it. CCI is great. Yeah. Yeah. And, and we're glad to run CCI, like, network state charter city move.
  1096. I mean, it's still super early. But there are like, hundreds of really talented people working on
  1097. that didn't exist prior. So like, that's, that's pretty big accomplishment. So like, um, if you want
  1098. hundreds of it that didn't exist prior, yeah, I feel like there's a package, there's a bundle of
  1099. things that, that people need. And, um, you know, being able to explain to their parents and people,
  1100. people they went to college with and high school with, like, or even like, have it on their LinkedIn
  1101. profile needs to be seen as, as prestigious. And I think that's something that people, people who
  1102. are entering things from a place of truth seeking and integrity and mission driven sometimes because
  1103. they themselves are impervious to some of these procedure status games relative to others. They
  1104. don't realize that others aren't as impervious. Right. Right. I think like the biggest, the biggest
  1105. motivator that I've come across so far, both for myself and for like, other like-minded people, I've
  1106. just been workshopping how to tell this story exactly. But, uh, have you ever seen the movie, uh,
  1107. 20th century boys? No, I haven't seen it. Okay. So like, the plot of the movie is like a cult takes
  1108. over Japan and then eventually the world. And, um, there's a scene in the movie where the cult
  1109. leader fakes his death in resurrection, or like, well, technically some other guy who helps being
  1110. the cult leader is shot instead, but like, he fakes his resurrection. And, um, what happens is that
  1111. like, they're already very famous of the ruling political party of Japan. And like, the, the
  1112. citizenry of Japan are like packed into the stadium and, you know, like tens of thousands of people.
  1113. They're all cheering. They put up like their, their hand signs, like the cult sign. You know, all
  1114. the people in the streets are stopping and they're putting up the cult sign. Right. Um, it's just
  1115. like this feeling of utter doom of like, just complete sentiment in that like, is the world insane.
  1116. Right. I remember like Eric Weinstein talking about a similar, uh, similar experience as well. He
  1117. had the, uh, I forget. I'm blanking on the author's name, but he had the author, um, Tim Koran, uh,
  1118. Tim, the prep. No, no, no, no, uh, of the, of the essay in the New York Times about, uh, not Agnes
  1119. Keller. Yeah. No, no, uh, this, this was when he read the essay. He read like this essay that was
  1120. written a long time ago about the kind of like, did not, it was like on the denial of, um, on the
  1121. denial of atrocities or something like that. Right. And he had this quote, right, which was
  1122. something like, um, right there, you've got their attention. Hold them and blow them before, before
  1123. they shake off, you know, before they shake off, uh, their confusion like a puppy, uh, like a wet
  1124. puppy or something like that. Right. I'm forgetting the quote right now, but like, I think that it's
  1125. both true and incredibly powerful to emphasize that that's like, that is the world we live in to
  1126. some degree. Right. And we're going to talk about this later with egalitarianism, but like, it is
  1127. like, it is just true that it's genetically encoded in many people, um, or like, it's an evolved
  1128. pattern of behavior to deny reality in very specific ways that are responsible directly for, you
  1129. know, some of the greatest missteps, the banning of innovative technologies. You can look at
  1130. NewClear as like the key example here of just creating of like voluntarily creating this poverty and
  1131. creating this like completely unnecessary, um, struggle. And of course, even more in the past,
  1132. right, with, with communism, um, with, uh, really like a long record of these kind of of anti-
  1133. prosperity, anti-innovative movements. And to me, like, one really striking example of this was
  1134. GDPR. GDPR is, you know, I tweet this out fairly often, you know, I tweet one of two versions,
  1135. either, you know, the European Union is China with lower IQ, or the European Union is China with
  1136. lower IQ and far worse food. Um, you know, depending on how many people I want to piss off. Um, but
  1137. it is the case. You know, like, controversy about the second part aside, it is the case that the
  1138. European Union is just the less competent version of the Chinese government. The same motivations
  1139. are there. The same motivation for kind of total control, the fear of anything disruptive. It's,
  1140. it's exactly the same kind of psychological pattern. And they are, they have less power specifically
  1141. because they are less competent, um, which, which, you know, might be a good thing in the end, might
  1142. be a good thing, um, especially compared to the circumstances that China had, you know, two or three
  1143. years ago. But it is striking just a special, returning to GDPR. How many people like China, how
  1144. many like normies, you know, like people who just don't pay attention to politics, how they thought
  1145. this was like a good thing. And not realizing that it just crushed, you know, thousands, if not
  1146. millions of small businesses, of people who are really on the way up, who are going to have
  1147. fundamental improvements, and not even just small businesses, right? This was the grounds for Italy
  1148. banning chat GPT, like literally like Chinese state behavior. Um, and that this is just, you know,
  1149. this was cheered on, this was celebrated. It's exactly the kind of 20th century boy's moments, I
  1150. think. The big idea is that like, essentially, you're not living in a world where like the safety of
  1151. your industry is guaranteed. And like empirically, that's been the case, right? It's not like, you
  1152. know, it's not 100% of time the industry gets regulated out of existence. But it is like pre-common,
  1153. right? If you're a nuclear engineer, you know, in like the 50s or 60s, you've seen that like real
  1154. time. And I think that has happened to a few tech people. And that is why, you know, like, as Peter
  1155. Tiel said, you know, liberty and democracy, it was like democracy and freedom are no longer
  1156. compatible. I don't think that's quite the case. But I do think that incentive is, is you basically
  1157. need a lot of people, quite frankly, like people like me, um, who are acting not in their self
  1158. interest in order for freedom and democracy to be compatible? How so? Say that more. Say more about
  1159. that. Because like, right now, acting in my self interest is like starting a tech company and
  1160. becoming, you know, extremely wealthy, right? Certainly, there's a higher chance of becoming
  1161. extremely wealthy doing, even though like, you know, it's not guaranteed, I might fail for sure. You
  1162. know, I'm keeping that in mind. I'm for sure not like 100% confident that that'll happen. But
  1163. certainly it's a much, much more likely path to wealth than, than, you know, doing machine learning
  1164. policy, right? Like that the incentive is like, the people who hate the market will like go to areas
  1165. outside of the market. And in fact, it will work very hard to crush the market. The people who like
  1166. the market will go into the market and succeed in the market. And so like, like, that's the core
  1167. case, you know, of the free market selecting against the free market. Yes. Yes. So your marketing is
  1168. basically tech needs a defense budget. Tech needs a defense team. Yeah, that's a brilliant way to
  1169. put it. Yeah, yeah, exactly, exactly. And I think that's strong marketing. I think it's interesting
  1170. to look at this in the arc of how Silicon Valley, you know, said broadly, like tech has approached
  1171. politics and kind of its defense in the past. And mostly how it hasn't had to. So let me give a
  1172. brief overview. I mean, basically Silicon Valley in the, you know, late 2000s, you know, with Obama
  1173. and the Arab spring was the darling of the left. Like social media had ushered in, you know, all
  1174. this freedom of speech, which was at the time, you know, corresponded with with left wing causes,
  1175. Arab Spring being one of the biggest ones. And then of course, you know, Jack, you know, Dorsese
  1176. famously, you know, said stay woke, you know, was was a big supporter of Black Lives Matter and
  1177. Durey and what was happening in Ferguson in 2014. So I mean, Silicon Valley was it was a darling for
  1178. leftism in the late 2000s, early 2010s. And I saw this transition because I was, you know, my
  1179. company products on is, you know, hyping technology startups, coming technology startups, and we
  1180. were at Darling. And I saw the mood start to change. And they would use contradictory arguments.
  1181. Like they would say, hey, everything that's on product time is just a silly app. Like, all these
  1182. tech people are working on all these silly things that are not important, not serious. And we need
  1183. to, you know, they need to have a bigger impact. But then they would also say at the same time, hey,
  1184. tech is taken over the world. It's, it's, you know, having a bad impact. It's way too powerful. And
  1185. then, you know, and Trump really, like, you know, the perception in many technicals minds that many
  1186. people's minds, elite's minds is that in the same way social media had elected Obama, it had elected
  1187. Trump. Now, there were things before Trump that created this rift between Silicon Valley and the
  1188. DNC, let's just say. I mean, that are worth emphasizing because it wasn't just Trump. It would have
  1189. likely happened regardless. Silicon Valley started to attack traditional American left power
  1190. centers, near times, Hollywood started to go after academia too. First, it was enabling it. And then
  1191. it started to replace it, like Netflix is a very obvious example. It started to undercut their
  1192. precision influence. It pulled away a lot of their top talent. You'd people like Larry Summers, Eric
  1193. Holder, David Ploof, all working for tech companies. It became a more powerful global culture
  1194. exporter, like Stanford taking over as number one school from Harvard, YC becoming like a top
  1195. school. And, you know, Silicon Valley no longer needed the DNC. They built a network of super
  1196. wealthy people with an alternate social network and path to power. Rather than working government,
  1197. you know, become CEO. And so we had this tech lash. And it was, and what tech did is they responded
  1198. by apologizing to it, by apologizing to it, giving money to left and causes. They thought that it
  1199. would go away. It, in fact, the critiques got worse and worse. And what happened around, I'm fast
  1200. forwarding a bunch, but what happened around COVID was you had a contingent of people who said, I'm
  1201. not apologizing anymore. Actually, I'm like directly fighting back. And those were people, to have
  1202. some examples, like biology and like Mike Salana, who early on were saying, hey, like tech and
  1203. journalists, while they used to be aligned, they are now like two different classes of people with,
  1204. I mean, they compete economically, you know, they compete for the same advertising dollars or they
  1205. compete for intention. And then two, they're just adults. And so they were very aggressive. There
  1206. was this very famous biology, Taylor Renn's feud, which was very controversial. Mark, was it Mark
  1207. and Jason? Yeah, he was defending Mark's reputation. And the many people within tech were either
  1208. critical of people like biology or Salana, or were uncertain. But the idea that you would fight back
  1209. seemed either wrong or uncouth, that the people that were attacking tech were doing so in good faith
  1210. and kind of deserved respect and that actually tech needed to be held accountable by this separate
  1211. class. And so the idea that tech needed a defense was, didn't resonate with them. They would say,
  1212. oh, we're so powerful. Like we actually, we're too powerful. We need accountability less so than
  1213. that defense. We need people to attack us. We need people to critique us. It's not attack. It's
  1214. critique. And they're doing so from a place of love as Kara Swisher would say or something. And that
  1215. started to change once a number of CEOs got fired, once a number of regulations started to pass or
  1216. threaten to pass. And once San Francisco started to like materially deteriorate in a way that was no
  1217. longer deniable. And you started to see the ratio significantly change where when people were
  1218. uncertain about supporting someone like Mike Salana, who around COVID maybe had like 5,000 Twitter
  1219. followers. Now he's got like 220,000. Now fighting back against journalists against policymakers who
  1220. are attacking, the famous example was Zuck donating 80 million dollars or whatever amount of million
  1221. dollars he donated to the hospital and then being vilified for it. And people like Mike Salana would
  1222. go around and say, actually, he's good. Actually, like it's good they donate 80 million dollars. And
  1223. also it's good that he invented Facebook. And so you started to have this class of tech defenders.
  1224. And they did it via media and they didn't do it. They weren't making a ton of money off it. To your
  1225. point, they did it outside of the market, but it did support their efforts. I mean, they built an
  1226. audience off it. You know, biology was an investor, Salana had worked for a founders fund. Now he's
  1227. starting a media company that has raised money. So I think tech appreciates or some elements of tech
  1228. appreciate defense in a way they didn't, you know, 2017, 2016. So I think it's good timing because
  1229. they thought at the time it was uncouth or morally incorrect or something. So it's so interesting.
  1230. I'm sorry, gone, gone. Oh, I would say is I think, so I think that's a strong marking push. But then
  1231. I think it's like you then get into the details of like defense from who and in what area. And if
  1232. it's on the AI front, you know, I think people need to be more more persuaded, I would say, that,
  1233. you know, Woke AI is a big threat relative to just, you know, AI from like safety in terms of my
  1234. lies are you Kowski concerns. I think Woke AI is like less of a threat than like I'm just like going
  1235. after hardware. It's actually pretty similar to like what biology said, right? It's the pivot from,
  1236. it's the pivot from what was it? Woke is them to stateism. Yeah. Yeah, like like that to me, you
  1237. know, like the quote unquote, you know, like this information crowd, of course, pervades of the
  1238. worst disinformation out there. They're the ones who are pushing for essentially, you know,
  1239. essential as control of machine learning hardware of essentially TPUs, GPUs and the like. And there
  1240. I think is the main venue of attack as well as kind of financial attacks from the FTC. I don't think
  1241. I think I don't think it's quite a distraction, but it's definitely a smaller venue. Wokeness sign
  1242. is definitely a smaller venue than these kind of like, you know, these kind of like status as you
  1243. know, boom, we're conservatism as it sounds, it's a correct description of what the threat actually
  1244. is. But sorry, go on. And the, what's interesting there is that the group of people that might be
  1245. most empathetic to that, those concerns is actually the crypto slash web three world because they've
  1246. operated since the beginning from a existential fear that the state is going to come down on them.
  1247. And they are, you know, in many ways competing with with state power. So, you know, they know they
  1248. need a defense. And, you know, to some degree, they they've invested in defense both on the media
  1249. and on the kind of think tank front. So, I think there are other groups too, but I think it's a
  1250. strong positioning and one would just need to get more specific in terms of which, you know, which
  1251. causes which segments and then which methods, right? Because there are there are media methods like
  1252. Mike Salana does, you just like fight fire with fire and like whoever wins the Twitter war, like
  1253. wins the elites, basically, like just be better on Twitter, like win the game. And then there and
  1254. some people do it on Twitter, some people do it, substack, whatever. And then there are you know,
  1255. more policy, you know, ways of making changes as you know and as you're exploring as well. And then
  1256. yeah. Right. I mean, like I said, so you had this term top left act right for the audience. What do
  1257. you mean by that? Yeah. Yeah. The one second. To describe talk left act right, you first have to
  1258. talk about what is left and what is right. And there are a few different ways of defining it. If you
  1259. define it, ideologically, you know, you could use Brian Kaplan's definition that you've used as
  1260. well, like the left hates the markets, the right hates the left. Or, you know, Mike Amalas has this
  1261. quote, ask a right wing person, if people are equal, they'll say no, ask a left wing person, if
  1262. people are equal, they'll give you a speech. So it's this idea that, you know, left wing people are
  1263. favor more equality and they're right wing people favor more hierarchy or we'll recognize that. And
  1264. so there are other ways of, you know, slicing it ideologically. You could say the left is all about
  1265. universalism and all about, you know, universalizing. It's ideology and the right is more narrow-
  1266. minded, more tribalizing. You could say the left is more about more utopian and that they believe a
  1267. better world is possible. And thus it's one's duty to make that happen. And the right, as perhaps
  1268. more, you know, constrained, that's used Thomas Sol's, like, constrained for some constrained vision
  1269. about what's possible and thus accepting the limitations of what we can actually do. And there's a
  1270. number of ways of slicing it ideologically. But then you say, is it actually an ideology? Because if
  1271. you were to say, you know, what does left believe? And what is the right belief? Like, you know,
  1272. even as recent as 30 years ago, you might say, oh, you know, the left was anti-immigration, anti-
  1273. trade, anti-war. And today they seem to be pro all those things. Are these more in terms of the
  1274. sense of Ukraine? Like the idea is flip-flop. You know, and the party's flip-flop on ideas. And so
  1275. you can say, okay, maybe it's a group of people, the sort of the, and these left right is more about
  1276. tribal loyalty to that group of people than it is to a certain set of ideas. That's another way of
  1277. looking at it. I think this truth is all these ways of looking at it. But the last way of looking at
  1278. it, which relates to talk left act right, is this idea of maybe it's a, well, and before I get to
  1279. that, like the cleanest way of thinking about, you know, the group of people is what we saw around
  1280. with coat with masks, and basically, or in COVID in general, how the left flip-flopped so quickly
  1281. on, you know, whether they're four masks or against masks, et cetera. And this idea of, you know,
  1282. first, it was racist to think that COVID was happening. And then it was, you know, you were a rub if
  1283. you didn't think that we had to go and lock down. And so maybe the third way of thinking about it
  1284. is, is, like, maybe it's a series of tactics, actually. Like, maybe leftism is a way to sort of rise
  1285. up within an organization or make change, basically. It's a way of calling for more, you know,
  1286. respect to the downtrodden, either genuinely or, you know, unwittingly cynically. But, you know,
  1287. there's this phrase, of course, if you are in a liberal, when you're young, you have no heart, but
  1288. if you aren't a conservative, when you're older, you have no brain. And part of this can be
  1289. explained by, when you're older, you have more status. You've developed more capital, like actual
  1290. capital, and then career capital, reputation capital, and you have more of a stake in society. But
  1291. when you're young, you don't have that much. And so you want more. And, you know, maybe leftism is
  1292. like a status acquisition tactic, and rightism is a status retention tactic. And so then one has to
  1293. ask the question, like, and so when the cynical way of saying talk left act right is basically like
  1294. Harvard, right? Like Harvard is one of the, you know, most fervent proponents of some, you know,
  1295. diversity equated inclusion, I just say, or, you know, kind of wokeness or, and I'm using Harvard as
  1296. a metaphor for universities, elite universities. And at the same time, Harvard is the most exclusive
  1297. place, you know, university in the world, in the sense that anyone of the world who would go to
  1298. university would pick Harvard as their first university, and Harvard rejects, you know, has like the
  1299. lowest acceptance rate. And they advertise their lowest acceptance rate. So there's certainly
  1300. exclusive. And also Harvard, like in my university, has extreme lack of diversity as it relates to,
  1301. you know, certain intellectual topics, right? So the talk left that right is talk about sort of, in
  1302. this case, ever seen inclusion, but then act in a, you know, non-deverse, you know, politically
  1303. uniform and exclusive way. And, you know, like Harvard both has sort of the moral, like you would
  1304. think from their language, and like New York Times too, New York Times advertises itself as a little
  1305. truth. Harvard, you know, advertises itself as like a beacon of knowledge, you know, all these
  1306. amazing things. And, you know, and for the good of itself, and yet it's sitting on like a $40
  1307. billion, you know, and down, I mean, there's so many, like, you know, capitalist things about what
  1308. Harvard is doing. And capitalism in the bad way, like chronic capitalism. And this, there are more
  1309. examples, I shared in my post called the hypocrisy of elites, where the people that were, you know,
  1310. advocating for defund the police more often than not were white people who did not live in high
  1311. crime areas. And so defund the police served as a way for them to signal that they were, you know,
  1312. left-wing and thus, you know, more moral and more noble and more caring, but acting right, in the
  1313. sense that, you know, they have, you know, live behind gates or don't live in high crime areas. And
  1314. you see this actually like in many, many different issues, whether it's about, you know, gifted
  1315. programs in schools or body positivity or, you know, relationships, polyamory, like, or the
  1316. environment, right? Like the people who are spreading the most left-wing, egalitarian messages are
  1317. the wealthiest people who in their own private lives, you know, do send their kids to private
  1318. school. Do work out a ton. Do end up getting married and in monogamous relationships. Do inflict the
  1319. largest carbon footprint. And so that is kind of the the talk left act right on an individual level.
  1320. And on an organizational level, it's this idea that you need a mission, you know, the left is, you
  1321. know, one way of saying it is the left is optics, the right is substance. And, you know, if you
  1322. don't have optics that itself is like a lack of substance, like, you need both, right? Like, you
  1323. need a mission that is going to inspire people in a democratic way. And when I say democratic, I
  1324. mean, like, you need mass coordination, right? Certainly to win elections, you need masses to vote.
  1325. And they're likely going to vote for the thing that promises them more stuff or, you know, something
  1326. better. But then also on a corporate level, like, you want to appeal to customers, you want to
  1327. appeal to recruits. And those people need to tell the rest of the world a story about how they are
  1328. making the world a better place. And, you know, saying we're going to make a more efficient
  1329. hierarchy, you know, is not as inspiring as we're going to have, you know, a quality of opportunity,
  1330. which of course is a weasel word, because no such thing exists. But it's actually a good example of,
  1331. you know, that is a left optically type thing. But, you know, when done right is actually a, you
  1332. know, a right wing concept. So that's what I mean when talk left act right is basically appeal to
  1333. the more reasonable sides of egalitarianism, ones that everyone would get behind. But then also
  1334. ensure that you are acting in a way that is, you know, going to be lead to most success for your
  1335. organization. Right. I think some of that, I think some of that really does kind of show how deep of
  1336. a hole for it. Like I, do you know who parisia is? Who parisia? Eve's part. No, was that at least
  1337. that's the name he goes by on the internet. I don't know if that's his actual name. Yeah, he writes
  1338. his newsletter called Parisia. He's kind of in similar circles as Richard Hanonia and I and like, I
  1339. deasleefully, this kind of crowd. And he, he like calls this circle like right wing rationalism.
  1340. Right. And his idea, this is kind of inspired by something like Richard Hanonia said on my podcast
  1341. is that like, the right wing just needs to focus on like factual things that it knows are true. Like
  1342. genetic differences and like, and like market efficiency. And like basically, like, basically just
  1343. like read statistics and like, and like, evolutionary psychology. Right. Basically, just like,
  1344. pointed facts that are like, that like the left wing denies. Right. To me, like, this is like
  1345. saying, you know, we're going to build an entire movement. That's like, you know, like, like, just
  1346. imagine the left wing version of this. Right. The left wing equivalent of this is like, the only
  1347. thing that we are going to run on is that like, vaccines reduce mortality. Higher carbon emissions
  1348. is correlated with higher average global temperature. Like, like, a left wing that is like, that
  1349. inert, right, just would not exist. Right. It needs to have the kind of like, it needs to tell you
  1350. what to think. It can't like, it can't just like provide evidence. Like, like, it just would not
  1351. function. That, that to me is like, I mean, like the, the black pill, like, the pessimistic take is
  1352. that like, right wing rationalism wouldn't work either. Right. It's the right wing rationalism, you
  1353. know, like, but at the same time, you know, you know, I'm a con. So like, there's this broad
  1354. question as to whether if you're going to compete with the left, do you do so on leftist terms or
  1355. tactics, or do you reject the premise entirely? And the leftist tactics here. So, but what I mean by
  1356. let's take math, for example, like, you know, there some schools were banning algebra or whatever it
  1357. is or, you know, like, restricting people, like, you know, gifted programs, stuff like that. The
  1358. left wing tactic would be to say, no, we need gifting programs. We need to teach kids algebra
  1359. because that is the way that people from low income backgrounds are going to get ahead. And by
  1360. restricting that, you are getting rid of, what's it called? Like a quality of opportunity. You're
  1361. reducing, you know, social mobility. That's a left wing tactic. The right wing tactic would be to
  1362. say, actually, like, hierarchy is good. And yeah, people are genetically different. And we should
  1363. let the, you know, the most brilliant people rise to the top and let the chips fall where they may.
  1364. It's focusing more on the accelerating the top than, you know, bringing up the bottom. Oh, this was
  1365. like completely different than what I thought you'd meant. We'll put on a pin on that, but we can
  1366. talk about this right now. I think it's a very context dependent. I think a lot of the time as well.
  1367. It's like salient space, right? Like, if you're running an election, you should just draw as much
  1368. attention to the, to the math topic as possible because it's a classic wide issue, right? It unites
  1369. Republicans, split Democrats, you know, like, there's not a single Republican who's going to be
  1370. like, you know, actually, we don't like math, right? But there are actually a lot of Democrats that,
  1371. you know, like, we mentioned René de Resta, right? René de Resta supports teaching kids math, right?
  1372. Yeah, exactly. Yeah. And she does a lot on leftist grounds, but the Richard's idea around, like,
  1373. recognizing genetic differences, I don't think that's going to be very effective. Well, it depends
  1374. on what your goals are, but like, that is, you know, the benefit from a pure tactical perspective, I
  1375. guess you're playing in places where the other side won't play. But that is such a controversial
  1376. issue for, like, it's so anti-leftist. And, you know, we swim in a leftist water, we swim in a
  1377. liberal water. And it's, it's, you know, one of our foundational myths is around kind of, you know,
  1378. moral equality, equality of opportunity, social mobility, American dream, and genetic differences
  1379. just has so much implications. Now, I certainly think that, you know, everything should be able to
  1380. be studied. And we should, like, you know, we shouldn't restrict knowledge that would be, you know,
  1381. beyond the pale, in a way that is happening now. But in terms of an actual, like, platform that is
  1382. going to move people in either the private sector, unless it's, you know, a genetic company, or the
  1383. public sector, I don't see it. Do you see it? What's the argument? Right. Like, as motivation, I
  1384. mean, like, I think to, like, talk about this properly, we kind of have to talk about, like, why
  1385. egalitarianism is motivational. I'll ask you first, like, why do you think egalitarianism motivates
  1386. people? We used to live in societies that were very stratified, and they were deliberately
  1387. stratified. There was very little social mobility. And as a result of that, there was very little
  1388. status anxiety because you knew where you stood. And if you were at a high place, that was because
  1389. you were ordained for that. And if you were at a low place, it wasn't like you failed. And then we
  1390. introduced a much more socially mobile society. And as a result, where you ended up in society was
  1391. up to you and your effort. And so at that point, things became much more high stakes. And so
  1392. people's entire concept of self-worth ended up, it would be where they were in society. And so in
  1393. order to, I'm greatly simplifying, obviously, but in order to make sense of this or kind of reduce
  1394. the anxiety that comes with everything being up to you, certain environmental factors were
  1395. introduced. So that or emphasize, I should say, so that it's not really up to you, it's up to it's
  1396. up to the environment. That's just easier to take. And so if you're a successful person, you are
  1397. effectively a threat to people who are not successful because your success implies that they didn't
  1398. try hard enough. And so you could say, okay, what we're going to do instead is we're going to
  1399. introduce genetic or environmental reasons, right? I mentioned the environmental reasons, but like
  1400. if you introduced genetic reasons, well, the problem with that is then how much social mobility
  1401. really is there. And then you're back into a stratified society back where you started, you want to
  1402. do that. So regression to the mean, right? It's not completely, it's not intelligence is not 100%
  1403. heritable, nor is most trace. Yeah, most people can't really understand the nuance. Most people
  1404. think that moral equality should equate with other kinds of equality as well. And the truth is, we
  1405. do bestow moral significance to people who have been more successful, like we just celebrate them.
  1406. We boost our status on to that. So the, right, but that's because of the genetic knot, right? That's
  1407. because they believe that like that first, it's like, this is something that I'm personally very
  1408. annoyed by, right? Like the conservative, you know, like pull yourself up by the reboot straps
  1409. argument, right? Like that is the form of genetic knot. That is a form of kind of like
  1410. egalitarianism that like people believe, they believe that like people are literally created equal.
  1411. Yeah, at the same time, there is a, you know, there are things that are optimal for the individual
  1412. and there are things that are optimal for the group, right? And so for the for society and for
  1413. society, it might be optimal. Or I'll tell you some reasons not in optimal, but it might be more
  1414. constructive to, to like not have it super obvious what everyone's IQ is, right? Because let's say
  1415. like there are people who don't have super high IQ who've achieved a lot of things, who've been very
  1416. accomplished. And if there was a world in which they were known with their IQ was, maybe they just
  1417. wouldn't have gone for them. And like a much more provoquial example is entrepreneurship, right? If,
  1418. you know, people, there's this joke, I didn't try to do this because I thought it, because it was
  1419. easy, I tried to do it because I thought it would be easy, right? And so like people, entrepreneurs
  1420. are acting irrationally in many ways. They don't know that the likelihood, or if they knew that the
  1421. likelihood of the success was what it actually was, they might not pursue entrepreneurship. And
  1422. that's what happened when people become more knowledgeable of, of probabilities, they tend to index
  1423. more, right? Because on individual level, you'd rather, you know, cap your upside, if you'd cap your
  1424. downside, you know, on average. Now from a societal level, like we benefit from thousands of people
  1425. trying to be the next Elon Musk, even if it only means a handful of Elon Musk, because the outliers
  1426. outweigh everything else. And so within entrepreneurship, at least I think there's a benefit to
  1427. certain irrationality, or it's just a lack of understanding of probabilities because, you know, the
  1428. outlier benefits outweigh the costs of people trying. When everyone becomes a rational automaton who
  1429. understands probability, they just become much more, you know, much less dynamic, right? Much less
  1430. willing to take the risks that society needs. And I think you can extrapolate out. I mean, certainly
  1431. there are a ton of costs with the kind of denialism that you're discussing. I'm also elucidating
  1432. that there might be some costs with the opposite of the denialism, with really coming to terms with
  1433. what ones odds in society are. And in some ways it cuts against the, you know, some of the
  1434. simplifications or myths that tie our fabric together, because this idea of the American dream, of
  1435. social mobility, do we still want that narrative? I think in some ways we do. And how would you
  1436. react to that? Right. When it comes to entrepreneurship, I don't think it's like, it's like somewhat
  1437. g-loaded, but it's not completely g-loaded, right? Like I study mentioned in Tyler Cowan's book,
  1438. talent, is that. So I think like, I think it was Sweden that like the average IQ is only 130 rate of
  1439. entrepreneurs. And of course, you'll have people on the sides of that. So I think like, would it be,
  1440. would it be disentivizing or would it be incentivizing? I'm not sure. I think like, you know, like
  1441. the rough, the rough approximation people have of like the average entrepreneurs IQ is probably
  1442. higher than that. Let me give you an example. I mean, I think there sometimes there's tension
  1443. between truth and social cohesion. And you know, there are people like Sam Harris, who I, you know,
  1444. I really like despite his recent efforts, but like he, you know, his book lying, like never lie,
  1445. like, you know, like truth always. And all of that. Like, what a really interesting example, you
  1446. know, IQ is interesting. Another example is like dating apps, right? Like imagine if dating apps
  1447. released all their data. If all the day it was public around, I'm sure they have like, you know,
  1448. people get ratings in the system. And let's just say like, you know, someone is a nine out of 10 and
  1449. all the matches they get and someone is a one out of 10, not someone like a third of the country or
  1450. whatever, it just gets like zero. Like, imagine the type of inequality that exists on dating apps
  1451. and imagine knowing how hopeless it really is. Like, I mean, there's already a ton of insult
  1452. hopelessness to begin with. Like, don't you think that that like information would further
  1453. disempower people? Like, wait, so people knew the correct or like knew the ratio of activity on
  1454. dating apps. They would be less incentivized to what, like, to use dating apps. Like, I don't really
  1455. put themselves out. Like, to put themselves out there. Like, because we're like entrepreneurship. I
  1456. mean, like, in dating, you have to, you know, you need, well, you only need one success, you know,
  1457. to make a marriage, but you might need to try a lot. And some people might say, oh, you know, my
  1458. odds are really stacked against me. I need to, you know, work that much harder. And other people
  1459. might say, oh, it's totally hopeless. I mean, this is the same thing the left actually does on race,
  1460. right? Like, in the right, we'll criticize them for it. They'll say, hey, certain groups have it so
  1461. stacked against them that there's this actual privilege, you know, the other groups have. And people
  1462. on the other side will say, hey, by doing this, you are disempowering them. And so there's a
  1463. question of like, should we pick the narratives that are most empowering? You know, David Brooks
  1464. once said something like when you went on a macro level, everything is is environmental or you
  1465. should, you should, you know, overweight environment. On a micro level, you should overweight
  1466. individual, you know, nurture, like you should individual opportunity to change one circumstances.
  1467. Now, maybe the cost is, hey, you overweighted that and they couldn't change your circumstances. And
  1468. now they, you know, are upset because of that. But the positive is you've got a bunch of people
  1469. trying to change your circumstances. And some of them actually do. Right. Like the bias of optimism,
  1470. I'm, I'm fine with. Right. Like this kind of like, I don't know, like I don't know if I like
  1471. specifically myself would kind of engage in the bias towards optimism. Right. But like that again, I
  1472. don't think that egalitarian narrative is a bias towards optimism. If anything, like, maybe like in
  1473. a vacuum or maybe like the boomer conservative version is a bias towards optimism and then it's
  1474. fine. Right. But like in practice, that is not the egalitarian narrative. Right. Like you already
  1475. mentioned this a little bit, but the egalitarian narrative is like, oh, it's all because of racism.
  1476. It's all because of like capitalist oppression. Right. It's, it's not, you know, like yeah, maybe
  1477. maybe you in the ideal world, right. We, we do like the boomer conservatism thing of like, yeah,
  1478. everyone, everyone needs to pull ourselves up by the bootstraps. Like, I don't know, if they, you
  1479. know, censor genetics research based on that or whatever, but that's not really what the right does,
  1480. at least not now. Right. Sorry. And yeah, like I would not be like completely opposed to, you know,
  1481. making like the bootstraps narrative the kind of like national mythology. Right. But that is just as
  1482. long as it stays there, but it just has not stayed there. It just is so vulnerable. You know, like
  1483. when you, when you, when you have, you know, like when you have that, it is kind of like if you
  1484. start with that assumption and people are like, oh, if, you know, if everyone is born equal, if
  1485. everyone has an equal chance of succeeding, you know, then why are there group differences? Right.
  1486. Like when you start with that, it's, it's kind of like a, like a principle of explosion, right. And
  1487. Matthew, you start with any false claim, you can get to anywhere. You, you can get to, you know, the
  1488. claim that like one equals two very easily. And right, like if there is some kind of stable state
  1489. where, you know, everyone is a kind of like bootstraps conservative libertarian forever, like that
  1490. would be kind of understandable, you know, like from a deontological perspective, maybe I'd still
  1491. oppose it, but you know, I think that we'd have like bigger problems to worry about. But like that
  1492. just isn't reality. That just isn't the world we live in. Yeah. Have we lived in society that was,
  1493. you know, in accordance with all, like all things true, like has every society had its own, its own
  1494. myths that have helped create, you know, some sort of harmony. And yeah, they are, you know, liable
  1495. to be warped for nefarious self-seeking ends, but this is the state of society. Like, yeah, for
  1496. sure, for sure. It's always, you know, it's always incremental, right? It's always marginal. It's
  1497. always, you know, like we're not fighting to kind of as much as Darwin would like to, you know,
  1498. we're not fighting to turn into like a kind of startup government. I'm just fighting to like have
  1499. the ML band, have the machine learning band, be at least postponed if not averted, right? Hopefully
  1500. averted. But it's all, yeah, yeah, I agree with you that like in practice, you know, it's all
  1501. marginal. It's all about making things slightly better than they used to be. And that's, that's like
  1502. the approach I have to things too. So like, yeah, I don't think we're ever going to, you know, like
  1503. fully or at least in the short term, I don't think we're going to solve the egalitarian problem at
  1504. all. Yeah. And there's a level of egalitarianism that seems to be just a fit strategy for people who
  1505. are trying to pursue, you call it making difference, you could call it status seeking. It seems to
  1506. be a way for them to do that. And, you know, it's kind of like a like a sorting mechanism though,
  1507. right? Like when you do that, you're kind of attracting in some cases the wrong people. Like if I
  1508. were right or like, yeah, like if I were trying to attract people to government policy, right? I
  1509. would prefer, you know, like all attractors of people, all pipelines of government policy kind of
  1510. actively selected against the egalitarianism. Right. Like I think that would lead to better
  1511. outcomes. But yeah. And like certainly there are circumstances where I think that that's true. Yeah.
  1512. Well, most people tend to optimize for themselves. Yeah. And more so, and this is like the malloc
  1513. idea, like everyone just acting on their own, you know, accordance, you know, own self-interest
  1514. could lead to some great negative things. But I think you're actually very skeptical of that. I
  1515. think that like, people are not really rational. And in a lot of circumstances, the thing that is
  1516. leading to people to the, in many cases, like the problem is like people who have power and who are
  1517. rational kind of not going to their full extent of how they could use that power. Right. Like like I
  1518. think that like people who have power and kind of abide by basic rationality norms should wield that
  1519. power more. Do you think that they are optimizing for, like if they were optimizing for self-
  1520. preservation, would they be doing, you know, and self-interest self-benefit, would they be doing
  1521. something different? Like are they poorly optimizing even for like selfish gains? In many cases,
  1522. yes. Right. Of course, this is a case by case basis. But for example, like, yeah, for example, you
  1523. know, like many governors pre-descented, like it would have benefited both kind of electorally and
  1524. kind of long-term politically for like more governors to advance the RT. Right. Yeah. Like it is
  1525. both a kind of like politically successful move. And it is also like a like a strategically
  1526. successful move. Um, in that you're kind of denying, denying resources from left-wing patronage
  1527. networks. Yeah. Like like a lot of the time, right. This is this is a lot of the kind of like new
  1528. right thing is that like, yeah, in terms of like efficiency, right. I'm not sure, you know, some of
  1529. them also think they're skeptical of kind of like trade, right. But a lot of the time it's just like
  1530. there is an obvious opportunity for kind of for a politician to like gain power for like the vague
  1531. right or like not even right, right. Like banning CRT is that really against, you know, kind of like
  1532. Matt Iglesias thought or whatever, right. Like not the ban, but like the action itself, right. Like
  1533. so it's like beneficial to like 70, 80% of the country. And it's like, and simultaneously, it's it
  1534. is in their self interest, right. It actually does help them politically. And they just don't do it.
  1535. It is kind of what you're talking about about risk aversion. I think like, yeah, a lot of right-wing
  1536. politicians are pretty risk averse. I think there's a broader question as to like, you know, should
  1537. you talk left act right or should you talk right act right. I mean, um, I mean, we're kind of
  1538. defining right as the kind of like right-wing rationalist thing that I was talking about earlier
  1539. where you just say true things that are inconvenient. There are also like real right-wing
  1540. ideologies, right. Like populism and traditionalism, right. Like this is this kind of why like a lot
  1541. of people don't consider me right-wing, right. Like it's kind of like the Nietzsche unbelievable
  1542. critique, right. My life is just structured in such a way that it's so difficult to believe in
  1543. traditionalism. My life is structured in such a way where like, you know, every day I'm walking
  1544. around this like, I'm walking around this like, honestly, all things considered pretty nice city,
  1545. but I'm talking to like so many people. Many of them are, you know, like my fundamental behaviors,
  1546. like most of my behaviors are not oriented towards like any kind of rooted tradition, right. Like
  1547. this is kind of a reason why I don't necessarily consider myself right-wing because like there are
  1548. right-wing moral appeals. Those are like an actual thing. It's just that they've been kind of erased
  1549. from a large portion of people's like historical memory, like people who don't like actively, you
  1550. know, research think about this stuff or aren't involved and say some kind of religious tradition in
  1551. some way, right. Like there's a thing of like actual, like there's a thing of like actual, there's
  1552. like an actual right-wing and it's like not just rationalists. I agree, I agree. And in some ways
  1553. those are big threats to the actual right-wing. Like I'll give an example, like some like biology is
  1554. very, although he, you know, is very good at optics. He is he is not egalitarian. He really believes
  1555. in, you know, merit and hierarchy and he's also, you know, he's a family man, got a bunch of kids,
  1556. like there's certain sympathies that, you know, the right-wing would have with him and he would have
  1557. with right, but he's also a transhumanist and he's also like radically pro-tech. Let's pursue, you
  1558. know, life extension, infinite frontier, you know, break up in America. I don't always be foreign in
  1559. that regard, but like in ways that many right-wing people would think of him as like a bigger threat
  1560. to the right that they know, or the things that they hold dear, you know, a god country than even,
  1561. you know, some mainstream or normy Democrats. So yeah, there are certainly, you know, big fishers
  1562. within the right as there always have been. Yeah, like I don't know, I myself would definitely not
  1563. consider biology right-wing. I'm not sure if he would consider himself right-wing. No, I don't think
  1564. so. I don't consider myself right-wing. Yeah, like I think, you know, many people certainly in tech
  1565. are, you know, politically almost, right? Yeah, in many cases, right? Like I don't know, like this
  1566. is very funny, like the thing is that like my revealed preferences are right-wing, right? Like I
  1567. don't, like I don't I want to get married early, I don't want to have sex before marriage. Like my
  1568. revealed preferences are pretty socially conservative, but I also, I'm pretty convinced by like
  1569. pulling data and like the conclusion that I've drawn from pulling data is that, you know, there's no
  1570. public morality, people won't vote for thing, people won't vote for basically like societal parental
  1571. controls, right? And with some of the abuses that that could end up with, like that, that's probably
  1572. a good thing, right? Like that people in general will not hold themselves to a higher standard of
  1573. kind of social morality. And so like you generally, like this kind, this is like not quite the
  1574. courtesy of intake, but this is somewhat similar to it in that like most pursuits of social
  1575. conservatism, and I mean like social conservatism as in like the traditional version, not not
  1576. necessarily like banning CRT, right? But like, you know, like abortion, right? Roe v. Wade is kind
  1577. of like a great example of this, right? That that's just going to piss people off, and that that's
  1578. not really going to actually make the country more socially conservative in any kind of meaningful
  1579. way. Like the way I differ from Yarpin is that like he thinks that this means that you should like,
  1580. you know, he wrote the entire like Hobbits and Dark Elves thing. He thinks that you should really
  1581. like basically like defer to elites. While I think that it's more of a, I'm more of a kind of like
  1582. biology, I'm more sympathetic to biology's idea of like moral, basically like morality first network
  1583. states, right? I think, you know, there should be very socially conservative, you know, charter city
  1584. in like Utah or something where, you know, you're just not allowed to have abortion. And you know,
  1585. if, you know, birth rates continue, right? We'll end up with the entire U.S. like mostly not having
  1586. abortion because everyone's, you know, every, you know, with generational selection. Most of the
  1587. people in the future who are having kids will be people will kind of fight definition. You will have
  1588. will be people who have not reported their kids. I kind of see like that vision as like a vision of
  1589. social conservatism that I can get behind. But like in terms of like short-term social conservatism,
  1590. it does seem like, it does seem like a misevaluation of like how fucked we are in the present. The
  1591. Curtis biology kind of access is interesting as a way of, you know, showing the different points. I
  1592. mean, Curtis used biology and, you know, any other kind of, you know, non-left wing, sort of, ide
  1593. entrepreneur, as I just say, as further empowering the left. And so he sees the Rufo types and, but
  1594. even, even biology. And so they'd be better off doing nothing, like, you know, winning by losing and
  1595. waiting for, you know, as Lenin said, like the conditions for the revolution are not yet present or
  1596. something. And so whereas biology thinks that Curtis is just giving up and biology doesn't want to
  1597. wait like 30 years, you know, and watch the country turn into Brazil or whatever the concern is,
  1598. biology thinks that actually like impact can be done, you know, things can be fought. And, you know,
  1599. Twitter can be taken over. And maybe other things as well. And so I think that's another thing
  1600. that's interesting is because like people will differ on what should be. And they will also differ
  1601. on the tactics in terms of how to, how to get there. Right. Yeah, that's fair. Yeah. I think like
  1602. what's really interesting about these political theories of change is that they're like, they're
  1603. almost inverted in that the libertarians, I think, are more passive. And the populists actually
  1604. believe that something can be done. Yeah. Yeah. Like, I remember I saw Saurav Trauma, a friend of
  1605. the show, like, there was like some kind of new right figure saying something like, would you rather
  1606. have the New York Times or would you rather have controlled the New York Times or control the
  1607. Senate? And then I think like, I don't remember the exact tweet tweet, but Saurav Trauma, who is I
  1608. think the president of American Moments, which is this vaguely new right aligned policy pipeline
  1609. for, yeah, vaguely new right aligned policy pipeline, he said, definitely the Senate, this isn't
  1610. 2016, right? So like, they have this idea, or at least he has this idea, but I think the sentiment
  1611. is pretty widely shared that like, actually, the right wing knows how to use power now, or at least
  1612. has a better idea of how to use power now. And then actually governments, or like being in charge of
  1613. government does matter. And that it is, it is impactful and that there can be policy steps that can
  1614. be done. We already talked about Richard Nania, right? He's sort of, although I'm sure he disagrees
  1615. with some of Saurav's policy preferences that there is definitely policy that can be done, policy
  1616. that can be implemented. Once Republicans have control again to actually do something about these
  1617. problems. So I think, actually, it's interesting, there were moments of despair, but I think, within
  1618. the new right, I would say that despair is kind of trending downward. So there's increasingly a lot
  1619. of white pills being sold. I agree. I think SF is actually an interesting microcosm there, because
  1620. the situation was pretty dire, and it is still dire in many ways, but people were saying, hey, tech
  1621. is this evil or giant monster taking over the world, and yet it couldn't even sway local elections
  1622. that require a few thousand votes that directly affect its interests. And so there's this great
  1623. contradiction there. And I think, at some point during COVID, it just became too much, and people,
  1624. like all in people, just said, hey, we can actually get Chesa the DA recalled. We can actually get
  1625. the school board recall. We can actually like using our influence, like make a difference. And we're
  1626. previously, we thought, we were too good for it, or we didn't want to do it. Now it's just
  1627. encroached in our lives, in enough material ways, I you see all the people who are moving out of San
  1628. Francisco, that like this effort is needed, and it will impact our bottom line. And also it became
  1629. like high status thanks to people like Mike Solana, who are fighting the fight of ideas. So I think
  1630. SF is just an example of a situation where people got involved locally and are continued to get
  1631. involved and made a difference. And now there's kind of a, you know, whole ecosystem there. And I
  1632. think you're seeing that sprout sprout up in other places as well. Yeah, at the end, like actually,
  1633. you know, you know this better than I do, right? What is the sentiment among, is actually like the
  1634. area founders and investors? Like what is their orientation towards politics in the year 2023? Well,
  1635. most of them are not political thinkers. I mean, most of them are trying to you know, run their
  1636. company and do their job and do it well and have a nice private life. And what happened was sort of
  1637. the, you know, you may not be interested in politics, but politics is interesting you. Yeah. And
  1638. politics started investing all these companies. And it was kind of, you know, ironic and tragic,
  1639. like you have these, you know, Indian and Chinese CEOs, like learning the, like, you know, who don't
  1640. know the intricacies of US race relations or US, you know, politics now having to defend against all
  1641. sorts of accusations of prejudice or things that hold back that if they were to, you know, it's easy
  1642. to would hold back their company, especially. And so Silicon Valley is primarily involved in
  1643. politics to the effect that it impacts Silicon Valley. It's like one one voter issue. Now that is
  1644. both in terms of like their ability to run companies, but also in terms of their ability to like be
  1645. seen as good and not get regulated. And for many, you know, people and companies, their approach to
  1646. not getting regulated is to comply with the regime. That's like, you know, or quote unquote regime.
  1647. Like if you, I didn't know, you know, the lines eating faces party would eat my face. Yeah. Yes.
  1648. Yeah. And so I think that's like predominantly, you know, the method of operating. I think, you
  1649. know, there is a intellectual class as well within Silicon Valley, even if it's a small minority
  1650. that cares, you know, that I think is trying to, in an emergent way, sort of create this culture of
  1651. something that is not, you know, that is not woke, but that is not, you know, boomer or Trump
  1652. either. And so, you know, it doesn't have a name yet. I think Pirate Wires embodies most of it or a
  1653. lot of it. And I think that's why he's built such a strong audience. I think there's a real question
  1654. as to like, there's like electoral politics, like, will, you know, will they support Trump?
  1655. Certainly not. No way. Decentis. I think I think it's to be decided. I think, you know, David Sacks
  1656. has gone on. But I think people are still unsure. I mean, Decentis is not as distasteful, but he's
  1657. certainly leaning into aggressive culture, and it's not like he's like the techie or, you know,
  1658. really like understands or appreciates tech. So I think it's, I think it's unclear, but I think, I
  1659. mean, Silicon Valley, like many fields that have, you know, attracted, like, a lot of really amazing
  1660. talent because of all their wealth, you know, attracts a lot of agreeable people. Like, most in
  1661. order for companies to scale and get big and have impact, you need a lot of agreeable people. I
  1662. mean, you need like, disagreeable people as founders and as like, you know, builders without the
  1663. org, you need some balance, basically, between disagreeability and agreeability. But like, on a
  1664. large dimension, certainly like morally, Silicon Valley is like very agreeable, it wants to be seen
  1665. in good standing. I mean, it's your point about ESG. Like, they want to be seen as doing well and
  1666. doing good at like making money. And so, and the left is just better, egalitarian is just often a
  1667. better narrative. Like, one that seems more palatable than like pursuing excellence, you know,
  1668. excellent. That's like not as much in vote. Like, people who are from an excellence haven't done a
  1669. great marketing job relative to people voting egalitarianism. And so, yeah, because like chasing
  1670. after the real thing involves like efforts and involves being after like, like being excellent is
  1671. just like a much more difficult thing than being egalitarian, right? Like, especially in an
  1672. environment like SF, right? Or especially in an environment in certain companies where, you know,
  1673. the baseline is already really high. You know, like, it's one thing to be, you can be a egalitarian
  1674. with anyone, but it's like, you know, it's one thing to be excellent at like, you know, some random,
  1675. some random public college. It's another thing to be excellent at like, open AI, right? Like, you
  1676. really have to be, you know, if that's your legitimating narrative, right? You really have to be,
  1677. you know, like on a completely other level, like maybe Elon can do that, right? Like, but yeah, like
  1678. kind of like the higher, you know, like the higher the bar is, the higher, you know, the higher
  1679. relative to it, you have to be, right? It is, it is pretty interesting though. Like, here is an
  1680. interesting case where I think like immigration maybe marginally solves this problem, right? It
  1681. might make others worse like lockdowns, anti-market science events. But in, I think like in a very
  1682. noticeable way, immigrants kind of understand intuitively, like both meritocracy and genetic
  1683. differences much more. Yeah, I do think like, this is a very strange one because it's like, it's
  1684. coded exactly the opposite way, right? Like left wingers supposedly like immigrants, right wingers,
  1685. right wingers supposedly dislike immigrants. I don't know. Well, right wingers say, you know, like
  1686. we like legal immigrants, we dislike illegal immigrants. Yeah, but in general, it's kind of coded as
  1687. left wing, but I think the, in terms of like within the tech ecosystem, within like what works in
  1688. tech, I think it's almost, you know, 180 degrees the opposite way. Like the white people like the
  1689. egalitarianism, the immigrants like the meritocracy. I think that is generally how it is, like, I
  1690. think that is generally like the intro tech scene, right? Yeah, I mean, it is interesting. Like
  1691. maybe at the same time, like, you know, a lot of companies now have Indian CEOs and like, you know,
  1692. I'm thinking about this on the fly, but like those companies are often like peacetime companies. I
  1693. mean, you could say Saudi air right now is wartime. And you know, there's anecdotes, but like a lot
  1694. of these immigrants, well, I'm not even sure if they're immigrants, if they're Indian, just like
  1695. have grown up. Yeah, yeah, yeah, even born America or went to like American inspired universities. I
  1696. mean, there's certainly like, like there is something to the idea that West Yang said about Andrew
  1697. Yang, which is he, because he is neither white nor black, he is not like a threat, or he's not part
  1698. like there is something. And one thing that's interesting about Asian people that like go to Ivy
  1699. League schools and like are smart enough to understand that they're like being systemically
  1700. discriminated against. And yet in many scenarios, I'm sure you've seen this, like actually support
  1701. it. And you're just misunderstanding the selection effect here, right? Like Asians, like Asians who
  1702. are good at school mostly are just like apolitical, right? Like the selection effect here,
  1703. especially for like egalitarian ideologies, is like Asians who are not good at school and whose
  1704. parents are disappointed in them. Like that is the constituency, that constituency specifically,
  1705. right? Like if, like here is like the thing, like I disagree with a lot of like the mandatory voting
  1706. things, right? But if you had like mandatory voting, the Asians as a demographic would seem would go
  1707. in a very different way. And like even in countries like Canada, right? Asians are much more of a
  1708. swing vote. Yeah, that makes sense. And certainly in SF, they've had a huge impact. I mean, I think
  1709. one thing about like more broadly in this conversation is, is Elon Musk is, you know, literally like
  1710. as talented as it gets. Like as credible and as talented as it gets. And even he in his most recent
  1711. turn to be more, well, you could call it more right. I mean, there's a number of things you could
  1712. say about him. But like, it seems like he's, or it's unclear that he's better off for it. Like I
  1713. hear he's having a harder time, attract talent than when he was more politically neutral. He
  1714. certainly I'm sure is upgrading, you know, introducing new talent to the right, I would say, or to
  1715. anti left or anti woke ideas. Yeah, like Elon Musk is kind of like peak right wing ration list,
  1716. right? Like he's, you know, he's had like IVF children, like how many now 11, you know, he's got a
  1717. choice. Like like how far apart are like Elon Musk and Richard and I, yeah, ideologically, right?
  1718. Like probably not that far. Right. And but the question is like, is he taking a sacrifice for doing
  1719. so? And, you know, if yes, basically like if you want more people to shift ideologically, like it's
  1720. hard to rely on people to make sacrifices, because that how's that going to happen at scale, right?
  1721. So, yeah, that's fair. At the end of the day, you know, like there are things, there are things that
  1722. cannot be accomplished without like a shift in the state religion, you know, like I did say, I did
  1723. say that I don't reasonably expect to kind of like overturn, you know, the egalitarian conspiracy
  1724. theory that is, you know, like the state religion of the United States. But would be nice, right?
  1725. Yeah, like I do think, yeah, I do think what you're saying is like, right, we don't have too much
  1726. time, but I can kind of give the brief version of this. I know I did send you this in a in a write
  1727. up, but I actually think that like a lot of social traditions are kind of like are not like
  1728. restricting egalitarianism, or sorry, are not like in favor of egalitarianism, but are rather
  1729. restricting egalitarianism, right? Like this is kind of based on a book, hierarchy, and in the
  1730. forest by Christopher Bohm, I talked about it with Mark Kent, or say, what did I say Mark? Rob
  1731. Henderson earlier in this, in, or like not earlier, but like, yeah, in an earlier episode, sorry,
  1732. I'm getting a bit tired. And the thesis of the book is that we had egalitarian societies for much of
  1733. hunter gatherer history, and the way that those egalitarian societies came about is by brutal
  1734. murder. So someone was significantly more talented, attracted significantly more, more mates, right?
  1735. You kind of had like the long arc of history was like basically like in cell crime, right? It was
  1736. basically like egalitarian in cell crime, you know, killing the guy who was taking the most mates.
  1737. And this eventually developed into social norms that were like a quote unquote moderate version of
  1738. this, and that's egalitarianism, where people would mock, they would try to attack the social status
  1739. of more successful hunters in order to, you know, decrease the likelihood of having to compete with
  1740. them. And that, basically, this leads to, this is like basically the earliest manifestation of like
  1741. egalitarian social norms, right? We're going to pretend everyone's equal because, you know, this was
  1742. actually, you know, Malthusian times, pre-industrial times, it was mostly actually a zero sum
  1743. resource conflict. And so, you know, the, you know, the installs get mates and the more successful
  1744. hunters, they don't get, you know, brutally murdered. And this, this has happened for basically like
  1745. most of human evolutionary time, right? Most of especially pre-civilizational human history. And
  1746. that, you know, introduced these kind of egalitarian essentially like biological instincts. The
  1747. moral of the story is, you know, are you there? Yeah, I'm here. Okay, yeah. The moral of the story,
  1748. you know, is that if we really want to undo the problem, we will need at the very least gene
  1749. editing. So, or, you know, another million years of selection, which have many people who don't
  1750. think we will have. I'm not sure if we want to get into that rabbit hole. Well, the intersection of
  1751. software eating the world and egalitarianism is the world is very interesting because what software
  1752. does is it exacerbates, you know, disparities because... Yeah, yeah. This is my favorite biology
  1753. quote, right? Like freedom and inequality are, and inequality are synonyms. Yeah. Right? Like the
  1754. more options you give people, the more abilities, the more capacity you give people, the more
  1755. they're going to be chosen in different ways. Yep. And at the same time, so it exacerbates
  1756. disparities, it codifies those disparities in like, legible ways. Right, measures. Yeah, okay. But
  1757. then it also presents ways to maybe fix them because it creates like control mechanism. You can see
  1758. like, you know, D.I. in some degree is like, it's this in the labor market, right? Like, you know,
  1759. more efficient labor markets mean more inequality, you know, like the fact that we have labor
  1760. marketplaces, you know, can like allows us to measure the disparity in really efficient ways. And
  1761. because, you know, those same companies, you know, like can be captured by the government in all
  1762. sorts of different ways, like they have to, or they get to depending on your perspective, enable
  1763. mechanisms to redistribute or make it more egalitarian at scale. And so software is both a naveler
  1764. of the inequality, but also a naveler of the thing that can jump in to help fix the inequality. It's
  1765. kind of like a, you know, whack them all type, type situation. And, you know, people say that, you
  1766. know, transgenderism leads to transhumanism, or could be like, because, you know, and like gene
  1767. editing is something that the left would be a pariah of the left. But if it's, if once it's in the
  1768. hands of, hey, we can actually like make people equal. Well, hey, that's a pretty exciting idea for
  1769. certain, certain, you know, egalitarians. Right. Like a Harrison Virgeron. Yes. That would be really
  1770. a dystopia. Actually, it'd be fine. In that case, we just let China win. China gets to inherit the
  1771. earth. Well, it wouldn't be fine, but like, it would not be a dystopia. Like, you know, like if a
  1772. regime wants to use gene editing to make people equal instead of better, then like China is the less
  1773. evil power there. Yeah. The, I'm going to zoom out because this is the fundamental question to me,
  1774. at least at the intersection of, you know, egalitarianism and meritocracy. And this is actually
  1775. Agnes Callard's question where she says, you know, morality requires we maintain a safety net at the
  1776. bottom that catches everyone. But we also need an aspirational target at the top. So as to inspire
  1777. us to excellence, creativity, accomplishment. So moral worth needs to be for free, but also
  1778. acquired, required, as to inspire people to do it. And so, like, how do we reconcile those
  1779. contradictions is the task for how we reconcile, you know, egalitarianism and meritocracy or get the
  1780. best. I think Christianity is pretty good. Like, I don't know. Like some people on the right think,
  1781. you know, like Christianity is slave morality and it's leading to the level of egalitarianism we
  1782. have now, you know, it's like a slippery slope argument. Given like my kind of argument to that is
  1783. like the stuff we talked about earlier, right, hierarchy in the forest, it kind of always has been
  1784. egalitarian. In that view, right, like Christianity was upstream of the enlightenment, it was
  1785. upstream of the industrial revolution, right? Like the, the argument for Christianity is like no
  1786. one, you know, basically that you have to go through merit or you have to go through it in order to
  1787. reach merit, right? Like, and I still don't think it's that bad of the system for kind of aligning
  1788. the moral value at the end of the day, right? Like I think, you know, it's been it's been around for
  1789. 2000 and 23 years. I think it could go for 2023 more, you know, I believe, you know, we'll go on for
  1790. and turn B until, you know, we're all, uh, we'll let it happen. But like, I think just from like an
  1791. empirical perspective as well, you know, like Christianity does a good job of doing this. Um, yeah,
  1792. I do think it's ironic that the thing that replaced, you know, the slave morality of Christianity
  1793. was like more slave morality. Yeah, yeah, yeah, yeah, yeah. I've checks and balances and, and it was
  1794. kind of like a bastardized version of the slave morality parts of Christianity and, and maybe
  1795. there'll be a bastardized version of, um, kind of the, the more master morality part of Christianity
  1796. or the less slave morality part because- Right, right. This is why I think Nietzsche's historical
  1797. retelling is just wrong, right? Like he's kind of comparing, you know, modern, modern mass morality,
  1798. or like not modernism, like literally now, but like modernism, his contemporary mass morality to
  1799. like the elite morality of days ago, right? Like, like, like, if Nietzsche, you know, like, like I
  1800. would wonder what he would think of like simultaneously having, you know, the worst slave morality,
  1801. um, but also having like at many elite levels, right? In many elite, elite circles, um, especially
  1802. like the capitalist elite, not necessarily the kind of, you know, not necessarily the social or the
  1803. politically, but like capitalist elite where I think his morality or like what his kind of like
  1804. integrated morality is much better adopted than like almost anywhere else in history. So yeah, I
  1805. think I see like, I'd love to have, you know, and you're like Simone and Malcolm Collins, they've
  1806. been on this show. They want to create a new religion. Uh, good, good luck with that. Um, well,
  1807. anything that is a new religion can't call itself a new religion, I think that gives it an attack
  1808. vector, like to the extent that you think, yeah, um, and also we have separation of church and
  1809. state. And so if you're a religion, you know, like the most effective religions take over this day,
  1810. right? Or, or, you know, penetrate state. So that's, that's a person's state. Right. We have a
  1811. little less than 30 minutes left, uh, which, which topic do you want to cover? Do you want to cover,
  1812. uh, AI? Do you want to cover crypto? Do you want to cover, um, interest rates? Um, I think crypto
  1813. would be cool. But first I want to expand on my answer. You asked me about SF politics and I said
  1814. they're mostly, you know, self-interested and, and, you know, um, not super electorally focused. But
  1815. to the extent that there is a, is a platform, I want to describe it a little bit, um, on both the
  1816. macro and microwave in terms of how I see it anyway, I, I think it's this, like, I was just a riff
  1817. for, for, for maybe it would be something like it's, it's, it's okay to be an elite. It's okay to
  1818. want to be elite. It's okay to want your children to be an elite. Uh, you know, we're not racist.
  1819. We're not sexist. Uh, you know, capitalism is good actually. Tech is good actually. Crime is bad
  1820. actually. Uh, we shouldn't be forced to hire people who are not qualified. Um, you know, some taxes
  1821. are fine, but government is a disaster zone and we shouldn't be keeping feeding it more money. Uh,
  1822. the schools should teach real topics, you know, like math and not indoctrinate kids. Pro math, yeah.
  1823. Um, and, you know, we need politics out of our companies, like stat, um, of any kind. Um, and at the
  1824. same time, let's not like relitigate things like abortion or gay marriage or immigration, things
  1825. that make us seem like bad people to the people that we care about. Um, and, and I'm on a mic, more
  1826. micro way, I think there's like, you know, you can look at the, you know, philosophies and practices
  1827. of caring and empathy that dissolve into kind of veneration of victimhood and, in fact, everyone
  1828. with resentment and, and misery and, and see that it's a straight downward spiral of, you know,
  1829. bitterness and the, and you could say, like, you don't need to live like this. Um, I think like, you
  1830. know, we don't, we don't need to feel like this, you know, you don't need to feel miserable at
  1831. yourself all the time. You don't, you, this is kind of like the George St. Peterson's look of value,
  1832. like, like, like, you can be normal and happy and non-judgmental and productive and satisfied. You
  1833. can treat people in business as individuals and not feel the need to obsessively keep, you know,
  1834. demographic scores or treat people like tokens. Uh, you're not racist. You don't need to think about
  1835. race. You can make money. Um, it should, in fact, making money shows you're doing something that
  1836. other people value. You can donate some of it to help the last portion or as much as you want and
  1837. you can, you know, spend whatever you want to knowing that spending is helping other people provide
  1838. for their families. You can say what you think and if other people don't like it, they can go home
  1839. and be upset, but you don't have to be. Other people can say things that offend you and you can
  1840. shrug and move on with your life. It, you can make mistakes. They can be your fault. You can fix
  1841. them. You can enjoy the spoils of your work. You can work hard and outcompete others and, and, and
  1842. achieve great heights and not feel guilty about it. You can also choose to live a calmer life if you
  1843. want to. I mean, the most ironic thing is that the, uh, you know, one of the most controversial
  1844. topics in tech Twitter is, you know, how hard should you work? Um, which just shows like the core of
  1845. the, the effort to divide. Anyways, I'm kind of rambling a bit, but those are some of the both, I
  1846. think macro like political takes and also some of the micro like, you know, how one should live
  1847. one's life or how, how, how we should approach one's life that I think comes out of a, you know,
  1848. sort of technologist builder mindset. Right. It's interesting that you use that term because I
  1849. actually had a, I had a conversation with, uh, with a friend about builder versus founder. To me,
  1850. like, I kind of have a negative taste in my mouth when I think of like the people who call
  1851. themselves builders. Like, like, I'm thinking of basically like the 2018-ish hackathon scene, which
  1852. I was pretty skeptical. I actually think like this maybe is also a controversial take, but I think
  1853. like monetization and like, bounties and, uh, crypto have been a net, like, extremely net positive
  1854. influence on the hackathon scene. Um, like pre-crip, people don't remember that, like the pre-crypto
  1855. hackathon scene was just like enormously busy work. And, you know, even if like the crypto stuff,
  1856. like, is like infrastructure that's not all that technically complex and, you know, not like
  1857. scientifically revolutionary in any way or form, at least it's like worth something to someone,
  1858. right? Instead of literally being, you know, like, side projects that are actually not functional in
  1859. doing anything, then make someone's resume looks slightly more impressive. Yeah. You know a lot more
  1860. than me on the, uh, on the hackathon front. I do think, okay, I did not expect that. I do think
  1861. builders, well, hackathon is more of a younger man's game and more. That's fair, so that's fair. Um,
  1862. I think the, what builder enables is just a wider, like, if you're, if you haven't started a
  1863. company, what do you call yourself? Okay, that's fair. Yeah, that's fair. People use the term
  1864. operator, but that's like not that compelay. So the more charitable view of builder is just that
  1865. it's more comprehensive to include people who do great work, but haven't really founded the
  1866. companies. Sure. Yeah. Um, so one of the, one of the topics on this list reads crypto as ESG for
  1867. libertarians. Uh, I'm sure, uh, we have not, you know, we, we have not pissed off all groups
  1868. equally. So, uh, for the sake of equity, let's talk about crypto for the sake of intellectual
  1869. diversity in being annoyed at the tapes in this podcast. Let's talk about, let's, let's talk about
  1870. crypto. Uh, first of all, how much of crypto was a zero interest rate than all none? Um, the, well,
  1871. it seems like, um, a lot of crypto was, um, you know, accelerated by easy money. Um, and so, you
  1872. know, the industry has, um, you know, contracted, um, but, um, you know, we're not going to have
  1873. high interest rates forever and, um, good times we'll, we'll come back. So I certainly a lot of the
  1874. speculation was funneled by, by zero interest rates, um, but also, you know, this woman, Carlota
  1875. Perez has a great, um, you know, uh, study of how different technological revolutions, um, go
  1876. through cycles. And one of those cycles happens with initial mania and a bubble that leads a ton of,
  1877. you know, people to spend all this money on all these projects, some of which go nowhere, but others
  1878. of which become like the critical infrastructure for, um, the next hype cycle. And so right now,
  1879. there's a crypto winter in terms of, um, you know, capital invested and in terms of certain projects
  1880. that relied on zero interest rates or, you know, high yields. At the same time, there, um, there's,
  1881. you know, a lot more purists, and there's a lot more developer activity, um, prior. So, um, that's
  1882. how I'd, uh, how I'd respond there. Yeah, the cynical take is that like, AI was right there, right?
  1883. It was lying there. It was, you know, um, and it just didn't have one like very effective proof of,
  1884. proof of concept yet. And then like, like, the main difference between kind of like, AI state of the
  1885. art, um, you know, like, what was it? What, you know, like 20, like summer 2022, and AI state of the
  1886. art now is like pretty negligible, but it was mostly just like really, first of all, releasing stuff
  1887. to the public. And second of all, like having a really nice clean proof of concept in chat GPT.
  1888. Yeah, absolutely. I mean, seeing, you know, there were countless people who switched from Web3 to
  1889. AI, you know, just like that. Um, so certainly, um, it was, it was right under a nose and the
  1890. combination of, you know, the markets, tanking, and thus, you know, financial, like FinTech, tanking
  1891. alongside of it, including inclusive crypto, also enabled, um, or, you know, accelerated AI's, you
  1892. know, mind share. But I think the ESG, um, analogy is, uh, doesn't hold as much for me because ESG
  1893. to me sounds like a way for finance people to seem moral or pure, um, but it's not really rigorous,
  1894. whereas they're not true believers. Um, or, or, or the true believers, they're not like, like ESG is
  1895. not built on, you know, interesting intellectual capital, whereas I think crypto, um, does have a
  1896. much richer intellectual, substrate to it. And it's also worth, you know, calling out that there are
  1897. like many different substrates, right? Like the Bitcoin community, um, you know, builds off of the,
  1898. the, you know, the Austrian economics tradition, you know, some libertarianism, and it's all about
  1899. kind of de-politicizing finance from, from governments. They're, you know, separating all, you know,
  1900. state money, right? Whereas the Ethereum tribe is, um, trying to, um, instead of focusing on de-
  1901. politicizing from the government, they're de-politicizing from big corporations, right? And so, um,
  1902. you know, the Facebook's and the, you know, the other centralized, um, you know, powers of the world
  1903. that are in the private sector. And that's like a cursory reading. But they are true ideological
  1904. believers who are trying to manifest the world. It's like a much more practical EA, I think, or like
  1905. much more technical, much more practical, an experimental, um, version of, of, of having impact. So,
  1906. um, I see it as, as, as Julia, you know, ideological people are trying to make, make, make a
  1907. difference. The technology is not nearly there to the same degree that AI is, um, certainly. AI is
  1908. like major, you know, technological breakthroughs have been. Right. Like, Peter Thiel famously said,
  1909. you know, like, crypto is communist, or, no, AI is communist. Crypto is libertarian. If crypto were
  1910. not libertarian, right? If the main, you know, like, marketing around cryptos, like CBDCs or
  1911. whatever, central bank digital currencies or whatever, right, and not like the Bitcoin intellectual
  1912. tradition, what do you think that would have, affected in terms of investment in crypto? Yeah, I
  1913. mean, it's a good example of talking left acting, right? Like, in terms of like, you know, it's a
  1914. way of making a lot of money and it's a way of, um, you know, trying to do good at the same time.
  1915. And, you know, uh, Daniel Gross said on a podcast recently is like, uh, you know, fire can be used
  1916. for arson, and it can be used for cooking. And so, you know, crypto can be used for libertarian
  1917. ends, and certainly the people who are working on it are trying to do it, but it can also be used
  1918. for state, you know, state control ends. And, um, and, you know, same with AI there, right? And so
  1919. it wouldn't be hilariously ironic if, um, if crypto was largely used for, um, you know,
  1920. authoritarian ends, and maybe the opposite with, uh, with AI. Yeah, that's what Sam, Sam Hammond
  1921. thinks will happen. Um, yeah. Uh, I think that what's, what's very interesting is that like, the
  1922. narrative of profitability around crypto became like the actual narrative, like, the narrative of
  1923. like making things that were not profitable before, like NFTs were kind of based on this, right?
  1924. We're going to like finally make art profitable, right? We're going to, we're going to supercharge
  1925. monetization, right? The narrative for making things profitable became what was profitable, right?
  1926. Like, so, so when you have these kind of like true believers, when you have like libertarian true
  1927. believers, they induce kind of like a market shift in the same way that like political true
  1928. believers induce like a, like a political shift, right? Like, that sounded like, in my head, that
  1929. sounded like an awfully biology like sentence, and I tried to make it less biology like, and it only
  1930. became more biology like of a sentence. Man, I think, I mean, crypto that's non ideological, maybe
  1931. just looks like fintech, like fintech, right? Yeah, is, you know, there's a lot of capital in
  1932. fintech, but they're just non-autological about it. I mean, there's some, you know, types of
  1933. companies that are trying to expand access and have, you know, strong optics and strong kind of a
  1934. more egalitarian mission, but most of it is just like really practical people being like, hey, we're
  1935. just going to focus on like making as much money as possible. And so, yeah, maybe that's what it'll
  1936. look like. Yeah, and I should say, you know, like, I went to East Denver, I respect the crypto
  1937. people. I think I had a tweet that was something like, crypto is the most trusty and, or like
  1938. movement I've ever seen. Everyone loves being low trust, but really I could leave my laptop
  1939. anywhere, and it would be there in a day. Yeah, very high trust community, and I am pearl high trust
  1940. communities, so it's definitely great. Okay, we'll return a little bit to some of the earlier, some
  1941. of the earlier topic and ask for some advice for young people. This has been a very fun, I think,
  1942. topic for like revealing how people think in, I think, at least a less ideological way, but what do
  1943. you think are good steps for young people in terms of dating? Good question. Well, I think I'm only
  1944. going to speak to men here, because I don't I don't know the women side as well. I think one thing I
  1945. think to really appreciate about dating for men is that it gets a lot better as you get older, and I
  1946. think it's hard to fully appreciate that, and you know, you can accelerate that, but basically like
  1947. the more you have to offer to the world, the more successful you are, the more you're going to be an
  1948. attractive person, and so you can try to sort of like, you know, product and marketing, like you
  1949. could try to focus on marketing, like your clothes and, you know, sort of like certain ticks and
  1950. stuff like that, and that's important, like you should do that, like marketing is really important,
  1951. but like, you know, what you have to offer the world, I either like the product, and that's not just
  1952. your career, but career is really important, it's also like your sense of integrity, like, you know,
  1953. your values, like that is, you know, marketing bad products and never going to work, and you know,
  1954. the challenge of a lot of like, dating advice is it's really like short-term, it's really like
  1955. marketing focus, so I mean, I think it's like, you know, Naval even has this line, it's like, you
  1956. know, if you want to marry an incredible person, be an incredible person, like, you know, and so I
  1957. think, you know, dating, but it's one it's worth often, like, recognizing what is the long-term
  1958. goal, long-term goal is presumably a amazing partnership, you know, like marriage, that could happen
  1959. early on if you find the right person, it doesn't have to happen, like, right away, and the more
  1960. that you, like, work on yourself, the easier it will be, and I guess I'm implicitly talking to
  1961. someone early on, because you're saying asking for advice and flying, like, maybe it's not coming
  1962. super easily, and so, you know, like, keep working yourself, be patient, and great things will come,
  1963. at the same time, like, I think friendship is a great way for a relationship to form, like, even
  1964. knowing that you're looking for a long-term partnership, I think separates you from a lot of other
  1965. men who are not looking for that, and so getting to a place where you are looking for that I think
  1966. is more likely to lead you to happiness than kind of engaging in fuckery, so I would probably get
  1967. more serious early on, I would surround yourself with the men that you want to be, and so I would
  1968. look at your, I would encourage you to look at your circle of people, and look at the influences
  1969. that you have, you know, online as well, and say, like, are these the men that I want to be, like,
  1970. do they have great partners, do they have great, you know, relationships, and, you know, it's much
  1971. easier to change your environment than your insides, we tend to become the things that shape us that
  1972. that are surround us, so I would focus on that as well as the personal growth, any reactions to
  1973. that? Right, I think, I'm worried that, like, how much do you think, like, becoming more valuable,
  1974. or like, some people, I think that's definitely the case, right? Like, like, like, young person with
  1975. a startup versus like, young person with a, you know, multi-million dollar company, definitely,
  1976. definitely huge change in value, right? I'm not sure if that's true for like most, I'm not sure if
  1977. like the relative advantage in dating markets for most men is because of increases in their value
  1978. versus just like relative and like true relative decreases in women's value, just in terms of like
  1979. fertility, right? Like, like, that's a cynical take on it, is that like, the market dynamics are
  1980. just true, or like, the market dynamics reflect something real, but like, the thing that is real is
  1981. that like, for example, if you want to have kids, right, your prospect are just worse, right? I
  1982. agree. I mean, I'm in support of, you know, finding someone early if you can. I think just a reality
  1983. that people that may, men may not understand is that like, women are less likely to date down, so,
  1984. and date down on all areas. So like, the women who has a more successful career than you do, and not
  1985. limited to career, but that's one big element to it, is, is, is not going to date you, probably. And
  1986. so, the more successful you are, the wider your pool is, basically. And that's why, like, younger
  1987. men are disadvantaged relative to older men, among other reasons. So, but I do agree with you that
  1988. relative that, yeah, the woman change is more significant, and of course, that impacts, you know,
  1989. male options. Right, and the second thing is like, finding someone at all, right? I think like,
  1990. especially people like similar, or are mutual circles, you know, like share, like, like, either idea
  1991. space or like, startup space, right? There's a pretty big, you know, there's a pretty big, sex ratio
  1992. there. Yeah, I mean, I know it's super generic, it doesn't work for everybody. Like, I do think
  1993. dating your best friend, or dating someone who could be a best friend, I think is pretty good
  1994. advice. Like, you know, no individual person is going to be everything to everyone, and there's
  1995. going to be, you know, as unromantic as it sounds, like sacrifices on some dimension. But if you
  1996. have a best, like, someone's truly, you know, you would be super close with, even if you weren't
  1997. dating, because you just respect them so much, you appreciate them so much, you could handle the
  1998. highs and lows with them. You know, that, that feels like a pretty good thing to optimize for, and
  1999. if you have the opposite feeling, like, I don't know how sustainable that dynamic is, you know,
  2000. going through the, like, thinking about who I can have a fun weekend with, or who you can have a fun
  2001. year with, is very different from who you can have, you know, a family with, and, you know, build a
  2002. life with. And I think we don't really fully appreciate that, like, you know, when were you on?
  2003. Right. Good advice. Last question of the show, always the last question of the show. I'm sure you're
  2004. prepared to some very good answers to this. What is something that is too much chaos and needs more
  2005. order and something that's too much order and needs more chaos? It's a shame that I, given how much
  2006. I enjoy the show, I, I should have answered for it. Let me think for a minute. Yeah, yeah, it's hard
  2007. because we talked about so much. Yeah, usually I want something that we haven't talked about yet.
  2008. I'm sure there are many examples that we have talked about. I think I really like you're saying
  2009. about in order to protect the market, you need to like exit the market, or like, directly where
  2010. you're going with that. And I would love to see much more experimentation on that, on that access.
  2011. So I would love to see tech get more engaged with politics, with policy, with culture, with
  2012. education, things that don't, like, make them rich right away necessarily, but, you know, protect
  2013. the broader ecosystem and hopefully will present, you know, more options for them down the road. And
  2014. so I would love to see more, kind of chaos is the right word, but certainly experimentation. Like, I
  2015. love that, you know, Barry Weiss and Joe Lawson, I just said, hey, screw it, looks like making a new
  2016. university. And the team there that did that, like, I'd love to see much more of that. And I think
  2017. when something seems too ordered, or too regulatory, or too whatever, blah, not worth our time,
  2018. like, that's where we need to to see more experimentation because, you know, otherwise that order,
  2019. that bureaucracy will just become even more cemented. And then in terms of effective libertarianism.
  2020. Yes. And then in terms of what has chaos and needs more order, I actually think, you know, one
  2021. challenge in the tech community is that they're all these companies are competing with each other,
  2022. like Facebook and Apple and Google and, you know, Microsoft and now OpenAI, like, it's actually hard
  2023. to have, like, tech as a class act together because those companies are trying to kill each other.
  2024. And so I think we need a better job, you know, biology starts about like a NATO for CEOs before,
  2025. like, I think we need more organization around, like, collective tech lobbying, so to speak, in a
  2026. product. Some class solidarity. Yes, exactly. I'm sure this is wonderful. That's a pretty good note
  2027. to end on. Anything else you'd like to add before the end of the show? Of these four hours? I
  2028. enjoyed this conversation. I'm excited to see what you decide to do. To do next, I appreciate that
  2029. you, you know, shared the conversation that we had with your audience, and I hope you continue to do
  2030. so. People who've made it this way have a strong, made it this long, have a strong connection to
  2031. you, and I think there will be a number of people who you'll be able to recruit for whatever you do
  2032. because of it. So excited to see you continue to follow your path. Yeah, I'll be awesome. And I'll
  2033. just leave this as a note for the ending. This was my final one of the most enjoyable episodes so
  2034. far. I had a few podcasts that I think were still like very informative and very interesting, but
  2035. for me, we're personally, like, pretty rough and pretty, like, I felt like I made some mistakes. But
  2036. for this one, it was, it was just very thoroughly enjoyable the entire way through. So yeah, thanks
  2037. for, thanks for coming on. It was great. Awesome. Thanks, Brian. I hope you enjoyed my conversation
  2038. with Eric Torenberg. If you'd like to help us out, the number one thing you can do is to let a
  2039. friend know either in person or online. It's the best way to help the show. And hopefully you'll
  2040. have a friend who's either interested in the same topics, has the same habits, and not only are you
  2041. helping S out, but you're also helping your friend find something interesting and hopefully
  2042. enjoyable as well. You can also help us out by leaving a five-star review on any podcast app,
  2043. suggesting future guests in the comments, subscribing to my substack, which is linked below. And if
  2044. you want to catch another great episode next week, subscribing to the podcast as well, once again,
  2045. on any podcast app. If you do that, then you'll get another great episode next Monday. See you then.
Add Comment
Please, Sign In to add comment