Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- When I was getting started in tech, you know, 12 years ago, you didn't really see 20-year-olds,
- 22-year-olds coming out of nowhere and being able to build audience, you know, social credibility
- from people who matter. You kind of had to get attention in different ways and you'd often it was by
- by building something really interesting. One thing I think to really appreciate about dating for
- men is that it gets a lot better as you get older and I think it's hard to fully appreciate that.
- You asked me about SF politics and I said to mostly, you know, so I'm interested to the extent that
- there is a platform where I want to describe it a little bit. It's something like it's okay to be an
- elite. It's okay to want to be elite. It's okay to want your children to be an elite. Hi, hi,
- welcome. Welcome. This is The From the New World Podcast. Today is an episode I've been hoping to
- bring to you for a long time. I'm speaking with Eric Torrenberg, the founder of Ondeck Village
- Global and as we'll discuss in the episode, first employee at Product Hunt. He's also started three
- new podcasts, upstream, cognitive revolution and moment of zen on which I've appeared as a guest. We
- discuss a variety of topics. Really, you'll see why I see Eric as one of the most insightful public
- commentators as also someone who and also someone who gives good advice on many of the topic areas
- that both me and hopefully all of you are interested in. We discuss this early career, venture
- capital. Silicon Valley is a growing media power, particularly the all-in-podcast, the ideas of
- Curtis Yarvin, investing for public impact, social desire ability bias, elite theory, his idea of
- talk left and act right, and egalitarianism, artificial intelligence, and dating and advice for
- young people. Really, we've hit almost all of the from New World bullet points that you're familiar
- with. So, it's a great episode. It's almost four hours and I really enjoyed all of it. So, without
- further ado, here's Eric Torrenberg. So, you told me you were just coming in from another podcast.
- What was the most interesting thing you talked about there? I recorded two podcasts, so I will give
- you two different bullet points. So, one I did with someone in the creator space, in name is Steph
- Smith, and we were talking about how the Mr. Beastification of creators were basically creators are
- treating themselves as entrepreneurs in the sense that they're not just focused on how do they build
- the biggest content machine, but they're focused on how do they build their content as a wedge to
- get distribution, to then build products, to then sell to their audience. So, Mr. Beast has his
- company Feastables, which made 100 million in revenue already, where he's just selling. I believe
- it's burgers, maybe it's chocolates. So, it's kind of like more basic products, but Mr. Beast just
- raised money from investors that over a billion dollar valuation. And so, the question is, in order
- to do that, investors need to believe they can have potential to 100x their money. So, can Mr. Beast
- build or partner with people who build software products? And it kind of turns the sort of venture
- creation model on its head, where people typically focused on building products first, and then
- finding distribution. But now, in partnership with creators, companies can get distribution first,
- and then build products on top of. So, that was one conversation. And then the second conversation
- that I paused was with Mike Solana for a moment of Zen, and we were comparing the response to AI
- with the response to crypto. Because, for the past few years, the response to crypto, the mainstream
- response is, hey, this doesn't really do anything. What's the use case? This is a all hype, et
- cetera, or worse fraud. And the response to AI is kind of the opposite. Like, this does too much.
- This is going to take over. This is way too powerful. And another difference is that a lot of the
- critics in AI are technologists. So, the call is coming from inside the house, so to speak, where
- there's kind of an intra-tech debate as to whether we should be accelerators or pausers or doomers,
- whatever you want to call them. So, those were two conversations. Right. It's interesting, because I
- have, I mean, especially now, I'm trying to really circulate through the beltway through DC. And
- that's not the picture I get about AI policy there. You know, like AI policy on one hand, there are
- some people who are worried about really just fitting it into the same old political shoe boxes. You
- know, what if the AI says, you know, racist facts? And then there are people who are, once again,
- fitting it in their old shoe boxes, like, oh, what does this mean for great power competition? And
- actually, that I'm a little bit more amenable to both in terms of their kind of people who I find
- more respectable. And I think it's a much more serious issue. Like China is a big deal, but is it,
- you know, is it the entirety of the deal when it comes to AI? Is it really the kind of showing point
- that everyone should be focusing on? I don't think so either. So I think it's the case where it
- might be true that among the people who are building the technology, this is the kind of mainstream
- discussion. It's like, oh, is AI going to create an apocalypse? Are we going to get run out of
- control, misaligned intelligence? But I think like when it comes to the question of whether AI is,
- or whether and how AI is regulated, there'll be much more of an influence from the kind of like
- academic, you know, quote-unquote, woke types, more so probably at the FTC. And then you'll get, you
- know, I'm not sure if they're still considered right-wing anymore, but at least people who you would
- use to consider right-wing national security hawks at, yeah, at Department of Defense. Where have
- you netted out in terms of your interests with AI? Because, you know, early on, you were, you know,
- writing about kind of the excess of chat, you know, the woke access of chat GPT, etc. And you
- haven't been as vocal about it since. I'm curious how you've evolved your thinking or as sort of the
- chessboard is evolved to. Yeah, it's kind of been focused around what I just said, right? I can, and
- I do say like I do like talking to the people who listen to the podcast or who read my writing. I'm
- very good friends with them, but at the end of the day, there's kind of like two things that I could
- do, right? I could spend more time doing basically public conversations or public writing or like
- private conversations and private writing. And at least from my current point of view, I think the
- latter is just, you know, thousands of times more effective, like literally, you know, one second
- spence, you know, talking to say like a national security contact, right? Is more likely to have
- influence than like one hour writing a blog post. And, you know, there's like a there's like a mean,
- right? I'm going to still be working on the newsletter and working on a very long article about
- actually specifically about the kind of AGI effective altruist concern and my skepticism on why
- machine learning progress is not going to be kind of continually exponential is not going to
- continue at the rate that it is for much longer because of many specific technical factors. We can
- talk about this a bit if you'd like, but I really like to dive into actually some of the stuff that
- you've been working on as well. Sure. Yeah, I'll let you go. Right, yeah, I just want to make sure I
- just want to make sure because some people are really interested in the AGI arguments and then some
- people aren't, right? So you've been a very successful investor already. You have multiple companies
- as well. And now you started a podcast. It seems like it seems like to me the opposite trajectory of
- a lot of people might where they will start a podcast will gather some kind of social media
- following and use that to leverage and use that kind of contact. But like you said earlier, a bit
- like you said earlier, get the distribution first and then try to build a startup afterwards. So why
- do you think, why do you think, you know, like the best use of your time is doing a startup. Do you
- think that or sorry, not a startup doing media starting three podcasts? Do you think that? And if
- so, why? Yeah, it's interesting. When I was getting started in tech, you know, 12 years ago, you
- didn't really see 20 year olds, 22 year olds coming out of nowhere and being able to build audience
- and, you know, social credibility from people who matter the way that, you know, you're able to
- today, the way that DoorCache is able to today, the way that a number of people are able to today.
- And so you kind of had to get attention in different ways. And often it was by by building something
- really interesting. And the thing that I got a little bit of attention for and I was, you know, the
- first employee founding team, not the CEO, was product time. And so that's what put me on the map in
- the sense that you've gotten on the map via via your podcast and via your writing. And I used
- product hunt to, you know, be able to help entrepreneurs, basically, because we could help them get
- customers from, you know, product time is a discovery platform for startups. It's almost the
- equivalent of being a journalist in that you get to give people traffic, but without any of the kind
- of, you know, emotional loading of or connotations and negative connotations of being journalists,
- only the positive parts. And I would say it's closer to something like Expedia, right? We're like
- Expedia ranks hotels and flights, product hunt ranks, you know, software startups. Yes, yes, and
- highlights them. And the, I used that to build a network of entrepreneurs. I then started investing
- in those in entrepreneurs that we're doing well on product time and using that as a, as a way to get
- on cap tables. And that's, you know, you know, when startups raise money from investors, the
- collection of investors is known as the cap table. Right. So you basically get in the door, you're
- investing early in some tech startups. Yes, if you're, if you're in the music world, it's like ANR
- for record label, you know, you're discovering artists, you're discovering startups. And some
- investment firms would give me money on their behalf to invest because I had the access and because
- I had the relationship. And so my, my advice to anyone who wants to be an investor is to find a way
- to build a deal flow machine that makes them money. Because if they can build a deal flow machine
- that makes them money, they can sustain themselves. I either make money part. I paid them a salary.
- And they can get access to deals, which they can use the money to invest. It used to be that you had
- to work your way up the totem pole of a busy firm, which could take 10, 20 years. But because of
- their, these new ways of building media, whether it's a newsletter, where you're an expert on a
- different category or a podcast or a, or kind of a, you know, community ranking. You started for a
- rating site like we built a product on. It's easier to get in the door and connect directly to
- entrepreneurs and investors will really respect that. And so what I'm doing now is basically
- realizing how helpful it was to have built product on for my investing career, how helpful it was to
- have built on deck for my investing career. And you can think of on deck as a, for people who don't
- know, it helps people find co-founders. So if product on helped startups find customers, it gets a
- distribution, get traffic, on deck help them find early hires and co-founders. And so I'm interested
- in building more tools like on deck and product on to have an investing advantage as we will get
- into venture capital in a bit. But in, you know, money is green and money is a commodity. And
- there's thousands of investors out there all competing for the best entrepreneurs. And venture is
- different than other asset classes in that the entrepreneur picks the investor. And so you have to
- be, you have to bring something to the table. And so if you don't bring, you know, 20 years, 30
- years of operating experience at the highest level, which I didn't, when I started, I have a little
- bit now, but certainly not to what I just described, you need to have actual products or service
- that make and help entrepreneurs at a meaningful scale. And it was, I don't want to ramble too much
- about VC. And but one thing. Don't worry, this is a pro rant podcast. Yes. I'm sure I'm sure a lot
- of people in my audience want the insight about VC. Yeah. So I am building a podcast network and
- also a newsletter network because I think it's basically this trend that that I mentioned, people
- are starting to, you know, companies are built audience first. And so I want to build different
- media properties in different categories for different positions. And I want them to be able to
- access these distribution, you know, these customers that are previously hard to reach or otherwise
- hard to reach and then invest in companies that sell to them or build companies that sell to those
- customers. So right now I'm starting kind of more, you know, general interest, more, more high level
- to build the, build the audience. But I'm going to launch a series of more dedicated, specific,
- vertical specific podcasts. My first one is an AI one, but also a position based podcast that, you
- know, we've seen this in the sports world. There are these like media conglomerates, like the ringer
- or bar stool or the athletic. And tech media hasn't really innovated much. There've been a few
- individuals. Lenny Vichitsky who's got a product management newsletter. Harry Stebings has a VC
- podcast and he's only like 25 years old and he's got $400 million of their management just because
- he's built an amazing VC podcast. Pachy McCormick is another example. And so I want to, I believe
- that there are more of these people out there who are practitioners, who are experts, who can make
- more money doing what Lenny, Pachy and Harry do it, then they do at their sales job, or their
- engineer job, or whatever job they currently have, and have a more interesting life. And so I wanted
- to aggregate those people, help fund them, get them off the ground and build a collective of them.
- And in some way, it also intermediates the journalists, too, because these people are actual
- experts. And it's expert to expert sort of, you know, content where as opposed to just a, you know,
- general middleman who's not an expert at all. And it is often, you know, in a different class and
- has counter incentives in some ways. And so it's, it's very funny. I remember talking about this
- with Jonathan Rouse. She kind of describes journalists. And I think like this is actually accurate
- to some degree. He compares kind of like journalism and social media and like journalism as being
- between a more kind of like, he doesn't use this word, but this is how I interpret it, kind of
- elite, basically elite consumers of news who have basically norms around, basically like filtering
- out a lot of, I think like this is, in fact, the filter of low status, but I think like correctly
- filters out a lot of incorrect information with some biases, of course, of information that they
- failed to filter out. But like when I was speaking with him, I actually had, you know, I had the
- same understanding of kind of tech media, right? That a lot of tech media is kind of like insular.
- And it's insular in a similar way, I think, to how, how kind of legacy media is insular. And that
- you have these concentric circles you have with you have the general public. They have certain
- predispositions that lead to bad media patterns. And then New York Times comes up and they, and like
- Richard and Ania kind of convinced me of this, right, or mutual friend. And he's convinced me that,
- you know, there are a lot of virtues in New York Times, Jonathan Rauch as well. There are a lot of
- virtues that kind of circumscribing this smaller circle in which you have to observe these norms.
- And then like tech media, I think like does kind of the same thing in creating an even smaller
- circle, like a lot, not a lot of people, I think, can really listen to like a group podcast with
- Nathan Lebenz, cognitive revolution, and really get everything, right? Maybe some people can listen
- to it and still enjoy it and get a lot of things. But to me, it feels like very inside of me in a
- way that I think is like generally positive in terms of its effect on the content there. But like
- the reason I'm saying this is I all lead to like this one one big trade off, right, which is kind of
- private power or leap power versus public power, right? You want to, on one hand, create, you know,
- if you're going for Mr. Beast, then you're creating this kind of public really like any man appeal,
- or if you're going like cognitive revolution, or if you're going even further than that, I don't
- know. Actually, I'm not completely sure on like a tech example or like a tech example actually, like
- a machine learning example is just like papers, right? Like like like you're building like neurics
- or something, right? Where would you put your project around that? Like is it closer to the elite
- end or is it closer to like the public end? I would say most of it is closer to the elite end if
- we're talking about like, you know, how technical versus versus accessible. But I mean, the best
- things, you know, all in, for example, is appreciated. The tech podcast is appreciated by insiders
- and it's broad, meaning and it's like deeply popular. It's one of the most popular tech podcasts in
- the world. I think it's the one or two. And so I think, I mean, that in some sense, like, you know,
- I wonder if it's a false trade off. If all in and others are able to achieve both, like, is that
- really what the, what the, like Lex is another example, right? Like, I mean, all in is truly
- insideery and that they get in the weeds and stuff Lex is more like what's the meaning of life and
- stuff. Yeah. All in, I think it's like a better example because, right, I listened to it and I see
- it as, you know, like I often have to follow up a lot, especially with a lot of the financial stuff,
- I'm less experienced with that and just almost everything that Friedberg mentions with like the
- biotech. I'm always, you know, having follow up Google searches. To me, it feels like a very, yeah,
- that, that I think is actually a spectacular example. What do you think is the secret of their
- success? Because I do feel it as like a, it definitely gives very strong kind of elite vibes to me
- and to like not just to me, but to a lot of other people who I know. But it also, like you said, it
- has like just empirically, it has kind of mass market appeal, right? I think there's a few things
- going on there. First off, they're undoubtedly successful and they're undoubtedly, like they're more
- successful and more intelligent than other people doing tech podcasts or tech journalism. And then
- they're also good friends slash have good banter. And that's entertaining. And you get a window into
- their, you know, their personal lives a bit, which are also very entertaining. The personal kind of
- like, you know, how they spend their wealth. So it's always interesting. Like hearing, you know,
- sentient millionaires talk to each other, kind of candidly is not something that most people hear
- about. But then they're also, you know, quite good at describing current events in ways that are
- easy, easy-ish to understand. So you feel like you're getting smarter listening to it. You're
- enjoying it. But then also, and I think one big thing is they've introduced basically like, they
- have moved the over to window pretty significantly. And they're not like a right wing like podcast
- or anything. But they like to give you an example of the rest of tech media tech crunches main
- podcast is called equity. Like that just gives you a sense. Is that a pando, you know, like raising
- equity? I think they mean it in the reducing disparities. That is the number one interview question.
- I'm not half teasing, but you know to ask people like, what does equity mean? It's either reducing
- disparity or upsetting companies. And that kind of tells you what the emphasize tells you what you
- need to know. And like, Chimoff, you know, like a year ago or something said like, you know, equity
- is problematic or whatever. Like I don't believe in equity or whatever. Like, and that like, you
- know, it's interesting because you came into this world in the last few years. But relatively
- recently, like, it would have been interesting for you to come in like 2014 or 2015, like every
- podcast episode you have would have gotten you canceled. I'm teasing, but I say that to say that the
- Overton window was just so different. There was just such a strident, like narrow window of what was
- acceptable. And to question equity would have been insane. And I think a lot of tech people, there
- was just this wave of massive preference falsification because people saw what was happening in
- their companies either were getting kind of like destroyed by activists or, you know, and they saw
- what was happening in San Francisco if they lived in San Francisco. And like, okay, that's getting
- destroyed. And they wanted to push back. But anytime they did push back, they were met with, oh, are
- you a Trump supporter? Like, and faced between the option of like, you know, on their knees, you
- know, like apologizing to leftists or being seen as a, you know, normy, you know, rude Trump
- sympathizer, they would rather do the, be on their knees. And once that kind of went away in terms
- of the inspector of Trump went away, basically it became okay. There was just this wave of massive
- preference falsification that all in helped kind of pierce where they were just saying common sense
- things. And a number of people were like, I actually believe them. And so it just gave them a new
- voice. I think mixelana has done a fantastic job of this as well of not being like overtly right
- wing, but just being common sense. You have a common sense being funny, being smarter, like, just
- better on every dimension. And I think that's why both salon and partwires and all in have just
- blown up because they offer an alternative perspective that just resonates more with tech people
- than the one that journalists has because they are tech people. Right. What do you think of the
- market, a marketplace of ideas hypothesis, right? The ideas that is that, you know, if you just let
- people discuss the best ideas will ultimately win out. It does not seem that the best ideas or it
- does not seem that the truest ideas win out. You had Curtis on your show and he has written and
- talked a lot about that. I mean, there are a lot of untrue ideas that, you know, seem to pass on for
- many, many generations. So I think the marketplace of ideas maybe makes sense for like fit ideas,
- ideas that just like have some sort of, you know, unfair advantage. But it seems like, and this is
- what I would say, you know, the Jonathan Roush's of the world, you know, he wrote kindly inquisitors
- in 1992. And I think that was like a really seminal book. That's like the best case for like free
- speech, best case for classical liberals and best case for marketplace of ideas. But it feels like
- it's turned out that, you know, when you're advocating for free speech, you're not going to censor
- anyone, you're kind of putting a hand behind your back. And when you're, when someone sees that you
- play by those rules, they will, you know, leverage those rules against you. They will, you know,
- censor, they will encourage free speech when it supports their aims. But when they are using speech
- that, you know, when they're using a liberal methods and you know, the only way to fight them is to
- use a liberal methods back. They will say they will use your liberalism against you. So I mean,
- there's a couple of different responses to it. I don't think true ideas when I think fit ideas win.
- And you know, if someone plays behind with a hand behind their back, they're often going to lose.
- And that's what happens to be a classical liberalos or happens to libertarians. It's what happens to
- many people in the in the gray tribe who are not willing to, not really willing to fight. Yeah, it's
- interesting to me because at least my interpretation of the past few years is that as kind of the
- limits of censorship have really been reached by, you know, like vaguely left, like I didn't even, I
- wouldn't really even call them left wing at this point. But you know, like people concerned with
- quote unquote disinformation. As the limits have been reached, they kind of like actually improved
- their arguments. Like, for example, just Bidenism over Clintonism, right? Bidenism is like a genuine
- innovation, both in terms of affect, like a lot of right wingers to listen to this podcast underrate
- this. No, but a lot of normal people, normally even like center right people, you know, really, you
- know, like the Biden vibe, they like the Biden kind of, almost like appeasement, right? That's what
- it kind of seems like, right? Not in terms of foreign policy, but in terms of like, you know, I hear
- you guys, like really, like maybe still disagreeing with right wing voters who might meet on the
- streets, but basically saying, you know, like he wants to live in the same country as people who
- disagrees with him. Now, like you might argue, like it's fair to argue that that's not actually
- reflected in these policies, but as a kind of like political figure, right? I think like people,
- people under analyzed Bidenism, right? People are still analyzing like Trumpism. Like, I think
- there's a reason Biden won. And I do think Biden won. And like it's actually like pretty obvious, at
- least like the places to get started in thinking about Bidenism. And so like to return to the root
- topic, like the reason why I bring this up is like, there's this quote from the last psychiatrist,
- another kind of online blogger, who says, who said knowledge is a defense against not having power.
- So like people who don't have power, they make like logical arguments because that's basically the
- only thing that they have. And I think this is like a very good description of EA of effective
- altruism. But yeah, I do think that's like a better predictor of kind of what's where good arguments
- will emerge from than just the mainstream like marketplace of ideas hypothesis. So where will they
- emerge from? They'll emerge from the places that that have reached basically what they can do with
- power, right? Basically, if I've done all I can with hard power, I'll do what I can with soft power.
- This is the idea. Yeah, it's a few responses. So one, I think right when people have to be surprised
- with Biden on a few, not just vibe, you know, topics, but also actual policies like in terms of his
- hard line against China and some other things. It's funny, right? Like people who are more right
- wing than me, they want to be, you know, tough on China. Actually, I think that the trade war stuff
- is probably self-destructive and has worsened inflation. But it's interesting. Yeah, like people who
- are, you know, right enough to support, like, or like people who support basically with the Trump
- policy in China, like Biden has emulated a lot of that. Yeah, I remember Curtis said before Trump,
- before 2020 that he wanted Biden to win because, you know, the way he evaluates candidates is
- whether they help their friends or my friends and punish their enemies. And he saw Trump as helping
- his enemies and hurting his friends, which is to say that even on, if you had Trump's goals and aims
- or sympathized with them, you'd prefer Biden. And at the time, people, you know, people were very
- skeptical of Curtis's claim, like, oh, it sounds clever and cute, but, you know, winning by losing
- doesn't seem to make much sense. But if you're, you know, anti-woke or right wing to some degree,
- you have to admit that the culture has radically shifted in your favor since Trump got out. And so
- it is interesting. Like, if you think that the, if you think as Curtis does, that the president has
- very little power and, you know, a fraction of the power that they should have, then maybe Curtis,
- you know, maybe that view keeps making the argument that, hey, let's get more Biden's people who are
- effectively, you know, ineffectual, but don't present the same boogie man to the left that enabled
- kind of excess leftism to expand significantly as we saw in 2016, 2020. Here's a question I want to
- ask. So, so how much of the post, you know, post-Trump new right, do you think is like downstream of
- Curtis? Do you think like agrees with him one is like major precepts? I think, you know, there's the
- quote Anne-Ran has like all of, I'm sorry, about Anne-Ran where her heroes are fake, but her
- villains are real. I think similarly, like it's like Curtis's solutions are fake, but his like
- identification of the problems are real, like his analyses are real. Like, I think he's convinced a
- lot of libertarians to be more like, like, of why they're losing. Doing more right wing. I think
- he's given people a mental model for governance, for how governments work that makes sense. I mean,
- I think he's been very influential in helping tech people just understand politics and power and the
- kind of, you know, the combination of corporate and public power combined as it terms of like
- crowning winners and cementing, you know, incumbents. And I think Twitter is this kind of great
- culmination of it where it's a thing that Curtis has, has, you know, wanted for a long time, which
- is someone to take over an existing entity. In his case, he wants the Curtis wants a government. In
- Elon's case was Twitter. And just say, hey, you know, there's a new sheriff in town. And actually,
- you're not verified in New York Times. And actually, Dogecoin is going to be the, like, just
- basically, like, shit all over the symbols that they hold dear. And, you know, I think to some
- degrees, he's done the playbook to some degrees. He's probably bashed it significantly. But, you
- know, does that happen in an era without Curtis? I don't think so. You know, I think that, you know,
- does Elon know who Curtis is? I don't even know. I don't, he certainly hasn't read Curtis probably.
- But I think it's been filtered through a number of people. So, yeah, I do think Curtis is pretty
- influential. I do think he's not getting credit for that, probably for reasons that people don't
- want to publicly associate with him. But I think that, now Curtis has drastically cleaned up his
- image. And I think he's done a fantastic job in the past, like, a couple of years. So he's changed
- things for himself. He's not courting controversy in the same way that he was perhaps in the past
- decade. But I think, I think you see, you do see a new tech right. It doesn't use the words right.
- And it shouldn't. And it doesn't, it truly is not, you know, there, it's not right-wing in a sense
- that it's pro choice. It's pro gay marriage. It's probably pro immigration. So I think on
- meaningful, you know, policy decisions differ significantly. Is libertarians who care about power a
- good, a good label? It's, it's, it's not a bad label. I'd first say gray tribe or something, because
- it's not libertarian. Like, they're not, they're actually want to use the state. You know, it's
- like, even Tyler Cowan, like, he's not a libertarian. He's a state capacity. What's the difference
- between state capacity libertarianism and populist nationalism? I think it's, it's like two things.
- I mean, like, I'm not, you know, I don't want to speak for Tyler, but I will do so anyway. I don't
- know, like, this is not necessarily representative of what he actually thinks, but is my best guess
- at what he thinks. I'll say that, right? Like, he would distance himself from, from kind of like the
- new right or from kind of nationalist populism by kind of, he basically thinks that's the populists
- are far too skeptical and are far too antagonistic towards people who are left-wing. He will use
- much more and compromise. I mean, Tyler does. And I would think that's the main difference. Yeah. I,
- I think it's mostly a brand, like, you know, conservatives have a bad brand. Popular nationalism has
- a bad brand. It has a low IQ brand. It has a kind of reactionary brand. I mean, just look at Richard
- and Ania's discussed towards, towards them. And, and so, but so it's, it's great. It's trying to be
- something that's different. It's a little bit, you know, like one way of putting it, it's the midwit
- meme. And they're on the other side of the, the, the, the high IQ side that kind of has some similar
- conclusions to the common sense lower IQ side, but just doesn't want to be associated with them. So
- they, they needed new word. And it, I think this is where libertarians used to be, but I, I think
- the main difference here is that they realize just kind of like how infeasible, you know, and
- unlikely libertarianism is, and how we actually do need a function in government. We need state
- capacity. We need a government to get out of the way in a bunch of different places, but we also
- need it to do the things that it does well, like, you know, handle crime. So yeah, I would say more,
- more great tribe that, that understands power. Right. I would say that like a big, um, a thing that
- libertarians don't get credit for is that they kind of had the political theory, right? Like this is
- just like public choice theory, right? Yeah, a concentrating benefits dispersing costs. It's very
- much, that's actually quite similar to the Burnham, to the Burnham understanding of political
- theory, right? James Burnham, someone who's quoted a lot nowadays, especially in those kind of like
- great tribe circles. But what's what's most interesting, I think, is that like, this is this is a
- central question to, yeah, this I think is actually the central question to like what should the
- strategy be? And like the thing that actually separates kind of like, you know, new rights and
- libertarians is like, Democrat exceptionalism versus Republican exceptionalism, right? Like, like,
- the libertarian view is that, you know, Democrats are just much better at creating people who would
- actually be willing to work for the state. They're much more, basically like, predisposed for
- geographic and psychological reasons. And, you know, the Republican equivalent of Lena Khan is not
- going to spend a decade toiling away to be, you know, FTC chair. He's going to become a very
- successful entrepreneur and maybe that's better for society even, right? But it means that they're
- not going to have as much power and the best thing you can do is just destroy the federal
- bureaucracies and defund them. Or like the new, or like the new right view is that actually this is
- due to specific strategic decisions, the reason why the right loses in bureaucracies is because it
- does not have basically a talent pipeline for creating, for creating people who would actually take
- these perceptions or for, you know, finding them, recruiting them, helping them along their journey.
- And yeah, to me, like, it's not, it's not one or the other, right? Both of these things matter to
- some extent. But what do you think? Where do you fall on the kind of spectrum there? Well, firstly,
- let me just say we were, you know, you mentioned effective altruism earlier and effective altruism
- is getting bashed a lot these days. It's punching bag and Antonio and my podcast. You're seeing what
- the regime press is doing to Eliza Yukowsky is just so sad. Like, you know, like, I am one of the
- people who makes very strong arguments against the kind of A.I. Doom things. And like, man, it still
- feels, it almost feels like a false victory, you know, to see really, like, basically, it's
- slandered coming from, you know, Vox and from time and whatever, right? Or not time. Time actually
- published Yukowsky, but like a lot of mainstream press or, you know, regime press are really just, I
- think on this issue in particular, it just like hits their sweet spot of like, oh, we're just going
- to like drop all pretenses of being intellectually honest and caring about the facts. Like, they
- could have written in like an intelligent, like, I'm going to write an article, very long article,
- like specifically because this is a difficult problem to argue for, you know, putting a lot of time
- into writing a clean article. Like, if you're the New York Times, if you're like, supposedly like an
- educated, intelligent person, like, that's what you would do. And instead, no, they're just writing
- really, like, I think this is, this is the affect of this podcast that I think makes it special. Is
- that like, Richard Nonia, like, has like this, he describes like left-wing ideology as kind of like
- a disgust at like right-wing, right-wing aesthetics or right-wing preferences. Yeah. And like, I
- think like this podcast is kind of like the nested version of this, right, where it's like, and not
- just like, not just simply disgust, right? But that is the vibe, or like, contempt, contempt at like
- this, this like, not quite egalitarian, but like, status-oriented way of approaching things that
- like, is I think like closer to like, if you think of like truth and whatever we're comparing it to,
- there's maybe, you know, like a 10% correlation of like, whatever the, you know, whatever the
- conservatives are doing, like, like, really like the, like, the normie comment, like the boomer cons
- are doing. And then like, maybe there's like 20% correlation with whatever the New York Times doing.
- And, you know, looking at this, it's just like 20%, really, and you guys, like, dare to call
- yourself, you guys really dare to like, want to censor, it's just, it's just so childish and lacking
- in foresight. What did it go you off? No, go ahead. Let me say a couple of things. So first, and we
- can return to this, I just want to say, it's, you know, effective altruism has been a punching bag,
- but there's another universe where the markets didn't tank, and SBF was still, you know, worth $30
- billion, or whatever, FDX was worth, and EA had tens of billions of dollars under its direction, and
- could actually accumulate some real political power. Let's say, you know, Biden wins the next
- election, and SBF's the biggest donor, and he's, you know, he's where Tiel was in 2016, he's putting
- all EA people in government, although, you know, Tiel probably could have done a lot more of that,
- but it probably weren't that many people to put in. And so EA is actually an interesting example,
- because it's both a punching bag, but it's also like, they got close to something. They got close to
- something closer to something than I think libertarians have ever gotten close. Like EA, no, no, go
- for it. I don't, I don't see any picture of DC where you can, like, like, I'm honestly, like, do
- you, like, my first instinct was like, do you actually believe this? Like, I don't, no, especially
- like democratic policy is so inert, and like, well, I guess EA was basically democratic policy.
- Like, like, I guess what I'm saying is, I see EA the movement getting power in that world. I don't
- see like, um, EA pushing back on any core, you know, democratic principles, or, or, you know, DNC
- principles, but I see them getting their pet causes in their, you know, animal rights. I don't know,
- like, things that the DNC doesn't really care about, or is, sure, have this. Mike, this is a
- hypothetical, but I mean, we can also bet on the current case, right? I would bet, you know, pretty
- insane odds that, you know, that the next democratic, whenever the Democrats want to do some kind of
- enforcement regulation on AI, it will be on, like, quote unquote, like equity grounds instead of on,
- um, EA grounds, right? Like, they're like, for every, for every staffer who's kind of convinced by
- the AI doom arguments, there are at least 10 who are convinced by like the disinformation argument,
- and like another 10 who are convinced by like the, the equity are like, I don't, like, just on an
- empirical level, I just don't believe this is true. We're saying different things. I agree with you,
- and everything you just said, I don't think you, Lazar Yutowsky, although we'll go, it's got to him
- in a minute, would be like, you know, having some role, you know, like, yeah, it's just too smart,
- like, like, this is the main thing, right? Like, when, like, when Hananiah calls, like, the
- Republican party, like, the stupid party, like, like, here's the thing, the democratic party is
- like, impermeable to smart people, like, it is, like, it's, it's entire kind of like,
- credentialization, and routinization system is like, a system specifically filtering out, like,
- people who would actually be kind of like, people, people exactly like, like, Lazar Yutowsky,
- actually, this is the thing, like, the, the, the regime press specifically, as like a mechanism, if
- we were to construct like an archetype of the person that the regime press is meant to politically
- assassinate, it would be, Lazar Yutowsky, this kind of like, extremely smart, obviously, like, I
- disagree with him on technical issues, but like, passionate, you know, focused, and honestly, like,
- somewhat autistic, you know, like, the democratic party is kind of anti-intelligence in that way,
- and the Republican party is like, maybe, you know, like, I kind of agree with, or like, I think this
- is empirical evidence that, you know, like, the average, like, the median Republican voter is like,
- slightly lower IQ, and that's fine, but like, the elite levels of the Republican party, both
- contain, you know, many people, even in our, like, mutual circles, right, contain many people who
- are, you know, these exceptional people, and are, is just so much more permeable, like, this is the
- reason why I just like, completely disagree with this characterization, like, the democratic party
- is a party that cannot be smart, well, the Republican party is like, a party of, like, high
- variants, and to me, like, like, you're familiar with, you know, like, the, like, the founder
- theory, you know, like, sound more various stuff, right? Like, to me, like, if you are a smart
- person who, who wants, like, some kind of career, like, let's say you just have, like, no, kind of,
- like, aesthetic preference, right? I think it would be, like, abundantly clear that the Republican
- party is kind of like the home for where you would, where you would go for that. A couple of
- responses there. One is the, I think, EA would end up looking more like, like, think more Bill Gates
- than Elijah Kowski, right? Like, Bill Gates, he's very smart. Sure. You know, I've been in rooms
- with Bill Gates, and he's talking to founders with different, in different industries, and he knows
- the industry is better than the founders. Like, so, it would be this corporatized, you know, DNC's,
- sanitized version. I think what EA had, meaning EA, in my opinion, had transcended Elijah, like, in
- a bad way, had to transcended Yudkowski. I mean, Richard Hennie wrote about this, right? He said EA
- is, like, in a fork in the road, and they could, you know, go woke. I, you know, they, they must be
- anti-woke or die, right? Yes. And I think they had already, I think, Richard Hennie was, you know,
- posting that, like, hopefully, but I think that decision had already been made. And I think, you
- know, short of SBF, having, you know, an Enron plus made off level, you know, extinction. I think EA
- was on the path to kind of DNC establishment level politics. There's nothing to do with you.
- Yudkowski, EA name only. It's like, you know, you know, talk about it. Yeah. Yeah. I was, although I
- think it actually does go deeper than that. Like, I was arguing about this with Rocco, like, EA is
- fundamentally based on the harm principle. It's, it's fundamentally based on like, JS Mill, you
- know, like, how do we stop people from being harmed? And I think that kind of, like, neurotic affect
- is intrinsically left-wing. Like, like, maybe it's smarter, right? Like, I agree, or like, I
- definitely would agree that it's like much less dumb and not subject to kind of, like, the specific,
- you know, conspiracy theories that the left-wing establishment believes. Yep. But I still think it's
- intrinsically left-wing. In the same way, that kind of like, you know, like, I don't know, it's hard
- coming up with a version on this with the right, because like, a lot of the right is also kind of
- intrinsically left-wing. Like, Christianity is sort of intrinsically left-wing. But, um, yeah, in
- the same way that like Nietzsche, this is a good example. Like Nietzsche, you know, like, would not
- necessarily, you know, support either party. I think I talked with Brett Anderson about this. We're
- not necessarily support either party today, but I think, like, by that kind of affect is
- intrinsically right-wing. But, sorry, go on. One thing we have to wrestle with, or any person has to
- wrestle with is that, you know, most competent people, whether they're running big corporations, if
- they're, you know, technical engineers, if they are, you know, running hedge funds or very
- successful in finance, um, most of them are like, left-wing. There are some that are disagreeable
- and just seeking truth above all else, but most people at that level tend to be, tend to be left-
- wing. And not just, like, in the last five years, although it's obviously been, like, all of them
- were less, like, there's been such a polarization. But in general, they tend to, to be more left.
- And so, to your point earlier about it filtering out smart people, I think it filters out, like,
- disagreeable, truth-seeking above all else. But that is separate from, like, deep competence and
- even, like, technical competence. And so, um, those things... I think you can filter... No, I don't
- mean that, like, the Democratic Party says, you know, like, if you're a smart person, don't vote for
- us. But you, if you want to have, like, the maximum impact as, like, a smart person, if you want to
- enter and change the direction of the Democratic Party versus the Republican Party, it's
- overwhelmingly a Republican Party. Like, like, for example, it's just, like, hair trade, like, Peter
- Teele versus, like, Bill Gates. Right, like, Peter Teele, I, I, from... Oh, and where is
- significantly less wealthy than, uh, Bill Gates. Yep. And I think, like, this is true intro to
- Democratic Party, too, right? The Democratic Party is much more, uh, amenable to, kind of, cultural
- shifts in Bill Gates. Like, like, just look at, like, how much, like, Bill Gates really wanted to
- talk about, like, basically, like, the global poverty stuff, right? He was, yeah, he was kind of,
- like, a proto-eA, um, or in the same direction, or, like, some of the pandemic preparedness stuff.
- And we can look at the results for themselves, how much, you know, pandemic preparedness stuff did,
- uh, Bill Gates actually do, right? Like, next to none. Yep. Right? Like, like, in practice, that,
- like, that amounted to, like, next to no impact, in fact, maybe even negative impact on the US's
- pandemic policy. Um, which I mean, like, I don't, you know, I'm not one of those people who are
- angry at Bill Gates for trying. But, like, you know, maybe, maybe donate to Republicans next time,
- you know? The things that get you power aren't necessarily the things that, you know, make you be
- successful once you have that power. Um, and so. Yes, exactly. This is exactly what I mean. This is
- the way that, like, the way to win within, like, the democratic party. And like, this is true within
- the Republican Party elections specifically as well, right? In both parties, the way to win in
- elections is to be like, dumber than you actually are. But like, in, like, the way to win, like,
- marginally, in like, democratic policy is also to be dumber than, than you actually are, right? In
- terms of the race stuff, cert, for sure, in terms of being anti-market. Um, and this isn't to say
- that, like, there aren't some people who can win despite their kind of intelligence and despite the
- fact that their policy preferences are intelligent, right? Like Tyler Cowan, um, as we're clients,
- been one of the best faith actors, he's been more amenable to basically, like, oh, a lot of the
- Yimbi stuff is a great example, right? I'm not saying that it's like completely futile. I'm just
- saying, like, just do the pair trade here, right? It's not, it's not that you can't accomplish
- anything within the democratic party. That's the much stronger position than I actually believe.
- It's that just like, basically, like, the impact ratio is just going to be so much higher in the
- Republican party. Like, like, the Republican party is so much more open to, like, good, new and
- unique ideas. It's, um, there's a greenfield there, for sure. And that is, we did bring two things
- on the table. One is, you can compare Tyler Cowan and Richard Hennania and their relative impact at
- the moment as a, it's part of a broader conversation of how truthful one should be, because, I mean,
- Tyler is a, you know, epic intellectual. He's also much more straw-seeing and much more careful. Or
- at least, that's my read as someone who's admired of him and also, you know, he's been great to me
- and his friend. And then Richard Hennania is much more truthful, almost to the point of, you know,
- is he seeking controversy sometimes? Like, I can, I can rely on Tyler to maybe hold back and, and
- Richard to maybe lean in, like, a bit more. I mean, Richard has said this publicly, right? He said
- this on Twitter that he kind of has, like, a reaction to audience capture. Like, like, like, the
- more his audience is, like, made up of a certain group of people, the more he notices, like, the
- stupid things that they believe. Right. Yeah. The, um, and then, in terms of the Republican thing, I
- mean, it's interesting because Vivek, who's now running for president, I just had him on my podcast
- and I'm going to release it soon. And it's going to blow my audience's mind in a, in, meaning
- they're going to be like, wow, who is this guy? He's really controversial. Vivek is really
- interesting because he is, you know, certainly an elite. He's been a very successful entrepreneur.
- He went to Harvard, Yale. He's very smart. Um, he's, um, an Indian guy. Um, and he is almost like
- running, he's trying to like, up level Maca stuff, basically. I mean, he's not saying the election
- was stolen, but he is like going on all their issues, he's pro-life. He's like anti-climate. I mean,
- he's leaning into culture war issues and taking the right wing side, like to the end degree. And,
- and it's interesting to see how he's going to be perceived because Trump did that. But Trump was,
- was way more of them. And, and of course, people would say, oh, Trump's a billionaire or whatever,
- you know, or he's not a billionaire. But like, how was he, you know, speaking to mainstream America?
- Well, he had their affect. He had their like, uh, craftness. He had their, like, he was a man of the
- people in vibe and style. And it just happens when you're like a real estate, you know, person, you
- do actually communicate with, um, and work with, you know, normal people. And you build up a, uh,
- you know, ability to connect with them. And you do your point about Biden. Like Biden has done that
- too in his way. And so, but Vivek has not done that, right? Vivek has been on an elite track, um,
- for his entire life. Now, he's from Ohio and, and, you know, I don't know, I'm super well. So maybe
- I'm underwriting something. But he, Vivek comes across as slick, um, whereas Trump came off as like
- much more smooth. And, you know, where people might resonate with Vivek on a policy issue, or
- they're glad that he's fighting for them. I wonder if there's going to be enough emotional
- resonance. But last thing I'll say about Vivek is he's just so interesting because he didn't need to
- do this. Like, he could have just been a successful entrepreneur and investor. Like, you see Joe
- Lonsdale, you see Peter Teele, right? Like, you know, these people don't get into politics directly.
- They just, they get it into indirectly, right? They start companies that sell to the government and
- solve the problems. They, they get their friends in office. You know, people they think are
- credible, um, in, you know, in the case of Peter Teele and JD Vance. Um, and they start like policy
- orgs in the case of Joe Lonsdale. Vivek could have done that. But Vivek instead decided to be a
- culture warrior himself. Now, he's so wealthy. He doesn't need to like make more money. But it's
- interesting to see in the next few years, which path is going to have the most success? And it did
- bring a full circle. Like, Vivek is optimizing for distribution first. Like, he's trying to build
- almost like a Tucker Carlson level audience. And then what's he going to do with that? Like, mate,
- you know, so I don't think he wants to like, this is what's interesting, right? Like, I don't think
- Vivek, like, I don't, I don't know him personally. Um, but like, you can kind of see Vivek. You as
- like the Andrew Yang of 2024. Right? Like, someone who's obviously way too smart for his party is
- way less is way too like nonpartisan for his party. Um, and this isn't to say that Vivek isn't, you
- know, like not a real Republican or something like that. I think he's like Republican, but he's
- nowhere near. Like, he said, he's not that ex, like, he, he doesn't want to lean into kind of like
- the culture war fights. I think he's much more technocratic than that, just from like what he said
- publicly, right? And on podcasts and stuff like that. Um, no, I think, I think Vivek is upping his
- tone. So he, I mean, when it was main points, so I think there's some ways in which he overlaps with
- Andrew Yang, both, uh, you know, very, very smart, both, I think it's actually key that they're
- Asian, you know, they're not working on back. Um, that was West Yang's like, uh, you know, reason
- why Andrew Yang could win because he could be a unifier. Um, I, I think there's some, and they're
- both entrepreneurs and, you know, respected by entrepreneurs. I think there is a, um, key
- difference. I think Andrew Yang did have more of a man of the people vibe and Andrew Yang to your
- point was, was non, nonpartisan. I mean, Vivek is, you know, he's trying to like, end affirmative
- action. He's trying, you know, end the climate religion. Like, Vivek is becoming very aggressive in
- his tone on certain issues that are even to the right of other, um, you know, right wing. Um,
- people, now he's not going to say the election was stolen. He's not going to say, you know, like ban
- all immigration. Like he's too smart for certain dumb issues. But where, where the midwit meme like
- makes sense, you know, and, and maybe it's not affirmative action, maybe it's on some other things.
- Like, Vivek is going all in. So, um, in some ways, it seems like a key difference. Right. Like, I
- don't think, you know, practically, if this isn't done by law, because there's a chance that, or
- sorry, this will, if it's not done by like Supreme Court ruling, because I think there's a non zero
- chance that it's done by the Supreme Court ruling. But I think like that there's, there's, there's
- just so much energy. Whoever is elected in the next candidate, even Trump as, as kind of like, you
- know, incompetent as I think that Trump is. I think that kind of like, here's the funny thing,
- right? Like the Republicans to talk about, you know, like the deep state, like the Republican
- version of the deep state, whatever that is, is actually like, I think kind of a hero in the
- circumstance that they're just in enough, like there's a critical mass of think tankers and staffers
- on the Republican side that like, Republican, or that affirmative action is going to get severely
- curtailed. Like, it's not like, literally inevitable, but you know, it's like very, you know, I
- would put it very high in the probability distribution that, so yeah, like the vector is kind of
- like, it's interesting, because I think he has, I think he like, first of all, I think he's like,
- Red Garvin and he's Red Burnham almost certainly, right? I think he's actually like reference
- Burnham and like some podcast interview, although I might be misremembering that. Yeah, he knows,
- you know, he knows it's not really a democracy. So like, why are you running for president if you
- know, it's not really a democracy, because there are people in think tanks, because there are
- people, policy staffers, congressmen, senators who who'll notice me, because I'm running for
- president, and I'm, you know, actually having some, some rise in polling, I think like that's the
- answer. Maybe I'm projecting a little too much, once again, I don't know Vivek personally, but like
- that's my best guess. Well, the, the, the, the, I don't think he's actually trying to win, win the
- presidency. Sorry, go on. Well, yeah, it's interesting, because Andrew Yang, it's like, okay, you
- run for president, and then, you know, you lose, and then you're a media entrepreneur, or, you know,
- a media person, and that's a pretty nice life. But I mean, you know, Andrew Yang was an accomplished
- entrepreneur in his own right, but Vivek was on another level of accomplishment. And so like, you
- know, Joe Lonzail or Peter Teo, I put it, you know, like Vivek is very smart and accomplished. I put
- in maybe a similar camp if Vivek kept on the tech path for another two decades. Like, they don't
- want to be media people. So meaning like if Vivek loses, like having Andrew Yang's career is, is
- probably not what he wants. And so I think he thinks he's going to, well, I think he thinks he wants
- to win. And I think he thinks he's maybe going to win at some point in the future. You know, not
- now, but maybe, you know, eight years from now, 12 years from now, I don't know. But I think he's
- going for it. And to your point, I mean, it's an example of a really brilliant person saying, hey,
- I'm just going to like take a chance at reforming, reforming this party. And I mean, I tend to
- sympathize with the critique that even if you, it's kind of a horseshoe theory of like why no one
- should want Trump in office. Because if you hate him, you certainly don't, if you hate what he
- stands for, you certainly don't want it in office. And if you actually like are sympathetic with
- what he stands for, well, it seems like if there's any repeat of 2016-2020, you're just going to
- like lose on all the issues that you care about. And so to the extent that there's like some grand
- unification in, you know, not wanting Trump to win, then, you know, the more smart people who
- challenge him, the better. I think like, I think a bet on Trump is like him. Or like, this is not
- originally my argument. I'm not sure if the people who made this argument to me would want me to
- give their names. But like, yeah, some pretty like some pretty like new right people basically said
- like a bet on Trump is, you know, basically a bet on like the new right deep state, right? It's
- basically a bet that, you know, like, uh, Trump knows whose friends are that those people are going
- to be the people who actually care about political power. Um, where were they in 2016 or 2018? Yeah.
- Yeah. Like, like, this whole movement escalated after that. Like, like, I don't actually believe
- this, right? I think the best predictor of future action is past action. But like, you know, the,
- the probability of the, the, I call this, you know, like the, this time we'll get it argument. I
- don't think it's like zero. I think it's low, but I don't think it's zero. But at least, you know,
- that's the, that's the consideration that those people would, um, would make at least. Yeah. It is
- interesting. I mean, to, to support Trump, you have to be like in the most charitable view, like,
- there's such a, um, sacrifice in terms of status, in terms of being seen in good standing with
- employed. Like, have you ever read the flight 93 election by Michael Anton? I have a long time ago.
- It was basically the punchline that, like, California is dying and like, this is all the marbles.
- Like, if, if Republicans don't win, it's just so over, as they say. Right, right. The idea is that
- with, yeah, basically with, um, the left wing kind of racial, um, scapegoating that they are
- basically planning to take. Yeah. And I think like this is actually like a pretty, like, I was
- skeptical of the take then, but I think it's actually kind of aged better. But the idea that, you
- know, like, if they don't stop Trump, then the, uh, many out previously neutral elements of
- government of private life of, um, I mean, certainly academia at this point, right, that, that part
- I think is undeniable will be reckoned, will be weaponized permanently against conservatives. Um, I
- think life for conservatives has gotten better from 2020 to 2023 than it has been from 2016 to 2020.
- Now, not in all areas, like what they're trying to do in K through 12 or with the China, like, you
- know, the, the, the left had a ton of soft power, um, and like hidden hard power. And now they're
- losing a lot of soft power and they have a, you know, more hard power, but it's constrained. Um,
- and, um, yeah, I mean, they've been losing a lot of the, the cultural, um, sort of momentum that
- certain, certain movements had. And the lack of a boogeyman on the right has given, has allowed the
- more moderate left people to have more power, hence like Bidenism. Um, so, I mean, that's not me.
- Yeah. Yeah. Like my counter argument to like the Michael Anton thing is that like Trump won and it
- still happened. Yeah. Right. Exactly. Like it happened even more, like, you can't do
- counterfactuals, but like, yeah. Yeah. Yeah. Like, like, like, like, there, there was a kind of,
- there was a flight nine or like, there, there was a like, uh, like a flight, but like, you know,
- Trump did not successfully storm the cockpit. It didn't work. It was not a solution. Yeah. That
- would be, I still think that's the kind of best counter argument, but that was not like the argument
- back then. Right. Yeah. Or like, some people for sure believe that, I don't know. But most people
- were saying, you know, like, this wouldn't actually happen. Like, like, that was the line of attack
- on Anton. And I think in hindsight, like, it's funny. Like, I think in hindsight, you know, like,
- Trump was both much more incompetent than we expected. Um, and Anton was like, more right than we
- expected on, at least on the half that was like, you know, this is going to happen. Not not
- necessarily that Trump would, uh, Trump would succeed in stopping it. But, but you know, like, to
- his credit, he, he, he, he like says that, right? That that's like a crucial point of the, of, of
- the essay is like, we don't know who would work. Um, and it didn't. Um, yeah. I'm intrigued by your,
- your, your question that you've been asking, which is like, how do you get really talented people
- into government? How do you redirect a lot of talent, um, you know, uh, to, to serve? And I, you
- know, you have to make it high status for them to, for it to, to, to, to, no, not, not, not high
- status, not, not high status, high, um, high, like work environment. Like, like, like, here's the
- thing, like, like I have this quote, um, that I have not, uh, I've, I've sat on Twitter a few times,
- but I don't think I've, I've sat in my podcast world, which is like, the free market selects against
- the free market. Um, because when you have a free market, uh, it creates just like such awesome
- talent, uh, or such awesome opportunities for talent, that like, I know, you know, like, the, the
- most brilliant, I was, I had a, uh, mouse Olympiad computer science Olympiad background, like all
- the most brilliant people are, you know, working in tech now, right? They're, they're working
- technical jobs. Yep. They're not involved in politics. They're not gaining any sort of power or
- connection or network. They're, they're working on important technical problems. You know, many of
- them, you know, there's, there's an exceptional number of them at open AI specifically, you know,
- um, on the technical teams. Um, and in fact, you, you can even argue that they've contributed a lot
- more to society by doing so. Um, but it means that those are exactly the people who are not buying
- for power. Yep. Um, like, and of course, you know, in order to keep this prosperous system that
- allows them to work so happily running, you have to not, you know, like, I'm sure there were a lot
- of brilliant nuclear engineers that upon, upon, you know, like the, the, the construction of the
- nuclear regulatory commission were condemned to like basically being an irrelevant or basically
- working in like purely theoretical physics and never replying it, right? Like, that is a possible
- path for machine learning today, right? And so, like, like me specifically, like I've had to make
- this decision, right? Like do I want to make money, do I want to, um, or do I want, like, like this
- specific fact, like, like the fact that I just talked about about the free market selecting against
- the free market is like specifically the reason why I'm at least like mostly convinced that I do
- want to, uh, that I do want to do something involving policy in the future. Well, I'll sort of stand
- it a bit, you know, crudely or simplified, which is if you have two options, and one of them is work
- at open AI, work at big tech, whatever, like, you know, um, or the other one is kind of work, you
- know, to user words, like against the regime. Like one option gives you, you know, can help make you
- rich and is like seen as, you know, more noble or something by a number of, you know, college grads,
- which a lot of people care about, and, um, and kind of more people in, at least society. The other
- path has probably less economic upside or certainly less economic upside. And it's depending on
- which circles they're in, certainly, um, you know, perhaps less approval from like a certain
- mainstream, early person, but what it does have is, um, it's, you know, a certain like, um, a
- certain integrity, intellectual integrity to it, to the extent that they are truth seeking, um, and
- so, I think open AI engineers are really truth seeking though. I don't think that's an advantage,
- but they're selectively truth seeking, right? Sure. Now, like, they can be honest with their
- political beliefs a lot of the time too. Like, I don't know, like, maybe if they were like very,
- very based, you know, they wouldn't. Yeah. I mean, most of them have like, normy, apolitical, you
- know, like a lot of them support Andrew Yang. Yeah. Um, all of them support, you know, like Nikki
- Haley or whatever, right? Like Shama supports Nikki Haley, right? Like all of them are kind of like
- apolitical or like, centuries or like, center libertarians. I guess what I'm saying is that in order
- to get people on this path, because it's not going to be as economically competitive, you have to
- compete on some other axis. And I think it, like, let's just say, for example, that Elon Musk said,
- Hey, you know, I Elon Musk said that we need a balanced, you know, government. I tweeted by that.
- That's why I said you should vote Republican in the last election. And I, I Elon acknowledged that
- the Republican party is totally broken. Hey, Democrats, we grew with that too. And we need to fix
- it. And so I am Elon, I'm going to have like a teal fellowship for policy. Um, and it's going to be
- the Elon fellowship. And it's going to be 20 people are going to get selected. 2000 are going to
- apply and, you know, I'll pay you a decent salary. It's not open AI, you know, level salary and
- equity. But, you know, you'll, you'll be fine. Um, and it'll be a two year thing, a three year
- thing. And then I'm going to help you start a company or something. A program like that would get
- people flooding and droves because they're not sacrificing their like career upside. And that's like
- long term status upside. In fact, maybe they're accelerating it. Um, and so they get to, and then
- you would get people applying who aren't even truth see like they're actually just like career hums,
- you know, like climbing and maybe they're so competent, um, that you actually want them there. Um,
- and so I feel like that's the kind of direction, um, that you, you know, you need to move, what
- needs to move in if you want to like shape where people go. And I think the same thing applies in
- higher ed too. I know you wanted to shout about that, which is like, you can have something like
- UATX, which I know you are, you know, went to and I really admire, uh, or you know, you attended
- some summer class or something. And I really admire, um, in terms, you know, I'm friends with, with
- Joe and Barry and, and, and these people, but like, I think, you know, you're not going to get
- people who are, you know, avoiding Harvard or Stanford and picking UATX over them because it's
- sacrificing career upside. And the best people aren't going to do that or don't want to do that.
- There would be some who are so, you know, intellectually pure that they're willing to sacrifice some
- upside. And, and you know, UATX, I talked to Joe about this. He said he's gonna, he's going to, um,
- you know, make it such that they're not sacrificing by getting companies like Tesla and SpaceX to,
- you know, guarantee jobs and stuff like that. So you can, you can measure with that, but like an
- organization that has a much bigger chance of competing with Harvard and Stanford is the teal
- fellowship. Because that is more prestige. One, because it borrows teals name and it's just
- extremely selective and they advertise how selective they are in the same way that Harvard does. Um,
- and then two, um, because they have, um, a track record or it like the people who've gone through
- are, are incredible. Um, now, if Elon tomorrow also similarly said, hey, we're starting to Elon, um,
- you know, degree and it's, it's competing with Harvard and Stanford. We have 2000 people and we, um,
- you know, we have signed up with all these companies that have promised to hire from here. I think
- he too could compete, but you need to borrow some level of prestige, um, either from individuals or
- from corporations, um, and use that to create a very selective program that gets the best people,
- um, and it most importantly excludes everybody else, um, so that people are making the Pareto
- optimal, you know, decision, non-ideological when they're choosing between Harvard and, and whatever
- competitor and until you're creating something that is just long term better for them on a, on the,
- you know, career and status prestige access access. You're, you're not gonna compete for the, for
- the most, um, for the most prestigious, um, sort of, you know, for the students that are chasing the
- most prestige, I eat the, the most, um, status seeking and, and, and often most competent and, and,
- and tell them. Right. Yeah, it has to be kind of philanthropic, but that's how the kind of, uh,
- incentive problem gets squared. Yeah, I think that makes sense. It doesn't, it doesn't, it doesn't,
- it doesn't, philanthropic. Like, Elon could be doing it as a for-profit venture, like he could
- charge $40,000, like he could charge the same price. It just, it has to be extremely selective and
- has to be associated with some level of prestige, right? Like, we have university competitors, UATX,
- Minerva, which I think Minerva is, is trying to do this to some degree that you have benchmark fund
- of them. They, they have some Silicon Valley pedigree. You know, there's the, the university. Is
- that a Austin Allard thing? No, he's doing Lambda. Um, okay. Which, which is interesting. I mean,
- he's, he's, he's not going after the most brilliant students. He's going after people who are trying
- to learn how to code. Yeah, like the marginal software engineer. Yeah. Yeah. But, but he did a good
- job of borrowing prestige in order to become the definitive bootcamp. Um, but, I mean, I think we're
- doing such a bad job in general. If we care about getting the top students, or even like,
- directionally, the top students out of Harvard or Stanford over the next 10 years, like, it doesn't
- seem like we're making any real progress to doing that. And you ask Silicon Valley, like, what are
- they doing about that? They're like, well, we think the education model is broken. So we're just
- going to wait until the next innovation, you know, happens. And the next like platform shift or
- something, because we don't want to do all the dirty work of creating a new universe. And this is
- why actually, like, Tiel looked into this. He looked into, I want, you know, creating a university,
- like a real university that offered degrees and that, you know, like competed head to head. And he,
- he said it would just become a copy. It's, it's just too much, whatever you didn't want to compete
- in the same way. And I think that's, that's a shame because Tiel Fellowship has had a tremendous
- impact for the people that went through it. It had a tremendous impact in terms of shifting the
- conversation. Like, when I was in college in 2010, 2008, like, saying that college was, um, sort of
- like a joke or a cartel or all these things was pretty controversial. Like Tiel's fellowship was
- extremely controversial. And now it's like fact. It's like, you know, even people like who
- previously would have thought it was a joke or, or, or, you know, beyond the pale, now agree with
- it. And, and yet, there's like zero change. And so, um, he's definitely shifted the conversation.
- But I think we need people to really like understand how, um, these institutions are competed with.
- And it's not just through, like, a better product via, like, meaning a better education, um, or a
- better network, necessarily, it's, it's by the whole, the whole package. And that whole package
- includes a more prestigious, um, option. Right. I think like a big problem here, honestly, like,
- with, with university specifically is that like, if you're, if you're trying to basically convince
- people to, like, locate their resources inefficiently, right? You either have to do two things,
- right? You have to either pay them the requisite amount, right? So that they're not actually, that
- they're being compensated for allocating their resources. And then it becomes actually optimal. Or,
- you know, you have to convince them to dislike the market for some reason. Or, you know, not
- actually to dislike it, but say, like, okay, here, I'm going to do this instead. And like, I am
- still, you know, I am still pro market in almost all areas. It's just that you need some people, you
- know, you need some people to exit the market or at least not, for that not to be the main thing
- that they're working on in order to actually defend it in the first place, right? Like, you know, we
- could have taken, you know, like, even just like 20% of the top nuclear engineering talent. And if
- we could have taken them and gotten them to basically work in VC and make sure that they stop
- banning nuclear technologies, then we would have a not, we would have a much more thriving nuclear
- engineering sector. Um, so like, yeah, this is the problem, you know, if the free market does not
- select for the free market existing, um, let me give an interesting example. And like, the biggest
- problem is that like in in the policy space, right, because of a selection effect, left wingers have
- an intrinsic advantage like people who hate markets just have an intrinsic advantage because you
- hate markets where you're going to go work. I'm sorry going. There was a famous conversation that
- happened with a very ambitious, um, very smart person and someone high up in effective altruism
- where he said, hey, I'm really interested in effective altruism. How can I have the biggest impact
- on effective altruism? And the person high up might have been Wilma Gaskar, I don't know who it was,
- said, get rich and come back and give some money and make a big and back that way. And that person
- very famously was San Franklin freed. And he he followed the formula and it worked short of, you
- know, massive unparalleled global fraud, but he transformed the the EA community for a few years and
- the community, their prospects of making a difference. And so, you know, if you're thinking about,
- hey, how do you maximize the impact over the next 20 years, or, you know, you know, period of time
- that you want to optimize your impact, you're probably asking yourself the question, yeah, like, do
- I go into policy or do I get rich first? I'd be successful first. And it seems that unfortunately,
- the way the world works is like, once you're really successful, once you have a successful startup
- or your stressful investor or whatever it is that you can do, you not only have money that can then,
- you know, direct other people's times and start organization and stuff like that, but you also have
- a level of credibility and prestige that can even further shape where, you know, labor and capital
- goes. And so, now, I don't know how you think about it for yourself, but that's one framework of
- thinking about it. Yeah, I think that makes sense. The counterpoint to psychology has this quote
- right there, there are like, there are like personal billionaires and then there are like, I forget
- what it was, like, like, manager billionaires or something like that, right? Like, like, the idea is
- like, the city of San Francisco has like billions of dollars being managed. Obviously, states, those
- states have more than billion dollars being managed. And that is all being directed. And a lot of it
- is just under the discretion of look whoever is the executive, right? Like, just what can be done
- with executive orders, right? Obviously, becoming president is quite difficult. But you have all
- these positions which are, you know, in effect billionaires, people who control billions of dollars,
- and in fact, maybe have even more discretion or at least are not punished in the same way that you
- would be punished, for example, for selling stock, you know, the stock price would drop. In spending
- that money. So, like, this is true in some cases. I'm sure of that, right? It might have been true,
- you know, if San Bank Madrid was running, you know, an actual profitable company that might have
- been the case that you would have done more by doing that instead of, instead of by, you know,
- influencing some kind of policy. I think it's definitely true if you want to influence the
- Democratic Party because of, once again, how just anti-intelligence they are. Both in terms of,
- like, also, denying that, like, I think, like, this is the reason, right? Like, I think, you know,
- like, there is an actual kind of virtual ethics here where, like, the lack of valuing intelligence
- philosophically leads to, like, lack of valuing intelligence practically. Yeah. Let me encounter to
- what I just said, which is every movement, and I do believe this, every, you know, EA needed Will
- and the gas kill, and it needed, and I might be pronouncing Will's name wrong, but it needed a Will,
- and it needed a Justin Moskovitz and San Bank Madrid. Like, you need the actual capital, and you
- need kind of moral or intellectual capital. And Will was an idea entrepreneur, and he was an amazing
- aggregator of talent and capital, and Will could only do that by having spent, you know, a decade or
- more, you know, really immersed in ideas, and also, really immersed in kind of, you know, local
- community organizing, so to speak, or some of this political work, even. So Will had, you know, he
- was very successful in terms of his, like, movement impact, and more so than he would have been if
- he had tried to be an entrepreneur, because it's so hard to be an SBF or a, or a Justin Moskovitz,
- there's a lot of luck involved. At the same time, you know, it's hard to be a, a, a Will too, but
- if, if, if, if people have that kind of, you know, unique prowess of ideas, and I remember having a
- conversation with Bology, as he was thinking about his, his next thing, because Bology is both an
- idea entrepreneur and an actual entrepreneur, but I think one thing he said to me, which I don't
- think reveals anything in confidence is, you know, I'm a competent entrepreneur, Ibology, I've
- helped start a council which sold for $300 million, it was CTO of Coinbase, but there are lots of
- companies and entrepreneurs out there, and Ibology, you know, I'm not Elon Musk, but in terms of
- idea entrepreneurship within technology, it's actually not that many. And so, yes, it is
- interesting, and Bology moved to Singapore, well, I'll take that out, sorry, Bology, you know, wait,
- I think you said that public. Okay, so maybe we could leave it in that, but Bology left the country.
- I say that to say that he hasn't seen most people in person since COVID, and yet he's become way
- more influential than he has prior, and that's just, that's based on ideas, that's based on, you
- know, publishing that we're state, that's based on, you know, waiting into the discourse. And so, if
- you're good at ideas, there's a lot of power in that, so I don't mean to undermine that, but one has
- to go all in, and one has to, you know, put their time in. Now, when you talk about doing policy
- stuff, you just have to find the right medium that, that, and maybe it is podcasting and, and
- newsletters plus, like some version of community organizing or movement building, but, you know,
- those are a couple examples of people who've, who've done that well, which, which put, and a dark
- action in the same position, right, like he's also brilliant. He came in your podcast for, for
- podcast, doesn't know as they checked that out. And he can start a company, or he can keep pursuing
- his intellectual work, which, like yours, seems to be really resonating, and seems to be pretty
- differentiated and pretty novel. I mean, you guys are both in your early 20s, and some of the best
- idea entrepreneurs in Silicon Valley already, which just shows the, the, the opportunity. Right,
- that, that's interesting, because I think like that, that phrase is really good, because, yeah, I
- was actually, like, okay, maybe I shouldn't, I won't mention who I was, was, was talking to, but I
- was talking to someone, and I basically said, like, I consider myself to be like a very poor writer,
- and like, not like, amazing of a podcast, or either. In terms of like, charisma, and in terms of
- like, really creating a kind of feeling of like, relatability of interest, I think I just like have,
- I've kind of like, speculated, I've kind of like, bet on ideas that have become, have much bigger.
- Right, like, and that's been, that's been the lane of like, okay, if you come to, you know, if
- you're listening to the From New World podcast, if you're subscribed to the newsletter, you will,
- you know, you won't get like, the most compelling paragraph about a new idea that, that will matter
- a lot to you in like, five years, right, or at least it will matter a lot to a lot of people in five
- years, but you will get that idea, right? You, you will get, you will get that idea in some form,
- and I think that's like, that, that is the draw, I think, for a lot of people, including people who
- have talked to you about it. Um, right. I mean, just to spend another minute on that, I mean, if you
- look at Richard and Annie's work, like, what is he most known for, what is a great intellectual
- contribution? From my perspective, it's, it's a few, and you know, he's got books and stuff, so I
- don't mean to undermine them, but it's a few blog posts. It's, you know, wokeness is civil rights
- law. It's, um, the, uh, the, you know, the liberals conservative analysis. Red liberals read
- conservative watch TV. Yeah, it's classic. It's some of the stuff on, on, on, on gender in terms of
- free speech. Um, you know, uh, like, and how gender impacts organizations are being, you know, a
- little, a little vague, but he can go read it. Um, I think, and then also, yeah, I think that the
- headline on that was something like, uh, the free marketplace of ideas, favors women's tears. Yes.
- Um, and then also, like, and then, you know, separate from that is just kind of his, his sense of
- humor on Twitter, which, um, you know, or his antics on, on Twitter, as some might say. Um, and, you
- know, I think one question I have for you is like, are you also someone who's going to write kind of
- seminal, like, you know, um, blog posts that, that will explain a concept that people didn't know
- how to understand. I mean, I think you were the only one on the wokeness and AI front for, for
- example, you know, like this woman, Renee, for the last name, Dresda, is built a whole career on
- kind of like writing the, you know, misinformation wave. You know, she's from the other side, of
- course, um, on the left via the social media stuff. But do they send that AI and, um, censorship is
- going to be on a major issue. It's going to require someone really technical to figure that out, who
- also understands some of the politics stuff. So that, I mean, that's an interesting angle. Um, yeah,
- I think it's interesting just to think about, we don't spend too much time on it. But if you, um,
- you know, really take the public intellectual path, just like what is the way to break out in a way
- that, you know, Richard and, and a couple of these others have not done, you know, yeah, I think
- like, I don't know that that's been like a different, that's been an interesting thing for me. Um,
- because, you know, I really am kind of a true believer in elite theory, um, or public choice, you
- know, like, like, this is a point that I made, made a lot, right? It's, it's pretty similar. It's
- almost the same thing. But like, yeah, um, a real believer in elite theory, a real believer that it
- matters kind of to, like, like, you, you found out about me by listening to this podcast, right?
- Like, and, like, I think there are many such cases where, um, they're, they're, they're like, once
- again, like, to go back to the beginning, I do think they're trade-offs. I think they're pretty
- strong trade-offs, um, between public and, and private appeal, like, public and elite appeal, right?
- I think that, you know, if I had basically, like, a clickbait, thumbnail, and headline on every
- single one of my episodes, right? Like, this was something that I actually discussed with a
- different podcaster. He suggested, you know, like, putting in clips of, like, news articles or
- whatever, right? Um, and like, I think, like, from pure kind of growth perspective, like, that's,
- that makes sense. That's true, right? But, and it would make sense, you know, not to do, like, four
- hour podcasts. But on the other hand, I think that it's actually crucial that you have some of these
- signals that basically say, like, actually, you know, this is not a norm, this is not a normal
- podcast. Yeah. Right? Like, I think that, that kind of, like, counter-signalling actually really
- matters. Um, mate, like, I think it kind of matters, or I think it matters a lot, that, like, the
- from the new world podcast does not, like, rarely touches on kind of, like, first order, um, culture
- war issues, and touches, and like, preferentially touches on, like, second order, um, uh, cultural
- issues, cultural war issues, right? Like, a good example of, like, the second order cultural issue
- is, like, Richard Hennami is writing on, like, affirmative action, right? Like, like, not, not just
- saying, you know, like, oh, they're going after your kids or whatever, right? But saying, like,
- okay, maybe, you know, like, and this doesn't even, like, necessarily mean you have, like,
- underlying policy disagreement, right? Like, it's like, here is, here is why, you know, affirmative
- action is so influential in each of these companies, it is due to these laws in specific, you
- should, or you should repeat all these laws in specific, right? Like, I think that that, that,
- that's like, both proper context in terms of, like, what I actually want to do, right? Like, in
- terms of what I actually care about and want to focus on, but it's also like, the proper context
- for, for a kind of like recruiting, or like, yeah, we can go with recruiting, right? Like, like,
- attracting, that's a better word, attracting a kind of very interested audience, who's just much
- more likely to actually do things in the future. Yeah, it isn't, it's interesting. I think they're
- trade-offs everything. Um, and I think that, that path that you just outlined, um, it's kind of
- like, you know, the, in different way, but like, the Curtis path, right? Like, he is, um, you know,
- sort of undesirable to, uh, to ignore me audience, and that attracts a certain level of die-hard,
- um, followership, and he's, you know, he's not the person named, but his ideas of influence, you
- know, someone like him, and back, or someone like a teal, even if, or even Elon, even indirectly,
- and that is impact, and that is power, um, at the same time, um, there, you know, there's always a
- question of, did this person succeed, um, in because of their, um, you know, the decisions they
- made, or in spite of certain decisions they made, and, you know, um, someone like, biology, maybe,
- is, um, is a bit of both in that, you know, like, um, he's both an idea entrepreneur and kind of
- like, good actual entrepreneur slash, like, community organizer, or like, he's able to, yeah, he was
- elite much more before he was like, publicly well known. Yes, and, um, I guess what I'm saying is,
- it's easy, it's a potential cope, this idea that, you know, I can't have my ideas, my ideas spread
- widely because it would sacrifice some, you know, some of the main points of why I even do this,
- because like we were talking about with all in before, like, some things are able to be mainstream,
- and also appeal to elites. Now, you know, it's certainly watered down. It's certainly, um, although
- in their case, actually don't think they're, they're watering down, but it's watered down relative
- to, like, a purist or, or someone, you know, someone who spends all their time thinking about
- certain things. Um, and, you know, some people, like, because they don't want to be seen with the
- normies, you know, sort of snuff their, their, you know, some of their nose at it, but it's
- certainly has a bigger impact. I mean, I think you need everything, but I was saying, yeah, I
- wouldn't rule out having a more excess, like, I don't think Richard Hanania has sacrificed a ton by,
- you know, growing his audience, you know, 10X in the past, you know, a couple of years, and, you
- know, if Richard Hanania grows his audience 10X again, the next, like, I don't think, now he might
- be audience captured to your point earlier and kind of a different kind of way, and he starts to,
- instead of cater to his audience, like, you know, uh, hate his audience, um, but I just want to rule
- that out, is what I'm saying. I think there could be successful models in, in, in both, you know,
- deliberately niche ways and also in ways that, um, you know, transcend that need. Yeah, I think the
- trade-off is kind of a lot more simple than, like, this might have been my fault that, that like, I
- was miscommunicating it, right? But like, the practical trade-off is like, you know, I can go to the
- DC meetup, or I can write another article, right? And maybe the time scales on that aren't quite
- right, but it's like, literally, like, a time trade-off in terms of elite versus public influence.
- Yeah. Right? Like, like, like, they're like, literally, you know, like, the same, the same time
- slot. Yeah. So, like, I'm, I'm wondering, like, do you, do you think that, like, where do you think
- the most value is generated? Red, I know you talked about earlier, already, that, like, it's kind of
- saturation dependence, right? We have a lot of idea entrepreneurs, or like, their idea entrepreneurs
- are actual entrepreneurs. Like, to me, like, this is actually something, like, like, I think about
- this a lot, actually, in terms of just, like, speculating on the kind of, like, talent metagame,
- right? Like, something that pushed me, like, people don't know, the people in my audience don't know
- this, right? I was like, really interested in machine learning in, like, 2018, 2019, right? And so
- was, like, the rest of the entire, you know, computer science, Olympiad scene. And, and like, I just
- saw so many people, especially like so many people who I like personally respect. And you already
- were like, new personally, we're like, just, just extraordinary people going to machine learning and
- like, man, do I want to really want to be, like, the n plus one's machine learning engineer, like,
- like, how, how impactful, how much will that actually matter as opposed to, like, doing literally
- anything else, right? It's, and, you know, like, I've kind of returned to that indirectly over over
- the past year or so. But I think that philosophy still kind of applies, right? Yes. Now, like, like,
- there are a lot of, like, like, applying that to, to hear their, I think there are a lot of, like,
- maybe this is a controversial take. But I think like, the current environment of public
- intellectuals is very good. Like, there are a lot of very good public intellectuals, like, like,
- biology, like Richard, like, Scott Alexander, and, like, Curtis, like, of all kinds of differing
- ideologies. And, like, in my experience, the quality of, like, a well-known public intellectual is,
- like, higher than the quality of, like, a DC lobbyist. I'll put it in Ezra Klein and Noah Smith just
- to get some more diversity. Sure, yeah. But the, I endorse that. Here's the way I would look at it.
- I mean, there's a few different dimensions of the question. Because in some ways, yeah, there are a
- lot of great entrepreneurs. There are a lot of great idea entrepreneurs. And yet, at the same time,
- there's, you know, there's a shortage, like, there's only one Elon Musk, or, you know, there's only,
- like, a few, right, people on that level. And there's only a few, you know, sort of tower accounts
- of Richard and Ania's, like, and so at the top level, you know, you could always have more. So a
- question I'd ask. There are a few questions. And one is, like, wherever there's more interest,
- there's just going to be more, like, desire to put in the work to get really great. I think what you
- are in, and many really talk to people are in, are they're, like, you know, they have the struggle
- of having choice. It's not obvious to them what they should do, because you, if you said, hey, I'm
- going to focus on making as much money as possible, and you applied your brain to that, you'd
- probably be pretty successful. You might be extremely successful. And similarly, if you said, hey, I
- am going to apply myself 100% to idea innovation, or, you know, public intellectual life. You know,
- you yourself could, could become a Richard within a few years, right? Like, Richard came out of
- nowhere. Like, Richard before COVID was not on anyone's radar, right? Like, Richard, Richard really,
- you know, rose up pretty fast relative to someone like a Tyler Cowan or something. And so, you're
- cursed by that, by the optionality, but the problem there is if you don't go all in on one, you
- might not have success in either, right? Because they require intense focus. Now, so if you assumed,
- so interest matters because it's going to determine how hard you work. But if you assumed for
- intensive purposes, interest was equal. And you just said, hey, let's say there's two universes, two
- worlds. One is which I then expand the next five years. And even after five years, you'd, you know,
- be in your late 20s or whatever, like you'd still plenty of time. But next five years, either
- focused on entrepreneurship, or, you know, getting wealthy or on idea work. And you just kind of
- like sketched out what that could look like. And, you know, you know, your skills and opportunities
- better, better than I do. And if, if either of them looks like you can make more progress, you, you
- cannot add the other. Like, let's say, let's say, for example, you in the next five years have the
- level of success that Richard has today. Your scene is like one of the great public intellect, or
- maybe biology has, but like, you know, putting aside his entrepreneurial accomplishments or
- something. Your, your scene is like a leading voice on issues that, that, that matter. And you have
- an audience and you have distribution and you have respect and brilliant people follow you. Well, at
- that point, you can do a number of things. Certainly you can, you know, start a media organization
- or have a successful career via media. But just as I was saying earlier, this conversation, like,
- distribution is a wedge into other things. Like, if you're also technical and you recruit, recruit
- technical people, well, you know, then you can co-found something or invest in, like, there's a lot
- of people who use media to become investors. And they, you do use your distribution, you know, ruin
- is an example, right? Like, someone who built an audience on Twitter and has leveraged that to get
- some influence such that some people, and I don't know, it's an exaggeration. But he's just, yeah.
- And for the audience, you might, you might not know who ruin is. We did an episode with him, the
- fourth episode in this entire podcast. And that will be, that will be linked as well. That's all
- right. Keep going. Sure. So I think, and I would say the same things are cash. It's like, where can
- you make the kind of quickest traction? Like, what is, what is the quickest path to, and and for,
- let's say I'm talking to our cash, it's like, hey, podcast is, you know, taking off. Like, what if
- you went all in on the, like, how far could you go with, with the podcast? And then it's like, okay,
- what are adjacent things you could do from there? It's like, I mean, Tyler Cowan, if, if Tyler was
- more kind of, you know, if he was younger in his career and more ambitious, he could start, or more
- commercially ambitious, he could start a fund as well. He could start, I mean, like, he could be a
- big time investor. He is the deep respect of, of the value and people who matter. He could certain,
- like, he's, I mean, here he does it with his grants. He's a proven talent, a tractor and selector.
- You know, VCs do get rich if they're successful, and he could be a very successful VC. Because of
- his ideas, he's just choosing not to. And so I think it's really like the combination of where you
- most interested, and where do you think you can make the most traction? And if you're, and the thing
- with you is you already have some momentum in the idea space. So if you're like, hey, I could really
- like go all out on this for the next few years. And, and really make some traction, whereas if you
- evaluate to get wealthy path, and it's like, I don't really see a path, or it's not obvious. Like,
- I'd have to, you know, scrounge for a while. You know, that said, I could have said the same thing
- to us, let's say SBF, you know, before he started FTX or whatever, had a blog, and that blog was
- doing pretty well. I could have been saying, hey, why don't you like take this blog forward? And he,
- you know, kind of clinically identified, you know, a few opportunities to make money, like Miss
- allocations, you know, or just like arbitrage opportunities. And now, if we had given, if SBF had
- been given that advice in 2004, or some other time period, like maybe we would never be talking
- about SBF because he would have tried some, you know, other internet thing in which he had no
- strategic advantage, and it wasn't a way to get $30 billion in three years, because there were,
- before crypto, there was no way. So, timing really matters, too, in terms of, like, road is the,
- what really is the arbitrage opportunity. And so, I mean, the, the, the, the comforting, but also,
- you know, somewhat challenging, you know, TLDR on this is like, it seems like you can be successful
- in either path. And it seems like either path could lead to the other. And so, it's really just a
- combination of, like, your assessment of your, your skills, in terms of where you think you could
- have the highest leverage, where you're most distinct. And that, that's on a more granular level,
- because you, I haven't worked with you, but you know, like, on the surface, you, you, you seem like
- you could do both. So, your assessment, your own interests, your own talents, your own interests,
- and then, you know, your assessment of the, of the market opportunities, and timing, in
- correspondence with those. Right. How do you react to that? I think I just believe in base rates too
- much, right? Like, like, like, here's the case for it, right? Like, so, yeah, technical, technical
- development, kind of, or like, yeah, I kind of separate that off a little bit, right? Basically,
- like, frontier tech research, some other kind of entrepreneurship, media, or kind of like insider
- politics. Like, like, what is the, what is the correct ratio of allocation of, like, top level
- talent between those, right? And to me, just kind of, like, at the population level, there is a lot
- of tech, or there's a lot of allocation into technical development and entrepreneurship. And, yeah,
- like, once again, going back to the quote, right? Like, the potential, it's, it's kind of strange as
- well. Like, like, it's a, the things that you would do on the policy side. Maybe this is also
- another thing that makes it kind of, like, easier to motivate, it's leveling people, right? If you
- think that government is intrinsically or like, on average, right? If you think like, government is
- on average bad, then you eat, then you know that you're playing a defensive game, right? Like, my
- goal with like, the future, you know, like machine learning policy think tank is like, if, if
- literally nothing happens, right? I would, we would be like, celebrating. If like, no AI regulation
- happens in the next five years, we will, like, be partying. We will be, you know, we'll be like,
- this is extraordinary, extraordinarily successful. We have accomplished everything we wanted to do.
- Right? And in terms of like, in terms of motivation, I have to admit that like, that is, you know,
- it's a difficult motivator. I've been going through this right now. I've been looking for people to
- recruit. And it's, you know, especially if you're someone who has the ability to kind of either do
- frontier level research or to do, or to be an entrepreneur, right? Like, it's not, it's not too
- appealing, right? I don't think it's necessarily like, it's less a status thing than like, it is
- just intrinsically not appealing, right? Like, like, yeah. So, so the question is, right? Like, or
- sorry, to finish up on that last point, this just makes it to me at least. This just makes my
- assessment of like talent allocation, that the base rate is just, of like, people who could do both,
- is just like, significantly misallocated towards the kind of, like, making money, making money side.
- Yeah, like, if only, you know, like, there's the case that, you know, you want, like, more
- corruption and more, basically, like, bribery and government because it, like, allows this to flow
- more efficiently. I'm not sure, like, the normal, you know, like, the normal version of corruption
- would be actually successful in incentivizing that, right? But, like, for example, like, prediction
- markets is one way that, like, maybe this becomes better. Yeah, I think that, like, in the long run,
- I really want to find some way of kind of optimizing the kind of meta-level allocation of talent
- between these two areas. But yeah, I just don't, like, on the individual level, I think, like, the
- base rates just make it much more likely that doing something in kind of entrepreneurship or tech is
- oversaturated. And, yeah, just looking at, I don't know, because, like, even before I had any
- interest in politics, I was thinking, you know, like, I'd much rather be a kind of CTO, and I can
- have CEO, right? I'd much rather be someone who works on the base level of technology. But at the
- same time, yeah, at the same time, I am fairly, yeah, like, like, the thing to rate is like the
- ratio of what I want to be doing and have what motivated to do combined with talents to, like, that
- kind of base rate misallocation, right? Yeah, I am, like, the thing is that I'm, like, 75%
- confidence, or sorry, confidence, that some kind of policy work is the right idea, and then, but
- that's still only 75%. I still think, like, thinking about this more would be very valuable, like,
- literally thinking, you know, like, what am I going to do for the next five to 10 years? Right,
- right. Even, let's go with, hypothetically, let's say you're taking the policy work path, and then
- let's just brainstorm how to do that. I mean, I want to work that as interesting. Are you familiar
- with Teach for America? I think I remember, like, Andrew Yang talking about it as a long time ago.
- So, Andrew Yang actually started an offshoot, or an organization inspired by Teach for America. It
- was called venture for America. Right. Yeah. Teach for America. I'm not surprised that, you don't
- know about it. It used to be much more relevant, like a decade ago. Like, when I was, it's kind of
- lost its lustre for whatever reason, but when I was in college, like, I applied Teach for America,
- got in, and I was planning on being a teacher for these, for two years. Interesting. Now, and I
- would have been a special ed teacher in the Bronx. I have, no patience for even amazing people who
- are underperforming or something. I would have been terrible. And so, the question you ask is, how
- did Teach for America convince me to apply, get in, and almost do it? And they convinced a lot of
- people at top schools to do it. It's a combination of, like, talk left, act right. They made it so
- prestigious. In turn, it was super selective. So it was a top signal. So the idea was, like, you
- would go do TFA for two years and then go to, like, Goldman Sachs or whatever, Bane, or, like, it
- was a career accelerator. And they had all these partnerships and they got these brilliant people.
- And then they made them look like heroes. They were like, hey, education's broken. You need to save
- education. And they're marketing or propaganda, whatever you want to call it, was amazing. And so,
- like, if you want to do, if you want to shift talent into policy work, like, what's your, what's
- your propaganda, right? Or what, what is this org's propaganda? Like, it needs, and I think that's
- where, you know, this biologist rate, like, talk left, act right in terms of, like, it needs to be
- seen as, like, more moral and noble and important. And I, you know, you can obviously create that
- argument. But then also needs to be seen, in my opinion, as something that will, like, just be net
- better for their lives, even if they were non ideologically motivated or non morally motivated. And
- so, that's where partnerships with companies or partnerships with, um, people, you know, who are,
- like, if I was aiming to do what you wanted to do, I would try to find someone like a biology or
- someone who's, who's got credibility and saying, hey, can we create this fellowship together? Can we
- create, you know, X, Y, and like, if you, let's say your cause was network states or charter cities,
- like, biology would definitely fund like a fellowship or grant program for people. And he's, he's
- actually doing that, right? And he's, he's, right, right. I'm familiar with CCI. Yeah, he shipped
- it. CCI is great. Yeah. Yeah. And, and we're glad to run CCI, like, network state charter city move.
- I mean, it's still super early. But there are like, hundreds of really talented people working on
- that didn't exist prior. So like, that's, that's pretty big accomplishment. So like, um, if you want
- hundreds of it that didn't exist prior, yeah, I feel like there's a package, there's a bundle of
- things that, that people need. And, um, you know, being able to explain to their parents and people,
- people they went to college with and high school with, like, or even like, have it on their LinkedIn
- profile needs to be seen as, as prestigious. And I think that's something that people, people who
- are entering things from a place of truth seeking and integrity and mission driven sometimes because
- they themselves are impervious to some of these procedure status games relative to others. They
- don't realize that others aren't as impervious. Right. Right. I think like the biggest, the biggest
- motivator that I've come across so far, both for myself and for like, other like-minded people, I've
- just been workshopping how to tell this story exactly. But, uh, have you ever seen the movie, uh,
- 20th century boys? No, I haven't seen it. Okay. So like, the plot of the movie is like a cult takes
- over Japan and then eventually the world. And, um, there's a scene in the movie where the cult
- leader fakes his death in resurrection, or like, well, technically some other guy who helps being
- the cult leader is shot instead, but like, he fakes his resurrection. And, um, what happens is that
- like, they're already very famous of the ruling political party of Japan. And like, the, the
- citizenry of Japan are like packed into the stadium and, you know, like tens of thousands of people.
- They're all cheering. They put up like their, their hand signs, like the cult sign. You know, all
- the people in the streets are stopping and they're putting up the cult sign. Right. Um, it's just
- like this feeling of utter doom of like, just complete sentiment in that like, is the world insane.
- Right. I remember like Eric Weinstein talking about a similar, uh, similar experience as well. He
- had the, uh, I forget. I'm blanking on the author's name, but he had the author, um, Tim Koran, uh,
- Tim, the prep. No, no, no, no, uh, of the, of the essay in the New York Times about, uh, not Agnes
- Keller. Yeah. No, no, uh, this, this was when he read the essay. He read like this essay that was
- written a long time ago about the kind of like, did not, it was like on the denial of, um, on the
- denial of atrocities or something like that. Right. And he had this quote, right, which was
- something like, um, right there, you've got their attention. Hold them and blow them before, before
- they shake off, you know, before they shake off, uh, their confusion like a puppy, uh, like a wet
- puppy or something like that. Right. I'm forgetting the quote right now, but like, I think that it's
- both true and incredibly powerful to emphasize that that's like, that is the world we live in to
- some degree. Right. And we're going to talk about this later with egalitarianism, but like, it is
- like, it is just true that it's genetically encoded in many people, um, or like, it's an evolved
- pattern of behavior to deny reality in very specific ways that are responsible directly for, you
- know, some of the greatest missteps, the banning of innovative technologies. You can look at
- NewClear as like the key example here of just creating of like voluntarily creating this poverty and
- creating this like completely unnecessary, um, struggle. And of course, even more in the past,
- right, with, with communism, um, with, uh, really like a long record of these kind of of anti-
- prosperity, anti-innovative movements. And to me, like, one really striking example of this was
- GDPR. GDPR is, you know, I tweet this out fairly often, you know, I tweet one of two versions,
- either, you know, the European Union is China with lower IQ, or the European Union is China with
- lower IQ and far worse food. Um, you know, depending on how many people I want to piss off. Um, but
- it is the case. You know, like, controversy about the second part aside, it is the case that the
- European Union is just the less competent version of the Chinese government. The same motivations
- are there. The same motivation for kind of total control, the fear of anything disruptive. It's,
- it's exactly the same kind of psychological pattern. And they are, they have less power specifically
- because they are less competent, um, which, which, you know, might be a good thing in the end, might
- be a good thing, um, especially compared to the circumstances that China had, you know, two or three
- years ago. But it is striking just a special, returning to GDPR. How many people like China, how
- many like normies, you know, like people who just don't pay attention to politics, how they thought
- this was like a good thing. And not realizing that it just crushed, you know, thousands, if not
- millions of small businesses, of people who are really on the way up, who are going to have
- fundamental improvements, and not even just small businesses, right? This was the grounds for Italy
- banning chat GPT, like literally like Chinese state behavior. Um, and that this is just, you know,
- this was cheered on, this was celebrated. It's exactly the kind of 20th century boy's moments, I
- think. The big idea is that like, essentially, you're not living in a world where like the safety of
- your industry is guaranteed. And like empirically, that's been the case, right? It's not like, you
- know, it's not 100% of time the industry gets regulated out of existence. But it is like pre-common,
- right? If you're a nuclear engineer, you know, in like the 50s or 60s, you've seen that like real
- time. And I think that has happened to a few tech people. And that is why, you know, like, as Peter
- Tiel said, you know, liberty and democracy, it was like democracy and freedom are no longer
- compatible. I don't think that's quite the case. But I do think that incentive is, is you basically
- need a lot of people, quite frankly, like people like me, um, who are acting not in their self
- interest in order for freedom and democracy to be compatible? How so? Say that more. Say more about
- that. Because like, right now, acting in my self interest is like starting a tech company and
- becoming, you know, extremely wealthy, right? Certainly, there's a higher chance of becoming
- extremely wealthy doing, even though like, you know, it's not guaranteed, I might fail for sure. You
- know, I'm keeping that in mind. I'm for sure not like 100% confident that that'll happen. But
- certainly it's a much, much more likely path to wealth than, than, you know, doing machine learning
- policy, right? Like that the incentive is like, the people who hate the market will like go to areas
- outside of the market. And in fact, it will work very hard to crush the market. The people who like
- the market will go into the market and succeed in the market. And so like, like, that's the core
- case, you know, of the free market selecting against the free market. Yes. Yes. So your marketing is
- basically tech needs a defense budget. Tech needs a defense team. Yeah, that's a brilliant way to
- put it. Yeah, yeah, exactly, exactly. And I think that's strong marketing. I think it's interesting
- to look at this in the arc of how Silicon Valley, you know, said broadly, like tech has approached
- politics and kind of its defense in the past. And mostly how it hasn't had to. So let me give a
- brief overview. I mean, basically Silicon Valley in the, you know, late 2000s, you know, with Obama
- and the Arab spring was the darling of the left. Like social media had ushered in, you know, all
- this freedom of speech, which was at the time, you know, corresponded with with left wing causes,
- Arab Spring being one of the biggest ones. And then of course, you know, Jack, you know, Dorsese
- famously, you know, said stay woke, you know, was was a big supporter of Black Lives Matter and
- Durey and what was happening in Ferguson in 2014. So I mean, Silicon Valley was it was a darling for
- leftism in the late 2000s, early 2010s. And I saw this transition because I was, you know, my
- company products on is, you know, hyping technology startups, coming technology startups, and we
- were at Darling. And I saw the mood start to change. And they would use contradictory arguments.
- Like they would say, hey, everything that's on product time is just a silly app. Like, all these
- tech people are working on all these silly things that are not important, not serious. And we need
- to, you know, they need to have a bigger impact. But then they would also say at the same time, hey,
- tech is taken over the world. It's, it's, you know, having a bad impact. It's way too powerful. And
- then, you know, and Trump really, like, you know, the perception in many technicals minds that many
- people's minds, elite's minds is that in the same way social media had elected Obama, it had elected
- Trump. Now, there were things before Trump that created this rift between Silicon Valley and the
- DNC, let's just say. I mean, that are worth emphasizing because it wasn't just Trump. It would have
- likely happened regardless. Silicon Valley started to attack traditional American left power
- centers, near times, Hollywood started to go after academia too. First, it was enabling it. And then
- it started to replace it, like Netflix is a very obvious example. It started to undercut their
- precision influence. It pulled away a lot of their top talent. You'd people like Larry Summers, Eric
- Holder, David Ploof, all working for tech companies. It became a more powerful global culture
- exporter, like Stanford taking over as number one school from Harvard, YC becoming like a top
- school. And, you know, Silicon Valley no longer needed the DNC. They built a network of super
- wealthy people with an alternate social network and path to power. Rather than working government,
- you know, become CEO. And so we had this tech lash. And it was, and what tech did is they responded
- by apologizing to it, by apologizing to it, giving money to left and causes. They thought that it
- would go away. It, in fact, the critiques got worse and worse. And what happened around, I'm fast
- forwarding a bunch, but what happened around COVID was you had a contingent of people who said, I'm
- not apologizing anymore. Actually, I'm like directly fighting back. And those were people, to have
- some examples, like biology and like Mike Salana, who early on were saying, hey, like tech and
- journalists, while they used to be aligned, they are now like two different classes of people with,
- I mean, they compete economically, you know, they compete for the same advertising dollars or they
- compete for intention. And then two, they're just adults. And so they were very aggressive. There
- was this very famous biology, Taylor Renn's feud, which was very controversial. Mark, was it Mark
- and Jason? Yeah, he was defending Mark's reputation. And the many people within tech were either
- critical of people like biology or Salana, or were uncertain. But the idea that you would fight back
- seemed either wrong or uncouth, that the people that were attacking tech were doing so in good faith
- and kind of deserved respect and that actually tech needed to be held accountable by this separate
- class. And so the idea that tech needed a defense was, didn't resonate with them. They would say,
- oh, we're so powerful. Like we actually, we're too powerful. We need accountability less so than
- that defense. We need people to attack us. We need people to critique us. It's not attack. It's
- critique. And they're doing so from a place of love as Kara Swisher would say or something. And that
- started to change once a number of CEOs got fired, once a number of regulations started to pass or
- threaten to pass. And once San Francisco started to like materially deteriorate in a way that was no
- longer deniable. And you started to see the ratio significantly change where when people were
- uncertain about supporting someone like Mike Salana, who around COVID maybe had like 5,000 Twitter
- followers. Now he's got like 220,000. Now fighting back against journalists against policymakers who
- are attacking, the famous example was Zuck donating 80 million dollars or whatever amount of million
- dollars he donated to the hospital and then being vilified for it. And people like Mike Salana would
- go around and say, actually, he's good. Actually, like it's good they donate 80 million dollars. And
- also it's good that he invented Facebook. And so you started to have this class of tech defenders.
- And they did it via media and they didn't do it. They weren't making a ton of money off it. To your
- point, they did it outside of the market, but it did support their efforts. I mean, they built an
- audience off it. You know, biology was an investor, Salana had worked for a founders fund. Now he's
- starting a media company that has raised money. So I think tech appreciates or some elements of tech
- appreciate defense in a way they didn't, you know, 2017, 2016. So I think it's good timing because
- they thought at the time it was uncouth or morally incorrect or something. So it's so interesting.
- I'm sorry, gone, gone. Oh, I would say is I think, so I think that's a strong marking push. But then
- I think it's like you then get into the details of like defense from who and in what area. And if
- it's on the AI front, you know, I think people need to be more more persuaded, I would say, that,
- you know, Woke AI is a big threat relative to just, you know, AI from like safety in terms of my
- lies are you Kowski concerns. I think Woke AI is like less of a threat than like I'm just like going
- after hardware. It's actually pretty similar to like what biology said, right? It's the pivot from,
- it's the pivot from what was it? Woke is them to stateism. Yeah. Yeah, like like that to me, you
- know, like the quote unquote, you know, like this information crowd, of course, pervades of the
- worst disinformation out there. They're the ones who are pushing for essentially, you know,
- essential as control of machine learning hardware of essentially TPUs, GPUs and the like. And there
- I think is the main venue of attack as well as kind of financial attacks from the FTC. I don't think
- I think I don't think it's quite a distraction, but it's definitely a smaller venue. Wokeness sign
- is definitely a smaller venue than these kind of like, you know, these kind of like status as you
- know, boom, we're conservatism as it sounds, it's a correct description of what the threat actually
- is. But sorry, go on. And the, what's interesting there is that the group of people that might be
- most empathetic to that, those concerns is actually the crypto slash web three world because they've
- operated since the beginning from a existential fear that the state is going to come down on them.
- And they are, you know, in many ways competing with with state power. So, you know, they know they
- need a defense. And, you know, to some degree, they they've invested in defense both on the media
- and on the kind of think tank front. So, I think there are other groups too, but I think it's a
- strong positioning and one would just need to get more specific in terms of which, you know, which
- causes which segments and then which methods, right? Because there are there are media methods like
- Mike Salana does, you just like fight fire with fire and like whoever wins the Twitter war, like
- wins the elites, basically, like just be better on Twitter, like win the game. And then there and
- some people do it on Twitter, some people do it, substack, whatever. And then there are you know,
- more policy, you know, ways of making changes as you know and as you're exploring as well. And then
- yeah. Right. I mean, like I said, so you had this term top left act right for the audience. What do
- you mean by that? Yeah. Yeah. The one second. To describe talk left act right, you first have to
- talk about what is left and what is right. And there are a few different ways of defining it. If you
- define it, ideologically, you know, you could use Brian Kaplan's definition that you've used as
- well, like the left hates the markets, the right hates the left. Or, you know, Mike Amalas has this
- quote, ask a right wing person, if people are equal, they'll say no, ask a left wing person, if
- people are equal, they'll give you a speech. So it's this idea that, you know, left wing people are
- favor more equality and they're right wing people favor more hierarchy or we'll recognize that. And
- so there are other ways of, you know, slicing it ideologically. You could say the left is all about
- universalism and all about, you know, universalizing. It's ideology and the right is more narrow-
- minded, more tribalizing. You could say the left is more about more utopian and that they believe a
- better world is possible. And thus it's one's duty to make that happen. And the right, as perhaps
- more, you know, constrained, that's used Thomas Sol's, like, constrained for some constrained vision
- about what's possible and thus accepting the limitations of what we can actually do. And there's a
- number of ways of slicing it ideologically. But then you say, is it actually an ideology? Because if
- you were to say, you know, what does left believe? And what is the right belief? Like, you know,
- even as recent as 30 years ago, you might say, oh, you know, the left was anti-immigration, anti-
- trade, anti-war. And today they seem to be pro all those things. Are these more in terms of the
- sense of Ukraine? Like the idea is flip-flop. You know, and the party's flip-flop on ideas. And so
- you can say, okay, maybe it's a group of people, the sort of the, and these left right is more about
- tribal loyalty to that group of people than it is to a certain set of ideas. That's another way of
- looking at it. I think this truth is all these ways of looking at it. But the last way of looking at
- it, which relates to talk left act right, is this idea of maybe it's a, well, and before I get to
- that, like the cleanest way of thinking about, you know, the group of people is what we saw around
- with coat with masks, and basically, or in COVID in general, how the left flip-flopped so quickly
- on, you know, whether they're four masks or against masks, et cetera. And this idea of, you know,
- first, it was racist to think that COVID was happening. And then it was, you know, you were a rub if
- you didn't think that we had to go and lock down. And so maybe the third way of thinking about it
- is, is, like, maybe it's a series of tactics, actually. Like, maybe leftism is a way to sort of rise
- up within an organization or make change, basically. It's a way of calling for more, you know,
- respect to the downtrodden, either genuinely or, you know, unwittingly cynically. But, you know,
- there's this phrase, of course, if you are in a liberal, when you're young, you have no heart, but
- if you aren't a conservative, when you're older, you have no brain. And part of this can be
- explained by, when you're older, you have more status. You've developed more capital, like actual
- capital, and then career capital, reputation capital, and you have more of a stake in society. But
- when you're young, you don't have that much. And so you want more. And, you know, maybe leftism is
- like a status acquisition tactic, and rightism is a status retention tactic. And so then one has to
- ask the question, like, and so when the cynical way of saying talk left act right is basically like
- Harvard, right? Like Harvard is one of the, you know, most fervent proponents of some, you know,
- diversity equated inclusion, I just say, or, you know, kind of wokeness or, and I'm using Harvard as
- a metaphor for universities, elite universities. And at the same time, Harvard is the most exclusive
- place, you know, university in the world, in the sense that anyone of the world who would go to
- university would pick Harvard as their first university, and Harvard rejects, you know, has like the
- lowest acceptance rate. And they advertise their lowest acceptance rate. So there's certainly
- exclusive. And also Harvard, like in my university, has extreme lack of diversity as it relates to,
- you know, certain intellectual topics, right? So the talk left that right is talk about sort of, in
- this case, ever seen inclusion, but then act in a, you know, non-deverse, you know, politically
- uniform and exclusive way. And, you know, like Harvard both has sort of the moral, like you would
- think from their language, and like New York Times too, New York Times advertises itself as a little
- truth. Harvard, you know, advertises itself as like a beacon of knowledge, you know, all these
- amazing things. And, you know, and for the good of itself, and yet it's sitting on like a $40
- billion, you know, and down, I mean, there's so many, like, you know, capitalist things about what
- Harvard is doing. And capitalism in the bad way, like chronic capitalism. And this, there are more
- examples, I shared in my post called the hypocrisy of elites, where the people that were, you know,
- advocating for defund the police more often than not were white people who did not live in high
- crime areas. And so defund the police served as a way for them to signal that they were, you know,
- left-wing and thus, you know, more moral and more noble and more caring, but acting right, in the
- sense that, you know, they have, you know, live behind gates or don't live in high crime areas. And
- you see this actually like in many, many different issues, whether it's about, you know, gifted
- programs in schools or body positivity or, you know, relationships, polyamory, like, or the
- environment, right? Like the people who are spreading the most left-wing, egalitarian messages are
- the wealthiest people who in their own private lives, you know, do send their kids to private
- school. Do work out a ton. Do end up getting married and in monogamous relationships. Do inflict the
- largest carbon footprint. And so that is kind of the the talk left act right on an individual level.
- And on an organizational level, it's this idea that you need a mission, you know, the left is, you
- know, one way of saying it is the left is optics, the right is substance. And, you know, if you
- don't have optics that itself is like a lack of substance, like, you need both, right? Like, you
- need a mission that is going to inspire people in a democratic way. And when I say democratic, I
- mean, like, you need mass coordination, right? Certainly to win elections, you need masses to vote.
- And they're likely going to vote for the thing that promises them more stuff or, you know, something
- better. But then also on a corporate level, like, you want to appeal to customers, you want to
- appeal to recruits. And those people need to tell the rest of the world a story about how they are
- making the world a better place. And, you know, saying we're going to make a more efficient
- hierarchy, you know, is not as inspiring as we're going to have, you know, a quality of opportunity,
- which of course is a weasel word, because no such thing exists. But it's actually a good example of,
- you know, that is a left optically type thing. But, you know, when done right is actually a, you
- know, a right wing concept. So that's what I mean when talk left act right is basically appeal to
- the more reasonable sides of egalitarianism, ones that everyone would get behind. But then also
- ensure that you are acting in a way that is, you know, going to be lead to most success for your
- organization. Right. I think some of that, I think some of that really does kind of show how deep of
- a hole for it. Like I, do you know who parisia is? Who parisia? Eve's part. No, was that at least
- that's the name he goes by on the internet. I don't know if that's his actual name. Yeah, he writes
- his newsletter called Parisia. He's kind of in similar circles as Richard Hanonia and I and like, I
- deasleefully, this kind of crowd. And he, he like calls this circle like right wing rationalism.
- Right. And his idea, this is kind of inspired by something like Richard Hanonia said on my podcast
- is that like, the right wing just needs to focus on like factual things that it knows are true. Like
- genetic differences and like, and like market efficiency. And like basically, like, basically just
- like read statistics and like, and like, evolutionary psychology. Right. Basically, just like,
- pointed facts that are like, that like the left wing denies. Right. To me, like, this is like
- saying, you know, we're going to build an entire movement. That's like, you know, like, like, just
- imagine the left wing version of this. Right. The left wing equivalent of this is like, the only
- thing that we are going to run on is that like, vaccines reduce mortality. Higher carbon emissions
- is correlated with higher average global temperature. Like, like, a left wing that is like, that
- inert, right, just would not exist. Right. It needs to have the kind of like, it needs to tell you
- what to think. It can't like, it can't just like provide evidence. Like, like, it just would not
- function. That, that to me is like, I mean, like the, the black pill, like, the pessimistic take is
- that like, right wing rationalism wouldn't work either. Right. It's the right wing rationalism, you
- know, like, but at the same time, you know, you know, I'm a con. So like, there's this broad
- question as to whether if you're going to compete with the left, do you do so on leftist terms or
- tactics, or do you reject the premise entirely? And the leftist tactics here. So, but what I mean by
- let's take math, for example, like, you know, there some schools were banning algebra or whatever it
- is or, you know, like, restricting people, like, you know, gifted programs, stuff like that. The
- left wing tactic would be to say, no, we need gifting programs. We need to teach kids algebra
- because that is the way that people from low income backgrounds are going to get ahead. And by
- restricting that, you are getting rid of, what's it called? Like a quality of opportunity. You're
- reducing, you know, social mobility. That's a left wing tactic. The right wing tactic would be to
- say, actually, like, hierarchy is good. And yeah, people are genetically different. And we should
- let the, you know, the most brilliant people rise to the top and let the chips fall where they may.
- It's focusing more on the accelerating the top than, you know, bringing up the bottom. Oh, this was
- like completely different than what I thought you'd meant. We'll put on a pin on that, but we can
- talk about this right now. I think it's a very context dependent. I think a lot of the time as well.
- It's like salient space, right? Like, if you're running an election, you should just draw as much
- attention to the, to the math topic as possible because it's a classic wide issue, right? It unites
- Republicans, split Democrats, you know, like, there's not a single Republican who's going to be
- like, you know, actually, we don't like math, right? But there are actually a lot of Democrats that,
- you know, like, we mentioned René de Resta, right? René de Resta supports teaching kids math, right?
- Yeah, exactly. Yeah. And she does a lot on leftist grounds, but the Richard's idea around, like,
- recognizing genetic differences, I don't think that's going to be very effective. Well, it depends
- on what your goals are, but like, that is, you know, the benefit from a pure tactical perspective, I
- guess you're playing in places where the other side won't play. But that is such a controversial
- issue for, like, it's so anti-leftist. And, you know, we swim in a leftist water, we swim in a
- liberal water. And it's, it's, you know, one of our foundational myths is around kind of, you know,
- moral equality, equality of opportunity, social mobility, American dream, and genetic differences
- just has so much implications. Now, I certainly think that, you know, everything should be able to
- be studied. And we should, like, you know, we shouldn't restrict knowledge that would be, you know,
- beyond the pale, in a way that is happening now. But in terms of an actual, like, platform that is
- going to move people in either the private sector, unless it's, you know, a genetic company, or the
- public sector, I don't see it. Do you see it? What's the argument? Right. Like, as motivation, I
- mean, like, I think to, like, talk about this properly, we kind of have to talk about, like, why
- egalitarianism is motivational. I'll ask you first, like, why do you think egalitarianism motivates
- people? We used to live in societies that were very stratified, and they were deliberately
- stratified. There was very little social mobility. And as a result of that, there was very little
- status anxiety because you knew where you stood. And if you were at a high place, that was because
- you were ordained for that. And if you were at a low place, it wasn't like you failed. And then we
- introduced a much more socially mobile society. And as a result, where you ended up in society was
- up to you and your effort. And so at that point, things became much more high stakes. And so
- people's entire concept of self-worth ended up, it would be where they were in society. And so in
- order to, I'm greatly simplifying, obviously, but in order to make sense of this or kind of reduce
- the anxiety that comes with everything being up to you, certain environmental factors were
- introduced. So that or emphasize, I should say, so that it's not really up to you, it's up to it's
- up to the environment. That's just easier to take. And so if you're a successful person, you are
- effectively a threat to people who are not successful because your success implies that they didn't
- try hard enough. And so you could say, okay, what we're going to do instead is we're going to
- introduce genetic or environmental reasons, right? I mentioned the environmental reasons, but like
- if you introduced genetic reasons, well, the problem with that is then how much social mobility
- really is there. And then you're back into a stratified society back where you started, you want to
- do that. So regression to the mean, right? It's not completely, it's not intelligence is not 100%
- heritable, nor is most trace. Yeah, most people can't really understand the nuance. Most people
- think that moral equality should equate with other kinds of equality as well. And the truth is, we
- do bestow moral significance to people who have been more successful, like we just celebrate them.
- We boost our status on to that. So the, right, but that's because of the genetic knot, right? That's
- because they believe that like that first, it's like, this is something that I'm personally very
- annoyed by, right? Like the conservative, you know, like pull yourself up by the reboot straps
- argument, right? Like that is the form of genetic knot. That is a form of kind of like
- egalitarianism that like people believe, they believe that like people are literally created equal.
- Yeah, at the same time, there is a, you know, there are things that are optimal for the individual
- and there are things that are optimal for the group, right? And so for the for society and for
- society, it might be optimal. Or I'll tell you some reasons not in optimal, but it might be more
- constructive to, to like not have it super obvious what everyone's IQ is, right? Because let's say
- like there are people who don't have super high IQ who've achieved a lot of things, who've been very
- accomplished. And if there was a world in which they were known with their IQ was, maybe they just
- wouldn't have gone for them. And like a much more provoquial example is entrepreneurship, right? If,
- you know, people, there's this joke, I didn't try to do this because I thought it, because it was
- easy, I tried to do it because I thought it would be easy, right? And so like people, entrepreneurs
- are acting irrationally in many ways. They don't know that the likelihood, or if they knew that the
- likelihood of the success was what it actually was, they might not pursue entrepreneurship. And
- that's what happened when people become more knowledgeable of, of probabilities, they tend to index
- more, right? Because on individual level, you'd rather, you know, cap your upside, if you'd cap your
- downside, you know, on average. Now from a societal level, like we benefit from thousands of people
- trying to be the next Elon Musk, even if it only means a handful of Elon Musk, because the outliers
- outweigh everything else. And so within entrepreneurship, at least I think there's a benefit to
- certain irrationality, or it's just a lack of understanding of probabilities because, you know, the
- outlier benefits outweigh the costs of people trying. When everyone becomes a rational automaton who
- understands probability, they just become much more, you know, much less dynamic, right? Much less
- willing to take the risks that society needs. And I think you can extrapolate out. I mean, certainly
- there are a ton of costs with the kind of denialism that you're discussing. I'm also elucidating
- that there might be some costs with the opposite of the denialism, with really coming to terms with
- what ones odds in society are. And in some ways it cuts against the, you know, some of the
- simplifications or myths that tie our fabric together, because this idea of the American dream, of
- social mobility, do we still want that narrative? I think in some ways we do. And how would you
- react to that? Right. When it comes to entrepreneurship, I don't think it's like, it's like somewhat
- g-loaded, but it's not completely g-loaded, right? Like I study mentioned in Tyler Cowan's book,
- talent, is that. So I think like, I think it was Sweden that like the average IQ is only 130 rate of
- entrepreneurs. And of course, you'll have people on the sides of that. So I think like, would it be,
- would it be disentivizing or would it be incentivizing? I'm not sure. I think like, you know, like
- the rough, the rough approximation people have of like the average entrepreneurs IQ is probably
- higher than that. Let me give you an example. I mean, I think there sometimes there's tension
- between truth and social cohesion. And you know, there are people like Sam Harris, who I, you know,
- I really like despite his recent efforts, but like he, you know, his book lying, like never lie,
- like, you know, like truth always. And all of that. Like, what a really interesting example, you
- know, IQ is interesting. Another example is like dating apps, right? Like imagine if dating apps
- released all their data. If all the day it was public around, I'm sure they have like, you know,
- people get ratings in the system. And let's just say like, you know, someone is a nine out of 10 and
- all the matches they get and someone is a one out of 10, not someone like a third of the country or
- whatever, it just gets like zero. Like, imagine the type of inequality that exists on dating apps
- and imagine knowing how hopeless it really is. Like, I mean, there's already a ton of insult
- hopelessness to begin with. Like, don't you think that that like information would further
- disempower people? Like, wait, so people knew the correct or like knew the ratio of activity on
- dating apps. They would be less incentivized to what, like, to use dating apps. Like, I don't really
- put themselves out. Like, to put themselves out there. Like, because we're like entrepreneurship. I
- mean, like, in dating, you have to, you know, you need, well, you only need one success, you know,
- to make a marriage, but you might need to try a lot. And some people might say, oh, you know, my
- odds are really stacked against me. I need to, you know, work that much harder. And other people
- might say, oh, it's totally hopeless. I mean, this is the same thing the left actually does on race,
- right? Like, in the right, we'll criticize them for it. They'll say, hey, certain groups have it so
- stacked against them that there's this actual privilege, you know, the other groups have. And people
- on the other side will say, hey, by doing this, you are disempowering them. And so there's a
- question of like, should we pick the narratives that are most empowering? You know, David Brooks
- once said something like when you went on a macro level, everything is is environmental or you
- should, you should, you know, overweight environment. On a micro level, you should overweight
- individual, you know, nurture, like you should individual opportunity to change one circumstances.
- Now, maybe the cost is, hey, you overweighted that and they couldn't change your circumstances. And
- now they, you know, are upset because of that. But the positive is you've got a bunch of people
- trying to change your circumstances. And some of them actually do. Right. Like the bias of optimism,
- I'm, I'm fine with. Right. Like this kind of like, I don't know, like I don't know if I like
- specifically myself would kind of engage in the bias towards optimism. Right. But like that again, I
- don't think that egalitarian narrative is a bias towards optimism. If anything, like, maybe like in
- a vacuum or maybe like the boomer conservative version is a bias towards optimism and then it's
- fine. Right. But like in practice, that is not the egalitarian narrative. Right. Like you already
- mentioned this a little bit, but the egalitarian narrative is like, oh, it's all because of racism.
- It's all because of like capitalist oppression. Right. It's, it's not, you know, like yeah, maybe
- maybe you in the ideal world, right. We, we do like the boomer conservatism thing of like, yeah,
- everyone, everyone needs to pull ourselves up by the bootstraps. Like, I don't know, if they, you
- know, censor genetics research based on that or whatever, but that's not really what the right does,
- at least not now. Right. Sorry. And yeah, like I would not be like completely opposed to, you know,
- making like the bootstraps narrative the kind of like national mythology. Right. But that is just as
- long as it stays there, but it just has not stayed there. It just is so vulnerable. You know, like
- when you, when you, when you have, you know, like when you have that, it is kind of like if you
- start with that assumption and people are like, oh, if, you know, if everyone is born equal, if
- everyone has an equal chance of succeeding, you know, then why are there group differences? Right.
- Like when you start with that, it's, it's kind of like a, like a principle of explosion, right. And
- Matthew, you start with any false claim, you can get to anywhere. You, you can get to, you know, the
- claim that like one equals two very easily. And right, like if there is some kind of stable state
- where, you know, everyone is a kind of like bootstraps conservative libertarian forever, like that
- would be kind of understandable, you know, like from a deontological perspective, maybe I'd still
- oppose it, but you know, I think that we'd have like bigger problems to worry about. But like that
- just isn't reality. That just isn't the world we live in. Yeah. Have we lived in society that was,
- you know, in accordance with all, like all things true, like has every society had its own, its own
- myths that have helped create, you know, some sort of harmony. And yeah, they are, you know, liable
- to be warped for nefarious self-seeking ends, but this is the state of society. Like, yeah, for
- sure, for sure. It's always, you know, it's always incremental, right? It's always marginal. It's
- always, you know, like we're not fighting to kind of as much as Darwin would like to, you know,
- we're not fighting to turn into like a kind of startup government. I'm just fighting to like have
- the ML band, have the machine learning band, be at least postponed if not averted, right? Hopefully
- averted. But it's all, yeah, yeah, I agree with you that like in practice, you know, it's all
- marginal. It's all about making things slightly better than they used to be. And that's, that's like
- the approach I have to things too. So like, yeah, I don't think we're ever going to, you know, like
- fully or at least in the short term, I don't think we're going to solve the egalitarian problem at
- all. Yeah. And there's a level of egalitarianism that seems to be just a fit strategy for people who
- are trying to pursue, you call it making difference, you could call it status seeking. It seems to
- be a way for them to do that. And, you know, it's kind of like a like a sorting mechanism though,
- right? Like when you do that, you're kind of attracting in some cases the wrong people. Like if I
- were right or like, yeah, like if I were trying to attract people to government policy, right? I
- would prefer, you know, like all attractors of people, all pipelines of government policy kind of
- actively selected against the egalitarianism. Right. Like I think that would lead to better
- outcomes. But yeah. And like certainly there are circumstances where I think that that's true. Yeah.
- Well, most people tend to optimize for themselves. Yeah. And more so, and this is like the malloc
- idea, like everyone just acting on their own, you know, accordance, you know, own self-interest
- could lead to some great negative things. But I think you're actually very skeptical of that. I
- think that like, people are not really rational. And in a lot of circumstances, the thing that is
- leading to people to the, in many cases, like the problem is like people who have power and who are
- rational kind of not going to their full extent of how they could use that power. Right. Like like I
- think that like people who have power and kind of abide by basic rationality norms should wield that
- power more. Do you think that they are optimizing for, like if they were optimizing for self-
- preservation, would they be doing, you know, and self-interest self-benefit, would they be doing
- something different? Like are they poorly optimizing even for like selfish gains? In many cases,
- yes. Right. Of course, this is a case by case basis. But for example, like, yeah, for example, you
- know, like many governors pre-descented, like it would have benefited both kind of electorally and
- kind of long-term politically for like more governors to advance the RT. Right. Yeah. Like it is
- both a kind of like politically successful move. And it is also like a like a strategically
- successful move. Um, in that you're kind of denying, denying resources from left-wing patronage
- networks. Yeah. Like like a lot of the time, right. This is this is a lot of the kind of like new
- right thing is that like, yeah, in terms of like efficiency, right. I'm not sure, you know, some of
- them also think they're skeptical of kind of like trade, right. But a lot of the time it's just like
- there is an obvious opportunity for kind of for a politician to like gain power for like the vague
- right or like not even right, right. Like banning CRT is that really against, you know, kind of like
- Matt Iglesias thought or whatever, right. Like not the ban, but like the action itself, right. Like
- so it's like beneficial to like 70, 80% of the country. And it's like, and simultaneously, it's it
- is in their self interest, right. It actually does help them politically. And they just don't do it.
- It is kind of what you're talking about about risk aversion. I think like, yeah, a lot of right-wing
- politicians are pretty risk averse. I think there's a broader question as to like, you know, should
- you talk left act right or should you talk right act right. I mean, um, I mean, we're kind of
- defining right as the kind of like right-wing rationalist thing that I was talking about earlier
- where you just say true things that are inconvenient. There are also like real right-wing
- ideologies, right. Like populism and traditionalism, right. Like this is this kind of why like a lot
- of people don't consider me right-wing, right. Like it's kind of like the Nietzsche unbelievable
- critique, right. My life is just structured in such a way that it's so difficult to believe in
- traditionalism. My life is structured in such a way where like, you know, every day I'm walking
- around this like, I'm walking around this like, honestly, all things considered pretty nice city,
- but I'm talking to like so many people. Many of them are, you know, like my fundamental behaviors,
- like most of my behaviors are not oriented towards like any kind of rooted tradition, right. Like
- this is kind of a reason why I don't necessarily consider myself right-wing because like there are
- right-wing moral appeals. Those are like an actual thing. It's just that they've been kind of erased
- from a large portion of people's like historical memory, like people who don't like actively, you
- know, research think about this stuff or aren't involved and say some kind of religious tradition in
- some way, right. Like there's a thing of like actual, like there's a thing of like actual, there's
- like an actual right-wing and it's like not just rationalists. I agree, I agree. And in some ways
- those are big threats to the actual right-wing. Like I'll give an example, like some like biology is
- very, although he, you know, is very good at optics. He is he is not egalitarian. He really believes
- in, you know, merit and hierarchy and he's also, you know, he's a family man, got a bunch of kids,
- like there's certain sympathies that, you know, the right-wing would have with him and he would have
- with right, but he's also a transhumanist and he's also like radically pro-tech. Let's pursue, you
- know, life extension, infinite frontier, you know, break up in America. I don't always be foreign in
- that regard, but like in ways that many right-wing people would think of him as like a bigger threat
- to the right that they know, or the things that they hold dear, you know, a god country than even,
- you know, some mainstream or normy Democrats. So yeah, there are certainly, you know, big fishers
- within the right as there always have been. Yeah, like I don't know, I myself would definitely not
- consider biology right-wing. I'm not sure if he would consider himself right-wing. No, I don't think
- so. I don't consider myself right-wing. Yeah, like I think, you know, many people certainly in tech
- are, you know, politically almost, right? Yeah, in many cases, right? Like I don't know, like this
- is very funny, like the thing is that like my revealed preferences are right-wing, right? Like I
- don't, like I don't I want to get married early, I don't want to have sex before marriage. Like my
- revealed preferences are pretty socially conservative, but I also, I'm pretty convinced by like
- pulling data and like the conclusion that I've drawn from pulling data is that, you know, there's no
- public morality, people won't vote for thing, people won't vote for basically like societal parental
- controls, right? And with some of the abuses that that could end up with, like that, that's probably
- a good thing, right? Like that people in general will not hold themselves to a higher standard of
- kind of social morality. And so like you generally, like this kind, this is like not quite the
- courtesy of intake, but this is somewhat similar to it in that like most pursuits of social
- conservatism, and I mean like social conservatism as in like the traditional version, not not
- necessarily like banning CRT, right? But like, you know, like abortion, right? Roe v. Wade is kind
- of like a great example of this, right? That that's just going to piss people off, and that that's
- not really going to actually make the country more socially conservative in any kind of meaningful
- way. Like the way I differ from Yarpin is that like he thinks that this means that you should like,
- you know, he wrote the entire like Hobbits and Dark Elves thing. He thinks that you should really
- like basically like defer to elites. While I think that it's more of a, I'm more of a kind of like
- biology, I'm more sympathetic to biology's idea of like moral, basically like morality first network
- states, right? I think, you know, there should be very socially conservative, you know, charter city
- in like Utah or something where, you know, you're just not allowed to have abortion. And you know,
- if, you know, birth rates continue, right? We'll end up with the entire U.S. like mostly not having
- abortion because everyone's, you know, every, you know, with generational selection. Most of the
- people in the future who are having kids will be people will kind of fight definition. You will have
- will be people who have not reported their kids. I kind of see like that vision as like a vision of
- social conservatism that I can get behind. But like in terms of like short-term social conservatism,
- it does seem like, it does seem like a misevaluation of like how fucked we are in the present. The
- Curtis biology kind of access is interesting as a way of, you know, showing the different points. I
- mean, Curtis used biology and, you know, any other kind of, you know, non-left wing, sort of, ide
- entrepreneur, as I just say, as further empowering the left. And so he sees the Rufo types and, but
- even, even biology. And so they'd be better off doing nothing, like, you know, winning by losing and
- waiting for, you know, as Lenin said, like the conditions for the revolution are not yet present or
- something. And so whereas biology thinks that Curtis is just giving up and biology doesn't want to
- wait like 30 years, you know, and watch the country turn into Brazil or whatever the concern is,
- biology thinks that actually like impact can be done, you know, things can be fought. And, you know,
- Twitter can be taken over. And maybe other things as well. And so I think that's another thing
- that's interesting is because like people will differ on what should be. And they will also differ
- on the tactics in terms of how to, how to get there. Right. Yeah, that's fair. Yeah. I think like
- what's really interesting about these political theories of change is that they're like, they're
- almost inverted in that the libertarians, I think, are more passive. And the populists actually
- believe that something can be done. Yeah. Yeah. Like, I remember I saw Saurav Trauma, a friend of
- the show, like, there was like some kind of new right figure saying something like, would you rather
- have the New York Times or would you rather have controlled the New York Times or control the
- Senate? And then I think like, I don't remember the exact tweet tweet, but Saurav Trauma, who is I
- think the president of American Moments, which is this vaguely new right aligned policy pipeline
- for, yeah, vaguely new right aligned policy pipeline, he said, definitely the Senate, this isn't
- 2016, right? So like, they have this idea, or at least he has this idea, but I think the sentiment
- is pretty widely shared that like, actually, the right wing knows how to use power now, or at least
- has a better idea of how to use power now. And then actually governments, or like being in charge of
- government does matter. And that it is, it is impactful and that there can be policy steps that can
- be done. We already talked about Richard Nania, right? He's sort of, although I'm sure he disagrees
- with some of Saurav's policy preferences that there is definitely policy that can be done, policy
- that can be implemented. Once Republicans have control again to actually do something about these
- problems. So I think, actually, it's interesting, there were moments of despair, but I think, within
- the new right, I would say that despair is kind of trending downward. So there's increasingly a lot
- of white pills being sold. I agree. I think SF is actually an interesting microcosm there, because
- the situation was pretty dire, and it is still dire in many ways, but people were saying, hey, tech
- is this evil or giant monster taking over the world, and yet it couldn't even sway local elections
- that require a few thousand votes that directly affect its interests. And so there's this great
- contradiction there. And I think, at some point during COVID, it just became too much, and people,
- like all in people, just said, hey, we can actually get Chesa the DA recalled. We can actually get
- the school board recall. We can actually like using our influence, like make a difference. And we're
- previously, we thought, we were too good for it, or we didn't want to do it. Now it's just
- encroached in our lives, in enough material ways, I you see all the people who are moving out of San
- Francisco, that like this effort is needed, and it will impact our bottom line. And also it became
- like high status thanks to people like Mike Solana, who are fighting the fight of ideas. So I think
- SF is just an example of a situation where people got involved locally and are continued to get
- involved and made a difference. And now there's kind of a, you know, whole ecosystem there. And I
- think you're seeing that sprout sprout up in other places as well. Yeah, at the end, like actually,
- you know, you know this better than I do, right? What is the sentiment among, is actually like the
- area founders and investors? Like what is their orientation towards politics in the year 2023? Well,
- most of them are not political thinkers. I mean, most of them are trying to you know, run their
- company and do their job and do it well and have a nice private life. And what happened was sort of
- the, you know, you may not be interested in politics, but politics is interesting you. Yeah. And
- politics started investing all these companies. And it was kind of, you know, ironic and tragic,
- like you have these, you know, Indian and Chinese CEOs, like learning the, like, you know, who don't
- know the intricacies of US race relations or US, you know, politics now having to defend against all
- sorts of accusations of prejudice or things that hold back that if they were to, you know, it's easy
- to would hold back their company, especially. And so Silicon Valley is primarily involved in
- politics to the effect that it impacts Silicon Valley. It's like one one voter issue. Now that is
- both in terms of like their ability to run companies, but also in terms of their ability to like be
- seen as good and not get regulated. And for many, you know, people and companies, their approach to
- not getting regulated is to comply with the regime. That's like, you know, or quote unquote regime.
- Like if you, I didn't know, you know, the lines eating faces party would eat my face. Yeah. Yes.
- Yeah. And so I think that's like predominantly, you know, the method of operating. I think, you
- know, there is a intellectual class as well within Silicon Valley, even if it's a small minority
- that cares, you know, that I think is trying to, in an emergent way, sort of create this culture of
- something that is not, you know, that is not woke, but that is not, you know, boomer or Trump
- either. And so, you know, it doesn't have a name yet. I think Pirate Wires embodies most of it or a
- lot of it. And I think that's why he's built such a strong audience. I think there's a real question
- as to like, there's like electoral politics, like, will, you know, will they support Trump?
- Certainly not. No way. Decentis. I think I think it's to be decided. I think, you know, David Sacks
- has gone on. But I think people are still unsure. I mean, Decentis is not as distasteful, but he's
- certainly leaning into aggressive culture, and it's not like he's like the techie or, you know,
- really like understands or appreciates tech. So I think it's, I think it's unclear, but I think, I
- mean, Silicon Valley, like many fields that have, you know, attracted, like, a lot of really amazing
- talent because of all their wealth, you know, attracts a lot of agreeable people. Like, most in
- order for companies to scale and get big and have impact, you need a lot of agreeable people. I
- mean, you need like, disagreeable people as founders and as like, you know, builders without the
- org, you need some balance, basically, between disagreeability and agreeability. But like, on a
- large dimension, certainly like morally, Silicon Valley is like very agreeable, it wants to be seen
- in good standing. I mean, it's your point about ESG. Like, they want to be seen as doing well and
- doing good at like making money. And so, and the left is just better, egalitarian is just often a
- better narrative. Like, one that seems more palatable than like pursuing excellence, you know,
- excellent. That's like not as much in vote. Like, people who are from an excellence haven't done a
- great marketing job relative to people voting egalitarianism. And so, yeah, because like chasing
- after the real thing involves like efforts and involves being after like, like being excellent is
- just like a much more difficult thing than being egalitarian, right? Like, especially in an
- environment like SF, right? Or especially in an environment in certain companies where, you know,
- the baseline is already really high. You know, like, it's one thing to be, you can be a egalitarian
- with anyone, but it's like, you know, it's one thing to be excellent at like, you know, some random,
- some random public college. It's another thing to be excellent at like, open AI, right? Like, you
- really have to be, you know, if that's your legitimating narrative, right? You really have to be,
- you know, like on a completely other level, like maybe Elon can do that, right? Like, but yeah, like
- kind of like the higher, you know, like the higher the bar is, the higher, you know, the higher
- relative to it, you have to be, right? It is, it is pretty interesting though. Like, here is an
- interesting case where I think like immigration maybe marginally solves this problem, right? It
- might make others worse like lockdowns, anti-market science events. But in, I think like in a very
- noticeable way, immigrants kind of understand intuitively, like both meritocracy and genetic
- differences much more. Yeah, I do think like, this is a very strange one because it's like, it's
- coded exactly the opposite way, right? Like left wingers supposedly like immigrants, right wingers,
- right wingers supposedly dislike immigrants. I don't know. Well, right wingers say, you know, like
- we like legal immigrants, we dislike illegal immigrants. Yeah, but in general, it's kind of coded as
- left wing, but I think the, in terms of like within the tech ecosystem, within like what works in
- tech, I think it's almost, you know, 180 degrees the opposite way. Like the white people like the
- egalitarianism, the immigrants like the meritocracy. I think that is generally how it is, like, I
- think that is generally like the intro tech scene, right? Yeah, I mean, it is interesting. Like
- maybe at the same time, like, you know, a lot of companies now have Indian CEOs and like, you know,
- I'm thinking about this on the fly, but like those companies are often like peacetime companies. I
- mean, you could say Saudi air right now is wartime. And you know, there's anecdotes, but like a lot
- of these immigrants, well, I'm not even sure if they're immigrants, if they're Indian, just like
- have grown up. Yeah, yeah, yeah, even born America or went to like American inspired universities. I
- mean, there's certainly like, like there is something to the idea that West Yang said about Andrew
- Yang, which is he, because he is neither white nor black, he is not like a threat, or he's not part
- like there is something. And one thing that's interesting about Asian people that like go to Ivy
- League schools and like are smart enough to understand that they're like being systemically
- discriminated against. And yet in many scenarios, I'm sure you've seen this, like actually support
- it. And you're just misunderstanding the selection effect here, right? Like Asians, like Asians who
- are good at school mostly are just like apolitical, right? Like the selection effect here,
- especially for like egalitarian ideologies, is like Asians who are not good at school and whose
- parents are disappointed in them. Like that is the constituency, that constituency specifically,
- right? Like if, like here is like the thing, like I disagree with a lot of like the mandatory voting
- things, right? But if you had like mandatory voting, the Asians as a demographic would seem would go
- in a very different way. And like even in countries like Canada, right? Asians are much more of a
- swing vote. Yeah, that makes sense. And certainly in SF, they've had a huge impact. I mean, I think
- one thing about like more broadly in this conversation is, is Elon Musk is, you know, literally like
- as talented as it gets. Like as credible and as talented as it gets. And even he in his most recent
- turn to be more, well, you could call it more right. I mean, there's a number of things you could
- say about him. But like, it seems like he's, or it's unclear that he's better off for it. Like I
- hear he's having a harder time, attract talent than when he was more politically neutral. He
- certainly I'm sure is upgrading, you know, introducing new talent to the right, I would say, or to
- anti left or anti woke ideas. Yeah, like Elon Musk is kind of like peak right wing ration list,
- right? Like he's, you know, he's had like IVF children, like how many now 11, you know, he's got a
- choice. Like like how far apart are like Elon Musk and Richard and I, yeah, ideologically, right?
- Like probably not that far. Right. And but the question is like, is he taking a sacrifice for doing
- so? And, you know, if yes, basically like if you want more people to shift ideologically, like it's
- hard to rely on people to make sacrifices, because that how's that going to happen at scale, right?
- So, yeah, that's fair. At the end of the day, you know, like there are things, there are things that
- cannot be accomplished without like a shift in the state religion, you know, like I did say, I did
- say that I don't reasonably expect to kind of like overturn, you know, the egalitarian conspiracy
- theory that is, you know, like the state religion of the United States. But would be nice, right?
- Yeah, like I do think, yeah, I do think what you're saying is like, right, we don't have too much
- time, but I can kind of give the brief version of this. I know I did send you this in a in a write
- up, but I actually think that like a lot of social traditions are kind of like are not like
- restricting egalitarianism, or sorry, are not like in favor of egalitarianism, but are rather
- restricting egalitarianism, right? Like this is kind of based on a book, hierarchy, and in the
- forest by Christopher Bohm, I talked about it with Mark Kent, or say, what did I say Mark? Rob
- Henderson earlier in this, in, or like not earlier, but like, yeah, in an earlier episode, sorry,
- I'm getting a bit tired. And the thesis of the book is that we had egalitarian societies for much of
- hunter gatherer history, and the way that those egalitarian societies came about is by brutal
- murder. So someone was significantly more talented, attracted significantly more, more mates, right?
- You kind of had like the long arc of history was like basically like in cell crime, right? It was
- basically like egalitarian in cell crime, you know, killing the guy who was taking the most mates.
- And this eventually developed into social norms that were like a quote unquote moderate version of
- this, and that's egalitarianism, where people would mock, they would try to attack the social status
- of more successful hunters in order to, you know, decrease the likelihood of having to compete with
- them. And that, basically, this leads to, this is like basically the earliest manifestation of like
- egalitarian social norms, right? We're going to pretend everyone's equal because, you know, this was
- actually, you know, Malthusian times, pre-industrial times, it was mostly actually a zero sum
- resource conflict. And so, you know, the, you know, the installs get mates and the more successful
- hunters, they don't get, you know, brutally murdered. And this, this has happened for basically like
- most of human evolutionary time, right? Most of especially pre-civilizational human history. And
- that, you know, introduced these kind of egalitarian essentially like biological instincts. The
- moral of the story is, you know, are you there? Yeah, I'm here. Okay, yeah. The moral of the story,
- you know, is that if we really want to undo the problem, we will need at the very least gene
- editing. So, or, you know, another million years of selection, which have many people who don't
- think we will have. I'm not sure if we want to get into that rabbit hole. Well, the intersection of
- software eating the world and egalitarianism is the world is very interesting because what software
- does is it exacerbates, you know, disparities because... Yeah, yeah. This is my favorite biology
- quote, right? Like freedom and inequality are, and inequality are synonyms. Yeah. Right? Like the
- more options you give people, the more abilities, the more capacity you give people, the more
- they're going to be chosen in different ways. Yep. And at the same time, so it exacerbates
- disparities, it codifies those disparities in like, legible ways. Right, measures. Yeah, okay. But
- then it also presents ways to maybe fix them because it creates like control mechanism. You can see
- like, you know, D.I. in some degree is like, it's this in the labor market, right? Like, you know,
- more efficient labor markets mean more inequality, you know, like the fact that we have labor
- marketplaces, you know, can like allows us to measure the disparity in really efficient ways. And
- because, you know, those same companies, you know, like can be captured by the government in all
- sorts of different ways, like they have to, or they get to depending on your perspective, enable
- mechanisms to redistribute or make it more egalitarian at scale. And so software is both a naveler
- of the inequality, but also a naveler of the thing that can jump in to help fix the inequality. It's
- kind of like a, you know, whack them all type, type situation. And, you know, people say that, you
- know, transgenderism leads to transhumanism, or could be like, because, you know, and like gene
- editing is something that the left would be a pariah of the left. But if it's, if once it's in the
- hands of, hey, we can actually like make people equal. Well, hey, that's a pretty exciting idea for
- certain, certain, you know, egalitarians. Right. Like a Harrison Virgeron. Yes. That would be really
- a dystopia. Actually, it'd be fine. In that case, we just let China win. China gets to inherit the
- earth. Well, it wouldn't be fine, but like, it would not be a dystopia. Like, you know, like if a
- regime wants to use gene editing to make people equal instead of better, then like China is the less
- evil power there. Yeah. The, I'm going to zoom out because this is the fundamental question to me,
- at least at the intersection of, you know, egalitarianism and meritocracy. And this is actually
- Agnes Callard's question where she says, you know, morality requires we maintain a safety net at the
- bottom that catches everyone. But we also need an aspirational target at the top. So as to inspire
- us to excellence, creativity, accomplishment. So moral worth needs to be for free, but also
- acquired, required, as to inspire people to do it. And so, like, how do we reconcile those
- contradictions is the task for how we reconcile, you know, egalitarianism and meritocracy or get the
- best. I think Christianity is pretty good. Like, I don't know. Like some people on the right think,
- you know, like Christianity is slave morality and it's leading to the level of egalitarianism we
- have now, you know, it's like a slippery slope argument. Given like my kind of argument to that is
- like the stuff we talked about earlier, right, hierarchy in the forest, it kind of always has been
- egalitarian. In that view, right, like Christianity was upstream of the enlightenment, it was
- upstream of the industrial revolution, right? Like the, the argument for Christianity is like no
- one, you know, basically that you have to go through merit or you have to go through it in order to
- reach merit, right? Like, and I still don't think it's that bad of the system for kind of aligning
- the moral value at the end of the day, right? Like I think, you know, it's been it's been around for
- 2000 and 23 years. I think it could go for 2023 more, you know, I believe, you know, we'll go on for
- and turn B until, you know, we're all, uh, we'll let it happen. But like, I think just from like an
- empirical perspective as well, you know, like Christianity does a good job of doing this. Um, yeah,
- I do think it's ironic that the thing that replaced, you know, the slave morality of Christianity
- was like more slave morality. Yeah, yeah, yeah, yeah, yeah. I've checks and balances and, and it was
- kind of like a bastardized version of the slave morality parts of Christianity and, and maybe
- there'll be a bastardized version of, um, kind of the, the more master morality part of Christianity
- or the less slave morality part because- Right, right. This is why I think Nietzsche's historical
- retelling is just wrong, right? Like he's kind of comparing, you know, modern, modern mass morality,
- or like not modernism, like literally now, but like modernism, his contemporary mass morality to
- like the elite morality of days ago, right? Like, like, like, if Nietzsche, you know, like, like I
- would wonder what he would think of like simultaneously having, you know, the worst slave morality,
- um, but also having like at many elite levels, right? In many elite, elite circles, um, especially
- like the capitalist elite, not necessarily the kind of, you know, not necessarily the social or the
- politically, but like capitalist elite where I think his morality or like what his kind of like
- integrated morality is much better adopted than like almost anywhere else in history. So yeah, I
- think I see like, I'd love to have, you know, and you're like Simone and Malcolm Collins, they've
- been on this show. They want to create a new religion. Uh, good, good luck with that. Um, well,
- anything that is a new religion can't call itself a new religion, I think that gives it an attack
- vector, like to the extent that you think, yeah, um, and also we have separation of church and
- state. And so if you're a religion, you know, like the most effective religions take over this day,
- right? Or, or, you know, penetrate state. So that's, that's a person's state. Right. We have a
- little less than 30 minutes left, uh, which, which topic do you want to cover? Do you want to cover,
- uh, AI? Do you want to cover crypto? Do you want to cover, um, interest rates? Um, I think crypto
- would be cool. But first I want to expand on my answer. You asked me about SF politics and I said
- they're mostly, you know, self-interested and, and, you know, um, not super electorally focused. But
- to the extent that there is a, is a platform, I want to describe it a little bit, um, on both the
- macro and microwave in terms of how I see it anyway, I, I think it's this, like, I was just a riff
- for, for, for maybe it would be something like it's, it's, it's okay to be an elite. It's okay to
- want to be elite. It's okay to want your children to be an elite. Uh, you know, we're not racist.
- We're not sexist. Uh, you know, capitalism is good actually. Tech is good actually. Crime is bad
- actually. Uh, we shouldn't be forced to hire people who are not qualified. Um, you know, some taxes
- are fine, but government is a disaster zone and we shouldn't be keeping feeding it more money. Uh,
- the schools should teach real topics, you know, like math and not indoctrinate kids. Pro math, yeah.
- Um, and, you know, we need politics out of our companies, like stat, um, of any kind. Um, and at the
- same time, let's not like relitigate things like abortion or gay marriage or immigration, things
- that make us seem like bad people to the people that we care about. Um, and, and I'm on a mic, more
- micro way, I think there's like, you know, you can look at the, you know, philosophies and practices
- of caring and empathy that dissolve into kind of veneration of victimhood and, in fact, everyone
- with resentment and, and misery and, and see that it's a straight downward spiral of, you know,
- bitterness and the, and you could say, like, you don't need to live like this. Um, I think like, you
- know, we don't, we don't need to feel like this, you know, you don't need to feel miserable at
- yourself all the time. You don't, you, this is kind of like the George St. Peterson's look of value,
- like, like, like, you can be normal and happy and non-judgmental and productive and satisfied. You
- can treat people in business as individuals and not feel the need to obsessively keep, you know,
- demographic scores or treat people like tokens. Uh, you're not racist. You don't need to think about
- race. You can make money. Um, it should, in fact, making money shows you're doing something that
- other people value. You can donate some of it to help the last portion or as much as you want and
- you can, you know, spend whatever you want to knowing that spending is helping other people provide
- for their families. You can say what you think and if other people don't like it, they can go home
- and be upset, but you don't have to be. Other people can say things that offend you and you can
- shrug and move on with your life. It, you can make mistakes. They can be your fault. You can fix
- them. You can enjoy the spoils of your work. You can work hard and outcompete others and, and, and
- achieve great heights and not feel guilty about it. You can also choose to live a calmer life if you
- want to. I mean, the most ironic thing is that the, uh, you know, one of the most controversial
- topics in tech Twitter is, you know, how hard should you work? Um, which just shows like the core of
- the, the effort to divide. Anyways, I'm kind of rambling a bit, but those are some of the both, I
- think macro like political takes and also some of the micro like, you know, how one should live
- one's life or how, how, how we should approach one's life that I think comes out of a, you know,
- sort of technologist builder mindset. Right. It's interesting that you use that term because I
- actually had a, I had a conversation with, uh, with a friend about builder versus founder. To me,
- like, I kind of have a negative taste in my mouth when I think of like the people who call
- themselves builders. Like, like, I'm thinking of basically like the 2018-ish hackathon scene, which
- I was pretty skeptical. I actually think like this maybe is also a controversial take, but I think
- like monetization and like, bounties and, uh, crypto have been a net, like, extremely net positive
- influence on the hackathon scene. Um, like pre-crip, people don't remember that, like the pre-crypto
- hackathon scene was just like enormously busy work. And, you know, even if like the crypto stuff,
- like, is like infrastructure that's not all that technically complex and, you know, not like
- scientifically revolutionary in any way or form, at least it's like worth something to someone,
- right? Instead of literally being, you know, like, side projects that are actually not functional in
- doing anything, then make someone's resume looks slightly more impressive. Yeah. You know a lot more
- than me on the, uh, on the hackathon front. I do think, okay, I did not expect that. I do think
- builders, well, hackathon is more of a younger man's game and more. That's fair, so that's fair. Um,
- I think the, what builder enables is just a wider, like, if you're, if you haven't started a
- company, what do you call yourself? Okay, that's fair. Yeah, that's fair. People use the term
- operator, but that's like not that compelay. So the more charitable view of builder is just that
- it's more comprehensive to include people who do great work, but haven't really founded the
- companies. Sure. Yeah. Um, so one of the, one of the topics on this list reads crypto as ESG for
- libertarians. Uh, I'm sure, uh, we have not, you know, we, we have not pissed off all groups
- equally. So, uh, for the sake of equity, let's talk about crypto for the sake of intellectual
- diversity in being annoyed at the tapes in this podcast. Let's talk about, let's, let's talk about
- crypto. Uh, first of all, how much of crypto was a zero interest rate than all none? Um, the, well,
- it seems like, um, a lot of crypto was, um, you know, accelerated by easy money. Um, and so, you
- know, the industry has, um, you know, contracted, um, but, um, you know, we're not going to have
- high interest rates forever and, um, good times we'll, we'll come back. So I certainly a lot of the
- speculation was funneled by, by zero interest rates, um, but also, you know, this woman, Carlota
- Perez has a great, um, you know, uh, study of how different technological revolutions, um, go
- through cycles. And one of those cycles happens with initial mania and a bubble that leads a ton of,
- you know, people to spend all this money on all these projects, some of which go nowhere, but others
- of which become like the critical infrastructure for, um, the next hype cycle. And so right now,
- there's a crypto winter in terms of, um, you know, capital invested and in terms of certain projects
- that relied on zero interest rates or, you know, high yields. At the same time, there, um, there's,
- you know, a lot more purists, and there's a lot more developer activity, um, prior. So, um, that's
- how I'd, uh, how I'd respond there. Yeah, the cynical take is that like, AI was right there, right?
- It was lying there. It was, you know, um, and it just didn't have one like very effective proof of,
- proof of concept yet. And then like, like, the main difference between kind of like, AI state of the
- art, um, you know, like, what was it? What, you know, like 20, like summer 2022, and AI state of the
- art now is like pretty negligible, but it was mostly just like really, first of all, releasing stuff
- to the public. And second of all, like having a really nice clean proof of concept in chat GPT.
- Yeah, absolutely. I mean, seeing, you know, there were countless people who switched from Web3 to
- AI, you know, just like that. Um, so certainly, um, it was, it was right under a nose and the
- combination of, you know, the markets, tanking, and thus, you know, financial, like FinTech, tanking
- alongside of it, including inclusive crypto, also enabled, um, or, you know, accelerated AI's, you
- know, mind share. But I think the ESG, um, analogy is, uh, doesn't hold as much for me because ESG
- to me sounds like a way for finance people to seem moral or pure, um, but it's not really rigorous,
- whereas they're not true believers. Um, or, or, or the true believers, they're not like, like ESG is
- not built on, you know, interesting intellectual capital, whereas I think crypto, um, does have a
- much richer intellectual, substrate to it. And it's also worth, you know, calling out that there are
- like many different substrates, right? Like the Bitcoin community, um, you know, builds off of the,
- the, you know, the Austrian economics tradition, you know, some libertarianism, and it's all about
- kind of de-politicizing finance from, from governments. They're, you know, separating all, you know,
- state money, right? Whereas the Ethereum tribe is, um, trying to, um, instead of focusing on de-
- politicizing from the government, they're de-politicizing from big corporations, right? And so, um,
- you know, the Facebook's and the, you know, the other centralized, um, you know, powers of the world
- that are in the private sector. And that's like a cursory reading. But they are true ideological
- believers who are trying to manifest the world. It's like a much more practical EA, I think, or like
- much more technical, much more practical, an experimental, um, version of, of, of having impact. So,
- um, I see it as, as, as Julia, you know, ideological people are trying to make, make, make a
- difference. The technology is not nearly there to the same degree that AI is, um, certainly. AI is
- like major, you know, technological breakthroughs have been. Right. Like, Peter Thiel famously said,
- you know, like, crypto is communist, or, no, AI is communist. Crypto is libertarian. If crypto were
- not libertarian, right? If the main, you know, like, marketing around cryptos, like CBDCs or
- whatever, central bank digital currencies or whatever, right, and not like the Bitcoin intellectual
- tradition, what do you think that would have, affected in terms of investment in crypto? Yeah, I
- mean, it's a good example of talking left acting, right? Like, in terms of like, you know, it's a
- way of making a lot of money and it's a way of, um, you know, trying to do good at the same time.
- And, you know, uh, Daniel Gross said on a podcast recently is like, uh, you know, fire can be used
- for arson, and it can be used for cooking. And so, you know, crypto can be used for libertarian
- ends, and certainly the people who are working on it are trying to do it, but it can also be used
- for state, you know, state control ends. And, um, and, you know, same with AI there, right? And so
- it wouldn't be hilariously ironic if, um, if crypto was largely used for, um, you know,
- authoritarian ends, and maybe the opposite with, uh, with AI. Yeah, that's what Sam, Sam Hammond
- thinks will happen. Um, yeah. Uh, I think that what's, what's very interesting is that like, the
- narrative of profitability around crypto became like the actual narrative, like, the narrative of
- like making things that were not profitable before, like NFTs were kind of based on this, right?
- We're going to like finally make art profitable, right? We're going to, we're going to supercharge
- monetization, right? The narrative for making things profitable became what was profitable, right?
- Like, so, so when you have these kind of like true believers, when you have like libertarian true
- believers, they induce kind of like a market shift in the same way that like political true
- believers induce like a, like a political shift, right? Like, that sounded like, in my head, that
- sounded like an awfully biology like sentence, and I tried to make it less biology like, and it only
- became more biology like of a sentence. Man, I think, I mean, crypto that's non ideological, maybe
- just looks like fintech, like fintech, right? Yeah, is, you know, there's a lot of capital in
- fintech, but they're just non-autological about it. I mean, there's some, you know, types of
- companies that are trying to expand access and have, you know, strong optics and strong kind of a
- more egalitarian mission, but most of it is just like really practical people being like, hey, we're
- just going to focus on like making as much money as possible. And so, yeah, maybe that's what it'll
- look like. Yeah, and I should say, you know, like, I went to East Denver, I respect the crypto
- people. I think I had a tweet that was something like, crypto is the most trusty and, or like
- movement I've ever seen. Everyone loves being low trust, but really I could leave my laptop
- anywhere, and it would be there in a day. Yeah, very high trust community, and I am pearl high trust
- communities, so it's definitely great. Okay, we'll return a little bit to some of the earlier, some
- of the earlier topic and ask for some advice for young people. This has been a very fun, I think,
- topic for like revealing how people think in, I think, at least a less ideological way, but what do
- you think are good steps for young people in terms of dating? Good question. Well, I think I'm only
- going to speak to men here, because I don't I don't know the women side as well. I think one thing I
- think to really appreciate about dating for men is that it gets a lot better as you get older, and I
- think it's hard to fully appreciate that, and you know, you can accelerate that, but basically like
- the more you have to offer to the world, the more successful you are, the more you're going to be an
- attractive person, and so you can try to sort of like, you know, product and marketing, like you
- could try to focus on marketing, like your clothes and, you know, sort of like certain ticks and
- stuff like that, and that's important, like you should do that, like marketing is really important,
- but like, you know, what you have to offer the world, I either like the product, and that's not just
- your career, but career is really important, it's also like your sense of integrity, like, you know,
- your values, like that is, you know, marketing bad products and never going to work, and you know,
- the challenge of a lot of like, dating advice is it's really like short-term, it's really like
- marketing focus, so I mean, I think it's like, you know, Naval even has this line, it's like, you
- know, if you want to marry an incredible person, be an incredible person, like, you know, and so I
- think, you know, dating, but it's one it's worth often, like, recognizing what is the long-term
- goal, long-term goal is presumably a amazing partnership, you know, like marriage, that could happen
- early on if you find the right person, it doesn't have to happen, like, right away, and the more
- that you, like, work on yourself, the easier it will be, and I guess I'm implicitly talking to
- someone early on, because you're saying asking for advice and flying, like, maybe it's not coming
- super easily, and so, you know, like, keep working yourself, be patient, and great things will come,
- at the same time, like, I think friendship is a great way for a relationship to form, like, even
- knowing that you're looking for a long-term partnership, I think separates you from a lot of other
- men who are not looking for that, and so getting to a place where you are looking for that I think
- is more likely to lead you to happiness than kind of engaging in fuckery, so I would probably get
- more serious early on, I would surround yourself with the men that you want to be, and so I would
- look at your, I would encourage you to look at your circle of people, and look at the influences
- that you have, you know, online as well, and say, like, are these the men that I want to be, like,
- do they have great partners, do they have great, you know, relationships, and, you know, it's much
- easier to change your environment than your insides, we tend to become the things that shape us that
- that are surround us, so I would focus on that as well as the personal growth, any reactions to
- that? Right, I think, I'm worried that, like, how much do you think, like, becoming more valuable,
- or like, some people, I think that's definitely the case, right? Like, like, like, young person with
- a startup versus like, young person with a, you know, multi-million dollar company, definitely,
- definitely huge change in value, right? I'm not sure if that's true for like most, I'm not sure if
- like the relative advantage in dating markets for most men is because of increases in their value
- versus just like relative and like true relative decreases in women's value, just in terms of like
- fertility, right? Like, like, that's a cynical take on it, is that like, the market dynamics are
- just true, or like, the market dynamics reflect something real, but like, the thing that is real is
- that like, for example, if you want to have kids, right, your prospect are just worse, right? I
- agree. I mean, I'm in support of, you know, finding someone early if you can. I think just a reality
- that people that may, men may not understand is that like, women are less likely to date down, so,
- and date down on all areas. So like, the women who has a more successful career than you do, and not
- limited to career, but that's one big element to it, is, is, is not going to date you, probably. And
- so, the more successful you are, the wider your pool is, basically. And that's why, like, younger
- men are disadvantaged relative to older men, among other reasons. So, but I do agree with you that
- relative that, yeah, the woman change is more significant, and of course, that impacts, you know,
- male options. Right, and the second thing is like, finding someone at all, right? I think like,
- especially people like similar, or are mutual circles, you know, like share, like, like, either idea
- space or like, startup space, right? There's a pretty big, you know, there's a pretty big, sex ratio
- there. Yeah, I mean, I know it's super generic, it doesn't work for everybody. Like, I do think
- dating your best friend, or dating someone who could be a best friend, I think is pretty good
- advice. Like, you know, no individual person is going to be everything to everyone, and there's
- going to be, you know, as unromantic as it sounds, like sacrifices on some dimension. But if you
- have a best, like, someone's truly, you know, you would be super close with, even if you weren't
- dating, because you just respect them so much, you appreciate them so much, you could handle the
- highs and lows with them. You know, that, that feels like a pretty good thing to optimize for, and
- if you have the opposite feeling, like, I don't know how sustainable that dynamic is, you know,
- going through the, like, thinking about who I can have a fun weekend with, or who you can have a fun
- year with, is very different from who you can have, you know, a family with, and, you know, build a
- life with. And I think we don't really fully appreciate that, like, you know, when were you on?
- Right. Good advice. Last question of the show, always the last question of the show. I'm sure you're
- prepared to some very good answers to this. What is something that is too much chaos and needs more
- order and something that's too much order and needs more chaos? It's a shame that I, given how much
- I enjoy the show, I, I should have answered for it. Let me think for a minute. Yeah, yeah, it's hard
- because we talked about so much. Yeah, usually I want something that we haven't talked about yet.
- I'm sure there are many examples that we have talked about. I think I really like you're saying
- about in order to protect the market, you need to like exit the market, or like, directly where
- you're going with that. And I would love to see much more experimentation on that, on that access.
- So I would love to see tech get more engaged with politics, with policy, with culture, with
- education, things that don't, like, make them rich right away necessarily, but, you know, protect
- the broader ecosystem and hopefully will present, you know, more options for them down the road. And
- so I would love to see more, kind of chaos is the right word, but certainly experimentation. Like, I
- love that, you know, Barry Weiss and Joe Lawson, I just said, hey, screw it, looks like making a new
- university. And the team there that did that, like, I'd love to see much more of that. And I think
- when something seems too ordered, or too regulatory, or too whatever, blah, not worth our time,
- like, that's where we need to to see more experimentation because, you know, otherwise that order,
- that bureaucracy will just become even more cemented. And then in terms of effective libertarianism.
- Yes. And then in terms of what has chaos and needs more order, I actually think, you know, one
- challenge in the tech community is that they're all these companies are competing with each other,
- like Facebook and Apple and Google and, you know, Microsoft and now OpenAI, like, it's actually hard
- to have, like, tech as a class act together because those companies are trying to kill each other.
- And so I think we need a better job, you know, biology starts about like a NATO for CEOs before,
- like, I think we need more organization around, like, collective tech lobbying, so to speak, in a
- product. Some class solidarity. Yes, exactly. I'm sure this is wonderful. That's a pretty good note
- to end on. Anything else you'd like to add before the end of the show? Of these four hours? I
- enjoyed this conversation. I'm excited to see what you decide to do. To do next, I appreciate that
- you, you know, shared the conversation that we had with your audience, and I hope you continue to do
- so. People who've made it this way have a strong, made it this long, have a strong connection to
- you, and I think there will be a number of people who you'll be able to recruit for whatever you do
- because of it. So excited to see you continue to follow your path. Yeah, I'll be awesome. And I'll
- just leave this as a note for the ending. This was my final one of the most enjoyable episodes so
- far. I had a few podcasts that I think were still like very informative and very interesting, but
- for me, we're personally, like, pretty rough and pretty, like, I felt like I made some mistakes. But
- for this one, it was, it was just very thoroughly enjoyable the entire way through. So yeah, thanks
- for, thanks for coming on. It was great. Awesome. Thanks, Brian. I hope you enjoyed my conversation
- with Eric Torenberg. If you'd like to help us out, the number one thing you can do is to let a
- friend know either in person or online. It's the best way to help the show. And hopefully you'll
- have a friend who's either interested in the same topics, has the same habits, and not only are you
- helping S out, but you're also helping your friend find something interesting and hopefully
- enjoyable as well. You can also help us out by leaving a five-star review on any podcast app,
- suggesting future guests in the comments, subscribing to my substack, which is linked below. And if
- you want to catch another great episode next week, subscribing to the podcast as well, once again,
- on any podcast app. If you do that, then you'll get another great episode next Monday. See you then.
Add Comment
Please, Sign In to add comment