Skip to main content

Marc Andreessen on AI, Tech, Censorship, and Dining with Trump AI transcript and summary - episode of podcast Honestly with Bari Weiss

· 132 min read

Go to PodExtra AI's episode page (Marc Andreessen on AI, Tech, Censorship, and Dining with Trump) to play and view complete AI-processed content: summary, mindmap, topics, takeaways, transcript, keywords and highlights.

Go to PodExtra AI's podcast page (Honestly with Bari Weiss) to view the AI-processed content of all episodes of this podcast.

Honestly with Bari Weiss episodes list: view full AI transcripts and summaries of this podcast on the blog

Episode: Marc Andreessen on AI, Tech, Censorship, and Dining with Trump

Marc Andreessen on AI, Tech, Censorship, and Dining with Trump

Author: The Free Press
Duration: 01:45:35

Episode Shownotes

Democrats once seemed to have a monopoly on Silicon Valley. Perhaps you remember when Elon Musk bought Twitter and posted pictures of cabinets at the old office filled with “#StayWoke” T-shirts. But just as the country is realigning itself along new ideological and political lines, so is the tech capital

of the world. In 2024, many of the Valley’s biggest tech titans came out with their unabashed support for Donald Trump. There was, of course, Elon Musk. . . but also WhatsApp co-founder Jan Koum; Cameron and Tyler Winklevoss, who run the cryptocurrency exchange Gemini; VCs such as Shaun Maguire, David Sacks, and Chamath Palihapitiya; Palantir co-founder Joe Lonsdale; Oculus and Anduril founder Palmer Luckey; hedge fund manager Bill Ackman; and today’s Honestly guest, one of the world’s most influential investors and the man responsible for bringing the internet to the masses—Marc Andreessen. Marc’s history with politics is a long one—but it was always with the Democrats. He supported Democrats including Bill Clinton in 1996, Al Gore in 2000, and John Kerry in 2004. He endorsed Barack Obama in 2008 and then Hillary Clinton in 2016. But over the summer, he announced that he was going to endorse and donate to Trump. Public records show that Marc donated at least $4.5 million to pro-Trump super PACs. Why? Because he believed that the Biden administration had, as he tells us in this conversation, “seething contempt” for tech, and that this election was existential for AI, crypto, and start-ups in America. Marc got his start as the co-creator of Mosaic, the first widely used web browser, which is said to have launched the internet boom. He then co-founded Netscape, which became the most popular web browser in the ’90s, and sold it to AOL in 1999 for $4.2 billion. He later became an angel investor and board member at Facebook. And in 2006, when everyone told Mark Zuckerberg to sell Facebook to Yahoo for $1 billion, Marc was the only voice saying: don’t. (Today, Facebook has a market cap of $1.4 trillion.) He now runs a venture capital firm with Ben Horowitz, where they invest in small start-ups that they think have potential to become billion-dollar unicorns. And their track record is pretty spot-on: They invested in Airbnb, Coinbase, Instagram, Instacart, Pinterest, Slack, Reddit, Lyft, and Oculus—to name a few of the unicorns. (And for full disclosure: Marc and his wife were small seed investors in The Free Press.) Marc has built a reputation as someone who can recognize “the next big thing” in tech and, more broadly, in our lives. He has been called the “chief ideologist of the Silicon Valley elite,” a “cultural tastemaker,” and even “Silicon Valley’s resident philosopher-king.” Today, Bari and Marc discuss his reasons for supporting Trump—and the vibe shift in Silicon Valley; why he thinks we’ve been living under soft authoritarianism over the last decade and why it’s finally cracking; why he’s so confident in Elon Musk and his band of counter-elites; how President Biden tried to kill tech and control AI; why he thinks AI censorship is “a million times more dangerous” than social media censorship; why technologists are the ones to restore American greatness; what Trump serves for dinner; why Marc has spent about half his time at Mar-a-Lago since November 5; and why he thinks it’s morning in America. If you liked what you heard from Honestly, the best way to support us is to go to TheFP.com and become a Free Press subscriber today. Learn more about your ad choices. Visit megaphone.fm/adchoices

Summary

In this episode of 'Honestly with Bari Weiss,' Marc Andreessen discusses his endorsement of Donald Trump, reflecting a broader political shift in Silicon Valley and expressing concerns over 'soft authoritarianism' affecting the tech landscape. He critiques the Biden administration's policies on technology, particularly in AI and crypto, and emphasizes the necessity of innovation. Andreessen argues that technologists hold a crucial role in restoring American greatness and laments the dangers of censorship, particularly regarding AI. The conversation also touches upon the emergence of counter-elites like Elon Musk and the need for a balance between regulation and innovation in the tech industry.

Go to PodExtra AI's episode page (Marc Andreessen on AI, Tech, Censorship, and Dining with Trump) to play and view complete AI-processed content: summary, mindmap, topics, takeaways, transcript, keywords and highlights.

Full Transcript

00:00:00 Speaker_00
Today's episode is brought to you by the Foundation for Individual Rights and Expression, or FIRE. FIRE believes that free speech is the foundation of a free society. This freedom is fundamental.

00:00:12 Speaker_00
It drives scientific progress, entrepreneurial growth, artistic expression, civic participation, and so much more. But free speech rights don't protect themselves, and that's where FIRE comes in.

00:00:25 Speaker_00
Proudly nonpartisan, they defend free speech and the First Amendment where it's needed most, on campus, in the courtroom, and throughout our culture.

00:00:33 Speaker_00
If you believe in that fight, and if you believe in the principles of free speech, consider joining FIRE with a gift before the end of the year. Your donation will help FIRE continue their critical work, and it's tax deductible.

00:00:45 Speaker_00
Visit thefire.org slash donate today to make your gift and join the free speech movement. From the Free Press, this is Honestly, and I'm Barry Weiss. Democrats, until very recently, seemed to have a monopoly on Silicon Valley.

00:01:02 Speaker_00
Perhaps you remember when Elon Musk bought Twitter and posted pictures, I saw them in real life, of cabinet after cabinet of t-shirts that just said, stay woke. In many ways, that was emblematic of an entire culture.

00:01:17 Speaker_00
But just as the country is realigning itself along new ideological and political lines, so is the tech capital of the world. In 2024, several of Silicon Valley's biggest tech titans came out with their unabashed support for Donald Trump.

00:01:33 Speaker_00
There was, of course, Elon Musk, but also WhatsApp co-founder Jan Koum, Cameron and Tyler Winklevoss, who run the cryptocurrency exchange Gemini, VCs like Sean Maguire, David Sachs, and Chamath Palipatiya, Palantir co-founder Joe Lonsdale, Oculus and Andoril founder Palmer Luckey, hedge fund manager Bill Ackman, and my guest today, one of the most influential investors in Silicon Valley, and the man largely responsible for bringing the internet to the masses,

00:02:03 Speaker_00
Mark Andreessen. Mark was always with the Democrats. He supported Bill Clinton in 1996, Al Gore in 2000, John Kerry in 2004. He endorsed Obama in 2008, and then Hillary in 2016.

00:02:19 Speaker_00
But over the summer, a few days after Trump's assassination attempt in Butler, PA, Mark hosted a podcast with his VC partner, Ben Horowitz, and announced that both of them were going to endorse and donate to President Trump.

00:02:33 Speaker_00
Public records show that Mark donated at least $4.5 million to pro-Trump super PACs. Why?

00:02:40 Speaker_00
Because he believed, as he tells me, that the Biden administration had, quote, seething contempt for tech, and that this election was existential for AI, crypto, and startups in America.

00:02:56 Speaker_00
Marc Andreessen, if you haven't heard of him, got his start as the co-creator of Mosaic, the first widely used web browser, which is said to have launched the internet boom.

00:03:06 Speaker_00
He then co-founded Netscape, which became the most popular web browser in the 1990s, and he sold it to AOL in 1999 for $4.2 billion.

00:03:18 Speaker_00
Mark now runs a venture capital firm with Ben Horowitz, where they invest in startups who they think have potential to become billion-dollar unicorns. Their track record is pretty impressive.

00:03:29 Speaker_00
A16Z invested in Airbnb, Coinbase, Instagram, Instacart, Pinterest, Slack, Reddit, Lyft, and Oculus, to name just a few. And for full disclosure, Mark and his wife Laura were small seed investors in the free press.

00:03:47 Speaker_00
And yes, we do expect for our name to be alongside those other companies sometime soon.

00:03:52 Speaker_00
Great investors are able to see around Ben's, and Mark has built a reputation as someone who can do just that, who can recognize the next big thing in tech and really more broadly in our lives.

00:04:04 Speaker_00
He has been called the chief ideologist of the Silicon Valley elite, a cultural tastemaker, and even Silicon Valley's resident philosopher king.

00:04:14 Speaker_00
Today, I sit down with Mark to understand his reasons for supporting Trump and the broader vibe shift in Silicon Valley. We talk about why he thinks we've been living under soft authoritarianism over the last decade and how we get out of it.

00:04:29 Speaker_00
We talk about why he's so confident in Elon Musk and his band of counter elites, how Biden tried to kill tech and control AI, why he thinks AI censorship is a million times more dangerous than social media censorship.

00:04:44 Speaker_00
why technologists, in his view, are the ones to restore American greatness, what Trump serves for dinner, why Mark has apparently spent half of his time in Palm Beach since November 5th, and why he thinks it's morning in America. Stay with us.

00:05:06 Speaker_00
Today's episode was made possible by Ground News. America's trust in the media has been on a long decline, especially in the last few years. If you listen to this show, you know it's something we care about and talk about a lot.

00:05:19 Speaker_00
The legacy press often has their own agenda, which leads, and we've seen this many times, to biased coverage. public polarization, and ideological bubbles that reinforce readers' opinions rather than challenging them.

00:05:32 Speaker_00
That's why Ground News is so important. Their app and their website allow us to access the world's news in one place, so you can compare coverage with context behind each source.

00:05:44 Speaker_00
Reading the news this way helps you see discrepancies on how certain topics are covered or not covered, so that you can think critically about what you read and make up your own mind.

00:05:54 Speaker_00
Check it out at groundnews.com slash honestly to get 50% off the Ground News Vantage plan for unlimited access. By subscribing, you're supporting transparency in media and our work in the meantime. Today's episode was made possible by Ground News.

00:06:11 Speaker_00
America's trust in the media has been on a long and steady decline. especially over the last few years. If you listen to this show, you know that's something that we care about and talk about a lot.

00:06:23 Speaker_00
Mainstream media often have their own agenda, which leads, and we've seen this many times, to bias coverage, public polarization, and ideological bubbles that reinforce readers' opinions rather than challenging them.

00:06:36 Speaker_00
That's why Ground News is so important. Their app and website allow us to access the world's news in one place, to compare coverage with context behind each source.

00:06:47 Speaker_00
Reading the news this way helps you see discrepancies on how certain topics are covered or ignored, so you can think critically about what you read and make up your own mind.

00:06:57 Speaker_00
Check it out at groundnews.com slash honestly to get 50% off the Ground News Vantage Plan for unlimited access. Ground News is subscriber funded. By subscribing, you're supporting transparency in media and our work in the meantime.

00:07:14 Speaker_00
Marc Andreessen, welcome to Honestly.

00:07:16 Speaker_04
It's great to be here.

00:07:17 Speaker_00
I'm really happy to have you. I have to say, I have never seen you with more of a pep in your step and more of a kind of perma smile on your face than I have over the last four weeks in public.

00:07:27 Speaker_00
And I think that that's because Donald Trump won the election. And I want to understand, what about Trump's win to you felt so fundamentally important for America?

00:07:39 Speaker_04
Yeah, so I'd start by saying, first of all, it's morning in America. So I am very happy. I do think the analogy for what's happening right now is 1980. But look, I would start by saying it is not just Donald Trump. Trump's victory is part of it.

00:07:51 Speaker_04
But I think below that are two other things. So one is just in the election, generally, there was, as you know, a dramatic shift to the right across broad swaths of the population, by the way, including in California.

00:08:01 Speaker_04
And one of the stories that hasn't fully been, I think, realized is the extent to which counties, districts in California actually went red this time. including some that hadn't been read in, I think, 30 years.

00:08:10 Speaker_04
And then the other is just like, even in places like San Francisco, there was a pretty significant vote shift. And then the youth vote, you know, is I think maybe the single biggest part of what happened, which is, you know, the kids are changing.

00:08:19 Speaker_04
And then I would say even beyond that, even beyond the partisan politics of it, it feels like the last decade has been a very emotionally dark and repressive time.

00:08:28 Speaker_04
You know, in many ways, Silicon Valley was on the vanguard of, you know, what you might call sort of a soft authoritarian, you know, kind of social revolution starting about 10 years ago, 12 years ago.

00:08:37 Speaker_04
And that soft repressive authoritarianism, you know, had a real negative impact on basically my whole world, the tech industry, the country. And, you know, we'll see. It certainly feels like that's cracked.

00:08:46 Speaker_04
You know, I get all these little data points on kind of things that are changing from people in all these different industries. And like, there's just so many little micro stories right now of like movie projects that were dead in the water.

00:08:54 Speaker_04
in October that all of a sudden have come alive, or people who now think that they can write the book that they never thought they'd be able to write, and, or, you know, comedians who think they can, you know, tell jokes that they couldn't tell.

00:09:03 Speaker_04
So there are all these little sparks, you know, kind of firing all over the place.

00:09:06 Speaker_04
You know, people basically poking their head out of the frozen tundra of the, of the culture and sort of realizing that it's actually okay to like laugh and play and have fun and build things, do things, hire on merit, you know, celebrate success.

00:09:17 Speaker_04
You know, and then underneath that is sort of, you know, fundamentally be proud of the country, be happy about the country, be patriotic.

00:09:22 Speaker_00
There were many people broadly in your world, I think Elon Musk most obviously, who repeated again and again in the weeks leading up to the election that if Donald Trump didn't win, if the right didn't come to power, this was going to be the last American election.

00:09:38 Speaker_00
Did you buy that?

00:09:39 Speaker_04
You know, both sides have kind of intense views on that, and the other side obviously has very intense views on that also. You know, maybe I have a little bit more faith in the system. Maybe I shouldn't, but maybe I do.

00:09:50 Speaker_04
So my model for the world we're in is just different. I don't think we're in the world where there's like sudden dramatic change. Like, I don't think we're in the world where it's 1917 and Lenin shows up.

00:09:58 Speaker_04
You know, politics 50, 100, 200 years ago, when there used to be the dramatic breakpoints, you had these populations that were primed for physical violence. You know, they were primed to go into the street on a moment's notice and like start killing.

00:10:09 Speaker_04
I tend to think we're in a world in which we kind of do that, but we do that online. It's a virtual Cold War, not a physical hot war.

00:10:15 Speaker_04
And so, you know, people beat the hell out of each other, but they do it on X and, you know, they kind of take their anger out that way. And look, that's why I say it's sort of like a soft authoritarianism.

00:10:22 Speaker_04
We don't live in the world of like jackbooted thugs, you know, searching for Anne Frank or something like that. We live in the world of, you know, you can't say that. And if you do say that, you're going to be completely obliterated.

00:10:32 Speaker_04
reputationally and economically, and you're going to lose your friends and family. And so it's been an intense situation.

00:10:36 Speaker_04
I think that American politics and culture will continue to be intense, but it's sort of a soft form of authoritarianism and repression, not a dramatic, physical, hard breakpoint one, I don't think.

00:10:46 Speaker_00
let's just spend a second for people and put some meat on the bones of this sort of like soft authoritarianism, soft totalitarianism you're describing.

00:10:54 Speaker_00
So I think what you mean when you say that in the realm of academia, right, I can think of certain particular flashpoints like Nicholas Christakis at Yale, I believe it was in 2014, getting actually surrounded by a mob of hysterical students who were overwhelmingly offended by the idea that his wife had the gall to send out an email

00:11:15 Speaker_00
saying that students should be able to choose their own Halloween costumes.

00:11:18 Speaker_00
You know, in my own experience at The New York Times, this looked like James Bennett and Jim Dow getting drummed out of their jobs or reassigned for having the temerity to publish an op-ed by Senator Tom Cotton.

00:11:30 Speaker_00
Why did this worldview have so much purchase and why was it able to conquer so much territory and so many institutions so broadly and effectively?

00:11:42 Speaker_04
Yeah, I think there's two reasons. One is, look, it's sort of the fundamental impulse of what you just might say is sort of the political cultural left, which is basically that society is unfair and unequal.

00:11:50 Speaker_04
And so we must therefore commit ourselves to whatever level of social revolution is required to get to total fairness and inequality.

00:11:56 Speaker_04
And, you know, that's an impulse that you can trace it back maybe to the French revolution, maybe to the American revolution.

00:12:01 Speaker_04
You can certainly trace it back to the Fabians, you know, in the UK in the 1880s, you can trace it to, you know, Marxism, the communist revolution had that in spades.

00:12:09 Speaker_04
There's this great book called Red Decade, which is sort of the chronicle of basically the communist revolution that sort of happened in the cultural and intellectual elites in America in the 1920s and 1930s.

00:12:18 Speaker_04
The movie Oppenheimer actually created actually quite vivid portrayal of what life was like at, in that case, at UC Berkeley in the 1930s, which is, you know, there was a joke in the movie, which is, you know, well, is, you know, is Oppenheimer communist?

00:12:28 Speaker_04
And somebody laughs and says, yeah, like half the faculty at Berkeley are communist. And if anything, that understated it, right? It was probably a hundred percent. So, you know, we, we had our version of that in the twenties and thirties.

00:12:37 Speaker_04
We had our version of that in the sixties and seventies with the cultural revolution of that period.

00:12:41 Speaker_04
We had a version of it, by the way, when I was in college in the late eighties, early nineties, you know, with so-called political correctness and multiculturalism, you know, look like I think a lot of people would believe that there's a lot of good that has come along the way.

00:12:52 Speaker_04
And that you could argue that like a lot of what we call social progress has happened. And I don't even want to second guess that, but. it does kind of arrive in these very intense waves.

00:12:58 Speaker_04
And then, you know, like the other part of it is it's an instrument of power, right? It's a little bit like the ring of power.

00:13:03 Speaker_04
Like if you have the ability to destroy somebody by calling them racist or calling them sexist or accusing them of one of the many other thought crimes, are people going to wield the ring of power responsibly at all times?

00:13:12 Speaker_04
And of course the answer from, like, Like, what's the one thing we know for sure? You know, power corrupts. The whole point of the Ring of Power in the Tolkien stories was, like, you can't wield it without getting corrupted.

00:13:23 Speaker_04
Like, of course it's going to turn evil. Of course it's going to get used. Because it's just simply too powerful, right? Of course it's going to get used in ways that people will later regret.

00:13:30 Speaker_00
I guess I wonder, when did you start to update your mental model of what was going on?

00:13:36 Speaker_00
And maybe another way of asking that is, there have been, I think, a bunch of signal events over the past 10 years that have served to either shift or fully radicalize people's views. What were those galvanizing moments?

00:13:49 Speaker_00
What were those signal events for you?

00:13:51 Speaker_04
So I would say the sort of thing of like, wow, this is weird. And then it keeps happening was sort of call it 2012 to 2015. We were investors in this company, GitHub, which was one of the companies that went hyper woke super early.

00:14:04 Speaker_04
And that place got completely surreal from very early on. But the really big turning point moments for me were actually what happened on the other side, what happened on the right.

00:14:12 Speaker_04
And specifically, I was completely floored that Trump got nominated in 2015, and I didn't understand it at all. And then, like, I was completely shocked, you know, times 10, that he won the general election in 2016.

00:14:22 Speaker_04
And what happened was, sort of the woke stuff, the Silicon Valley stuff, the social media stuff, the censorship, which we'll talk about,

00:14:29 Speaker_04
Those things like 10x'd when Trump got nominated those things like 10x'd and then when he won they 10x'd again And then when the Steele dossier came out they 10x'd again and then when Charlottesville happened they 10x'd again And then when impeachment happened they 10x'd again and then 2020 COVID they 10x'd again and then BLM they 10x'd So it was like this like incredible like ramping up of emotion and drama and change and that's when I Psychologically, I was like, all right I need to take a step back here because I don't understand what's happening on the left because I don't understand these purges

00:14:56 Speaker_04
And then I was like, I don't understand Trump getting elected at all. And I felt very disoriented because my personal story is I grew up in rural Wisconsin, you know, which is now staunch Trump country.

00:15:05 Speaker_04
So kind of my people historically, farmer world was in for Trump early and hard. And I had just like lost touch with the culture and didn't understand what was happening in that part of the world.

00:15:14 Speaker_04
But I spent, I would say, those five years basically confused. I tried deliberately to kind of reset my own psychology. I was just like, look, I don't understand the country. I don't understand what's happening either side.

00:15:22 Speaker_04
I need to really go think about this hard. And I need to read a lot. And I need to go like back in history, right? And I had to basically completely rebuild my worldview. And that took six years.

00:15:31 Speaker_04
And that's while we're basically fighting fires the entire time, trying to help our companies kind of navigate, you know, and trying to get the tech industry to kind of navigate through this.

00:15:38 Speaker_00
What did you read in that six-year period that most helped you understand both sort of like the mode of the right and the mode of the left?

00:15:47 Speaker_04
James Burnham is, I think, you know, super helpful on these topics for people who don't know who he is. He was one of the smartest political thinkers, political scientists, philosophers of the 20th century on American politics.

00:15:57 Speaker_04
And he had this particular life story, which is he was actually a full on communist revolutionary activist. He was a personal friend of Leon Trotsky in the 1920s, 1930s.

00:16:04 Speaker_04
And then he broke from communism in the forties and he went hard to the right and he actually became a co-founder of National Review.

00:16:10 Speaker_04
with William F. Buckley, but he wrote these two books in the 1940s during the heart of kind of the big three-way battle between communism, fascism, and liberalism was kind of raging in the world.

00:16:20 Speaker_04
One's called The Managerial Revolution, and it's basically, you know, these movements have real differences, obviously, those three big movements, but there is something in common, which he called managerialism, which is sort of the establishment of this expert class, the communists called it the vanguard class, you know, in America we call it academics, experts and fact-checkers, and it's sort of, it's the establishment of kind of this class of expert technocrats

00:16:40 Speaker_04
who are assumed to be able to steer society in healthy and beneficial ways, and then often lead you in very bad directions. And so that turns out to be a helpful framework for thinking about our time.

00:16:49 Speaker_04
And then he wrote this other book called The Machiavellians, where he sort of looks at politics structurally as opposed to ideologically.

00:16:55 Speaker_04
One of the ideas from this school of political philosophy is basically what he calls the iron law of oligarchy, which basically is democracy is never actually a thing. There's no actual system of democracy.

00:17:05 Speaker_04
You always end up with a small minority in charge of a large majority. in basically every society in human history. You always have a small elite minority in charge of a large majority.

00:17:13 Speaker_04
And the reason is because small elites can organize and large majorities cannot. And by the way, this is built into the American system, right? Where it's the classic thing, we're actually not a democracy, we're a republic, right?

00:17:22 Speaker_04
And so the founders understood the iron law of oligarchy and they knew that if we were just like a straight, you know, direct democracy, it would end up collapsing and it would lead to something very bad.

00:17:31 Speaker_04
And so that's why we have this intermediate layer of Congress and Senate representing us. Every experiment in direct democracy in the history of the world has ended in disaster. Basically, every error is tried. Direct democracy fails.

00:17:42 Speaker_04
By the way, direct democracy is a huge problem right now in California, where we have this ballot system, referendum system, in which you just get the craziest stuff that comes in through the referendum system with no buffer at all.

00:17:54 Speaker_04
And so anyway, the conclusion, like, for example, from the Machiavellians is, OK, it doesn't matter what you think democracy should be.

00:18:00 Speaker_04
Any form of democracy that you're going to have is going to have an elite class that is going to be running things. And that's just going to be a structural reality.

00:18:07 Speaker_04
And that elite class is either going to be good and beneficial and have the best interests of the population in mind, or it's not.

00:18:12 Speaker_04
But to kind of pretend that they're just kind of arbitrarily voted in and out, that people are in charge, is just a myth.

00:18:16 Speaker_00
So are we living in a democracy in America or an oligarchy?

00:18:19 Speaker_04
An oligarchy. So this is the iron law of oligarchy. We are always and ever living in an oligarchy. Every society in history has been an oligarchy of some kind. By the way, this view has been very common for a very long time in the American left.

00:18:31 Speaker_04
People like Noam Chomsky, who've made their entire political careers talking about the sort of presumptive business oligarchy that must run everything, and then how they work through their allies in the press to basically manufacture consents.

00:18:41 Speaker_04
People like me look at at least what's happening today and kind of say, actually, now the shoe's on the other foot.

00:18:46 Speaker_04
The oligarchy, as expressed by the heads of the major corporations, the heads of the bureaucracies, are no longer business oligarchs for the most part. They're kind of these academic press bureaucratic elites.

00:18:56 Speaker_04
But anyway, Burnham is a very good framework for thinking about that.

00:18:58 Speaker_00
Just to kind of like extend this idea, let's assume Burnham's right. Everything's an oligarchy and that the election on November 5th was sort of the voting out by the American public.

00:19:11 Speaker_00
or at least a giant middle finger to the old elite, the old guard, the old oligarchy, and perhaps the ushering in of a new oligarchy. In other words, is part of the way to understand this election, the triumph or maybe the emergent

00:19:27 Speaker_00
victory, let's see, of if they can actually wield power and, you know, recreate themselves of a new counter elite.

00:19:34 Speaker_00
And are you, Marc Andreessen, you know, someone who has supported Clinton in 1996, Gore in 2000, John Kerry in 2004, Obama in 2008, Hillary in 2016.

00:19:45 Speaker_00
Are you a turncoat or are you someone who saw the corruption of the old elite and decided to sort of switch into a new counter elite?

00:19:54 Speaker_04
Yeah, so yes, both of those, yes. Depending on your point of view, right?

00:19:59 Speaker_00
Explain it from yourself, if you can, and also what causes someone to turn against their... Because you and Reid Hoffman, Mark Cuban, you're all like pedigreed in much of the same way, but only some of you sort of have gone in the other direction.

00:20:14 Speaker_00
So let's talk about you and then the phenomenon more broadly.

00:20:16 Speaker_04
So basically, the way I would describe it, politically, socially, culturally, I'm a child of the 1990s, California in the 1990s specifically.

00:20:23 Speaker_04
Basically, in the 1990s, for somebody like me, tech founder, kind of coming up, educated in American universities, beneficiary of everything from federal research funding to student loan programs and so forth, kind of got in position to be able to start a tech company that became successful.

00:20:35 Speaker_04
There was basically something that nobody ever wrote down, but everybody understood, which I call the deal. The deal was somebody like me basically could start a company. You could invent a new technology.

00:20:44 Speaker_04
In this case, web browsers and all the other things that Netscape did. Everybody would think that that was great. And by the way, we got glowing press coverage. Everybody loved us. It was great. And then you could go public. You could make a lot of money.

00:20:54 Speaker_04
That was great. You would pay your taxes, and that was fine. That was obviously a key part of the social compact. And then at the end of your career, you would be left with this giant pot of money.

00:21:02 Speaker_04
And then what you would do is donate it to philanthropy that washes away all of your sins, you know, reclassifies you as from a sort of, you know, suspect business mogul to a, you know, virtuous philanthropist.

00:21:11 Speaker_04
And that's the arc and it's all great and wonderful.

00:21:13 Speaker_04
You get honorary degrees at all the universities, you get invited to all the great parties, you get invited to Davos, you get invited to Aspen, you get to come in and, you know, sit with the New York Times.

00:21:22 Speaker_04
editorial board, the dinner parties are spectacular. You know, there's all these incredibly important, like wonderful people. And then, you know, you get married and you and your spouse have the exact same politics and off that way you go.

00:21:31 Speaker_00
And then you sign the giving pledge and everything's awesome.

00:21:33 Speaker_04
You sign the giving pledge and everything is great. And then Warren Buffett talks about how great you are and it's just like, it's all like wonderful.

00:21:38 Speaker_04
And basically what happened, what I experienced was they, the people in charge of all this basically broke the deal in basically every way that you possibly can.

00:21:47 Speaker_04
So basically every single thing I just said for the last decade has been now held to be presumptively evil. Everything from just the whole idea that there are certain people who merit a greater economic outcome than others is itself evil.

00:21:59 Speaker_04
Technology, of course, is held to be presumptively evil. Tech companies are held to be presumptively evil. Tech people are held to be this evil class. Anybody who's rich is evil.

00:22:07 Speaker_04
And then the thing that really shook me was when I realized that philanthropy was being redefined as evil. And so this very specific thing that happened there was my friend Mark Zuckerberg and his wife

00:22:16 Speaker_04
did this thing where they announced and committed that 99% of the money in their ownership in Facebook was going to go to philanthropic causes.

00:22:23 Speaker_04
By the way, non-political philanthropic causes, the big mission of what they call the Chan Zuckerberg Initiative is literally to cure all disease in the next 100 years, right? So imagine $200 billion or whatever going to curing all disease.

00:22:33 Speaker_04
And they just got hammered with criticism and attacks on that. And the line of argument was literally, oh, they're just slimy rich people, and they're only doing it for the tax break.

00:22:42 Speaker_04
which is like a basic mathematical problem, which is you don't give away 99% of your money for a tax break.

00:22:48 Speaker_04
But what I realized is there's this real sentiment now, which is philanthropy is evil because the morally correct thing to do is for the government to do all this. And so what they should be doing is they should be getting taxed at 99%.

00:22:58 Speaker_04
The government should take the money and then the government should allocate the money. And that's a level of statism and kind of bordering on communism that is sort of unfortunately kind of infected the culture right now.

00:23:06 Speaker_04
So basically it's like, okay, every single part of that deal no longer works. And so it's basically the entire thing has been jettisoned off the airlock.

00:23:12 Speaker_04
And so it just, you know, for me, it then raised the question of like, okay, if none of that is true, then what world am I living in? What role do I play?

00:23:17 Speaker_00
How much was sort of going on just beneath the surface over the past few years? I thought about this a lot on July 13th, the day Trump got shot.

00:23:27 Speaker_00
I remember scrolling through X and seeing all of these people who I sort of privately knew from WhatsApp groups and Signal groups. that they were Trump-curious or had moved rightward. But in public, they were performing something very, very different.

00:23:42 Speaker_00
All of a sudden, we're reposting this iconic image of Trump bloodied on his ear with the raised fist, some of them explicitly endorsing him, but it was all sort of like a tacit endorsement or saying, you know, we're on this side.

00:23:55 Speaker_00
Explain to me, Mark, how big of a group was that? When did you start understanding that there were a lot of closeted people? And were you surprised by what you saw

00:24:05 Speaker_00
In the sort of like mass coming out of the closet, I think you would describe it as on the 13th.

00:24:10 Speaker_04
The easy version of the story, I think the easy answer is there was a lot of what was called preference falsification happening.

00:24:17 Speaker_04
And so there were some number of true believers on the blue side and then there were some number of true believers on the red side.

00:24:22 Speaker_04
But the true believers on the red side were lying in public about what they believed because of the threat of social sanction and career obliteration and losing their friends and family and everything else.

00:24:30 Speaker_00
And just for the listeners, that's what preference falsification means.

00:24:32 Speaker_00
Sort of everyone who listens will be familiar with this phenomenon of like there are certain things that you'll say to your spouse alone in bed at night that you would never dare to say in the office or on X.

00:24:43 Speaker_04
Exactly. And in fact, there's actually two parts to preference falsification. There's not saying the things that you believe. And then there's being forced to say things you don't believe. Right.

00:24:52 Speaker_04
And of course, there's this famous essay by Václav Havel, where he talks about the power of the powerless.

00:24:56 Speaker_04
And he talks about the role of the slogan in like the communist, you know, behind the Iron Curtain, he talked about the role of the slogan, you know, the... It was the green grocer putting in his window, workers of the world unite.

00:25:06 Speaker_04
Workers of the world unite. And he does this whole thing where he's like, nobody believes that the workers of the world are uniting. Nobody believes.

00:25:12 Speaker_00
But why is Martin Garther putting it in his window?

00:25:14 Speaker_04
Exactly. Right. And so that's the other side of it is people saying things that they don't believe. And, you know, these are both bad. Arguably, the second one is worse because you're forcing somebody to say something they don't believe, which is a

00:25:23 Speaker_04
at least in the communist system, that's what broke the spirit, right? Which is, okay, if you're being forced to lie at that point, you've like completely lost yourself. You're drenched in so much internal shame that you're never gonna recover.

00:25:32 Speaker_04
Timur Kuran, who sort of is the best author on this topic with his book, Private Truths, Public Lies.

00:25:37 Speaker_04
He went on X the other day and he said, the preference cascade is unwinding so fast now that we're now gonna start to see people who are gonna claim that they voted for Trump, even though they didn't, right? Like it's gonna go the other way.

00:25:47 Speaker_04
I don't know if that'll happen or not, but he's the expert and he thinks it will. I think there's a darker truth, though, which I think has, if anything, more to do with it, which I think there are a lot of people who just don't have strong beliefs.

00:25:57 Speaker_04
By the way, they may think they do, but what they actually have is they're basically just, they're in a social context where everybody around them either believes the same thing or is saying that they believe the same thing or hiding their true beliefs.

00:26:07 Speaker_04
And so it's go along, get along. You just didn't have to put a lot of thought into it. And especially like in the business world, which I know the best, most people are just busy at work trying to get their work done.

00:26:16 Speaker_04
Most people are just trying to get promoted and get a raise and then they go home at night and they're just trying to take care of their families and they're trying to take care of their kids and you know, whatever.

00:26:22 Speaker_04
There's a leak in the roof and they're trying to deal with that. And so even in the elite classes, most people don't have these super strong concrete views.

00:26:28 Speaker_04
And so if the entire momentum of society is heading one direction, it's just the most natural thing in the world to go along with it. And then when this kind of preference unwind happens, when the cascade tips,

00:26:37 Speaker_04
tilts in the other direction, then it may cascade hard in the other direction. And I think that's as much to do with it. Oh, and then back to the Trump getting shot.

00:26:44 Speaker_04
So Trump getting shot, you know, by the way, there may be a gender thing here, but I'll just tell you, like, I think it was hard for any man, I won't speak, I don't say this to imply women think differently. I can just only speak for myself as a man.

00:26:56 Speaker_04
I think it was hard for any man to see somebody get shot in the head, bleeding, not knowing how badly he's been injured. The shots are still coming in.

00:27:04 Speaker_04
The Secret Service comes in, shoves you to the ground, is dragging you off the stage, and your response is to break free of the Secret Service and stand up and expose yourself to ongoing gunfire, with literally people being shot in the bleachers right behind you, and screaming, and chaos, and blood, and all of it, and to actually expose yourself to that level of physical danger and put your fist up in the air and say, fight, fight, fight.

00:27:23 Speaker_04
Like, we don't see physical bravery like that. Like, I'm sure they saw it in World War II. I'm sure you see it in wartime. But, like, we don't see that in our lives.

00:27:31 Speaker_04
And so I think it was one of those just such shocking, positively shocking moments that it was this thing where it's just like, all right, if we're not going to stand up for that, like, what are we all doing? And so I think that made a big impact.

00:27:44 Speaker_00
You were one of the people that tweeted that image on that day.

00:27:46 Speaker_04
Yeah.

00:27:47 Speaker_00
Were you nervous to do that?

00:27:49 Speaker_04
You know, look, it's just like, it's the United States of America. We've all been told for 10 years that we're all racist and sexist and horrible and the country's horrible and colonialism and on and on.

00:27:57 Speaker_04
It's just like all horrible and tech is horrible and the whole thing is horrible. And here's this guy bleeding out of his head who's standing up and he's literally, I mean, the iconography is amazing.

00:28:04 Speaker_04
He's literally wearing a red, white and blue suit and tie with the American flag behind him. By the way, we knew at that point that we were gonna come out for him. Ben and I had already decided at that time.

00:28:14 Speaker_04
We had dinner with him eight days before that. And we had a great, it was wonderful, it was an incredible conversation. And then that happened the following Saturday.

00:28:22 Speaker_04
And then we were headed in actually to record our podcast where we were going to endorse him on that Monday.

00:28:26 Speaker_04
And then of course, the other, of course, important thing about that moment is that's the moment when Elon stepped up and said, yep, I'm for him.

00:28:32 Speaker_04
That was the big moment in our industry where the most important iconic figure in tech by far, you know, did that in that moment. And so that was the real shift.

00:28:41 Speaker_00
Peter Thiel got up at the RNC in 2016 and endorsed Trump. that didn't have the same effect of sort of giving people cover or shifting the tide in the way that Elon did. How do you understand that?

00:28:54 Speaker_04
I mean, again, I think like, I was just speaking for myself, like I didn't have anywhere near the depth of knowledge or awareness or had like, I hadn't really thought about any of it.

00:29:03 Speaker_04
And so what Peter did just seemed so completely offbeat from, by the way, he was like, I think in tech he was the only one. I think he was the only public one, yeah. The only public one, right.

00:29:11 Speaker_04
And by the way, Peter is a really good friend of mine and I talked to him a lot during that time period and he and I were on the Facebook board together at that point.

00:29:17 Speaker_04
And this actually led to a big blow up actually on the Facebook board that became public when one of the other board members actually attacked Peter for it in an email that then leaked So like, you know, there was high drama around this.

00:29:25 Speaker_04
And by the way, Mark Zuckerberg showed at that time actually a lot of bravery himself. There was tremendous pressure on Mark to kick Peter off the board for that.

00:29:31 Speaker_04
And Mark really stood up for him and said, no, we precisely want him to stay on the board because he thinks differently than everybody else. After Trump won, I asked him like, what do you think happened? And Peter made this really interesting.

00:29:39 Speaker_04
This is like December of 16 or something. So the country still doesn't know how to think about it all. And he's like, I have a sneaking suspicion that Trump won because people are tired of political correctness.

00:29:48 Speaker_04
And I think Peter was sensing everything that followed. Peter was dialed in on that. And again, not just the political correctness, but I think what Peter meant by that in retrospect was just, again, this sort of deeper thing in the country.

00:29:59 Speaker_04
which is the majority of people in the country are just not willing to live under this kind of soft, basically repressive, negative, hostile, emotional, hysterical repression.

00:30:07 Speaker_04
And maybe the people writ large in the country didn't realize that fully in 2016 or 2020, but they certainly realize it today. And so I think Peter, in retrospect, was sort of predicting what actually just happened in this election with a leg.

00:30:19 Speaker_00
I want to ask you a few questions sort of about the political dynamics in Silicon Valley, because there's a lot being written about the vibe shift happening.

00:30:27 Speaker_00
And I guess I wonder a little bit if that's overstated, because the average worker at Meta or Google, I imagine, has not changed their mind post-November 5th.

00:30:37 Speaker_00
Like, these are still extraordinarily progressive people that are dominating the most, you know, the biggest behemoths in the tech industry. How does the dynamic break down politically?

00:30:47 Speaker_04
Yeah, so to start with, I would say this has been a misperception, and I believe actually fostered by the press for a long time, a misperception that Silicon Valley is some haven of libertarian ideologues, lunatics.

00:30:57 Speaker_04
Silicon Valley, my entire life, I got here in the 90s, but certainly back to the 80s, and I'm sure probably the 70s and 60s too, is overwhelmingly just straight up Democrat. And you see that if you look at the voting.

00:31:07 Speaker_00
I mean, it's basically like the faculty of an Ivy League university, maybe slightly more balanced, but barely.

00:31:13 Speaker_04
Yeah, you have companies where literally the donations are 99 to 1. And by the way, that is reflective of, right, if you took the top 200 executives at these big companies, that's roughly how it would break down.

00:31:22 Speaker_04
So overwhelmingly, the elites in Silicon Valley and California are still exactly where they were. They haven't changed. So look, the vast majority of the CEOs, executives, founders of the big tech companies are still exactly where they were.

00:31:33 Speaker_04
The vast majority of the employee bases, especially you would say in the creative fields, you go into any of the front end designers, developers, the people who build consumer apps, all that stuff. You know, it's going to be very left-wing.

00:31:44 Speaker_04
And it's the same thing in the Fortune 500. It's the same thing in the universities. It's the same thing in the press. It's the same thing in the big bureaucracies. You know, it is the generalized phenomenon, and so the value is just part of that.

00:31:53 Speaker_04
The elite, counter-elite thing is real. You know, again, Elon being the peak of the counter-elite has had a big impact. David Sachs had a big impact. Joe Lonsdale has had a big impact.

00:32:01 Speaker_04
So I think what you have is you have this elite monoculture that's sort of very comfortably in power. And then you have this counter elite that is, starting with Elon, is kind of coming in on top.

00:32:09 Speaker_04
And not to take huge credit for this myself, but I think a bunch of the people in the counter elite are like clearly more successful, clearly more productive, better results than most people in the elite.

00:32:19 Speaker_04
Burner's framework on this is he calls it the circulation of elites. And it's basically the idea of you have an elite class and then you have a group of up and comers who are like aspirates to the elite class.

00:32:28 Speaker_04
And what the elite class does to retain its own status is it invites the aspirants into itself. So you get pulled into the elite and you're in and you're being praised and everybody loves you. And like, it's just wonderful.

00:32:40 Speaker_04
And, and you're on the side of goodness and justice and, you know, freedom and equality and great. You're doing so many great things for black people. Right. It's just like, it's just amazing. Right.

00:32:49 Speaker_04
that lasts for as long as it lasts until at some point there's a counter-elite, there's a set of people who ideally are even better, are even more accomplished, are even higher status elites who basically come in and basically say, no, like, this is not right.

00:33:02 Speaker_04
This is not how this should work. This has gone corrupt.

00:33:04 Speaker_00
How does that group of counter elite, like I'm sure you've noticed, there's just such sycophantic behavior going on right now towards particular members at the vanguard of that counter elite of people sort of just really brown nosing and like trying to get in good.

00:33:19 Speaker_04
There's been so much brown nosing in the other direction for my entire life.

00:33:23 Speaker_00
As someone who just doesn't like ground nosing and doesn't, like, how do you avoid the same ditto head behavior of everyone sitting around the table, nodding along to whatever, you know, Tom Friedman wrote about Israel in the Middle East in the New York Times that day?

00:33:39 Speaker_00
How do you not replicate that same mirror image behavior on the other side? Or is that unavoidable, according to you and Burnham?

00:33:45 Speaker_04
You know, I'm getting all these calls now, and I'm not like hanging out with Trump every night, although Elon is, but I'm not.

00:33:50 Speaker_04
But, you know, I'm connected enough where I'm getting all these calls now from people where it's like, you know, they were fully certified members of the Vogue Vanguard, you know, executives at these companies, fully committed to every possible cause you can imagine, who all of a sudden are like, oh.

00:34:02 Speaker_04
You know, I'm starting to rethink things, and oh, by the way, I've got an idea for, you know, Elon, or I've got an idea for the Doge, or I've got an idea for... Right, like, who in their right mind would criticize Elon Musk right now?

00:34:11 Speaker_00
Not just because he's so powerful, but, you know, we know what he does to people who disagrees with him. So, like, how do you avoid that? How do you create an anti-fragile counter-elite, is maybe the right way to phrase it?

00:34:22 Speaker_04
So I think part of the answer is you don't. I mean, what do you know what happens when people get around power or they think they can get proximity to power?

00:34:27 Speaker_04
Like there's always a Sun King, there's always a court, there's always a power center, and people are always gonna try to get to it, right? And so that's a permanent state of affairs.

00:34:34 Speaker_04
I think my main reaction to what you're saying is actually though that it's still so strong in the other direction, right? Like at least sitting here today, it's still the case that 90 plus percent of the people in tech are on the other side.

00:34:43 Speaker_04
It's still the case that 90 plus percent of the Fortune 500, 90 plus percent of the universities, 90 plus percent of the experts, 90 plus percent of the bureaucrats, Now, the thing you know is they didn't do that by, like, building rockets, right?

00:34:55 Speaker_04
They didn't do that by building things that, like, matter. They did that by basically being part of the government complex.

00:34:59 Speaker_04
And so, overwhelmingly, the power structure in the country, overwhelmingly, the elites in the country, overwhelming — by the way, it's the same thing. It's like Trump's billionaires.

00:35:08 Speaker_04
Billionaires overwhelmingly went for Harris over Trump, like, 100 to 1. Of course.

00:35:13 Speaker_00
That's the elite that just lost.

00:35:14 Speaker_04
No, I know, I know, I know, but my point is like, I don't think that's changed that much. And look, maybe that just means that they continue to lose and maybe that's great.

00:35:21 Speaker_04
And I'm not saying that the new elite or counter elite should aspire to be like that. It's just like of all the problems that we have today, like I think that that is, I'm not worried about that right now.

00:35:29 Speaker_04
Look, I think the other thing is it is, I think just indisputably true that one side went in super hardcore for censorship over the last 15 years. And I think that actually was both damaging to the country. And I think it was also damaging to itself.

00:35:41 Speaker_04
as a movement, because I think basically, like the call of the Democratic Party writ large or the left, sort of that side of things decided that all dissent, counter-argument, objection to their points of view was gonna be hate speech, misinformation, and was to be completely dismissed, and then for the people who said those things to be ostracized.

00:35:57 Speaker_04
By the way, and again, I saw this happen. I saw it happen right up front, and I was in the meetings where it was first formulated.

00:36:01 Speaker_00
What do you mean you were in the meeting?

00:36:03 Speaker_04
I was in the meeting at Facebook where we defined the terms hate speech and misinformation for the first time.

00:36:09 Speaker_00
When was that and who defined them? 2013, give or take. Why were they defined and who defined them?

00:36:15 Speaker_04
Well, let me back up all the way. No social media company can function without having some level of censorship.

00:36:21 Speaker_04
And the reason is because you're going to have things in terrorist recruitment, incitement to violence, child pornography, frauds and scams. You cannot have those things. They're illegal. They're actually not covered under the First Amendment.

00:36:31 Speaker_04
There's no protection for them. You cannot have them. Your service will fail. It's horrible. And so you're always going to have some censorship function. And these companies all had functions to do that.

00:36:39 Speaker_04
And then it's like, from there, it's like, okay, hate speech. And then it's like, all right, well, you can't have the N word, right? And it's like, all right, fair enough. You can't have the N word, I got it, right? But then you know what happens.

00:36:47 Speaker_04
Then it's like straight down the slippery slope. And before you know it, you're censoring the claims that COVID was a lab leak. Like, it's just a straight, it's the ring of power. It's a straight slide.

00:36:55 Speaker_04
And so I was in the meetings at a bunch of these companies, including Facebook, where it's like, okay, we need to define hate speech. We need to define misinformation, right?

00:37:03 Speaker_04
And of course, hate speech, of course, gets defined as thing, you know, statements that made people uncomfortable. And of course, the immediate objection is like, well, the concept of hate speech makes me uncomfortable.

00:37:12 Speaker_00
Did you say that in the meeting?

00:37:13 Speaker_04
Of course. Yes, I did. I said like, look, obviously this is going to get blown completely out of proportion. This could now apply to everybody.

00:37:18 Speaker_04
And so we are now going to empower a commissar class, right, of professional activists who are now going to be able to apply this to fight all their political battles and to vanquish their political. Of course they're going to do that.

00:37:27 Speaker_04
And this is the thing. It's just like, you know, an American, you know, the way we experience politics is what everybody feels about their most important political issue is that it's not a political issue. They feel like it's a moral issue. Right?

00:37:37 Speaker_04
And we heard this in these companies all the time, starting around this time, which is this, you know, you don't want any politics at work. Well, this isn't a political issue. This is a moral issue. Right?

00:37:44 Speaker_04
And the minute you have a moral trump card, and then the ability to censor people based on the morality, of course, it's going to get used for everything under the sun, including politics. So there was that.

00:37:51 Speaker_04
And then the misinformation thing was early on was not that big of a deal. And I could tell you where that came from, but it was not that big of a deal. It was really the Russiagate

00:37:59 Speaker_04
whatever you want to call it, conspiracy hoax thing that since got, I think, completely debunked.

00:38:04 Speaker_04
But it was really this theory that Trump only got elected because he was a Russian asset and that Putin, as Hillary Clinton put it, Donald Trump is the only president because Vladimir Putin hacked Facebook, right?

00:38:12 Speaker_04
And so basically, nobody would ever vote for Donald Trump out of their own free will. It therefore had to be literally a Russian misinformation, active measures campaign. Right. Therefore, you must censor misinformation.

00:38:22 Speaker_04
And again, it's the ring of power. The minute you have that, you censor everything else. But that actually predated Trump. That all started in like 2013.

00:38:28 Speaker_00
Looking back, Mark, do you feel like you could have played it differently? Like, is there is there anything you regret about that? Should you have resigned from the board, for example?

00:38:36 Speaker_04
This is always the problem that I think people have, by the way, in government, as well as they have in companies, which is if you resign, you're gone, right? If you resign, you lose the information flow, right?

00:38:44 Speaker_04
And so if your job is to understand what's happening in the world, which is fundamentally my day job, you just lose all that context.

00:38:49 Speaker_04
And then number two, it's like, all right, to be part of an organization, should the requirement be that you agree with every single thing they do? And of course not, right? So there's basically three modes of how you resolve arguments in companies.

00:38:59 Speaker_04
You can agree and commit. That's the easy one. You can disagree and quit. or you can disagree and commit. What's the line between that and complicity with things that are really wrong? You know, that's a question of conscience.

00:39:09 Speaker_04
You know, I think everybody has to answer that kind of in their own soul. You know, I would say because I was in there the whole time and I saw the development, and by the way, not just Facebook, I saw this happen.

00:39:17 Speaker_04
You know, I was in these conversations at many other companies and I was talking to many of the founders. I knew the Twitter guys really well. I was an angel investor. I knew them when they were the free speech wing of the free speech party.

00:39:27 Speaker_04
And then I knew them when they became, you know, one of the wokest companies in the world. And so I saw that happening.

00:39:32 Speaker_04
By the way, this all has informed, number one, my firm participated in Elon's buyout of Twitter, which now became X. And then also we are the major outside investor in Substack.

00:39:41 Speaker_04
And I would say one of the things I'm proudest of is being able to take the learnings from these other companies that kind of couldn't figure out how to deal with this upfront because they didn't know these issues were coming.

00:39:49 Speaker_04
And I think the Substack guys have done an amazing job building these questions and how to deal with them into the company from the very beginning.

00:39:56 Speaker_04
And Substack has been far more resilient to these kinds of attacks and far more pro-free speech than basically any other de novo company in the last 20 years.

00:40:04 Speaker_04
And I think the tide is turning a bit on this and we'll see what happens to the big companies, but yeah, that's where it ended up.

00:40:10 Speaker_00
I want to ask you something just about Ben Horowitz before we move on, who's your partner, and you've built this unbelievable firm with him. He endorsed Trump with you over the summer.

00:40:20 Speaker_00
Then, a little while later, he talked about how he was going to give money to the Kamala Harris campaign. A lot of people saw that and thought this was a hedge, basically, for your firm, A16Z or Andreessen Horowitz. Was it?

00:40:34 Speaker_04
Yeah, oh yeah, yes, it was a hedge. It was 100% a hedge. So it was a textbook definition of a hedge. So the context for that is the Biden administration was really bad. I would say seething contempt was their attitude towards the American tech industry.

00:40:50 Speaker_04
They went out of their way to damage it as much as they could. By the way, there was actually news in the press, they basically destroyed the career of the CEO of Intel.

00:40:58 Speaker_00
Just to back up for the listener.

00:41:00 Speaker_04
Yeah.

00:41:01 Speaker_00
Let's talk about why the Biden administration had, I think you just said, seething contempt for tech and the way that that played out.

00:41:09 Speaker_00
Because for most civilians like me, the last group that maybe I would think of as being victim to the excesses of maybe the totalitarianism of the left or the excesses of the Biden White House would be the kind of companies that you invest in.

00:41:26 Speaker_00
So, like, tell us what this actually practically looks like for a person that's coming to this pretty cold.

00:41:32 Speaker_04
Basically, Silicon Valley and Washington have had a partnership going back literally 100 years.

00:41:36 Speaker_04
Silicon Valley, Northern California, was the home of the invention of everything from guided missiles and radar, you know, pre-World War II and then leading into the computer age and the microchip and everything that followed.

00:41:47 Speaker_04
And then, obviously, the Internet, you know, was this payoff of this 40-year program by the government to fund these technologies, you know, that I was a beneficiary of.

00:41:54 Speaker_04
And then in the Clinton-Gore years, there was like tremendous partnership between DC and the Valley. And then, by the way, in the first Obama term, there was like tremendous back and forth.

00:42:03 Speaker_04
You know, for example, a lot of Silicon Valley people remember the famous Obamacare website when it broke, like a lot of the best and brightest in Silicon Valley just put their hands up and said, we're going to go to Washington and fix it.

00:42:11 Speaker_04
And that created something called the U.S. Digital Service, where they now, for 15 years, have had really bright valley people who go into the White House and work on all these tech topics. You know, the military runs on American technology, right?

00:42:20 Speaker_04
So, like, this dynamic was, like, super healthy and productive under both Democratic and Republican administrations, basically, for my entire career, up until this one, up until the Biden administration.

00:42:29 Speaker_00
And, by the way, this took us totally by surprise, but... Well, you just don't, you don't think about, like, Joe Biden and his administration as being particularly passionate about this issue. So, like, what was driving it? Who was driving it?

00:42:39 Speaker_04
You know, look, and I would say, like, when we endorsed Trump, we only did so on the basis of, like, tech policy, and we're specifically weighing in on, like, tech business economics, you know, when we think about these things.

00:42:47 Speaker_04
But if you ask anybody who has a problem with what the Biden administration does, you have this incredible dichotomy between what seems like a reasonable, moderate, centrist, thoughtful president who's been a senator forever and pillar of the old Democratic establishment, and then you have this, like, incredibly radicalized

00:43:02 Speaker_04
set of policies with this young staff that just is like out for blood on all these different fronts. And they just adopted these very, very radical positions aimed squarely at damaging us as much as they possibly could.

00:43:14 Speaker_04
And that played out specifically for us in three areas that caused us to endorse Trump. One was crypto, where they just declared war and tried to kill the entire industry and drive it offshore.

00:43:24 Speaker_04
Number two was AI, where I became very scared earlier this year that they were going to do the same thing to AI that they did to crypto.

00:43:30 Speaker_04
And then third is what seems like an esoteric topic, but turns out to be, I think, very important, which is this concept of unrealized capital gains tax.

00:43:37 Speaker_04
So taxing private companies, which also by definition means, you know, will ultimately end up meaning homeowners and everybody else who owns a private asset, small business people, and basically destroying the ability to have small businesses to be able to have home ownership and to be able to have tech startups.

00:43:50 Speaker_04
through this sort of structural change to taxes called unrealized capital gains. The crypto war, we were on the receiving end for four years. It was incredibly brutal, incredibly destructive. AI, we had meetings in D.C.

00:44:01 Speaker_04
in May where we talked to them about this, and the meetings were absolutely horrifying, and we came out basically deciding we had to endorse Trump.

00:44:07 Speaker_00
Back so little color to absolutely horrifying. What did you hear in those meetings?

00:44:12 Speaker_04
They said, look, AI is a technology, basically, that the government is going to completely control. This is not going to be a startup thing. They actually said flat out to us, don't do AI startups. Like, don't fund AI startups.

00:44:20 Speaker_04
It's not something that we're going to allow to happen. They're not going to be allowed to exist. There's no point. They basically said AI is going to be a game of two or three big companies working closely with the government.

00:44:29 Speaker_04
And we're going to basically wrap them in a, you know, I'm paraphrasing, but we're going to basically wrap them in a government cocoon. We're going to protect them from competition. We're going to control them. We're going to dictate what they do.

00:44:39 Speaker_04
And then I said, I don't understand how you're going to lock this down so much because the math for AI is out there and it's being taught everywhere.

00:44:45 Speaker_04
And they literally said, well, during the Cold War, we classified entire areas of physics and took them out of the research community. And entire branches of physics basically went dark and didn't proceed.

00:44:56 Speaker_04
And that if we decide we need to, we're going to do the same thing to the math underneath AI. And I said, I've just learned two very important things.

00:45:04 Speaker_04
Because I wasn't aware of the former and I wasn't aware that you were, you know, even conceiving of doing it to the latter.

00:45:08 Speaker_04
And so they basically just said, yeah, we're going to, look, we're going to take total control of the entire thing and just don't start startups.

00:45:12 Speaker_00
And Mark, what was the steel manate for the listener? Like, what was their argument?

00:45:16 Speaker_04
So this gets into this whole, like, all these debates around, like, AI safety, AI policy. So there's sort of several dimensions on it, and I'll do my best to steel manate.

00:45:22 Speaker_04
So one is just, like, to the extent that this stuff is relevant to the military, which it is, if you draw an analogy between AI and autonomous weapons being, like, the new thing that's going to determine who wins and loses wars,

00:45:33 Speaker_04
then you draw an analogy to the, in the Cold War, that was nuclear power and that was the atomic bomb. And the steel man would be the federal government didn't let startups go out and build atomic bombs.

00:45:41 Speaker_04
You had, you know, the Manhattan Project and everything was classified. And, you know, at least according to them, they classified down to the level of actual mathematics and, you know, they tightly controlled everything.

00:45:50 Speaker_04
And that, and look, you know, that determined a lot of the, you know, the shape of the world, right? That's part one. And then look, I think part two is there's the social control aspect to it.

00:45:58 Speaker_04
which is where the censorship stuff comes right back, which is the exact same dynamic we've had with social media censorship and how it's basically been weaponized and how government became entwined with social media censorship, which was one of the real scandals of the last decade and a real problem, like a real constitutional problem.

00:46:12 Speaker_04
That is happening at like hyperspeed and AI. And these are the same people who have been using social media censorship against their political enemies.

00:46:19 Speaker_04
These are the same people who have been doing debanking against their political enemies and they want to use AI the same way.

00:46:24 Speaker_04
And then look, I think the third is, I think this generation of Democrats, the ones in the White House under Biden, they became very anti-capitalist. They wanted to go back to much more of a centralized, controlled, planned economy.

00:46:34 Speaker_04
And you saw that in many aspects of their policy. But I think, quite frankly, they think that the idea that the private sector plays an important role is not high up on their priority list.

00:46:41 Speaker_04
And they think generally companies are bad and capitalism is bad. and entrepreneurs are bad, and they've said that a thousand different ways, and they demonize entrepreneurs as much as they can.

00:46:50 Speaker_04
And then the tax policy, up to including proposing a tax policy that would just destroy the process of private company creation and destroy venture capital. I mean, I don't think there's been an administration this radical on economic and tech policy.

00:46:59 Speaker_04
I don't think in like ever, like a hundred years. Communists a hundred years ago loved technology. They loved industrialization.

00:47:06 Speaker_04
Like communists a hundred years ago wanted industrialization because they knew that industrialization and technological advance made everybody's lives better. They just wanted to seize it all once it was built.

00:47:14 Speaker_04
These people didn't even want it to get built. And so something went horribly, horribly wrong in this administration and in this movement.

00:47:21 Speaker_04
And by the way, let me just add one more thing, which is, look, this is not what you would consider to be mainstream democratic policy you're thinking up until, you know, whatever, 2020. You know, the Clinton people think this is insane.

00:47:31 Speaker_04
The Obama first term people think this is insane. The moderate Democrats today think this is insane. And by the way, it's not me saying this. You got people like Richie Torres now saying this in public, like this is wrong.

00:47:39 Speaker_04
Like this is clearly like off the deep end. And, you know, the Democratic Party right now is going through a soul-searching process to try to figure out what it's going to do in the wake of this loss.

00:47:47 Speaker_04
And I would say I am cautiously optimistic that the smart, moderate Democrats are going to figure out that these are unnecessary fights. Like, there's just no reason to have these fights. There's no reason to have this approach.

00:47:58 Speaker_04
It doesn't have anything to do with the historical basis of the party. It doesn't have anything to do with what people think that they're voting for. It doesn't have anything to do with the ability to, you know, take care of poor people.

00:48:05 Speaker_04
It doesn't have anything to do with the ability to have progressive social policy. It's like just this extreme level of anti-business, anti-tech animus.

00:48:11 Speaker_04
And they should just simply let it go and reestablish historically close ties and move on with things and start winning elections. And I hope they reach the right conclusions.

00:48:21 Speaker_00
One of the things that the Biden administration did is that they aggressively went after Google, Amazon and Meta over these antitrust laws. And I think across the political spectrum, there's sort of a growing resentment of what you call big tech.

00:48:35 Speaker_00
I think what other people just call tech's power, their size, arguably their profitability. J.D. Vance, the incoming vice president, supports a big tech antitrust crackdown. Where do you agree with Vance, and where do you disagree?

00:48:48 Speaker_00
In other words, where should we be reining in some of these big tech companies in protecting consumers, and where has this been an overreach in your view?

00:48:58 Speaker_04
Yeah, so start by saying we differentiate between what we call big tech and what we call little tech.

00:49:02 Speaker_04
Because big tech are the companies that have made it, and the companies that have some level of market power such that at least people are accusing them of being monopolies or cartels or oligopolies.

00:49:13 Speaker_04
These kind of big words that mean kind of very large amounts of market power. Those are household names. By definition, if you're big tech, you're a household name. We then define what we call little tech, and little tech is startups. Right.

00:49:21 Speaker_04
And so it's new companies that have this aspiration to become big companies. There is this sort of funny cycle of life aspect to it, which is all the little companies start out wanting to be big companies, right?

00:49:31 Speaker_04
You have your existing big tech companies, and then you have these little tech startups. And then when they, most of them fail, but when they succeed, they become big tech. And then basically the cycle repeats.

00:49:39 Speaker_04
And then the role of the venture capital firm is to fund each new generation of little tech company.

00:49:44 Speaker_04
And so you, you, you end up basically what we end up doing, like most of our day job is to fund companies that are trying to grow up, to take out the existing big tech companies. and replace them.

00:49:51 Speaker_04
And so the question is the question you asked, which is what's the right policy to deal with the consequences of having, at the end of the day, big companies with lots of concentrated market power.

00:50:00 Speaker_04
I think what I would say is in the last decade, I would say both sides of the political spectrum have really decided they really hate big tech, but for, I would say, diametrically opposed reasons. So the left hates big tech for several reasons.

00:50:12 Speaker_04
One is they just hate, to the extent that they hate capitalism, to the extent that they hate companies, to the extent that they hate outsized economic success,

00:50:20 Speaker_04
to start with, and then to the extent that they blame tech for getting Trump elected specifically, and they blame tech for kind of enabling the rise of, you know, let's call it populist right-wing politics.

00:50:29 Speaker_04
Therefore, like, intense pressure, you know, on these big tech companies from the left for all the censorship and debanking. The right hates big tech because of the censorship and the debanking.

00:50:38 Speaker_04
The right has a very particular bone to pick, and the bone to pick is you people at the big tech companies have been censoring and deplatforming Republicans now for over a decade.

00:50:46 Speaker_04
And by the way, you know, some of that is clearly true, and then there's a lot of gray area, and then, you know, there may be some claims that aren't true, but the level of mistrust is so high, and at least the Republicans believe that the level of deception and lying is so high that, you know, there's been like a complete breakdown of relationship.

00:51:00 Speaker_04
The populist right has a little bit of just a general edge also against big companies in general. There's almost like a right-wing neo-progressivism on that, and that's where you kind of hear

00:51:10 Speaker_04
And I think, frankly, that has something to do with electoral politics, which is the union vote has really started to shift.

00:51:15 Speaker_04
And so I think that there are some people on the right who think that if they come down harder on big companies generally, they'll be able to get more of the union vote, which, by the way, might be true. So that's like the new tinge.

00:51:24 Speaker_04
But I think most of it is anger at big tech and anger at the censorship and debanking. Look, I don't even really want to weigh in.

00:51:31 Speaker_04
I mean, on the specific politics of antitrust, part of that is because I'm involved in a bunch of these companies, and I don't want to say things that are going to show up in court.

00:51:40 Speaker_04
But also, look, for me, this is a legitimate, time-honored tension in the American system that goes all the way back to Teddy Roosevelt. And I think many smart people have worked on this for a very long time.

00:51:50 Speaker_04
There are very well-thought-through theories on many sides of this. There are many legal precedents on this. Clearly there's a level of big company behavior that goes too far.

00:51:59 Speaker_04
Clearly also government involvement in business often has unanticipated side effects. These are very complex and difficult questions. I think these are questions that need to be litigated both literally and philosophically.

00:52:10 Speaker_04
Again, this is a case where the last administration was so aggressive on this and so out for blood that I think in many ways they suppress the actual question.

00:52:18 Speaker_04
I would expect in the next decade the actual question will probably be confronted more directly.

00:52:22 Speaker_00
Let me ask you then another question about the relationship between government and tech. I think you can very plausibly make the argument that government walks so tech could run.

00:52:31 Speaker_00
In other words, they invested in the original internet, which made your career. They funded GPS. They gave loans to Tesla to keep Tesla afloat.

00:52:39 Speaker_00
They fund the California public university system, which arguably gives you employees and also founders to invest in. Essentially, they kind of like built

00:52:48 Speaker_00
or at least tilled the soil to create a very, very rich environment through which all of these companies could grow. And now a lot of people in that environment are now turning around and saying, government out of tech.

00:53:03 Speaker_00
How do you respond to that line of criticism?

00:53:05 Speaker_04
Yeah, so this goes to, I would say this is like a broader version of what I was saying about the deal.

00:53:09 Speaker_04
The deal for many decades was, precisely as you said, the government invested deeply in R&D, the government invested deeply into infrastructure, the government invested deeply, by the way, procurement, you know, the DOD was the early customer for a lot of these things that made them possible.

00:53:22 Speaker_04
The government invested deeply into education, like all that is 100% true. By the way, for a very long time, all of that had bipartisan support.

00:53:30 Speaker_04
Republicans thought that that was great because that, to your point, created the seedbed for companies coming out the other side and the overall success of the American experiment, the entire complex of government and private action that makes America succeed.

00:53:41 Speaker_04
People on the left thought that it was great because, of course, education is fantastic. It's new to have a left-wing movement that's anti-technology. That's actually a new thing.

00:53:49 Speaker_04
Left-wing movements historically were pro-technology, not anti-technology, because they, progress, right? Technology quite literally represents progress. And technology represents, in particular, economic progress, right?

00:53:58 Speaker_04
And if you're trying to actually lift everybody up economically, you need technology, because you need economic growth, you need new devices, you need washing machines, you need all these things. you know, you need electric power, right?

00:54:06 Speaker_04
I mean, a huge focus of the Roosevelt administration was the deployment of electric power everywhere.

00:54:10 Speaker_04
Just like, by the way, at least in theory, a huge focus of the Obama administration has been the deployment of broadband internet everywhere, even though, you know, maybe not so much in practice, but at least in theory, right?

00:54:18 Speaker_04
They had, you know, they were spending a lot of money on that. And so historically, this has been the, you might call this the deal with a capital D, like just like, of course, this is how it works.

00:54:26 Speaker_04
The success of American industry, the success of American business, the success of American tech, has been viewed as good for America by both sides. It's a new phenomenon to have this very sharp anti-capitalist thing.

00:54:38 Speaker_04
And I think that's the reaction, at least for people like me. If we went back to the Clinton-Gore attitude on this, that would be 100% fine. And they were, at the time, plenty progressive. They had progressive tax policy.

00:54:48 Speaker_04
They had progressive social policies. Same thing for the first Obama term. And so I think we actually got kind of off the rails as opposed to there being some deeper underlying issue.

00:54:56 Speaker_00
One of the things that flipped you out over the past few years was this phenomenon that you spoke about recently on Joe Rogan that you've referenced a few times already in this conversation, and that's the phenomenon of debanking.

00:55:08 Speaker_00
You said on Rogan that roughly 30 founders of crypto companies and other companies were sort of quietly debanked during the Biden administration.

00:55:17 Speaker_00
And debanking, which is essentially having your bank accounts, your ability to transact closed or blocked, happens on both sides of the political spectrum.

00:55:26 Speaker_00
We have this really definitive piece by Rupa Sabarmania about how it has happened both to Muslim business owners, who banks and companies like Stripe were concerned could be tied to terror, and also to people like Melania Trump.

00:55:40 Speaker_00
And I guess I wonder, hearing you talk about that on Rogan and then subsequently what's happened is that a lot of these founders have come out and say, hey, I am Spartacus, it happened to me. Why did these people not say anything at the time?

00:55:55 Speaker_04
So I would say some people did speak up. And actually, we talked about it. Actually, I'll give you an example. Ben and I actually talked about it specifically in the podcast where we explained why we were endorsing Trump.

00:56:03 Speaker_04
And it basically came and went without people noticing. And maybe that's just because we were not interesting enough, but we actually were talking about it. It just didn't take. I think it's just one of those things.

00:56:12 Speaker_04
They seem like one-offs when they're happening. And then it seems like if you say there's something systematic happening, it sounds like a conspiracy theory. That's part of it. And then look, the other thing is retaliation.

00:56:21 Speaker_04
I think especially a lot of small business people who have been subjected to this or small tech founders, I think a lot of them were hoping that it would just be a problem with one bank.

00:56:29 Speaker_04
And if they talked about it publicly, it might become a problem much more broadly. Also, another part of it is family members. Melania Trump in her new book actually says that she and her son, Barron, were both debanked.

00:56:38 Speaker_04
And of the many things that you could say about Trump, we know for a fact that Melania and Barron had nothing to do with, you know, anything that people are mad about.

00:56:44 Speaker_04
And so, you know, the idea that going after somebody's wife and son is just like, you know, I mean, there you're in like Stalin level territory. And so I think people were scared about that. By the way, it's not just your bank account.

00:56:54 Speaker_04
You know, there's also been cases where you can't get insurance, can't get a car loan. I mean, there's lots of cases you get kicked off the Internet.

00:56:59 Speaker_00
Do you think that there should be, you know, there's no right to banking, obviously, in the Constitution. Do you think this is something that the next administration should pass laws to protect?

00:57:07 Speaker_04
So this is what I mean by soft authoritarianism. It's not the knock on the door at four in the morning, you get dragged off to the basement in some horrible building and beaten. That's not, for the most part, what happens.

00:57:16 Speaker_04
What happens is, oops, sorry about your bank account. Oops, sorry about your home loan. Oops, sorry about your insurance. Oops, sorry about your trucker license.

00:57:23 Speaker_04
When the state turns on you, they have all of these ways where somebody can press a button in a well-lit office somewhere and ruin your life.

00:57:29 Speaker_04
The thing that you mentioned, though, about there's no right in the Constitution of banking the way there is to free speech, that actually is not completely true.

00:57:34 Speaker_04
And I'm not a lawyer on this, and so I won't get the details right, but the Constitution and the Bill of Rights is quite clear that the rights that you have do not all need to be enumerated to be rights, and that the government cannot arbitrarily suppress your rights even if they're not enumerated in the Constitution.

00:57:47 Speaker_04
So, like, for example, the government cannot act on you without due process to take away, for example, your bank account. They're not allowed to.

00:57:53 Speaker_04
There's no right to the bank account in the Constitution, but it does say the government cannot act on you in that way without due process.

00:57:59 Speaker_04
And so I actually think that there actually is a pretty good case that a lot of this behavior is illegal, quite possibly criminally illegal. There are federal criminal statutes around denial of rights and conspiracies to deny rights.

00:58:09 Speaker_04
And at least I think there's a reasonable take that if there's been a conspiracy to deny somebody bank accounts without due process, and specifically that the government has been involved in that, it very well might be criminal felony behavior.

00:58:19 Speaker_04
In terms of what the new administration will do, a lot of people in the first Trump administration have told me that they have been through this themselves.

00:58:26 Speaker_04
So that in the wake of the first Trump administration, they were variously debanked or unable to get various kinds of, you know, insurance or home loans or other things.

00:58:33 Speaker_04
And so I think that there's a bunch of people who were in the 45 administration who I think are, you know, take that quite seriously. And that has happened to a lot of their friends and allies over the last decade. I don't want to speak for them.

00:58:41 Speaker_04
I don't know what they're going to do. I know that they're keenly aware of this. And they certainly now have the power, I would say, number one, to discover what's actually been happening.

00:58:48 Speaker_04
Because there's a big discovery component of this, because what happens in the shadows in Washington is always not easy to see from the outside, but they can now go find out.

00:58:55 Speaker_04
And then number two, if they decide there are cases, they certainly are in a position to bring them.

00:58:59 Speaker_00
Okay. Well, speaking of sort of the next administration and the government, some have reported that you are considering in the running, I don't know what the right word is, to join DOJ. Is this true at all?

00:59:10 Speaker_04
I'm an unpaid volunteer.

00:59:12 Speaker_00
You're an unpaid intern for the new Department of Government Efficiency.

00:59:16 Speaker_04
Yes. What do you think it's going to do? There's basically two big parts to it.

00:59:19 Speaker_04
One is they're going to do a top-to-bottom review of government spending, and they're going to go cut as much cost as they possibly can, and they have a whole theory and strategy on that.

00:59:26 Speaker_04
And then in conjunction with that and related to that, they're going to do the same thing for regulations. So they're going to basically do a top-to-bottom review of what we call the regulatory state or the administrative state.

00:59:37 Speaker_04
And the connective tissue there that they now talk about in public is, I think, actually quite important, which is, like, a lot of the reaction of the Doge from, like, institutional Washington is like, well, that's impossible. You can't do that.

00:59:46 Speaker_04
There are all these laws and statutes and regulations. There's this thing. There's this thing. Do you know the term impoundment? Yeah. So impoundment normally means, as a civilian, it normally means, like, your car gets impounded.

00:59:56 Speaker_04
There's a different use of the word in the context of government spending. Impoundment is the principle in law today that the president has to go to Congress to get money for a program. The president is not allowed to either overspend or underspend.

01:00:11 Speaker_04
So the president is obligated to spend the money whether he thinks it's a good idea or not.

01:00:15 Speaker_00
That's insane.

01:00:16 Speaker_04
On every government program. And so the president literally, like legally quote unquote today, can't actually save money. He can't say, I'm just not going to spend on this. He has to spend the money.

01:00:25 Speaker_04
So this is the kind of constitutional question, like we've been, we've been living under a regime in which there's a lot of these things that have been taken for granted that are probably not constitutional.

01:00:32 Speaker_04
So like that, that's one of the things that I'm sure is going to get looked at.

01:00:35 Speaker_00
Just because you're brilliant in one area doesn't mean you're brilliant in another.

01:00:38 Speaker_00
You know, like, there's certain people I'm thinking of right now who are just absolutely brilliant, obviously, in tech and entrepreneurship and have, like, the most asinine ideas about foreign policy I've ever heard in my life.

01:00:48 Speaker_00
What makes you confident that this is the right role for these two men and their various unpaid interns?

01:00:56 Speaker_04
Well, I mean, the first question is, how good do you think the experts are?

01:00:59 Speaker_00
Very bad.

01:01:00 Speaker_04
Yes.

01:01:03 Speaker_00
But just to push back, you can agree that the experts are bad, but believe that you still need experts. In other words, I think the current elite is bad, but I'm still going to be skeptical of the counter elite. Do you disagree with that?

01:01:17 Speaker_04
Obviously, the point in general terms is correct. And obviously, this will always be a question of concern. And look, you could even say it has nothing to do with one kind of expert or another.

01:01:24 Speaker_04
It's just these are complex systems, right, with tremendous consequences to the decisions and how many people from any background are qualified to make these calls.

01:01:31 Speaker_00
Like, you can agree Fauci abused his power and still believe that we need public health officials that deserve our trust. Oh, yeah, of course. Oh, 100%. Then we don't want to live in a country where there is no coherent public health authority.

01:01:47 Speaker_04
Yeah, 100%. But look, what built the government we have today is in the 1930s, it was Roosevelt, like, dramatically transformed the government.

01:01:52 Speaker_04
And if you recall what Roosevelt did, which was widely lauded at the time and since, is he put out the call for basically all of the smart young people in the country to put their hands up and volunteer. It was the Y Combinator of its time, right?

01:02:03 Speaker_04
The federal government was the Y Combinator of 1933.

01:02:05 Speaker_00
The New Deal, baby, yes.

01:02:06 Speaker_04
The New Deal, right?

01:02:07 Speaker_04
And literally what they did is Roosevelt basically delegated to Harry Hopkins, like, you got to build this new thing and build all these programs and agencies and do it quickly because of the Great Depression and then World War II.

01:02:15 Speaker_04
And we don't have the people to do it today because the government is just tiny as compared to what it needs to be. And so you put out the call and the call is every smart 25, 30-year-old

01:02:24 Speaker_04
who's good at anything, you bring him to Washington, and you create an agency, and you put him in charge, and you have him go do it. And it's like, you know, Harry, you go electrify the Tennessee Valley.

01:02:31 Speaker_04
You know, and Mary, you go, you know, create the art grants program, right? And, you know, Ken, you go build naval warships.

01:02:37 Speaker_04
And these were all people who would be the exact same kinds of people that we have who come into Y Combinator and come into our firm and pitch and start companies. They're smart, young athletes.

01:02:46 Speaker_04
You know, they haven't spent 30 years studying government. They haven't been, you know, government bureaucrats forever. They don't come from a long history of government service, but they're very smart. They're very well educated.

01:02:54 Speaker_04
They're very good systems thinkers, right? What you really want is somebody who can wrap their head around a complex system. They're really good at understanding incentives. They're really good at understanding how to build things.

01:03:02 Speaker_04
They're really understanding how to demand high levels of performance. One of the reasons why FDR is such a legend is because the FDR era federal government was actually high performance. It actually worked, right?

01:03:12 Speaker_04
Like it did electrify the Tennessee Valley. The Biden administration did not build electric charging stations. And then it goes back to your question. It's like, okay, who do you assume is going to do a better job building electric charging stations?

01:03:24 Speaker_04
you know, the mayor of South Bend, Indiana, or the guy who built the electric car, right? So it's definitely a change from how DC has been operating.

01:03:30 Speaker_04
It's definitely a change from the kinds of people who have been recruited to run these things for the last 40 years. It's hard for me to believe that they're going to do a worse job.

01:03:37 Speaker_04
It's quite straightforward for me to believe that they're going to do a better job.

01:03:39 Speaker_00
But I think there's an argument to be made that you want Elon Musk catching rocket ships with chopsticks. You want him building electric cars. Do we want him, you know, meeting with

01:03:51 Speaker_00
foreign leaders of adversary nations and sort of like freelancing on that front. And it seems to be kind of like boundaryless situation right now.

01:03:58 Speaker_04
Yeah. Look, the counter argument is number one, Elon decided that he had to do this because if not, he wasn't going to be able to launch rockets.

01:04:03 Speaker_00
Decided he needed to endorse Trump and get fully behind this, this movement. Okay.

01:04:08 Speaker_04
Yeah, yeah, yeah. Like literally what was happening was it was the FAA was trying to prevent him from launching rockets and it was taking longer to get the paperwork approved to launch a rocket than it was taking to build the rocket.

01:04:16 Speaker_04
And so that's just like, clearly that should not be the case.

01:04:18 Speaker_04
And he's talked at length about all the crazy stories about the, you know, having to kidnap the seal and, you know, strap it to the board and see if, you know, the rocket launch is going to interfere with its mating impulse.

01:04:26 Speaker_04
Just this like lunacy just spiraled out of control, regulatory thing that's been happening. And so he, he concluded that he had to do it. And then look on the foreign policy stuff.

01:04:33 Speaker_04
Here's what I would just say is like Elon already, before any of the political stuff, Elon was already an integral part of both the national defense system and then also our allied defense system.

01:04:41 Speaker_04
You know, in Ukraine, the Ukrainian military runs on Starlink because the Russians have the cyber warfare capabilities to take out all the other communication systems.

01:04:49 Speaker_04
And so Starlink is the only communication system that the Ukrainian military can run on. It's the only one. And by the way, SpaceX has been in a pitch battle with the Russian cybersecurity.

01:04:57 Speaker_04
offensive operators to keep trying to take Starlink down precisely because it's what the Ukrainian military is running on. And so, yeah, like, he's already in the middle of this. He's been a vendor to the U.S.

01:05:06 Speaker_04
national security state for, you know, for a very long time. So he was already in the middle of it. He's just in the middle of it in a more direct way now.

01:05:12 Speaker_00
I want to go back briefly to the dinner that you mentioned with Donald Trump in the days before he got shot and before you endorsed him. You said it was a wonderful dinner.

01:05:20 Speaker_00
Tell us about what you heard there that made you feel comforted and maybe even excited and enthusiastic. And also, have you been down to Mar-a-Lago since the election?

01:05:30 Speaker_04
So on the second question, yes.

01:05:32 Speaker_00
How many times?

01:05:34 Speaker_04
A fair amount. Maybe half my time down there since the election. Wow. In and around there. Yeah, I mean, look, I'm not claiming to be in the middle of all the decision making, but I've been trying to help in as many ways as I can.

01:05:46 Speaker_04
Trump brings out a lot of feelings in a lot of people. People have very strong views. And then there are many political topics that we're very deliberately not weighing in on. And so I'm not Mr. Foreign Policy or Mr. Abortion Policy or guns.

01:05:56 Speaker_04
I'm not an expert on those things. When I talk about these things, it's around, as I said, tech policy, business, economics, and then the health of the country, the success of the country.

01:06:06 Speaker_04
I mean, so I guess the color that I give you is, I mean, it's everybody says this who meets with him, but like, he's an incredible host. So we met with him at Bedminster Golf Club in New Jersey, which is like, you know, absolutely beautiful.

01:06:16 Speaker_04
You know, we had a great time. We stayed overnight. You know, he loves being the host. He loves being surrounded by his friends and family and, you know, grandkids and, you know, the members of his various clubs.

01:06:24 Speaker_04
It's also one of the really interesting, we'll kind of watch him at work, which is he treats everybody the same and he talks to everybody.

01:06:29 Speaker_04
This is, I think, one of his real unappreciated strengths that people didn't get for a long time, which is he will happily talk to distinguished visitors about, like, you know, who the vice president should be, and then he'll ask the caddy, right?

01:06:39 Speaker_04
And it's been painted in a negative way, but there's a real strength to it, which is, like, he really talks to regular people, like, a lot. He's in that mode kind of all the time, talking to everybody.

01:06:47 Speaker_04
His thing with us basically was like, look, I just want America to win. You guys are in tech. I don't know much about tech, but I don't need to because you guys know a lot about it. You guys should go build tech companies.

01:06:55 Speaker_04
The American tech companies should win. American tech companies should be the winning companies. We should beat China. We should export. We should make the products the world wants. Our economy should be growing a lot faster.

01:07:05 Speaker_04
We should be creating a lot more jobs. Everybody in America who wants a good job should have one, and that will be the result of American companies succeeding. And I want America to win.

01:07:14 Speaker_04
And I want us to sell our products not just in America, but all over the world. He said on trade, he said, I'm painted as Mr. Anti-trade, but he said, look, I want a level playing field. I want all markets to be open to American companies.

01:07:24 Speaker_04
A lot of these countries claim to be pro-free trade, but they actually block their markets from American products. There are not many American cars on the road in Japan. Nobody's using Facebook in China. Right.

01:07:35 Speaker_04
And so he's like, OK, when these countries aren't fair, I'm going to I'm going to basically make them be fair. And so I'm going to go take that on directly. But the goal with it is for America to win.

01:07:44 Speaker_04
And so, you know, and most of the discussion is just around that.

01:07:46 Speaker_00
Half your time at Mar-a-Lago, what are you weighing in on? What are the kind of meetings that you're taking, having, helping on?

01:07:53 Speaker_04
I've been involved in some of the interviewing process for some of the officials coming in. The caliber of a lot of the people that I've met has been very high.

01:08:00 Speaker_04
A lot of the appointments that have come down in the last two weeks of sort of the next level down staff I think have been very impressive people. And I think that flow of talent seems very strong.

01:08:10 Speaker_04
There was, I would say there was this fear if qualified people were leery about working for Trump 45, are they going to be even more leery about working for Trump 47?

01:08:16 Speaker_04
And I think the opposite has happened, which is I think the flow of qualified people from outside the system in now is actually much stronger. I mean, they're just moving incredibly fast.

01:08:25 Speaker_04
You know, it's all in preparation for actually taking office on January 20th. So we still have a ways to go, but they're certainly going to hit the ground running on inauguration day.

01:08:31 Speaker_00
Feels like a honeymoon stage right now. How do you anticipate Donald Trump or the Trump administration breaking your heart?

01:08:41 Speaker_04
So I would say the following, which is, I hope the Biden administration was just uniquely bad. Like, I genuinely hope that's the case. I genuinely hope the next Democratic administration is going to be much more sensible than that.

01:08:51 Speaker_04
That would please me to no end. By the way, at my firm, we actually support a lot of Democrats as well as Republicans. We plan to continue doing that.

01:08:57 Speaker_04
So I hope we're going to be dealing with a revitalized and sensible, you know, centrist, moderate, pro. pro-American dynamism, Democratic Party. You know, look, I don't know. I have no idea. I learned from 2015 and 2016 to not try to predict.

01:09:11 Speaker_04
And so I'm, you know, very focused on trying to help as much as I can, trying to help get smart people in. I'm very focused on, you know, trying to help, you know, on any of the technical topics.

01:09:18 Speaker_04
But, you know, it's going to be a, I mean, the American government is complicated and things are going to happen. And I no longer feel that I can predict, but I think it's, we needed it. Like it had to change.

01:09:27 Speaker_04
What we had in the last four years could not continue.

01:09:37 Speaker_00
After the break, why Mark has hope for America, despite once telling me that to build a media company, I needed to buy a used mimeograph machine. Stay with us.

01:09:55 Speaker_03
The Credit Card Competition Act would help small business owners like Raymond. We asked Raymond why the Credit Card Competition Act matters to him.

01:10:02 Speaker_02
I'm Raymond Huff. I run Russell's Convenience in Denver, Colorado. I've ran this business for more than 30 years, but keeping it going is a challenge. One of the biggest reasons I've found is the credit card swipe fees we're forced to pay.

01:10:14 Speaker_02
That's because the credit card companies fix prices. It goes against the free market that made our economy great. The Credit Card Competition Act would ensure we have basic competition. It's one of the few things in Washington that both sides agree on.

01:10:31 Speaker_02
Please ask your member of Congress to pass the Credit Card Competition Act. Small businesses and my customers need it now.

01:10:38 Speaker_03
For more information on how the Credit Card Competition Act will help American consumers save money, visit merchantspaymentscoalition.com and contact your member of Congress today. Paid for by the Merchants Payments Coalition.

01:10:49 Speaker_03
Not authorized by any candidate or candidates committee. merchantspaymentscoalition.com

01:10:55 Speaker_00
This show is sponsored by BetterHelp. Have there been times when you feel like you couldn't be your full self, like you were hiding behind some kind of mask? October, of course, is the season for wearing masks and costumes.

01:11:07 Speaker_00
But some people feel like they wear a mask and hide more often than they want to, at work, in social settings, around their families.

01:11:14 Speaker_00
Therapy can help all of us accept all parts of ourselves, so you can take off the mask, because masks, they should be for Halloween, not for your emotions. If you're thinking about starting therapy, consider giving BetterHelp a try.

01:11:27 Speaker_00
It's entirely online, and it's designed to be convenient, flexible, and suited to your schedule. Just fill out a brief questionnaire to get matched with a licensed therapist. You can switch therapists at any time for no additional charge.

01:11:40 Speaker_00
Take off the mask with BetterHelp. Visit betterhelp.com slash honestly today to get 10% off your first month. That's BetterHelp, B-E-T-T-E-R-H-E-L-P dot com slash honestly.

01:11:57 Speaker_00
The first ever piece I wrote for the Free Press, which was then called Common Sense, was a piece called The Great Unraveling.

01:12:03 Speaker_00
And it was sort of about my thinking out loud about whether or not I would be able to build a new journalistic institution at the height of what felt like this soft totalitarianism, you know, that had certainly been enforced in the press.

01:12:18 Speaker_00
And in it, I reference a big wig in Silicon Valley, which is you.

01:12:23 Speaker_00
And I don't know if you remember this, but I basically say to you, how is it possible to build something new that isn't vulnerable to being purged or compromised or otherwise demolished by the forces of Facebook and Twitter and Apple and Google and Amazon?

01:12:36 Speaker_00
And the thing you texted back to me was used mimeograph machines.

01:12:42 Speaker_00
OK, and at the time you were also urging me and other people to like buy hard copies of encyclopedias and buy hard copies of books we cared about, not on the assumption that it's very nice to have a library, but on the assumption that online they were all going to get re-edited to suit the political tastes of wokeness.

01:13:01 Speaker_00
Now, on the one hand, that's like a very bleak diagnosis, right? You're basically saying that building on the Internet would mean submitting to this kind of political ideology and sort of bending a knee to it.

01:13:16 Speaker_00
And yet, you're the guy that wrote the Techno-Optimist Manifesto, this 5,000-word essay that I highly commend to everyone to listen to, arguing basically that technological progress is the key to solving humanity's greatest challenges, from poverty to climate change, and that we should prioritize it at all costs.

01:13:35 Speaker_00
I want to read just a few lines from that essay. Our civilization was built on technology. Our civilization is built on technology.

01:13:43 Speaker_00
Technology is the glory of human ambition and achievement, the spearhead of progress, and the realization of our potential. For hundreds of years, you write, we properly glorified this, until recently. But I am here to bring the good news.

01:13:56 Speaker_00
We can advance to a far superior way of living and of being. We have the tools, the systems, the ideas. We have the will. It is time, once again, to raise the technology flag. It is time to be techno-optimists. And I wanted to ask you, which is it?

01:14:11 Speaker_00
Because some days it seems like you wake up on the side of the bed and take what, you know, people on the Internet call the black pill, you know, the pill of nihilism and despair. And other days you wake up and it's, you know, the white flag.

01:14:23 Speaker_00
It's hope. It's all good things lie ahead. And I'm sure in your mind, these things are not an obvious contradiction, but maybe somehow fit together. So I'd love if you can explain how they do fit together. And also, if you can kind of lay out

01:14:38 Speaker_00
Maybe a vision for us, the vision of where we need the mimeograph machine to get reliable information, and maybe the vision where we have an open and free internet. One, it seems like you're feeling more these days.

01:14:52 Speaker_04
Yeah, so look, I think I've just been through the same thing you've been through and a lot of people have been through who are listening, which is it felt like everything just got increasingly repressive from 2013 through to maybe 2022.

01:15:05 Speaker_04
And then, you know, it feels like in the last two years, a counter movement has been building, you know, among other things, this election. It does feel like a new spirit is alive in the land, at least right now, which I hope will continue.

01:15:15 Speaker_04
Maybe the way to describe it the best is look, but my day job is a venture capitalist, you know, tech entrepreneur. The thing you do when you're an entrepreneur VC is you're trying to find these, I call them like sparks from the future.

01:15:25 Speaker_04
Like you're trying to find these little sort of indicators, these little flashes of like, okay, there's a thing that's not like a big thing yet, but like, it's interesting. And if it happens another thousand times, or if a thousand people pick it up.

01:15:37 Speaker_04
And then another thousand people after that, and then a million people, and then a billion people, it might become a big thing. And that's what the internet felt like in 1992. And that's what social media felt like in 2007.

01:15:47 Speaker_04
And that's what AI and the tech industry felt like two years ago. So our day job is quite literally to try to find these things. And in tech, these things are some combination of technological change and then social change.

01:15:58 Speaker_04
I got a lot of things wrong in the last 15 years. I didn't understand what was happening. Having said that, I saw the early sparks of WOKE and all this other stuff early on from 2013, 2014, for sure, before a lot of other people did.

01:16:10 Speaker_04
And then, you know, look, I felt and saw the counter-relate phenomenon that you're talking about. Part of it's just exposure to kids. The day job is meeting with lots of really sharp 22-year-olds.

01:16:18 Speaker_04
And starting in around 2021, 2022, I started meeting a completely different kind of college graduate. 22 year old graduating in 2022, by definition born in 2000.

01:16:29 Speaker_04
And so they were in high school and college and they got like the full blast of woke for 12 years straight. If you're, you know, a white male or an Asian, you're just like the worst possible person you can imagine.

01:16:42 Speaker_04
You know, the entire education system got weaponized against you. And so I started to see this new generation of kids coming out and they're just like, they're just like not having it anymore. I'll give you an example.

01:16:50 Speaker_04
I have a friend who has a teenage son who's 15 going on 16. His father is Jewish and his wife is Japanese. And so he is just screwed. He's an incredibly thoughtful kid. He's incredibly, by the way, he's like math Olympiad.

01:17:03 Speaker_04
He's like a super genius, incredibly bright kid, wonderful kid. And it just every day at school is just constant, you're an evil person. And his reaction to that is he is now to the right of Attila the Hun.

01:17:14 Speaker_04
He has read every piece of like dissident right-wing literature you can imagine. And he's just not having it. He's leaving that world behind. He's going to build his own world.

01:17:21 Speaker_04
And I don't know what he's, you know, who knows what he's going to do, but like, he's going to build something amazing. And there are a lot of kids like him. And so I started meeting kids like that kind of in this period.

01:17:28 Speaker_04
And, you know, you could kind of feel just people getting exhausted. And, you know, look, these social revolutions, you know, they, they curdle, you know, communism curdled at the end and the, you know, the sexual revolution curdled at the end.

01:17:37 Speaker_04
They reach a point where they go from being the renegade exciting thing to being just like incredibly depressing and corrupt. And so anyway, that's the white pill, which is the time has come for a very different outlook.

01:17:46 Speaker_04
And that's what I've been trying to put on paper.

01:17:48 Speaker_00
But the used mimeograph machine, as funny as it is, still feels relevant.

01:17:53 Speaker_00
In other words, even looking at the Wikipedia entry for Zionism, I'm like, well, we need to bring back used mimeograph machines, or the printing press at least, because history is getting revised and rewritten and lied about in the most astounding of ways.

01:18:10 Speaker_00
And we're not even living in an age yet where the kind of politics we saw exhibited from Gemini, for example, are like the overlay of everything that we're searching for on the Internet.

01:18:22 Speaker_00
How do we avoid a fate five years from now where the politics that you've sort of been describing in this conversation and that I've encountered in my own way are not like embedded, woven into the very fabric of the Internet through these AI programs?

01:18:41 Speaker_04
So I think the way to think about it is there's the cultural movements, which we've been discussing, and then there's institutional control.

01:18:47 Speaker_04
And, you know, what we have is we have, you know, as you well know, the big institutions are fully under world control. Like, they just are. And that's true across the board. It's true of the Fortune 500. It's true of big tech. They're just under control.

01:18:57 Speaker_04
Like, Elon demonstrated how hard it is to break them out of control, which is what he did at Twitter, which nobody else has done with any of these things. you know, and with all the consequences that came from that.

01:19:04 Speaker_04
And so that illustrates the level of control that these things are under. And, you know, look, the use me graph thing was always half a joke, half serious. It's still half serious. You know, the censorship machine continues to run at a lot of companies.

01:19:14 Speaker_04
The debanking machine continues to run. You know, is the machine going to unwind or not? I don't know. Is it going to unwind on its own? I don't know. Again, there have been these little sparks. I'll give you one little spark. So,

01:19:24 Speaker_04
The COVID lab leak hypothesis, which basically was misinformation right in the beginning and was, you know, broadly censored on social media. If you remember, there was a moment where all of a sudden you were allowed to talk about that.

01:19:34 Speaker_04
And it was the moment when Jon Stewart went on the Colbert show and he did this like eight minute segment where he pointed out, he's like, it literally, it cannot be a coincidence that you had the Wuhan Institute of like, you know, viral bat viruses.

01:19:44 Speaker_00
Like Chocolate and Hershey PAA. I remember the whole thing.

01:19:47 Speaker_04
Exactly, right? And anyway, the point is, I was in a discussion at one of the big tech companies, one of the big internet companies, where the discussion was like, oh, did you see the Jon Stewart thing? Oh, ha ha, that was really funny.

01:19:56 Speaker_04
Oh, OK, I guess we should stop censoring the lab leak theory now. Ha ha. You know, yes, now would be the time to stop censoring that. And literally, they stopped censoring it that day.

01:20:04 Speaker_00
Is that supposed to make me feel good?

01:20:06 Speaker_04
Well, this is what I mean. On the one hand, you're horrified. On the one hand, you're in used mimeograph machine territory because you're like, oh my God, they were censoring it up until that point.

01:20:15 Speaker_04
On the other hand, the minute it became socially permissible to not have it be censored, which Jon Stewart, like the one really amazing thing he's done in the last decade is that moment.

01:20:23 Speaker_04
All of a sudden, like, again, spark from the future, all of a sudden it's a de-censoring. And I'll just tell you, like, I have not seen many de-censoring moments. There have not been that many of those, but like, that was a big one.

01:20:31 Speaker_04
Point being, like, if the cultural shift continues, you know, then maybe things really loosen up.

01:20:35 Speaker_04
The other thing goes to what we were talking about before about the new administration, which is, you know, look, Elon with the Twitter files did a privatized version of what now needs to happen broadly.

01:20:43 Speaker_04
We, the American population, need to find out what's been happening all this time, and specifically about this intertwining of government pressure with censorship and debanking. Like, we really need to know what happened there.

01:20:52 Speaker_04
And then there need to be consequences. Like I said, these consequences need to unfold. Let's put it that way. And like I said, I don't know what this new administration will do.

01:20:58 Speaker_00
What do you mean by that? What do you mean consequences need to unfold?

01:21:01 Speaker_04
Well, to start with, like, I think there were crimes committed. The government is not allowed to fund private organizations and then direct them to do things that it itself is constitutionally not allowed to do.

01:21:11 Speaker_04
That's like as big an offense and an actual federal crime in a lot of cases as when the government does things itself. And you know, it's indisputable that that has been happening, the sort of censorship complex that you've written about at length.

01:21:20 Speaker_04
Like that's 100% what's been happening. The government and its ancillary arms are also not allowed to pressure private companies in this way. Like there's all these things that are just simply not allowed.

01:21:29 Speaker_04
And a lot of them are actually, if a prosecutor wants to pursue them, there are federal criminal statutes that certainly apply that carry jail sentences, right? Like constitutional rights are a big deal. Like there's real teeth behind these laws.

01:21:39 Speaker_04
And so like, if there were laws broken, like that needs to be flushed out. And then we need to figure out what the safeguards need to be so that this doesn't happen again. Like, it can't happen again.

01:21:47 Speaker_04
You can't have government funding, like private censorship bureaus. It can't be allowed.

01:21:50 Speaker_00
So, in other words, like, funding an NGO or a think tank in order to pressure a corporation to do something so you don't, as the federal government, pressure Facebook yourself?

01:22:00 Speaker_04
Yeah, exactly. Now, to be clear, what was happening was both the government was applying inappropriate and, I think, illegal pressure itself, for sure, and then, in addition, they were funding a lot of private organizations.

01:22:10 Speaker_04
And I just, you know, to their enormous shame, you know, I'm next door to Stanford.

01:22:12 Speaker_04
Stanford had one of these operations, the Stanford Internet Observatory, and it's funded by the federal government to implement a censorship policy on the Silicon Valley companies. And it's just like, no, that's not legal.

01:22:20 Speaker_04
And it's not even like an ambiguous question. You're just simply not allowed to do that. And so, like, that stuff needs to be flushed out, and then we need to figure out how to prevent it from happening again.

01:22:28 Speaker_00
Is the other thing that needs to happen sort of robbing these institutions of philanthropic dollars? Like, I know your family has been very involved. You just referenced Stanford.

01:22:38 Speaker_00
At Stanford, do you wish you could claw that money back, given the way they've spent it?

01:22:43 Speaker_04
Yeah, so I mean, look, so I don't want to speak for my family. A lot of this was my father-in-law who passed away, so a lot of this was in an earlier era. Having said that, there's no question you're right.

01:22:52 Speaker_04
A lot of these universities, they run on development dollars, they run on philanthropic contributions, and then they run on an enormous amount of federal money.

01:22:59 Speaker_04
You know, Jay Bhattacharya from Stanford has now been nominated to be the new head of NIH. There are many, many, many criticisms of the National NIH and the National Health Research Funding Complex that you could make.

01:23:08 Speaker_04
Jay has made many of those himself. One of those is that NIH grants come with overhead money.

01:23:13 Speaker_04
So when the government issues money to a biomedical researcher at Stanford to try to cure cancer, they give additional money to Stanford University that Stanford University uses for overhead, which is to say, essentially, whatever it wants.

01:23:23 Speaker_00
The bureaucracy.

01:23:24 Speaker_04
Yeah, right. Yeah, the bureaucracy. And then if you look at the headcount at places like Stanford, they now have more administrators than they have students, as is true in many, many, you know, bureaucratic institutions.

01:23:32 Speaker_00
It's true at Yale, it's true at lots of these elite schools.

01:23:34 Speaker_04
Yeah, there's been this massive bloat. And then, you know, as you well know, you have this extremely activated, you know, ideological, you know, activist class that's sort of conjoined administrators, faculty, and students.

01:23:44 Speaker_04
And then they go out and they sort of mount all these political crusades and, you know, all basically on the back of like literally like in part federal research dollars. That can't happen, like that's not allowed. Another one is foundations.

01:23:54 Speaker_04
The number of foundations that are tax-exempt because they're assumed to be philanthropic good causes that have become politically active and have tripped over, I think, the line of what you're allowed to do as a tax-exempt organization.

01:24:04 Speaker_04
I think a lot of the big foundations are out of compliance with that. Like that should not be, you know, many, many, many examples of this.

01:24:10 Speaker_04
And as I said, at least now there's the opportunity to really dig into this and find out what happened and figure out how to have it not happen again.

01:24:17 Speaker_00
If there is a single household name that most Americans know when it comes to AI, it's ChatGPT, which is owned by OpenAI, which is valued at $157 billion and was founded by Sam Altman, who's perhaps the biggest founder advocate for AI regulation.

01:24:33 Speaker_00
There's a tremendous amount of sort of back and forth and drama around OpenAI. It started as a nonprofit. Now it wasn't. Sam was ousted. Then he came back. Elon was mad. Elon went on Tucker and said OpenAI was training the AI to lie.

01:24:46 Speaker_00
Can you explain the situation there and explain to the listener how we should be thinking about AI regulation?

01:24:53 Speaker_04
Yeah, so to start with, Sam was a co-founder, but Elon was a co-founder. So just for full credit and context, Elon was the original funder of OpenAI.

01:25:00 Speaker_04
And he and Sam worked together, but like Elon was definitely either a or the founder and was certainly a magnet for the early talent and as well as the funder. So Elon has been involved there from the beginning.

01:25:10 Speaker_04
The umbrella thing that I think has been happening is social media went on this arc that I've described from like 2013 to today where it became a censorship machine. AI has gone on a hyper-accelerated version of that arc.

01:25:22 Speaker_04
It happened basically right up front. It took time for social media to become a censorship machine. It happened on AI right from the beginning.

01:25:28 Speaker_04
It happened on AI right from the beginning because the AI companies learned from the experience of the social media companies, and they just said, well, if we're going to end up building a censorship machine over a decade, we might as well just do it up front.

01:25:37 Speaker_00
So Mark, are you saying it's intentional or it's just learning off of information? It's 100% intentional.

01:25:41 Speaker_04
Yeah. That's how you get black George Washington at Google is because there's an override in the system that basically says everybody has to be black. Boom. Right?

01:25:49 Speaker_04
Like there are squads, there are large sets of people in these companies that determine these policies and write them down and encode them into these systems. So overwhelmingly what people experience is intentional.

01:25:58 Speaker_04
Like there's just no question about that. And they have these groups with this very Orwellian name called trust and safety, right? You know, mistrust and no safety, you know, that do this.

01:26:08 Speaker_04
And a lot of these companies, it's common to have as many people on the trust and safety side, deciding on all these policies as it is to have engineers building these systems.

01:26:14 Speaker_04
And then actually what happens, right, is you're an AI company and you know, you need to build this kind of machine, this kind of censorship machine. And so you go hire the people out of social media who built the machine at social media.

01:26:23 Speaker_04
So it's literally in a lot of cases, it's actually the same people. doing it in AI. And so these companies were born, well, they were born as censorship machines. They exhibit all of that behavior. And again, here Elon is like the big break in that.

01:26:35 Speaker_04
And the big companies, they're doing it for the same reason the social media companies did. They're doing it because either they've already gotten pressure or they know they're going to get pressure. And so you might as well just do it.

01:26:44 Speaker_04
Most of the people who work at these companies agree with that side of things. And so they think that they need it.

01:26:49 Speaker_04
I was in a discussion with a professor who is at one of the big companies, and I was like, you know, it's crazy to have these AIs give you these patronizing moral lectures all the time when you ask questions.

01:26:57 Speaker_04
And this professor said, well, you know, my undergrads actually really like that. Like, that's how they expect technology to work, right? And I'm, you know, and I'm like screaming inside my own head.

01:27:06 Speaker_04
Like what the hell it's like it's like baked into the culture, right?

01:27:09 Speaker_04
A lot of these people are newly arrived from these universities They're fully, you know indoctrinated and so they they're very sheltered isolated people They're you know, kind of people who spend all their time in a computer lab So they don't haven't had a reason to think this stuff out more broadly.

01:27:19 Speaker_04
So it's the same thing my concern is that the censorship and political control of AI is like a thousand times more dangerous than censorship and political control of social media and maybe a million times more dangerous.

01:27:30 Speaker_04
Social media censorship and political control is very dangerous, but at least it's only like people talking to each other and communicating. The thing with AI is I think AI is gonna be the control layer for everything in the future, right?

01:27:41 Speaker_04
So I think AI is gonna be the control layer for how the healthcare system works. I think it's gonna be the control layer for how the education system works. It's gonna be the control layer for how the government works, right?

01:27:49 Speaker_04
And so in the future, when you deal with a healthcare system or with the education system or with the government, you're gonna be dealing with an AI.

01:27:55 Speaker_04
And so if that AI is, call it what you want, woke, biased, censored, you know, politically controlled, you are in like a hyper Orwellian China-style social credit system nightmare. Right, absolute nightmare.

01:28:08 Speaker_04
And this goes directly to Elon's argument, which is at the core of this, what you have to do is you have to train the AI to lie.

01:28:13 Speaker_04
It's just like, if you wanted to create the ultimate dystopian world, you'd have a world where everything is controlled by an AI that's been programmed to lie. It's hard to imagine things getting worse than that.

01:28:22 Speaker_04
And this hasn't rolled all the way out yet, because AI is still new, and it's not being put in control of everything. But this is where things are headed. And so it's, I think, vitally important that this not happen.

01:28:32 Speaker_04
My hope is that the culture changes, the case studies, all the stuff that this new administration might do, my hope is that this all gets kind of peeled back and thrown out into the sunlight, and people writ large come to really understand this, and people don't stand for it, and this doesn't happen.

01:28:48 Speaker_04
But like, this has to be fought. Like, this will happen by default unless people fight it.

01:28:52 Speaker_00
Are the incumbents pushing for regulation basically in order to protect themselves?

01:28:56 Speaker_04
Yeah, that's right. And this is, this is not even a, this sounds like a conspiracy theory. It's just simply something that happens. And it's a, it's a concept that's well known and understood in economics called regulatory capture.

01:29:07 Speaker_04
And what happens with regulatory capture is a big company goes to Washington and basically intertwines itself with the government. to establish a wall of regulation that new companies cannot comply with.

01:29:18 Speaker_04
You can't afford to comply with, it's too complicated, it's too hard to get to from a standing start.

01:29:21 Speaker_04
And then what happens is the big company now basically has a permanent government-supported monopoly, and then the government then gets to control that company. I mean, the banks are the case study of this.

01:29:30 Speaker_04
You know, the banks got giant, they got too big to fail. We had all this banking reform that was supposed to solve the too-big-to-fail problem. After that, the big banks got much, much larger, and the number of banks shrank dramatically.

01:29:40 Speaker_04
And so the banks have achieved regulatory capture. The banks control the regulators. The regulators control the banks. And by the way, this is why debanking happens, and this is why it works.

01:29:48 Speaker_04
It's because you have the company intertwined with the government. There's no question, yeah, no, the big internet companies, they try to do, the big tech companies try to, every, this is the standard logical play for big companies.

01:29:58 Speaker_04
It happens, I think, in every industry sector and it's happening in tech just the same. And specifically with AI, this was actually not that relevant a concept to like computers and chips and stuff like that.

01:30:07 Speaker_04
It didn't really matter that much because there wasn't really any political implications to like chips, at least at the time. But this for sure is a conscious strategy on the part of at least several of the big AI companies and very dangerous.

01:30:20 Speaker_00
I want to just connect it to what's at stake beyond the freedom to know information, which is sort of America's role as the world's sole superpower, and that now sort of being up for grabs.

01:30:31 Speaker_00
At the end of the Cold War, America had the military, the economic, the technological prowess, and you've made the case that our tech supremacy sort of allowed for that dominance to follow. Are we in a position right now to continue to dominate?

01:30:49 Speaker_00
How far close are we in the sort of race with China on AI, on chips, on any number of things that we could talk about?

01:30:56 Speaker_04
This has been one of the weird kind of things in D.C. for the last five years, which is, as many of the conversations we've had today, sort of assumes that the U.S. is in a vacuum, right?

01:31:05 Speaker_04
But then there's this whole other conversation, which is exactly that, which is the China situation. And at least I and many others now believe that we're in some new version of a Cold War 2.0 with China.

01:31:13 Speaker_04
And, you know, it has, you know, certainly big economic dimensions. It has big cultural, political dimensions, you know, ideological dimensions. And then there is a big military dimension to it.

01:31:22 Speaker_04
And, you know, as we sit here today, there's been tension building in the Taiwan Strait for the last, you know, several years with, you know, at least a lot of American military planners thinking that at some point China's gonna move and try to take Taiwan.

01:31:34 Speaker_04
And then at least in theory, America is committed to defend Taiwan. And so you have the preconditions for what might be an actual live military conflict between the two current superpowers.

01:31:41 Speaker_04
And then as a backdrop, you have that simultaneous with the technology of war itself changing, right?

01:31:45 Speaker_04
And this is where AI and autonomy are now central to how the US Department of Defense is planning to fight wars in the future and central to how the Chinese military is planning to fight wars in the future.

01:31:53 Speaker_04
And so it's this multifaceted level of sort of competition and rivalry expanding out globally, hopefully, This remains a Cold War and, you know, something that sort of just bubbles along. But, you know, it's certainly possible that there's a hot war.

01:32:07 Speaker_04
You know, look, the good and the bad is sort of, I think, fairly obvious, which is America has all of the advantages and disadvantages of a decentralized system. We do capitalism. We, you know, that's the exceptions, exceptional times.

01:32:18 Speaker_04
Generally speaking, we do capitalism pretty well. free markets, startups, entrepreneurship, technological development, free speech in theory. We have an open society, at least that's what we're supposed to have.

01:32:28 Speaker_04
And China has much more of a totalitarian dictatorial control system inside their country. They actually have been moving against capitalism over the last five years.

01:32:35 Speaker_04
And so they've actually been quite punitive to their own tech companies, as an example, their own tech founders. And they've driven a lot of their best and brightest actually out of the country.

01:32:42 Speaker_04
So we have all the advantages of our dynamic system, but they have the advantages of tremendous resources, tremendous control. They can dictate what their companies do. They can dictate what new technologies get developed.

01:32:52 Speaker_04
They can dictate how those get adopted and used throughout their society. And then they have this very explicit strategy to take their technology and their culture and their ideology global.

01:33:02 Speaker_04
And when their companies go to market globally, they go with their government. And their government is able to provide all kinds of carrots and sticks for other countries to adopt their technology. And our government has been trying to kill us.

01:33:14 Speaker_04
So it's the opposite with us. We're doing it with the active opposition of our own government. So China's got lots of really smart people. They're doing very well in AI. They are, I would say, ahead of us in robotics.

01:33:24 Speaker_04
They are probably neck and neck in AI right now. and they are, you know, absolutely determined to win these competitions.

01:33:30 Speaker_04
It is the most rational thing in the world for me to, you know, going back to what President Trump said at dinner with us, like, the most rational thing in the world to say America needs to win this fight.

01:33:37 Speaker_04
Like, America needs to be the dominant technology superpower.

01:33:40 Speaker_00
When you write, and you've written, we believe that deceleration of AI will cost lives. Deaths that were preventable by the AI that was prevented from existing is a form of murder. Are you thinking about China when you're writing that?

01:33:55 Speaker_00
Are you thinking about the way that AI, five, 10 years from now, could save us in terms of detecting cancer? Is it both? Like, that line has stayed with me ever since you've written it.

01:34:05 Speaker_04
Yeah, the easiest domestic example is healthcare. And at least everybody in the biotech world thinks that AI is going to sharply accelerate the process of curing disease. And so that's an obvious case study there.

01:34:15 Speaker_04
The global version of this is warfare, which is, you know, how have societies like ours fought wars in the past? You know, we send a tremendous number of soldiers into battle. and they get killed, right? What is the future of warfare going to be?

01:34:25 Speaker_04
It seems like it's very clearly going to be based on autonomy, which is AI, right? And you're going to have everything from fighter jets to submarines and everything else, tanks that are unmanned.

01:34:34 Speaker_04
Yeah, and then you see, by the way, the battlefield in Ukraine.

01:34:36 Speaker_04
You know, if you have technological superiority on the battlefield in Ukraine, it means that you have these first-person drones and you've got these, you know, automated piloting systems and you are dropping in, you know, on top of a tank.

01:34:46 Speaker_04
You know, the tank costs millions of dollars to build. You drop a $300, you know, drone with some Semtex at it and blow up the tank. It's a great illustration, you want to be on the side of technology.

01:34:53 Speaker_04
October 7th, you know, Israel had a technologically powered defensive line between Israel and Gaza that broke down. You know, one lesson from that is technology by itself won't save you, which, you know, the Chinese have taken to heart.

01:35:04 Speaker_04
But the other answer to that is that the technology needs to get a lot better.

01:35:07 Speaker_04
The next time something like that happens, God forbid, there ought to be a nationwide drone response system that's able to put an armed response, an armed AI automated response under human control, but like within seconds. Let me give you an example.

01:35:21 Speaker_04
So my partner, Ben, has funded a pilot program in Las Vegas. The Las Vegas PD now has the ability to put an unarmed drone with a high-resolution camera on site at the location of any 911 call, any burglary alarm, or any gunshot.

01:35:33 Speaker_04
In Las Vegas, they're able to put a drone on site in 90 seconds.

01:35:36 Speaker_00
Wow.

01:35:36 Speaker_04
Right? And that has two consequences. Number one, if something goes wrong, if there's a break-in or whatever, there's something happening, a carjacking or whatever, number one, you're going to have instant response. You're going to be able to see it.

01:35:47 Speaker_04
By the way, the drone can then follow the perpetrator. So you can't get away from these things. And then that means that the police officers are not at danger when they go in blind because they don't know what's happening.

01:35:56 Speaker_04
When police encounters go bad and somebody gets shot and killed, a lot of it is because you have a live police officer under stress in a situation that's unclear where they don't know what they're going into and something goes wrong. That's one.

01:36:06 Speaker_04
But two, the deterrent effect. If you know that in Las Vegas, if you shoot somebody, you're going to get caught by a drone or break into a 7-Eleven at two in the morning. You're not going to do it.

01:36:14 Speaker_04
And so these are like just like examples right now of the kind of technologies that could be deployed. And again, these are like, I'm talking about unarmed drones, but just like eyes on massive deterrent to crime.

01:36:24 Speaker_04
And again, China is like racing ahead on every, so China has a two-part strategy. One is they take everything we do. Right.

01:36:30 Speaker_01
And steal it.

01:36:31 Speaker_04
and steal it, or, you know, get access to it. And then two is they have a lot of their own smart people, right? And then they heavily support them from the government. And so they're racing ahead.

01:36:39 Speaker_04
They, by some measures, already have the best AI, open-source AI models. You know, one of the things the Biden administration is trying to do is kill open-source AI in a world where the U.S. is not the leading open-source AI world than China is.

01:36:49 Speaker_04
Then the entire world's technology infrastructure is running on Chinese AI. That seems like it would be a bad idea. You know, the Biden FAA tried to successfully, in large part, try to kill the American drone industry.

01:36:58 Speaker_04
As a consequence, the Chinese own the global drone industry. You know, most drones in use by the American military today are made in China. Every single one of those is a potential weapon to be used against our soldiers. Right?

01:37:10 Speaker_04
So, like, my view is just sort of patently obvious that, like, geopolitically, we have to win this fight.

01:37:16 Speaker_00
You love tech. You probably love tech than any other person I've ever spoken to in my life. I've wanted to ask you, what are the blind spots? Because here's one that I see.

01:37:26 Speaker_00
I think that tech and progress sort of devoid or unmoored from a deeper moral worldview, or maybe you want to call that God, can quickly sort of devolve into what I think of as like idol worship, worship specifically of intelligence.

01:37:43 Speaker_00
in a way that I think can dip into something very scary.

01:37:48 Speaker_00
Like if you believe in the good of intelligence, which I do, but without any belief in a view of sort of like fundamental human dignity and equality, a person with 170 IQ becomes much more valuable than a person with Down's.

01:38:01 Speaker_00
How do you think about that problem?

01:38:04 Speaker_04
Yeah, so what I would say is I think technology changes society. And this goes all the way back to everything from the invention of fire and everything that followed. And there's a long history to this. Technology reorders power and status in society.

01:38:14 Speaker_04
It changes how society operates. It kind of has to, right? Because if it's going to have an effect, it's going to change how things are done. That's going to change society.

01:38:20 Speaker_04
And then, yeah, there are these two basically critiques of technology that kind of flow from that observation.

01:38:24 Speaker_04
One is what I would call is the left-wing critique, which is interesting for a little bit and then rapidly becomes uninteresting because it ends up being the same critique, which basically is inequality.

01:38:32 Speaker_04
you know, technology drives inequality, inequality is bad, therefore the need for communism, or as they say these days, luxury fully automated communism.

01:38:38 Speaker_04
I think it's the title of the book, you know, but it's fundamentally, it's an economic power zero sum sort of communist neo-Marxist kind of argument, fair enough. And then there's what I would call the right wing critiques of technology.

01:38:47 Speaker_04
And I think yours was an example of that, which is it's like, okay, fundamental questions of the human spirit and fundamental questions of humanity and fundamental questions of tradition and fundamental questions of social organization.

01:38:59 Speaker_04
fundamental questions of cultural change and, you know, losing important things. I find those critiques much more interesting. And quite frankly, they're harder to answer.

01:39:08 Speaker_04
And look, in the last, you know, hundred years, you know, the march of technology, you know, when we say technology, we, you know, it's often we view technology and progress as synonyms, but like that also means like technology is sort of, has come hand in hand with, you know, secularization.

01:39:21 Speaker_04
you know, has come hand in hand with, by the way, with the exaltation of experts.

01:39:25 Speaker_04
I think these are, quite frankly, the questions that should be the center of the debate, as opposed to a lot of the other stuff that, you know, at least has been dominating the discourse for the last decade.

01:39:34 Speaker_04
My claim, you know, that I wrote in my manifesto, my claim is these are all important and valid questions. But, however, the thing you need to always kind of reorient yourself to is,

01:39:43 Speaker_04
questions of like the human spirit and the social animal and organization of society and religion, like these are deep kind of permanent questions, you know, that go back many thousands of years. And I don't know that they ever get solved.

01:39:53 Speaker_04
It may be that these are the questions we need to think about the most, talk about the most.

01:39:57 Speaker_04
Are we likely to do a better job figuring those things out in a world in which we have higher levels of material prosperity or lower levels of material prosperity?

01:40:05 Speaker_04
And I think the easy, caricatured form of the argument is technology changes society, changing society is bad, therefore we need to dial back on technology, kind of the Luddite argument.

01:40:14 Speaker_04
The problem with that is you're deliberately then pulling yourself back from economic growth, you're pulling yourself back from growth of standards of living, material prosperity, and then you just have to imagine that, okay, now we're in a better position to answer deep questions of the human soul, because we're poorer.

01:40:28 Speaker_04
I think that's unlikely to be the case. I would be more optimistic that as we get to higher and higher levels of material welfare, we actually have more time

01:40:35 Speaker_04
Right, and a larger opportunity to address these deeper underlying questions, and so that's the claim I would make.

01:40:40 Speaker_00
Marc Andreessen, are you ready for a lightning round?

01:40:44 Speaker_04
I'll do my best.

01:40:45 Speaker_00
Okay, best thing about the creation of the internet?

01:40:48 Speaker_04
Oh, just the explosion of culture. Creativity.

01:40:52 Speaker_00
Worst thing about the creation of the internet?

01:40:55 Speaker_04
The extent to which it drives monoculture and groupthink, but I think that's not the main thing that it does.

01:41:01 Speaker_00
Best bet that you've made as a VC?

01:41:04 Speaker_04
Oh, I have, I will not know for 30 years. Do you know the worst? No. I mean, you know, like the thing with VC is you can only lose one extra money. Although we've done plenty of that.

01:41:16 Speaker_00
You have fuck you money, Marc Andreessen. How are you spending it?

01:41:20 Speaker_04
You don't. Nobody, nobody actually has that. Cause if you, if you like, you end up responsible for other people. And so the concept doesn't work unless you literally drop out. It doesn't work. So how do I spend money? Mostly to buy time.

01:41:33 Speaker_00
Elon Musk spent $44 billion on Twitter. Was that a good use of funds?

01:41:39 Speaker_04
Well, we're in it. We're part of the syndicate. And I would say we have confidence that in the long run that will prove to be a great investment.

01:41:45 Speaker_00
Do you think that his decision to buy Twitter determined the election in some key way?

01:41:51 Speaker_04
Probably, yeah.

01:41:52 Speaker_00
Can California be saved? It's up to California. Signal or WhatsApp?

01:41:57 Speaker_04
Oh, equal opportunity, both. I literally have the window side by side.

01:42:04 Speaker_00
Is there any Democrat you would vote for?

01:42:07 Speaker_04
Oh, sure. I voted for many Democrats. I mean, a current favorite, as I mentioned, Richie Torres, who I think is a new leader of his party, which we're big supporters of.

01:42:14 Speaker_00
Do you believe in God? I'm not sure. What's the most important book you've read in the past decade?

01:42:22 Speaker_04
I'd say probably the Machiavellians, probably, in terms of usefulness, probably. I don't know if it's the most important, but the most useful.

01:42:32 Speaker_00
What did Trump serve at dinner at Bedminster?

01:42:34 Speaker_04
Oh, he said, he said, what do you guys want to eat? And I just, I, for some reason, I was just like, I know exactly what to say. And I'm like, meat, I want meat. And so he literally ordered every meat dish.

01:42:47 Speaker_04
By the way, he ordered every meat dish and nothing else. There were no sides? There were no sides. He asked Ben, there were no sides. It was all meat and it was glorious. There was so much meat. I don't think there was room on the table for sides.

01:42:59 Speaker_04
Were there drinks or no alcohol? It was Diet Coke. He mainlines Diet Coke, and I was mainlining it right next to him.

01:43:05 Speaker_00
Tomorrow, you wake up and you're the DNC chair. What's the first thing you would do?

01:43:09 Speaker_04
The DNC chair. I mean, the DNC chair, it has to be candidate recruitment. Like, you got to get to work in candidate recruitment, I think, at this point.

01:43:17 Speaker_00
Who do you think the Democrats will run in 2028?

01:43:19 Speaker_04
I hope it's a Richie Torres or somebody like that. And then, yeah, they have other choices that, you know, would be doubling down on what they had.

01:43:26 Speaker_00
In 1996, Ben Horowitz, your business partner, criticized an interview you gave, and you quipped back this, next time do the fucking interview yourself, fuck you. But the relationship endured, and you guys are still running the company together.

01:43:41 Speaker_00
What's your advice for partnerships in business?

01:43:44 Speaker_04
Find somebody who can tolerate a lot of your bullshit.

01:43:47 Speaker_00
Your son is now eight years old. He's too young, arguably, to be on social media, but will you let him when he's older?

01:43:59 Speaker_04
To be decided, having said that, there is so much that he is already learning from online resources. You have to, I think, have a very nuanced view of this.

01:44:09 Speaker_00
What's something you believe that most people disagree with?

01:44:12 Speaker_04
Oh, look, that there is tremendous reason for optimism, that America's best days are ahead of us, that the world is going to get much better, that kids coming out of school now are much sharper and more capable than my generation.

01:44:23 Speaker_04
There is no reason that this can't be a golden age.

01:44:26 Speaker_00
You've spent time with Trump, as we've talked about here. You've noted that he's very focused on America winning. What does America winning mean to you? Four years from now, how will we know if America has won?

01:44:37 Speaker_04
Oh, I mean, there will be economic indicators. There will be, you know, foreign competition indicators, you know, reduction of war, you know, economic growth will be two big ones.

01:44:45 Speaker_04
And then a lot of this is, I think, national spirit and psychology, animal spirits, enthusiasm, positivity, excitement.

01:44:55 Speaker_00
Mark Andreessen, thank you so much for joining me.

01:44:57 Speaker_04
Thank you, Barry.

01:45:01 Speaker_00
Thanks for listening. If you liked this episode, if it made you think differently about AI policy, crypto, or perhaps even Trump's election, well, use it to have an honest conversation of your own by sharing it with your friends and family.

01:45:14 Speaker_00
Last but not least, if you wanna support Honestly, there is just one way to do it. It's by going to the Free Press' website and becoming a subscriber today. If you haven't heard, we have a goal and it's to get to a million subscribers.

01:45:29 Speaker_00
by Christmas or by New Year's. We are very, very close. So sign up. You don't need to pay anything. Go sign up at TheFP.com and help us get to a million by 2025. See you next time.