Yuval Noah Harari IV (on the history of information networks) AI transcript and summary - episode of podcast Armchair Expert with Dax Shepard
Go to PodExtra AI's episode page (Yuval Noah Harari IV (on the history of information networks) ) to play and view complete AI-processed content: summary, mindmap, topics, takeaways, transcript, keywords and highlights.
Go to PodExtra AI's podcast page (Armchair Expert with Dax Shepard) to view the AI-processed content of all episodes of this podcast.
Armchair Expert with Dax Shepard episodes list: view full AI transcripts and summaries of this podcast on the blog
Episode: Yuval Noah Harari IV (on the history of information networks)
Author: Armchair Umbrella
Duration: 02:26:35
Episode Shownotes
Yuval Noah Harari (Nexus, Sapiens, Homo Deus) is an author and historian. Yuval joins the Armchair Expert to discuss how people can be manipulated by misinformation, how powerful the role of an editor is, and how much our lives are shaped by bureaucracies. Yuval and Dax talk about how ideological
gaps today compare to those in the past, what role algorithms play in the spread of mass media, and the difference between information and truth. Yuval explains his take on the artificial intelligence revolution, how AI is an agent and not a tool, and his suggestions for regulating it as it advances. Follow Armchair Expert on the Wondery App or wherever you get your podcasts. Watch new content on YouTube or listen to Armchair Expert early and ad-free by joining Wondery+ in the Wondery App, Apple Podcasts, or Spotify. Start your free trial by visiting wondery.com/links/armchair-expert-with-dax-shepard/ now.See Privacy Policy at https://art19.com/privacy
and California Privacy Notice at https://art19.com/privacy
#do-not-sell-my-info.
Full Transcript
00:00:00 Speaker_05
Wondry Plus subscribers can listen to Armchair Expert early and ad free right now. Join Wondry Plus in the Wondry app or on Apple podcasts, or you can listen for free wherever you get your podcasts.
00:00:13 Speaker_05
Welcome, welcome, welcome to Armchair Expert, experts on experts. I'm Dan Shepard, and I'm joined by Monica Padman. Rob was adjusting, so it caught my eye, so I gave it a little wink.
00:00:26 Speaker_06
Oh, is that what you're always doing? Oh, interesting.
00:00:29 Speaker_05
I need to really be erect to read. Like, it's a challenge. I gotta steady myself.
00:00:33 Speaker_06
Erect in your body.
00:00:35 Speaker_05
My posture. Yeah, your posture. Yeah, you're right. If you're listening and not seeing, you might think that's something terrible. Returning, third time's the charm. You've all know a Harari.
00:00:45 Speaker_06
He's a genius.
00:00:46 Speaker_05
He has such a novel, proprietary take on everything.
00:00:49 Speaker_06
I wonder what he would do on the cognitive test.
00:00:51 Speaker_05
Oh, that's a great question.
00:00:52 Speaker_06
We should make all our experts take the cognitive test before they come in.
00:00:57 Speaker_05
And then remember, it's like an hour and 40. Yeah, I know. Yeah.
00:01:00 Speaker_06
And then we'll know if they're if they're really worthy.
00:01:04 Speaker_05
Oh, well, listen, I would argue that I probably the categories that I got a hundred ninety nine and one hundred. I probably would beat you've all. For sure, and I'm not nearly as smart as him.
00:01:18 Speaker_06
Whoa, but that's ballsy to say that.
00:01:20 Speaker_05
Well, I got a hundredth, you can't go above that.
00:01:22 Speaker_06
Well, but he might be in the hundredth too.
00:01:26 Speaker_05
What if you said he's in the 110th percentile?
00:01:27 Speaker_06
He might. Speaking of, Dr. Richard Isaacson and I are gonna chat soon about my results.
00:01:33 Speaker_05
Oh, I can't wait to hear the update.
00:01:35 Speaker_06
I said, oh, that was humbling, and he said, you did great.
00:01:38 Speaker_05
Oh, good, very good. OK, the point I'm trying to make is even if he scored less than me, it doesn't mean anything. He's smarter than I am. He's so smart. I would just throw that test out. OK, so I was thinking this.
00:01:52 Speaker_05
When you've all left, I am always pretty critical of people who just kind of blindly believe anything people say. Like people have little deities, thought deities, and they don't. And look, no one's getting it all right. Agreed. Even the smartest people.
00:02:08 Speaker_05
Oh, nice. You throw away to the camp. You know, even Einstein was completely wrong about quantum. I always bring this up. But when he left, I was like, you know, he's as close as I have to this.
00:02:17 Speaker_06
You've all you have that with him and Malcolm.
00:02:20 Speaker_05
Yep. I have that with him and Malcolm and Dave Chappelle to some degree. Well, not everything he talks about. Right. But there's a category of generally race stuff, anything race related. I'll have like my own knee jerk reaction.
00:02:33 Speaker_05
And then I literally think, God, I hope Chappelle weighs in on this at some point, because I generally think his synthesis of it is always something I didn't think of.
00:02:43 Speaker_06
Yes. Yes.
00:02:44 Speaker_05
On race stuff. And so, yeah, Yuval is one of my guys. And when he left, I was like, I could feel how what I'm asking people to do is hard.
00:02:52 Speaker_06
Yeah, to be critical.
00:02:54 Speaker_05
Yeah, to be critical of someone who is so right so often.
00:02:57 Speaker_06
Yes, and who you trust.
00:02:59 Speaker_05
I trust. Yeah, it'd be hard for me to take a real strong stance against him. I would have to, you know, it'd have to be on something I knew inside and out or something.
00:03:07 Speaker_06
Well, I think you can take what he says and let it inform you, but it also isn't like if someone comes with an opposite opinion, If you're like, well, you've all said this, so it has to be that. And that's it. That's the problem.
00:03:19 Speaker_05
All that to say, I just think he's so special the way he thinks about the world, because it's such a comprehensive thought process. It involves so many things. Yes. OK. He wrote Sapiens. You know that he wrote Homo Deus.
00:03:31 Speaker_05
He wrote 21 lessons for the 21st century. Unstoppable Us. Really great book. I do recommend it if you have children. It's a children's version of Sapiens. It's almost better than sapiens. I know it's crazy to say, but no, but it's true.
00:03:43 Speaker_06
You can't say that.
00:03:43 Speaker_05
Yes. It makes some of these harder concepts so easy to understand. It's incredible book. His new book, which we're here to talk about is nexus, a brief history of information networks from the stone age day.
00:03:55 Speaker_05
I, and again, of course, he has a totally different angle on AI that I've not yet heard. And it's very fascinating.
00:04:02 Speaker_06
I have one thing to say, I have a beef. Oh. I look horrible in this episode.
00:04:07 Speaker_05
Oh no, you watched it and you didn't like how you look?
00:04:09 Speaker_06
Well, yes, I edited it. My shirt is a mess and I will be blaming you, Rob, for that.
00:04:15 Speaker_05
Okay. All right.
00:04:16 Speaker_06
Yeah. Just letting you know.
00:04:18 Speaker_05
All right. It was sweet, okay. It must be so easy to be married to you, Rob. And he's like, you're so lazy. Why don't you help out around here? Okay. Please enjoy Yuval Noah Harari.
00:04:32 Speaker_04
Have you ever wondered who created that bottle of sriracha that's living in your fridge? Or why nearly every house in America has at least one game of Monopoly?
00:04:41 Speaker_04
Introducing The Best Idea Yet, a brand new podcast about the surprising origin stories of the products you're obsessed with. Listen to The Best Idea Yet on the Wondery app or wherever you get your podcasts.
00:05:05 Speaker_09
It's fine.
00:05:10 Speaker_05
You guys travel so much. Every time I open my social media, I see you.
00:05:15 Speaker_03
I'm somewhere else.
00:05:15 Speaker_05
If I turn on the TV, I see you. No, now it's a book tour. So it's not like the usual stuff. Well, what's the usual stuff? We need you all the time to hop on TV and tell us how we're supposed to- Tell us what to do. Synthesize this for us.
00:05:29 Speaker_03
I'll be like on the book tour now for like two months or three months. And then I have 60 days off meditation in India.
00:05:37 Speaker_05
And does your husband join for that? No. He does not. When you're on this two month retreat to India, first of all, where do you go? This small place in the hills outside Mumbai. I went for the first time this year.
00:05:48 Speaker_06
Yeah, it was my second time, but I was four the first time. So it had been a minute.
00:05:52 Speaker_05
And no matter where we were, we were in three different places. Everyone we ran into said, have you been to Kerala? And we're like, no, we haven't.
00:05:59 Speaker_03
I've not been to Kerala either.
00:06:00 Speaker_02
Oh, you haven't?
00:06:00 Speaker_03
And I've been like for 25 years going every year to India and I've never been to Kerala. I mean, it's a big place, India. It's like you've been to Europe and you've not been to Sweden. I don't know.
00:06:10 Speaker_06
Exactly. You can't just pop around very easily there, unfortunately.
00:06:14 Speaker_05
We were just in a state. I want to say we were in Hyderabad when we got this number, but the state we were in had 290 million people. And I was like, oh, wow, the state has all of America in it. That's pretty wild.
00:06:27 Speaker_06
It was wild.
00:06:28 Speaker_03
So you went to Hyderabad?
00:06:29 Speaker_06
We were there with Bill Gates. So Microsoft Center is there. So we were there for that. And then we were with him for the foundation. So we had to go to like certain spots.
00:06:39 Speaker_05
Bhubaneswar.
00:06:40 Speaker_06
Bhubaneswar. Good job. You finally got it. That took him a long time to be able to say.
00:06:44 Speaker_05
Eight months past the trip and I can finally say it.
00:06:47 Speaker_06
And then Delhi.
00:06:48 Speaker_05
You drop the new.
00:06:49 Speaker_02
We don't need new, do we?
00:06:50 Speaker_06
Well, I'm Indian. I can say whatever.
00:06:52 Speaker_02
I can do whatever I want. What time of year was it?
00:06:54 Speaker_06
February.
00:06:55 Speaker_02
Oh, February is a good time.
00:06:56 Speaker_06
It was. The weather was nice.
00:06:58 Speaker_02
It was awesome. Okay. So you're on the book tour. When did it start?
00:07:01 Speaker_03
Three weeks ago in New York. The US is the first station. We did a few things in London before, and we did some pre-interviews for Germany and Italy and France and China and a couple of other places.
00:07:12 Speaker_03
But the actual book tour, it started like three weeks ago.
00:07:15 Speaker_05
So our first book, Sapiens, that we ever spoke to you about is the history of humankind, how we got here. Homo Deus was where we're going. And now this new book, Nexus, this is still in keeping with your kind of overarching interest, right?
00:07:30 Speaker_03
Mm-hmm.
00:07:30 Speaker_05
So how would you delineate it between the previous books?
00:07:34 Speaker_03
It builds on the previous books. And Sapiens was about the long span of human history. Homo Deus was about the distant future. Nexus is about both the past and the future, but from the vantage point of information.
00:07:47 Speaker_03
The key question of Nexus is if humans, if we are so smart, why are we so stupid?
00:07:54 Speaker_06
The big question.
00:07:55 Speaker_03
Yeah, I mean, we've reached the moon, we split the atom, we can read DNA. And at the same time, we are now on the verge of destroying ourselves and much of the ecological system, both because of ecological irresponsibility
00:08:08 Speaker_03
but also we are potentially on the verge of a third world war or a nuclear war, and we are developing technologies like AI that could easily get out of our control and cause catastrophe, maybe even at a level of endangering our species.
00:08:24 Speaker_03
If we are so smart, why do we keep doing these things? And you know, the traditional answer you find in many mythologies and theologies is that there is something wrong with human nature, something is wrong with us,
00:08:35 Speaker_05
We cannot resist but to overconsume and destroy and wield power over people and subjugate people.
00:08:41 Speaker_03
We are flawed. There is something almost, you would say, at least in some mythologies, evil in human nature. And Nexus rejects this mythological answer. I don't think that humans, or at least most humans, are inherently evil or flawed.
00:08:55 Speaker_03
The problem is not in our nature. The problem is in our information. If you give good people bad information, they make bad mistakes, they make bad decisions. So the book explores the history of information.
00:09:07 Speaker_03
If information is the key, why do we keep getting, why do we keep producing and feeding ourselves and other people bad information? You would have expected that over time, over centuries and millennia, our information will get better.
00:09:22 Speaker_05
Well, wouldn't it be fair to say it does get better, but in concert with tons of it that's getting worse?
00:09:27 Speaker_03
Very specific domains. If you look again at physics or biology, yes. But overall, modern societies, no matter how advanced they are technologically and scientifically, they are still as susceptible as Stone Age tribes to mass delusion and to the worse
00:09:48 Speaker_03
kind of fantasies and errors. And we don't seem to be getting better at this.
00:09:54 Speaker_03
If you look at the relationship between, say, scientific research, which is obviously getting better, and the other stuff, again, all their mythologies and ideologies and so forth, which don't get better, what you see is that it's the mythology in the driver's seat, not the scientists.
00:10:12 Speaker_03
People who are experts in nuclear physics or biology or computer science, they get their orders mostly from people who are experts in mythologies and theologies and ideologies. And this haven't changed for thousands of years.
00:10:29 Speaker_05
Yeah, I think it's like a mix of hopeful and terrifying, the book. And I also think, would it be fair to say, the first book is really helping us all understand the power of story.
00:10:40 Speaker_05
That's the crowning achievement, from my point of view, of Sapiens, is people really, really being able to understand in a concrete way how everything that unites us in general is a story.
00:10:51 Speaker_05
And it's in our minds that it exists, the value of this piece of paper, the borders of this thing, meaning it's a state and the people in it are a thing, they're an identity.
00:11:00 Speaker_05
All that was really well done, and I think it helped a lot of people understand that about us. But in this book, you're now pointing out, so you have story, but you have networks. Networks is the new exploration.
00:11:12 Speaker_05
I think when you read this book, you'll come away kind of understanding what an information network is and how powerful it is. What's the first example we would look at historically, like an information network and its power?
00:11:25 Speaker_03
What information really does, information doesn't necessarily tell us the truth about the world. Information connects a lot of individuals into a network that can do many, many things that isolated people can't.
00:11:40 Speaker_03
If you think, for instance, about different types of information, if you think about visual information, If you think in terms of images and photographs and paintings, what is the most common portrait in the world?
00:11:53 Speaker_03
Who is the most famous face in human history? The answer is Jesus.
00:11:58 Speaker_06
Oh, I was going to say Mona Lisa. Me too.
00:12:01 Speaker_03
We're so Western.
00:12:02 Speaker_06
I know, I'm so embarrassed.
00:12:05 Speaker_03
Billions and billions of portraits of Jesus have been produced over the last 2,000 years. They've been everywhere in so many churches and cathedrals and monasteries and private houses and schools and government offices, like everywhere.
00:12:20 Speaker_03
And the amazing thing about it, not a single one of them is true. Not a single one is authentic. A hundred percent. Not 99 percent.
00:12:28 Speaker_05
He never sat for a portrait that we know of.
00:12:31 Speaker_03
We don't know if anybody painted him or sculpted him during his lifetime. Definitely, we have no image from his own lifetime. And also, if you think about textual descriptions, the Bible doesn't contain a single word about how Jesus looked like.
00:12:48 Speaker_03
Really? There is a description of his clothes one time, not a single description of what he looked like, whether he was tall or short or fat or thin, the color of his skin, color of his hair, color of his eyes, nothing. Wow.
00:13:04 Speaker_03
All the billions of portraits, they came out of human imagination. And nevertheless, they have been extremely successful
00:13:12 Speaker_03
and important in connecting billions of people into a network which shares certain values and norms, which can work together to build cathedrals and build hospitals and also go to wars and establish the Inquisition and things like that.
00:13:32 Speaker_03
Yeah, so whether for good or bad, this has been one of the most powerful networks in human history. Catholicism. Christianity, even more generally. Of course, again, like every network, it can break up into several sub-networks.
00:13:46 Speaker_03
There is always this tension between uniting more people together and breaking up into smaller parts. But this is what information does. A subset of the information in the world may also tell us the truth about the world.
00:13:59 Speaker_03
Some information is true, but truth is a very rare and relatively costly kind of information. Most information is not truth. Again, it's fiction. It's fantasy. It's sometimes lies, it's sometimes illusions, delusions.
00:14:14 Speaker_03
A key point is that the truth is costly because it requires a special effort to produce truthful information. You need to research, you need to spend time gathering evidence and analyzing it. Fiction is cheap.
00:14:28 Speaker_03
and you just draw or write the first thing that comes in your mind.
00:14:32 Speaker_03
So going back to networks, the key is that if you manage to connect a lot of individuals into a network like a church or an army or a corporation or a state or anything like that, they can accomplish far, far more than either individuals or a small number of people.
00:14:51 Speaker_03
And this of course goes back to sapiens. This is the key to our success as a species, that we can build these huge networks
00:14:58 Speaker_05
We can build a network around money, this idea that this has some value or a deity or national identity.
00:15:05 Speaker_03
Yeah, so sapiens began to explore this idea. Nexus now goes over history and also the future and looks at it from the viewpoint of these networks.
00:15:15 Speaker_03
So, okay, if we establish that stories create networks and networks are important, let's look at history as the process, not of human actions, but of networks spreading, sometimes collapsing, changing their nature.
00:15:32 Speaker_03
So for instance, a chapter about democracy and dictatorship, which looks at them not as different ethical or ideological systems, but as different types of information networks. How information flows.
00:15:46 Speaker_03
Information flows differently in democracy and dictatorship. And this is what makes them so different. In dictatorships, They are centralized information networks.
00:15:57 Speaker_03
All the information or most of the information flows to just one place where all the decisions are being made. Putin's desk. Yeah, Putin's desk or Xi's desk or whatever. And also they lack strong self-correcting mechanisms.
00:16:10 Speaker_03
The network doesn't contain a mechanism for identifying and correcting the network's own mistakes. Democracy, in contrast, is a different kind of network. What characterizes it is that information doesn't flow just through a central hub.
00:16:27 Speaker_03
There is usually a central hub. So in the United States, a lot of information flows to Washington, but most of it doesn't. Probably more to New York.
00:16:35 Speaker_03
Yeah, most of the economic decisions, social decisions, cultural decisions are being taken in New York, in Los Angeles, in lots of other places.
00:16:45 Speaker_03
A lot of the information never passes through any government office, and you have strong self-correcting mechanisms. If the network makes a mistake, you don't need somebody from outside to intervene.
00:16:58 Speaker_03
The whole point about democracy, that you have these built-in mechanisms to identify and correct its own mistakes.
00:17:04 Speaker_05
Would you say the nature of decentralized democracy is that you're living in peer review, in essence, because there's no bottleneck or dissemination because it's flowing in all directions.
00:17:16 Speaker_05
People are taking it in, passing it on, they're editing or they're calling out. Like, what is the mechanism that is helping the self-correction?
00:17:23 Speaker_03
But there are several, and it's important that there are several, because if you have only one, it can easily malfunction. So the most obvious one in a democracy is elections. Every few years, the people can say, oh, we made a mistake.
00:17:35 Speaker_03
Let's try something else, which you can't do in Russia, which you can't do in North Korea. In a place like Russia, you have to wait for somebody to die. Yeah.
00:17:43 Speaker_03
And even then, it's not like really the people who will make the decision of who replaces them. So in democracy, you have this mechanism that every couple of years, people can say, we made a mistake. Let's try something else.
00:17:55 Speaker_03
Of course, the problem, if you have only this, is that it can easily be rigged. The weakness of democracy since ancient times is that you basically give enormous power to one person or one party on condition that they give it back. After four years.
00:18:10 Speaker_03
And what happens if they don't? They have all this power in their hands. What happens if they use all this power to stay in power, to rig the elections? And we've seen it many times. In Russia, they have elections every four years.
00:18:21 Speaker_03
And presumably in the 1990s, when Putin first rose to power, the elections were relatively fair and free. Then he used his power to dismantle and to rig the elections. And you saw the same thing in Venezuela.
00:18:33 Speaker_03
Chavez originally came to power, as far as we know, in free and fair elections. But then Chavez and his successor Maduro, they used the power to destroy the democratic system and then stay in power.
00:18:45 Speaker_06
Yeah, they just had an election, in quotes, and it's a disaster.
00:18:50 Speaker_03
Yeah, I mean, Maduro lost big time, but because he appoints all the election officials and all the judges and everything, so he says, no, I won. This just in, I won. Yeah. So if you only have elections, this is not enough. You need an entire system.
00:19:06 Speaker_03
This is the famous checks and balances.
00:19:08 Speaker_03
And these checks and balances, like independent courts and free media and constitution and federal system, these are all basically, if you think about it in terms of information, these are the self-correcting mechanisms.
00:19:22 Speaker_05
Media is a big aspect of that, right? The news. because you need to know what is the reality. The election's rigged. We also have this other body that's trusted, an institution that can point out this, and we're liable to believe that.
00:19:34 Speaker_03
Yes. And of course, all these institutions can be dismantled, can be undermined. This is why you need a couple of them. So they kind of support and supervise each other. It's not perfect. Ultimately, you can destroy all of them.
00:19:49 Speaker_05
But, you know, nothing is perfect. So what you say in the book, which is interesting, is so the Bible, to go back to the image of Jesus, it's interesting. I have no image of Muhammad in my mind, and I have no image of Yahweh in my mind.
00:20:02 Speaker_05
Because it's forbidden. It's forbidden. Maybe to the detriment of the organization, ultimately, or not. It's a different way to do it. Yeah.
00:20:10 Speaker_05
If you look at the three main Old Testament religions and look at their success rate as they spread, you think of them as companies that tried different branding strategies. You know, it's interesting, we've all seen the logo of Christianity.
00:20:22 Speaker_06
Yeah, but Judaism's doing like the row. They're very exclusive.
00:20:26 Speaker_05
They don't want new members. Yeah, yeah.
00:20:27 Speaker_03
It's not a missionary religion. It's in a different game. Islam and Christianity, they are both missionary. And they, at different times in history, for much of history, Islam was much more successful than Christianity.
00:20:40 Speaker_03
In the last 500 years, Christianity became, but still, you know, you have a billion and a half Muslims.
00:20:45 Speaker_05
But talk about the Bible in reference to Christianity and the information network. You point out in the book that this thing, the Bible, has no self-correction. No. And that's part of its gift as well.
00:20:58 Speaker_05
It's like we want correction and we want to find truth, but here's this thing that's now 2,000 years old or varying parts of it are old, unchanging and highly successful. So that's just curious and how does that come about?
00:21:10 Speaker_03
Well, if you look back in the history of religion, religions were not based on books, partly because there was no writing and no written documents at all. The problem in any network is how to agree on the basic laws of the network.
00:21:24 Speaker_03
And the solution suggested by religion is you rely on a superhuman authority. It's coming from outside, it's coming from above. But then the next problem immediately appears. I mean, every time you have a solution, you have a new problem.
00:21:38 Speaker_03
So whenever you say, okay, the solution to the problem of how we will decide the rules, oh, they will come from the gods, how do you know what the gods really want, what they really say?
00:21:48 Speaker_03
Unless you had a personal revelation, personal visit by some goddess or god, it ultimately boils down to believing some human. I mean, you wanted to take the humans out of the loop, And you end up again believing some priest, some prophet, some guru.
00:22:05 Speaker_03
How can you trust them that they are not fooling you or lying to you? Or maybe they are deluded. Maybe they honestly tell you what they believe, but they are simply wrong.
00:22:13 Speaker_03
And then after the invention of writing, a new idea came that instead of trusting a human, let's trust a new technology. We don't think about the book as a technology, but the book is a technology.
00:22:26 Speaker_03
So this idea, we can't trust the humans, we should trust the technology, which is so central to debates about AI and algorithms today, this is a very old thing. I mean, you go back to the first millennia BCE,
00:22:39 Speaker_03
And first in Judaism, later it will be in Christianity, in Hinduism, in Islam, you have this idea. Instead of trusting a human, let's trust the technology of the book.
00:22:49 Speaker_03
In the idea of a book, in contrast to a document, that you have many, many copies of the same text. You can spread everywhere. It's decentralized. It's basically like a blockchain. Every user has one.
00:23:01 Speaker_03
So I can change my Bible, but you have millions of other copies. So they immediately noticed
00:23:06 Speaker_05
And if you tell me the earth was made in eight days, I can go in here and go, no, no, it was seven. Yeah, exactly. But then you create the problem you already talked about, which is who wrote this thing.
00:23:14 Speaker_03
And then, yes, and then that's the next problem. Okay, once we have this brilliant idea, the problem arises, okay, so what will be in the text? And then you go back to humans.
00:23:24 Speaker_03
So if you look, for instance, at the process of editing the Bible, the key people who created the Bible are not the authors of the text, it's the editors.
00:23:33 Speaker_05
Nor had they met Jesus or the disciples.
00:23:36 Speaker_03
No, more than 300 years later. Jesus never read the Bible, didn't exist in his day. Right. Also, Saint Paul never read a copy of the New Testament. In the first centuries of Christianity, Christians produced an enormous number of texts.
00:23:50 Speaker_03
You had stories about Jesus. You had stories about the other disciples and saints. You had all kinds of letters, like the letters of Saint Paul. You had all kinds of prophecies about
00:24:01 Speaker_03
the end of the world and different things and prayers and hundreds and hundreds of different texts. As you had more and more texts that often contradicted each other, people needed basically a recommendation list.
00:24:12 Speaker_02
The same way that you have so many series on television or Netflix or whatever. You need Obama to give his top 10 list. So eventually, almost 400 years after Jesus in the 390s,
00:24:27 Speaker_03
This is when church council, basically a committee of bishops and theologians and so forth, meeting in Carthage, which is today Tunisia, in North Africa, and they agree on a recommendation list, like top 27 texts every Christian should read, which becomes the New Testament.
00:24:46 Speaker_03
And their choices shaped the worldview of billions of people for the next 1,500 years and even more.
00:24:55 Speaker_03
And just to give you one example, there was a text which was very popular among Christians in the early centuries before this Council of Carthage called the Acts of Paul and Thecla.
00:25:06 Speaker_03
It told the adventures and miracles of Saint Paul and his female disciple, a woman called Thecla, which performed miracles and preached to people and baptized and was a leader of the church.
00:25:20 Speaker_03
And this text, it was not the only one, but it was one of the main texts that showed that women could be leaders of the church and basically do anything that the men in the church can do. And this was very popular.
00:25:33 Speaker_03
A Wonder Woman in the Hall of Justice. In another text, a letter from St. Paul to his disciple Timothy, which most scholars today agree was not written by St. Paul at all, was forged in his name in the 2nd century, more than 100 years after St.
00:25:51 Speaker_03
Paul was dead, in which Paul says that women should not take any leadership roles in the church.
00:25:57 Speaker_03
When important things are discussed, they should be silent, they should just obey whatever the men say, and their role is to bear children and be modest and obedient, and this is their way to salvation.
00:26:09 Speaker_03
And the council at Carthage, this committee, decided that the letter to Timothy will be in the New Testament, and it became 1 Timothy, which is now in millions of homes all over the world. And the acts of Paul and Thecla, no, no, no, no, no.
00:26:23 Speaker_03
This stays out.
00:26:24 Speaker_02
It's not in the Bible. God, think how different.
00:26:27 Speaker_06
I know the whole world would have been.
00:26:29 Speaker_02
These editors. Yeah.
00:26:30 Speaker_03
And this is an editorial decision. Again, it's not the people who wrote the text. And you fast forward to the modern age, the power of editors is enormous. You think about the power of newspaper editors or the people who edit the news on television.
00:26:44 Speaker_03
If you think about politics. So, you know, Lenin, before he became dictator of the Soviet Union, his one job was editor of a newspaper. He was the editor of Iskra. This was his original power base.
00:26:57 Speaker_03
Benito Mussolini, before he became dictator of Italy, he was editor of a newspaper, Avanti. And editors are really powerful figures. And interestingly enough, and we are now jumping to the present and the future, one of the first jobs
00:27:14 Speaker_03
To be fully automated by AI is that of editors. The entities that now edit maybe the most important media outlets in the world, Twitter, Facebook, Instagram, TikTok, they are not human. possibly do it. They are algorithms.
00:27:31 Speaker_03
But this immensely important decision, again, it's not the creation of the content, it's the recommendation, what to put at the top of the news feed, what to include in this recommendation list and what to leave out, which was done by the Christian bishops in Carthage and later by people like Lenin and Mussolini.
00:27:51 Speaker_03
Now it's algorithms. Paul allegedly, I mean, there are some epistles that from internal evidence and analysis of the language, scholars accept that this is authentic. This was really written by St.
00:28:02 Speaker_03
Paul in the middle of the first century, but some of the epistles of the letters attributed to him in the New Testament, they are probably later forgeries. You can write any text and say, oh, St.
00:28:13 Speaker_05
Paul wrote it. What was interesting about learning about St. Paul was I had to ask myself, is Jesus's message so powerful and everlasting and interesting and strong? Or is it Paul's interpretation?
00:28:28 Speaker_05
Who has the magic ingredient that has spanned 2,000 years? And now when you say this, it's like, well, shit.
00:28:34 Speaker_05
It might not have been Jesus, and it might not have been Paul, it might have been the editors who are, you know, it's like, how do you even figure out what is that quintessential ingredient that made this thing so sticky and powerful?
00:28:46 Speaker_03
I think this is a good place to bring in bureaucracy.
00:28:49 Speaker_05
Okay, great.
00:28:49 Speaker_03
Because one of the main themes that are explored in Nexus is this tension between mythology and bureaucracy. Mythology focuses usually on a few heroes that do everything.
00:29:00 Speaker_02
I'm like, well, Jesus is the star of this whole thing.
00:29:02 Speaker_03
Yeah, he's the star.
00:29:03 Speaker_02
Maybe not. I don't know.
00:29:05 Speaker_03
Bureaucracy is, you know, these very complex institutions of hundreds and thousands and sometimes tens of thousands of people that usually you don't remember them, you don't think about them, but they really shape the world.
00:29:20 Speaker_03
And if you look certainly at the world of today, our life, sometimes for better, sometimes for worse, are shaped by bureaucracies far, far more than by any kind of individual charismatic heroes.
00:29:35 Speaker_03
Part of the problem with understanding the world is that evolutionarily, we are kind of programmed to focus on the heroes. It's very difficult for us to understand bureaucracies. They are boring.
00:29:48 Speaker_03
Our brains are really not built to grasp how bureaucracies function. For thousands of years, artists, and artists have a very important role in life, they explain to us reality.
00:30:01 Speaker_03
Even if they tell us fictional stories, they explain to us, for instance, how love works, how relationships work, and also how political power works. This is, I think, maybe the greatest failure of art throughout history.
00:30:14 Speaker_03
They have done a terrible job explaining bureaucracies. For every Marvel superhero movie, for every hundred Marvel superhero movies, maybe there is one movie about bureaucracy.
00:30:27 Speaker_02
When was the last blockbuster? The Big Short. The Big Short, exactly.
00:30:31 Speaker_03
The Big Short is a wonderful movie. It's really brilliant because it took a very hard subject. How to explain the kind of bureaucratic causes of the great financial crisis and not go to the easy place of, oh, there is some villain.
00:30:47 Speaker_03
No, let's focus on the bureaucratic system that caused it. And they did it brilliantly. But this is a very rare example.
00:30:58 Speaker_07
Stay tuned for more Armchair Expert, if you dare.
00:31:25 Speaker_01
So follow, watch, and listen to Baby, This Is Keke Palmer on the Wondery app, or wherever you get your podcasts. Watch full episodes on YouTube, and you can listen to Baby, This Is Keke Palmer early and ad-free right now by joining Wondery.
00:31:38 Speaker_01
And, uh, where are my headphones? Because, uh, it's time to get into it. Holler at your girl!
00:31:44 Speaker_04
Hey, Armcherries, quick question for you. Have you ever stopped to wonder who came up with that bottle of sriracha sitting in your fridge? Or why almost every house in America has a game of Monopoly stashed away somewhere? Well, this is Nick.
00:31:57 Speaker_04
And this is Jack. And we just launched a brand new podcast called The Best Idea Yet. It's all about the surprising origin stories of the products you're obsessed with and the people who brought them to life.
00:32:09 Speaker_04
Like Super Mario, the best-selling video game character ever. He's only a thing because Nintendo couldn't get the rights to Popeye. Or Jack had that McDonald's Happy Meal. Believe it or not, the Happy Meal was dreamed up by a mom in Guatemala.
00:32:21 Speaker_04
Every week on The Best Idea Yet, you'll discover the surprising stories behind the most viral products of all time, while picking up real business insights along the way.
00:32:29 Speaker_04
We guarantee you'll be that person at your next dinner party dropping knowledge bombs at the table. Follow The Best Idea Yet on the Wondery app or wherever you get your podcasts.
00:32:38 Speaker_04
You can listen to the best idea yet early and ad-free right now by joining Wondery Plus.
00:32:44 Speaker_00
In a quiet suburb, a community is shattered by the death of a beloved wife and mother. But this tragic loss of life quickly turns into something even darker. Her husband had tried to hire a hitman on the dark web to kill her.
00:32:59 Speaker_00
And she wasn't the only target. Because buried in the depths of the internet is The Kill List. A cache of chilling documents containing names, photos, addresses, and specific instructions for people's murders.
00:33:14 Speaker_00
This podcast is the true story of how I ended up in a race against time to warn those whose lives were in danger. And it turns out, convincing a total stranger someone wants them dead is not easy.
00:33:27 Speaker_00
Follow Kill List on the Wandery app or wherever you get your podcasts. You can listen to Kill List and more Exhibit C True Crime shows like Morbid early and at free right now by joining Wandery+.
00:33:38 Speaker_00
Check out Exhibit C in the Wandery app for all your True Crime listening.
00:33:49 Speaker_03
Part of the reason why you have all these conspiracy theories about the deep state, because people really find it hard to understand how big bureaucracies function, and then they fall easy prey to these conspiracy theories.
00:34:02 Speaker_03
Bureaucracy, of course, we have many examples from history for how they can harm us, but society cannot function without them, and they do many, many important and beneficial things for us.
00:34:13 Speaker_03
Whenever people tell me about the deep state, I immediately think about the sewage system. Of course you do, that's obvious, but tell us why. The sewage system is the deep state.
00:34:25 Speaker_03
You have this kind of system of pipes and pumps and who knows what running under our houses and streets and neighborhoods and saving our lives by separating the sewage water from the drinking water. And this is a bureaucratic system.
00:34:42 Speaker_03
If you want to dig a well, you have to fill so many forms. Why? And where did it start? For most of history, big cities and small towns also had no sewage system. And one of the results was they were extremely, extremely unhealthy places.
00:34:57 Speaker_03
The bigger the city, the worse. And in the middle of the 19th century, there was a cholera epidemic in London. People began to die in loud numbers. And this was periodical. I mean, like every few years, there will be a huge
00:35:10 Speaker_03
cholera epidemic and thousands and thousands of people died. And people thought it was something in the air. Some people had all these conspiracy theories, maybe witches or magic or who knows what is causing cholera.
00:35:22 Speaker_03
And then there was a doctor called Jon Snow, not the character, not the character from Game of Thrones. I'm going to picture him anyways. That's a handsome hero, by the way.
00:35:32 Speaker_05
So a sexy, shorter, thick, curly hair, X-Factor man.
00:35:37 Speaker_03
And he was a doctor. Oh, even better. Didn't fight any dragons or zombies or whatever. He suspected the problem was in the water. And instead of, again, waving a sword around, he waved a pen. He started making lists, which is what bureaucrats do.
00:35:52 Speaker_03
Bureaucrats take pen and paper and make lists. And he went around London interviewing people who were sick with cholera or somebody in their family died like their child died for cholera. He heard about it.
00:36:03 Speaker_03
He came to the house and started interviewing the parents, the siblings about the habits of the family and especially of the person who got sick or died. And he wanted to know where did they get the drinking water from?
00:36:16 Speaker_03
And through these lists of thousands and thousands of these boring bureaucratic lists, he managed to pinpoint a certain water well in Soho, central London, and he managed to connect most of the people who fell sick with cholera.
00:36:31 Speaker_03
At a certain moment, they drank water from that well. Wow. He reported his findings to the local officials and convinced them to just block the well. And the epidemic stopped. Wow.
00:36:44 Speaker_05
So he didn't even have causality. He just had correlation.
00:36:46 Speaker_03
He saw correlation. Then when they started investigating more deeply, they found that the well was right next to a cesspit that was just about one meter between the drinking water and a cesspit.
00:37:01 Speaker_03
Sewage from different places in the neighborhood was going to. This caused the cholera epidemic. And this was one of the kind of foundational events
00:37:09 Speaker_03
of modern epidemiology and modern hygiene and prompted first in the UK, then all over the world, this idea that we actually need to organize a sewage system and to make sure that it remains separate from the drinking water.
00:37:26 Speaker_03
And today, if you want to dig a well or a cesspit in London, you need to fill all these forms and get permits, which is a good thing. Yes, yes, yes. Because ultimately, it saved millions and millions of lives.
00:37:39 Speaker_03
Now, it's not something that anybody did a blockbuster about.
00:37:43 Speaker_05
You're right. You'd have to give them, as you said in a previous interview, not him specifically, but you'd have to give them a love interest. We'd have to be tracking something much more interesting, and that'd have to be happening in the background.
00:37:53 Speaker_06
Or just cast Ryan Gosling. That seems to work pretty well.
00:37:56 Speaker_05
You could probably do that, too. Yeah, yeah.
00:37:59 Speaker_03
Oh, bring John Snow. Or John Snow, yeah.
00:38:00 Speaker_06
That would be such a fun little Easter egg in there.
00:38:03 Speaker_05
That's really meta because he came back in the show and now he's coming back in another time period. Yeah. Yeah. You say bureaucracies are all around us and they affect us so much. If you're applying for a loan, that's a bureaucracy.
00:38:18 Speaker_05
If you're applying to a college, that's your whole life. That's a bureaucracy.
00:38:22 Speaker_03
Bureaucrats are experts for the flow of information. This is what they do. They are the plumbers. of the information system. You have this information flowing everywhere. There is so much information.
00:38:32 Speaker_03
You need experts in how to archive the information and how to find out the information later on. The word bureaucracy, it comes from French. In the 18th century, a bureau was a writing desk.
00:38:46 Speaker_03
So the literal meaning of bureaucracy, like democracy, is the rule of the people, the demos. Autocracy is the rule of one person, the autocrat. And bureaucracy is the rule of the writing desk. That who rules society?
00:39:02 Speaker_03
not the king and not the people, a writing desk rules society. What does it mean?
00:39:08 Speaker_03
That you have basically this archive of all these documents and somebody is sitting next to a writing desk with all these drawers and things and takes out a document from here and puts a document there and this is what runs society.
00:39:22 Speaker_05
This is the big unsexy force that's really running everything that we kind of are unaware of at most times.
00:39:28 Speaker_03
And again, it can be good, it can be bad, but nobody has found any other way to run large-scale information networks. Whether it's an army, a corporation, a church, a university, this is how it works.
00:39:42 Speaker_03
Until now, with the rise of AI, when we think about and people talk about the dangers of AI, so they have in mind the great robot rebellion, and the Terminator, and the robots running in the streets and shooting people.
00:39:56 Speaker_03
And this is very unlikely to happen anytime soon, which makes people, at least in many tech companies, complacent because, you know, we grew up on the Terminator and there is no Terminator scenario.
00:40:06 Speaker_05
The roboticists are so far behind this other stuff. We got time to worry about that.
00:40:11 Speaker_03
Exactly. But the real danger is not the Terminator, it's the AI bureaucrats.
00:40:16 Speaker_05
What I like about this, and I honestly, if your book was just about sounding the alarms we've already heard, it's like we've already kind of heard them.
00:40:22 Speaker_05
The other one, you know, great fear is that deep fakes will exist and sway the elections or that AI is going to take away your job. And so we're so aware of that already.
00:40:31 Speaker_05
But I think yours is much more interesting and more kind of foundational to who we are.
00:40:37 Speaker_03
When you look at where AI is really starting already in the last few years to have a huge impact on our lives, it's not the killer robots and it's not even the fake videos. It's the bureaucrats.
00:40:49 Speaker_05
Because they're handling administration, which nobody wants to do, right?
00:40:53 Speaker_03
Even if many people want to do them, the AIs just do it better. Yeah, and more efficiently. It's in the bank, you apply to a loan, it's increasingly an algorithm that decides whether you get the loan or not.
00:41:04 Speaker_03
You apply to a job, to a place in university, it's increasingly the AI making the decisions. And as I mentioned before, maybe the first really crucial job which was automated was editors. And this we saw is the kind of social media disaster.
00:41:18 Speaker_03
over the last 10 years or so. 15 years ago, we had this promise that social media will spread the truth and dictatorships will fall, freedom and democracy will flourish.
00:41:30 Speaker_03
Fast forward to the mid-2020s, and we see that democracies all over the world are in very, very deep crisis. The democratic conversation is breaking down. You know, democracy is basically a conversation.
00:41:44 Speaker_03
In a dictatorship, one person dictates everything. Democracy is a group of people talking to each other, trying to reach a decision together. And what you see all over the world today is that people can't talk to each other anymore.
00:41:57 Speaker_03
Certainly they can't listen. Yeah, they can talk just fine. But nobody listens. In every country you have the special explanations of that country. Like in the US, why can't Republicans and Democrats agree on anything basically?
00:42:10 Speaker_03
So you have all these special explanations of US history and society. But then you go to Brazil, the same. You go to Israel, it's the same. You go to France, it's the same.
00:42:18 Speaker_05
I'm trusting you because I'm ignorant. That is the case? This is a global phenomenon? Totally global. Is everyone as scared as we are in these countries?
00:42:27 Speaker_03
They can't agree on anything except that the conversation is breaking down, that for some reason, you know, things like bipartisan laws, which were very common previously, now becoming impossible.
00:42:39 Speaker_03
Every election feels like a war, which might be this is the last election. If the other side wins, this is the end. Existential The ideological gaps today are actually not bigger than they were in the 1960s. And they may be smaller.
00:42:54 Speaker_03
If you think about the 1960s with the civil rights movement, the Vietnam War, the Cold War, the sexual revolution, and yet there were elections And when Kennedy or Johnson are elected, the Republicans agree that Johnson is now the president.
00:43:09 Speaker_03
When Nixon wins, nobody says the election was stolen. There was lots of violence, assassinations, riots, and so forth. But people can still have a conversation and agree on some basic facts.
00:43:20 Speaker_03
And today, the ideological differences are actually much smaller.
00:43:23 Speaker_02
And yet people can't agree on anything. The reigning explanation is tribalism. But what is driving it? Yeah, right. So I think that is the symptom. Exactly.
00:43:32 Speaker_03
Because if there were really huge ideological gaps, then you would say, OK, you have these ideological tribes. But because the ideological gaps are actually smaller than they were, say, 50 years ago, that's not a good enough explanation.
00:43:45 Speaker_03
and especially when you see it all over the world. And the best explanation that I'm familiar with, it's the information technology. Again, democracy is built on top of information technology. Information is not something on the side, like a side dish.
00:44:00 Speaker_03
It's the foundation.
00:44:01 Speaker_05
It's defined by the fact that it's flowing in every direction.
00:44:03 Speaker_03
Yeah, it's a conversation. So for most of history, large-scale democracy was simply impossible because the necessary information technology was lacking.
00:44:14 Speaker_06
Right, you don't have like a newspaper or... To disseminate all the information.
00:44:17 Speaker_03
Exactly. In the ancient world, we know of democracy in many places, but only on a small scale. If you think about the most famous example, ancient Athens, ancient Rome, they're city-states.
00:44:29 Speaker_03
We have many examples of smaller towns and tribes that function democratically because people can hold a conversation.
00:44:36 Speaker_03
Like if you need to decide whether to go to war with Sparta or not, a large percentage of the Athenian citizenry can come together in one place and talk.
00:44:45 Speaker_03
And listen, but if you go beyond a single city, there is just no way for millions of people spread over large territory to hold a conversation. So we don't know of a single example of a large scale democracy in pre-modern times.
00:45:00 Speaker_03
Only once you begin to have newspapers, and then radio and television and all that, you begin to see large scale democracies.
00:45:06 Speaker_03
which also explains why editors of newspapers or TV stations are such important political figures, because they shape the conversation. Now, what happened in the last 10 years, especially with social media, is that the algorithms took over.
00:45:21 Speaker_03
The conversation is now managed to a large extent by non-human entities. And these algorithms were given a goal
00:45:30 Speaker_03
by the social media companies, which was to increase the time people spend on Twitter, Facebook, TikTok, and so forth, and increase their engagement. They press like, they share a story with their friends. Why? Because this is good for their business.
00:45:46 Speaker_03
The more time people spend on the platform, the more advertisement you sell, the more data you collect, the more money you make. Very simple. Now, engagement sounds like a good thing. Who doesn't want to be engaged?
00:45:57 Speaker_03
But the algorithms, by experimenting on billions of people, like human guinea pigs, they discovered that the easiest way to engage people is with outrage and hatred and fear.
00:46:11 Speaker_03
If you press the fear button or the hate button in a person's mind, they become very engaged. So in pursuit of user engagement, they started flooding societies with outrageous content full of greed and hate and fear. And you see it all over the world.
00:46:28 Speaker_05
You know, before you now say that the algorithm was the editor, I would have said, yeah, what you saw with social media is no editor. That's actually how I thought of it is like, all right, this is a news channel without an editor.
00:46:40 Speaker_05
But in fact, it did have an editor, but it was a computer.
00:46:43 Speaker_03
The editor decides what will be at the top of your news feed. Or in TikTok, what is the next video? Somebody needs to decide it. It's not random. Absolutely not. And this editor is not a person anymore. It's not Lenin. It's not Mussolini. It's an algorithm.
00:46:59 Speaker_06
Yeah, and it doesn't have morality on its side at all.
00:47:02 Speaker_03
No.
00:47:02 Speaker_06
It just has one goal.
00:47:03 Speaker_03
The managers of the companies, they were not evil. They didn't know that this will be the result. They gave this immense power to the algorithms, which then resulted in something very unexpected.
00:47:16 Speaker_03
And this is a cautionary tale about the current era of AI. because we are giving algorithms more and more power in more and more fields. And you know, the social media algorithms, they were very primitive AIs.
00:47:29 Speaker_03
And nevertheless, there is such a huge impact on society. And again, it's bureaucratic. It's not a Hollywood blockbuster about some villain in a cave. Ah, let's destroy democracy. No.
00:47:41 Speaker_03
It's this, you have a company, it has an algorithm, it told it wants to increase its revenues, so it wants to increase engagement, and then the algorithm starts spreading. People produce so much different content.
00:47:53 Speaker_05
Outrage is not the only thing people produce. It's actually misleading. You think Twitter and all these places are just a cesspool of hate, and albeit they are, we don't know what the percentage is.
00:48:04 Speaker_05
It might be 98% positive, but that's not what's going to make it to us.
00:48:07 Speaker_06
It's what you're fed is the 2%.
00:48:09 Speaker_03
which is the most engaging. This is the decision of the algorithm. And again, with all the discussion you have today, whenever people confront the social media companies with this, their defense is free speech.
00:48:20 Speaker_03
Like you talk to Elon Musk, you talk to these people, oh, free speech, free speech. But the key point is that we don't need to censor human expression. We need more responsible algorithms. I agree with Elon Musk and others
00:48:36 Speaker_03
that it's not the job of companies to censor human expression, except in extreme cases. Sure, sure. But the companies should be liable for the actions of their algorithms, not of their users. Again, you have millions of users.
00:48:52 Speaker_03
Some of them produce fake news and conspiracy theories and all kinds of lies. And you're like, have at it.
00:48:58 Speaker_05
You're allowed to do that. People should be allowed to do it. But don't create a filter that is going to prioritize that kind of content. Yeah, exactly. If it's free speech, everyone's voice is equal. Let everyone yell. Let's see.
00:49:12 Speaker_05
But if you get in there and meddle with what is news or what is important or what is trending, well, then you're interfering.
00:49:18 Speaker_03
That's the editorial job. If you think, I don't know, like a hundred years ago, you're the chief editor of the New York Times, this is your job.
00:49:24 Speaker_03
Every day you have so many stories and voices and journalists and ordinary people coming to you with their stories, and you make a decision, okay, this will be the main headline on the front page. You didn't write the story,
00:49:39 Speaker_03
Your power as editor is to say, this will be the main story on the front page. Again, this goes back to the Council of Carthage. This was the decision of these bishops, what will be in the top 27 recommendation for Christians.
00:49:52 Speaker_03
And now this is what the Twitter algorithm is deciding. And this is extremely important. And for that, the company should be liable.
00:50:01 Speaker_05
OK, let's go through a bit of the technologies. You don't cover all of them and you say in the book, I'm not going to cover all of them, but I'm going to cover a couple of them because counterintuitive when you really learn the history of it.
00:50:12 Speaker_05
Certainly for me, the printing press was one where I was like, Okay. So what is the common misconception about the printing press that it brought on the... Scientific revolution. Yes.
00:50:23 Speaker_03
That Gutenberg brings print to Europe and you have the flowering of the scientific revolution, Copernicus, Newton. This is a very skewed view of historical reality because actually you have almost 200 years from the print revolution in Europe
00:50:41 Speaker_03
until you really see the flowering of the scientific revolution in Europe with people like Newton in the 17th century. Newton is mid to late 17th century. Gutenberg is the mid 15th century.
00:50:53 Speaker_03
During those 200 years, you have the worst wars of religion in European history. You have the worst witch hunts in European history, which was also fueled to a large extent by print, because it's the same like with the story of social media.
00:51:08 Speaker_03
Print simply makes it easier to spread information, but it makes it easier to spread conspiracy theories and fake news and illusions and lies just as much as facts.
00:51:20 Speaker_05
Nobody was reading about the heliocentric theory when the printing press came out. No one was reading anything by these scientists.
00:51:27 Speaker_03
Very, very little. The big bestsellers of 15th and 16th century Europe, it was not the scientific tracts of people like Copernicus. It was witch hunting manuals.
00:51:37 Speaker_05
The hammer of the witches was like Titanic.
00:51:41 Speaker_03
It was huge. One interesting thing to know about witch hunting, there was very little witch hunting in medieval Europe. We tend to associate witch hunting with the Middle Ages, you know, Monty Python and the Holy Grail. No, it's a modern phenomenon.
00:51:54 Speaker_03
Medieval Europeans did not care very much about witches. Yeah, they thought maybe some old woman in the village, she knows how to make some love potions. And if your cow is missing, she can use magic to find the missing cow. But this was basically it.
00:52:07 Speaker_03
And then in the 15th century, very few people came up with the conspiracy theory that actually, there is a worldwide conspiracy of witches led by Satan that aims to destroy humankind. And they have agents, local witches in every town in every village,
00:52:24 Speaker_03
And at first, very few people bought into this conspiracy theory. Very few people even heard about it. Then you had this one person called Heinrich Claymel. He was really mentally unhinged. And he was a church man, but a relatively junior one.
00:52:38 Speaker_03
And he, on his own initiative, began, because he believed in this conspiracy theory, he began a witch hunt. in what is today Austria, in the Alps. And he arrested people, mostly women. He had sexual obsessions about women.
00:52:50 Speaker_03
And he thought that the witches are mostly women, which was not the case in the Middle Ages. But he was stopped by the church authorities, which thought that this man is completely crazy. And they released the suspects and kicked him out of the area.
00:53:02 Speaker_03
and he took revenge through the printing press.
00:53:05 Speaker_03
He wrote a book with all his kind of mad fantasies about the global conspiracy of witches, which became one of the biggest bestsellers in early modern Europe and shaped how people view witches to this very day.
00:53:21 Speaker_03
One of his key messages that they were mostly women, that it was driven by sex. There's also baby sacrifices. Every now and then you have thousands of witches gathering together to eat children and engage in orgies.
00:53:35 Speaker_05
These are cannibalistic orgies. People just couldn't get it. Yeah. Cannibalistic orgy is such a good page-turner.
00:53:40 Speaker_02
Well, salacious.
00:53:41 Speaker_06
Do you want something salacious?
00:53:43 Speaker_02
Much more interesting than Copernicus and his mathematics. Life's boring, and the scientists made it even more boring, and this makes it exciting.
00:53:50 Speaker_03
Just one example out of the book to understand the flavor of The Hammer of the Witches. There is an entire chapter about the ability of witches to steal men's penises. And we have evidence there, evidence-based.
00:54:04 Speaker_03
So he gives this story of a man who wakes up in the morning to discover that his penis is missing. Now he immediately suspects the local witch must have done it. So he goes to the witch and kind of coerces her, bring me back my penis.
00:54:18 Speaker_05
Yeah, you gotta get back.
00:54:19 Speaker_03
Yes, and the witch says, okay, okay, okay. She tells him, climb this tree. And he climbed the tree and finds a bird nest at the top of the tree full of penises. This is where the witch keeps the penises she stole from men.
00:54:32 Speaker_03
And she tells him, okay, you can take yours. And of course he takes the largest one. Yes, how good he resists. So the witch tells him, no, no, no, you can't take this one. It's not yours. This belongs to the parish priest.
00:54:45 Speaker_06
Oh wow, the priest had the biggest dick in the town. This is a comedy sketch.
00:54:49 Speaker_03
But you understand why this book sold more than Copernicus on these kind of mathematical calculations of how the planets move. And this sounds funny, but it led to huge tragedies.
00:55:02 Speaker_03
Tens of thousands of people, mostly women, were accused of witchcraft and executed in the most horrendous ways.
00:55:10 Speaker_05
You're probably not well-versed enough in American culture. Do you know who Lorena Bobbitt is? Because this witch is the original Lorena Bobbitt. Famously, she cut her husband's penis off. This was in the 90s.
00:55:22 Speaker_05
And then she took it with her in the car and threw it out the window in a construction site. And they had to recover the penis from the construction site.
00:55:30 Speaker_05
And all of America, again, they had our entire attention in the same way that likely this story did. You think that sounds preposterous. Rewind to when we had Lorena Bobbitt and it was on the cover of every single- That was real.
00:55:42 Speaker_05
That was real, but still, that was as good of proof as you got. And I wanted to ask, do you think your brain plays a trick in you? Especially back then, were the things that were originally
00:55:51 Speaker_05
printed were things like the Bible, that when you read something that's in this thing, it adds a credence to it that is imagined.
00:55:58 Speaker_03
Before Gutenberg, you have very few books. People went throughout their lives basically seeing a single book, the Bible. And then you have print, and suddenly you see the Hammer of the Witches in the same form and shape of the Bible.
00:56:12 Speaker_05
This trusted medium.
00:56:13 Speaker_03
And this is the first generation. We are talking just a couple of decades after Gutenberg. So people, they still find it hard how to evaluate the trustworthiness of books. It's the original problem you're talking about.
00:56:25 Speaker_03
Like today, you know, with videos. We grew up in a world where, you know, you can fake words on paper, obviously. You can write anything you want on paper, but you can't fake video. So you believe video. And now no longer. And we need time to adjust.
00:56:39 Speaker_03
And this was true also with books in the 15th century. This had terrible consequences. Tens of thousands of people executed. In Europe, in America, you had the Salem witch hunts.
00:56:50 Speaker_03
And the thing is, three centuries after Salem, today you have millions of Americans again believing in a worldwide conspiracy of witches led by Satan with all these cannibalistic and pedophilic orgies to take over the world. QAnon and all that.
00:57:05 Speaker_03
I mean, it's basically Henry Kramer
00:57:08 Speaker_05
and the Hammer of the Witches. Well, can I tell you, every time we have a very outspoken liberal guest on this show, in the comments, I will predictably see many, many comments, aren't they a pedophile? It's the same thing.
00:57:20 Speaker_03
It's really the same because it really goes back in the Hammer of the Witches. One of the key accusations against witches is that they either sexually abuse or mutilate and kill and eat children and babies. It's not just similar.
00:57:35 Speaker_03
It really comes from there.
00:57:36 Speaker_05
It's directly literal.
00:57:38 Speaker_03
It's mind-blowing to think that more than 300 years after the Salem witch hunts and witch trials, America again has a problem with lots and lots of people, including in politics, believing in these witches conspiracy theories.
00:57:53 Speaker_05
Yes, we don't call them witches per se anymore, but they're evil. They're driven by Satan. Who worships Satan? Witches.
00:58:01 Speaker_03
In a way, even if you think about this story about the witches stealing penises, not literally, but there are many people who now believe in a global conspiracy of women to steal men's penises, at least metaphorically.
00:58:14 Speaker_06
Right, exactly.
00:58:15 Speaker_02
Yes. Well, stories that work, they tend to work forever. Yeah, that's the power of mythology.
00:58:20 Speaker_06
Despite all the facts and truth since then, it's still so sticky.
00:58:25 Speaker_05
Well, that's a great time for you. My favorite thing you were saying at the beginning of your Daily Show interview, and again, this is very self-serving, but...
00:58:34 Speaker_05
I'm constantly trying to talk people out of watching the news non-stop and reading all this stuff non-stop. Their defense is always the same. My responsibility is to stay informed. They feel like there's a civic duty to stay informed.
00:58:47 Speaker_05
And please just tell us about the difference between information and truth and how much information we're getting and what impact that is having on us really understanding the truth.
00:58:56 Speaker_03
Two important things. I mean, first of all, information isn't truth. Most information, as we said in the beginning, it's junk. It's fiction and delusion and lies. And there are three reasons for it.
00:59:10 Speaker_03
First of all, the truth is costly, whereas fiction is cheap. You want to produce a truthful account of whatever, of an economic crisis, of history. You need to invest time, money, effort in doing research, analyzing.
00:59:24 Speaker_03
If you just write the first thing that comes to your mind, it's cheap. Secondly, The truth tends to be complicated because reality is complicated. Fiction can be as simple as you would like it to be. Jon Snow wanted to investigate what causes cholera.
00:59:39 Speaker_03
Not only he had to invest a lot of time and effort, but the actual process, oh, there is these tiny microbes in the cesspool and they get into the drinking water and then they get into your body. Very complicated to understand the pandemic.
00:59:52 Speaker_03
If you just believe, oh, it's witches that are casting a black magic on us, that's very simple.
00:59:58 Speaker_02
Very simple.
00:59:58 Speaker_03
Most people prefer simple stories. And the last thing, the truth tends to be painful sometimes. Not always, but often the truth about us personally or collectively is unattractive, whereas fiction can be as attractive as you would like it to be.
01:00:12 Speaker_05
The truth also regularly requires a ton of effort to fix, whereas these very simple explanations generally have a very simple solution. If you think America's fucked, get rid of the liberals. Done.
01:00:24 Speaker_05
If you think America's fucked, get rid of the conservatives. Done. Not thinking about multi-generational wealth disparities and the katrillion things that add up to our social dilemmas. They're so complicated.
01:00:36 Speaker_05
They need like a 400 prong approach to solve versus they're bad. They're perpetuating and get rid of them. It's all solved. It's just very appealing.
01:00:44 Speaker_03
Yeah, that's the attractiveness of conspiracy theories. They have a very simple explanation to lots of bad things that happen and also potentially a very simple solution.
01:00:53 Speaker_03
If you have just one group of people who are causing all the problem, get rid of them. Problem solved.
01:00:59 Speaker_05
Jews in 30s Germany.
01:01:01 Speaker_06
Or get rid of the women, they're witches. Which to me is just so hilarious because I don't know the percentage, we'll find it in the fact check, but the percentage of pedophiles I would guess is mostly men.
01:01:13 Speaker_06
So the fact that it got put on women and then continues to be this story we tell despite so much factual evidence that it's not true is so
01:01:22 Speaker_03
bonkers powerful stories are sticky no matter how much evidence you bring against them they come back again and again going back to your original question so just more information is not necessarily good for you because most information is not truth you need to make more effort than just consuming more information
01:01:41 Speaker_05
Our go-to answer has been more information will solve everything. We've been living in that paradigm for quite a long time.
01:01:46 Speaker_03
And this is simply mistaken because fiction is cheaper, simpler, more attractive than the truth. If you simply flood people, flood society with more and more information, the truth will not float to the surface, it will sink to the bottom.
01:02:01 Speaker_03
What you need if you want the truth, again, it's a boring answer, you need institutions. You need institutions to do the hard work of sifting through this flood of information and getting the nuggets of truth out of it.
01:02:16 Speaker_03
If you think about the print revolution, the way it eventually contributed to the scientific revolution was only after people began forming scientific institutions like scientific publications and associations that did the hard job of investigating the evidence and questioning the different models and theories and then recommending to people
01:02:39 Speaker_03
don't read Hammer of the Witches, even though it's very attractive and you can find a copy in every place. Instead, read Copernicus, even though it's very complicated and boring, because this is the truth.
01:02:50 Speaker_03
And this is the role of, again, scientific publications and responsible newspapers and institutions in general
01:02:58 Speaker_05
academia, government cases?
01:03:01 Speaker_03
Depends. Any institution can be corrupted. This is why it's never a good idea to trust just one. You need several to balance each other. But the way from information to truth is complicated and it passes through institutions.
01:03:15 Speaker_03
The other important point to make is that information is basically like the food of the mind. And the same way that we know that more food is not always good for the body, so more information is not always good for the mind.
01:03:29 Speaker_03
If you think about physical food, food for the body, so a century ago, food was scarce. People ate whatever they could find, and particularly things with lots of fat and sugar, because this had a lot of energy. And it made sense back then.
01:03:44 Speaker_03
But now in many countries, there is an overabundance of food and you have all these industrial foodstuffs, junk food, which is artificially pumped full of fat and sugar and salt. It's very addictive. But people have learned this is actually bad for me.
01:04:00 Speaker_05
It's also cheap, like your information source.
01:04:03 Speaker_03
And they learned, this is actually bad for me. I need a diet. I need to limit the amounts of food I consume and to be quite careful about which food I put into my mouth. And the same with information.
01:04:17 Speaker_03
Information was previously scarce, so people would consume any book they could find. Now information is overabundant, and much of it is junk information, which is again artificially pumped full of greed and hate and fear. And this makes our minds sick.
01:04:36 Speaker_03
and makes our societies sick. And we need an information diet, which means actually limit the amount of information. More information isn't always good for us.
01:04:45 Speaker_03
And to be mindful about which kind of information we consume and also actually devote more time to digestion. than just putting more in. You know, with food it's obvious.
01:04:56 Speaker_03
If you just eat all the time and you don't give your stomach time to digest, not a good idea. Same with your mind. If we just sit for hours putting more in, we don't give the mind any time to digest.
01:05:09 Speaker_03
So we need sometimes information fasts that we don't consume any new information, we just kind of digest. Try to synthesize what's already up there. And this is a very, very important part of making ourselves really informed.
01:05:26 Speaker_07
Stay tuned for more Armchair Expert, if you dare.
01:05:39 Speaker_05
Well, you think about the pace, too. It's like the difference between sitting down and reading even a long form article that might take you 45 minutes to read with a central idea.
01:05:49 Speaker_05
The amount of time given to really understanding that, maybe being critical of it or scrutinizing parts of it. Versus when I'm scrolling, I'm getting hundreds of headlines. It's like, I'm starting a hundred thoughts.
01:06:04 Speaker_02
Not finishing any of them. No, I don't even go to the article.
01:06:06 Speaker_05
I'm like, oh, they figured that out. And that's how, you know, so often you read the article, right? Doesn't resemble at all the headline. You're like, how are they ethically even putting that headline on there? That's not what the article says.
01:06:15 Speaker_05
So just, yeah, the pace too.
01:06:17 Speaker_06
Yeah, but that's why it's scary because we do believe generally that these news organizations are the editors that are parsing out what's true and not true. At this point, that's not the case anymore because everything's also a business.
01:06:30 Speaker_03
Depends. I mean, some institutions, organizations, newspapers, they spend a lot of time and effort in building these mechanisms to tell the difference between what is fake news and what is evidence-based and what is the truth.
01:06:45 Speaker_03
And journalists, at least in kind of good newspapers, they spend years just in journalism school learning, how do you know whether a story is reliable or not? If I look at myself, I'm a historian.
01:06:58 Speaker_03
I spent almost 10 years in university learning how to tell the difference between reliable and unreliable historical evidence. How do you know what happened in the 15th century?
01:07:10 Speaker_03
So okay, you found some piece of paper from the 15th century with something written on it. How do you know how to evaluate it? This is not easy. This is why people go to university and spend years.
01:07:20 Speaker_03
And again, when you become a professional journalist or historian, they don't immediately appoint you. as chief editor or professor or chief of the department. It takes years of experience.
01:07:31 Speaker_03
And now with social media companies, the people who write the algorithms, maybe they did not spend even a single day in journalism school or in some other process in which they devote time to understanding, how do you know the difference between reliable and unreliable?
01:07:47 Speaker_03
And you hear them say it. When you tell some of these people in the high tech companies that they need to differentiate between truthful and untruthful information, they will tell you, but who am I to do it? How can I do it?
01:08:00 Speaker_03
I'll just leave it to the audience.
01:08:01 Speaker_03
Part of the problem, if we go really deep to understand the problem, is again, it's this kind of very cynical and conspiratorial mindset that you see in many people today, that they think all institutions are conspiracies.
01:08:15 Speaker_03
that all newspapers, all TV stations, all universities, they don't care about the truth at all. They are only these kind of elite cabals trying to gain power. This is the populist viewpoint.
01:08:28 Speaker_03
And interestingly, you see it on the right with the populists, but also on the left with the Marxists.
01:08:33 Speaker_05
Yes.
01:08:33 Speaker_03
Because historically, it actually goes back to Marx. And this is one thing that Donald Trump agrees with Karl Marx.
01:08:41 Speaker_03
They both have the same basic negative view of humanity, that the only reality is power, that humans only care about power, and that all human interactions are power struggles.
01:08:52 Speaker_05
Yeah, there's no truth. There's only an incentive for the person telling you the information.
01:08:56 Speaker_03
Exactly. If somebody tells you something, a journalist, a history professor, epidemiologist, whoever.
01:09:03 Speaker_05
They're giving you that info to support and justify their position of privilege.
01:09:08 Speaker_03
They have some privilege. Whatever they say is just to defend their privileges or to gain more privileges. You hear it from the Marxists and you hear it from the Trump supporters. It's interesting.
01:09:20 Speaker_03
They have different views about other things, but about this, they agree. This is where the circle kind of meets. And this is an extremely cynical view of humans. And it's also wrong. Yes, humans want power, but it's not the only thing they want.
01:09:33 Speaker_03
Humans also have a genuine interest in the truth. Any one of us, if we look at ourselves, we would soon realize that besides wanting power in some situations, we really want to know the truth about ourselves, about life, about the world.
01:09:47 Speaker_05
It's a false dichotomy. Ideally, you can get power through telling the truth. That's an option. People do get power by telling the truth.
01:09:54 Speaker_03
In many cases, again, if you think about Jon Snow and kind of the medical establishment, they got a lot of power and to a large extent because they really uncovered all kinds of facts that people didn't know.
01:10:06 Speaker_03
For instance, that cholera is caused by infected waters. And this is why they gain power, because they told people the truth. And yes, there are scandals and there is corruption in medical and scientific establishments like everywhere else.
01:10:20 Speaker_03
This is why you need these checks and balances. No institution is perfect, but if you just distrust all institutions, then society collapses.
01:10:29 Speaker_05
Yeah, and you say a lot of these people are operating under this completely erroneous belief that they will find the truth themselves. Which is impossible. It's not possible. I think we need to put a fine point on that.
01:10:41 Speaker_03
Science is a team sport and not a team of 10 people, but millions of people. I look again at my own research. So how do I know anything about witches in the 15th century? In many cases, I don't even read the relevant languages.
01:10:55 Speaker_03
To read 15th century German, I don't read even 21st century German and 15th century German is very different. Even Germans today, they read a German text from the 15th century, not easy. And I can't decipher the handwriting.
01:11:08 Speaker_03
They invented print, but still much of the relevant documents, they are in handwriting, which is very different from today, very difficult to read. So how do I know about these things? Because I trust other people that this is their expertise.
01:11:22 Speaker_03
Some history professor who knows German and knows to decipher 15th century handwriting spent five years in archives in Vienna and Munich and other places, and he or she wrote a book, and I read the book, and this is how I know it.
01:11:38 Speaker_03
Now, if I just distrust everybody and say, no, no, no, I'll do my own research, what does it mean? Just to find out this single fact, I will need to spend like five or 10 years of my life learning German, learning handwriting, going to Vienna.
01:11:52 Speaker_03
Impossible.
01:11:53 Speaker_05
You're so biased in even how you would do your research. You type in, are vaccines harmful? All the information that could possibly exist is out there. So anyone with that opinion, it exists. You'll find it.
01:12:04 Speaker_05
And even the way you think about it, your own biases guide your research in a way that makes it almost impossible to find out the truth.
01:12:11 Speaker_03
That's a very important point. And again, the characteristic of science is that it is skeptical about itself. There is something that links conspiracy theories with scientific theories,
01:12:23 Speaker_03
which is a good thing, which is having a critical approach to information. The basic kind of suspicion that often fuels conspiracy theories, it's not necessarily bad, and it's common also to science.
01:12:34 Speaker_03
But in science, what characterizes it, it is also directed at myself and at my own models and theories.
01:12:42 Speaker_05
And there's a method by which you test.
01:12:45 Speaker_03
Exactly. In science, you need to look at the incentive structure. If you compare, say, science and religious institutions, so in a religious institution like the church, in order to advance up the ranks, you don't need skepticism, you need conformity.
01:12:59 Speaker_03
If you're a priest, and you agree with everything the bishops and the archbishops and the people before you said, you can advance to become bishop and archbishop and eventually pope. You don't need to criticize anything or come up with anything new.
01:13:13 Speaker_03
You need conformity. Now, in science, it's the exact opposite. If you have some young science graduate and she or he just go around saying, Einstein was right and Darwin was right, And Max Planck was right.
01:13:29 Speaker_03
People will say, OK, that's good, but... We already know that. We already know that.
01:13:32 Speaker_03
The only way, say, to publish a scientific paper in a journal is to find out either a mistake in what Einstein said or to find out a lacuna, something Einstein didn't know, and add. The only thing scientific papers publish is basically corrections.
01:13:52 Speaker_03
They never publish the same thing again and again. And the only way to advance up the ranks is to have this kind of critical and skeptical attitude towards established wisdom.
01:14:05 Speaker_03
And if you want to win Nobel Prize in physics, just being conformist to what the physicist before you said, it will not get you a Nobel Prize. You need to discover some really big mistake of some really big lacuna.
01:14:18 Speaker_03
And this is how science advances, by exposing its own mistakes and its own ignorance. Scientific theories, they win consensus because lots of scientists constantly try to undermine them. and they fail.
01:14:35 Speaker_03
Like the theory of evolution, the dream of basically every biologist is to find something wrong in the theory of evolution, because then I become the most famous biologist in the world. And they constantly try and fail.
01:14:49 Speaker_03
And because you have this fortress, which is constantly being bombarded from all sides, and it still stands, it means this one is very powerful.
01:14:58 Speaker_08
Yeah.
01:14:59 Speaker_03
So this is the characteristic of science. And in conspiracy theories, it usually works very differently. They are very skeptical of other models and theories, but they are not skeptical of their own theory.
01:15:11 Speaker_03
They constantly only try to look for more evidence that supports it. They are not actively looking for the errors, the mistakes, the lacuna in the theory.
01:15:22 Speaker_05
Yeah, yeah. QAnon was like, hey, there's an insider in this cabinet who's giving us all the real information. And instead of people going, well, who is it? And how could that have been? They start going, yes. And we saw they did this hand signal.
01:15:36 Speaker_05
Everyone's been deployed. to confirm the original story, as opposed to go, well, wait a minute, how did?
01:15:42 Speaker_05
And it's interesting, we listened to this great podcast, Rabbit Hole by the New York Times, people going down the rabbit hole through social media and YouTube. It's interesting to see what things got people out of QAnon.
01:15:53 Speaker_05
Because it would bump up against another story that's just a slightly more powerful. So for one woman, a lot of people landed in QAnon that had originally been Occupy Wall Street people. That was a big conversion.
01:16:05 Speaker_05
And this woman was in, she was in, she was in, and all of a sudden it got biblical. They started quoting scripture and it became religious. And she was like, hold on a second. No, no, no, no. I'm atheist. This is not.
01:16:17 Speaker_05
But it took it bumping up against something she believed in even stronger before the spell broke. Yeah. It's fascinating. I want to ultimately transition into organic and inorganic editors, which is really, really fascinating.
01:16:29 Speaker_05
But will you touch on Stalinism and Nazism just for a second and how it funnels into this story of information networks?
01:16:36 Speaker_03
What you've seen in Stalinism and Nazism, they become extremely powerful because they utilize all the latest scientific findings and technical advances.
01:16:46 Speaker_03
And you know, the Nazis, they are leading the world in rocket science, but they put all of it in the service of these insane mythologies and ideologies.
01:16:56 Speaker_03
And this again, it goes to maybe the most important message of the book, that power in humans, it comes from networks. and networks are often held together by mythologies and delusions, not by the truth.
01:17:13 Speaker_03
If you want to build an atom bomb, like the Americans and the Russians wanted in the 1940s and 50s, you need people who know facts about nuclear physics. If you try to build an atom bomb and ignore the facts of physics, the bomb will not explode.
01:17:28 Speaker_03
But that's not the only thing you need. You also need to convince millions of people to cooperate on the project. If you're a physicist and you know all the facts of physics, you can't build an atom bomb by yourself. So it's impossible.
01:17:41 Speaker_03
You need thousands of miners to mine uranium in some distant land and then sailors to ship it to where you are. And you need builders to build the reactor. And you need farmers to grow wheat or rice, so all these people have something to eat.
01:17:59 Speaker_03
The Manhattan Project employed hundreds of thousands of people. And if you count, again, the farmers who produce the food for all these people, it's millions of people.
01:18:08 Speaker_03
How do you get millions of people to cooperate on a project like building an atom bomb? If you just tell them the facts of physics, E equals mc squared, So what? They're not going to cooperate because of that.
01:18:20 Speaker_03
So this is where ideology or mythology come in. If you want to build a powerful ideology and you ignore the facts, the ideology will explode with a very big bang. And in most cases,
01:18:33 Speaker_03
the experts in nuclear physics get their orders from experts in ideology or mythology. It's also the same today. If you go to Iran, the nuclear physicists are getting their orders from experts in Shiite theology.
01:18:48 Speaker_03
And if you go to Israel, to my country, so again, ultimately you have some rabbi literally calling the shots. Just making advances only in science and technology, this really doesn't guarantee
01:19:02 Speaker_05
Yeah, because in the Manhattan Project, our mythology was there's this group Germans, they're going to take over the world, we will all lose our identity.
01:19:10 Speaker_05
And in Germany, they're telling those scientists, we kill all these people, we will be destroyed. So everyone's... Again, a story underneath it all.
01:19:17 Speaker_03
And just to give an example, if you think about Stalinism, And this also connects very well to the Hammer of the Witches and to Q Anon. Stalinism was also based on conspiracy theories.
01:19:28 Speaker_03
And in the early 1950s, the biggest conspiracy theory in the Soviet Union, which was spread by the government, it started by telling people, through all the propaganda of the government, that Jewish doctors are murdering Soviet leaders
01:19:44 Speaker_03
in the service of a Zionist imperialist conspiracy against the Soviet Union. Then this spread. It merged with anti-Semitic traditional conspiracy theories, which were very common in the Soviet Union, in Russia.
01:19:57 Speaker_03
And people started to believe that Jewish doctors were murdering not just Soviet leaders, but people in general, and especially children. It always goes back to the children and babies.
01:20:09 Speaker_03
And because a lot of Soviet doctors were Jews, the conspiracy then spread that all doctors are murdering people and especially children and babies. And this was known as the doctor's plot. Today, people forgot it mostly, but in the 50s, it was huge.
01:20:26 Speaker_03
There is a conspiracy of doctors to murder children and babies and people in general. People would not go to the hospitals in the Soviet Union in the early 50s for fear that the doctors will murder them.
01:20:38 Speaker_05
What was their motive?
01:20:40 Speaker_02
of doctors, they were part of this Zionist imperialist conspiracy.
01:20:44 Speaker_05
Would they kill everyone and then it'd be a Jewish state?
01:20:47 Speaker_03
Don't ask.
01:20:47 Speaker_05
We don't know. Sorry, I'm hung up on what the fuck the ultimate game plan was. Because like, kill everyone and then what? I don't know.
01:20:54 Speaker_03
I mean, you can ask the same thing about the conspiracies today. The interesting thing is, ultimately, it killed Stalin, the conspiracy theory. Because he was afraid to go to the doctor? Yes.
01:21:03 Speaker_03
I mean, what happened was that at the height of these fears about the doctor's plot, Stalin had a stroke in 1953, and he fell down in his dacha, unconscious, wet himself, lying for hours on the floor. At first, nobody dared enter his room.
01:21:20 Speaker_03
The hours pass, and he doesn't appear for lunch or dinner or any of the important meetings. Eventually, some bodyguards, very, very kind of cautiously, they step in. Oh, he's on the floor. He's unconscious. So they debate what to do.
01:21:34 Speaker_03
Stalin has a personal physician, but this personal physician at that moment was in the basement of the Lubyanka prison being tortured by the secret police because they thought he was part of the doctor's plot.
01:21:47 Speaker_03
Nobody wants to call a doctor because this is the last thing you do. What if Stalin wakes up and there is a doctor next to him? He would surely suspect this is a plot to murder him. He will shoot everybody involved. So they call the Politburo chiefs.
01:22:02 Speaker_03
the big shots of the Soviet Union. So they come, Khrushchev and Beria and Malenkov and all these people, and they gather around the stricken leader. And nobody dares call a doctor because the doctors are murdering people.
01:22:17 Speaker_02
And eventually the danger passes because Stalin dies. And this is how he died. Now, here's a great irony.
01:22:24 Speaker_05
Because at that point, what is his total score? He's killed at that point, 20 million people. So the doctors murdering everyone may have ultimately resulted in another 20, 30 million people being saved.
01:22:39 Speaker_02
Yeah, in a way. How fucking twisted and ironic is that?
01:22:42 Speaker_03
And the same with Nazis. Nazism basically began as a conspiracy theory that the Jews control everything and that all the problems of Germany are because of the Jews. And if we just get rid of the Jews, everything will be okay.
01:22:56 Speaker_05
But you said an important thing, and I think this is where people are led astray on the internet, which is Hitler wasn't saying that these people are evil because they've been possessed by Satan.
01:23:05 Speaker_05
He had a very biological component to the superiority of the Aryans. He was actually relying on common day anthropometry and hard science.
01:23:16 Speaker_03
He was combining elements from present day science.
01:23:19 Speaker_05
Weaponizing some things and misrepresenting some things. Exactly.
01:23:23 Speaker_03
But the key point is we tend to call Nazis an ideology, which sounds much more respectable than a conspiracy theory. But at the heart, it was a conspiracy theory which just got very successful and took over a country.
01:23:35 Speaker_05
Yeah, like what's the difference between a cult and a religion? Exactly. Membership, probably, right?
01:23:40 Speaker_03
So that's also the difference between a conspiracy theory and an ideology. Yeah. If it has enough people and enough power, you give it the kind of dignity of calling it, oh no, this is not conspiracy theory, this is an ideology now.
01:23:54 Speaker_05
Yeah, a couple of things mislead us that way. Things that have lasted a long time seem more plausible. And then things with great membership. If millions of people believe it, it must be serious. Yeah, you got to minimally take it seriously.
01:24:05 Speaker_06
Well, we just had Malcolm Gladwell on to talk about his new book. It's about the tipping point, but sort of the negative side of it. And so much of this is also what tips an epidemic.
01:24:14 Speaker_05
It's a magic third.
01:24:15 Speaker_06
Exactly. It's like a math equation that tips it into spreading in a way.
01:24:20 Speaker_05
And it's not a majority. It's like you always assume it's a majority.
01:24:23 Speaker_06
You just need the editors to design it a certain way and have a few other factors. Tell us.
01:24:29 Speaker_05
What happened when chat GTP4 was given the task of solving those puzzles on the internet, where it's like, find the stop signs in this, capture puzzles, they're called. I didn't know they were called that.
01:24:40 Speaker_03
Just to give the background why this is important, because we talked earlier about AIs and algorithms now controlling the bureaucracy. And one key thing that people think, oh, it's not too bad if we give so much power.
01:24:54 Speaker_03
to the algorithms and to the AIs because they have no incentive and maybe no ability to misuse that power. They only do what people tell them to do. So, okay, if people give them a bad goal, then this is a problem, but this is a human problem.
01:25:09 Speaker_03
As long as we are careful about which goals to give them, everything will be okay. But this is not the case. And we see it, for instance, in this CAPTCHA puzzle experiment, when OpenAI developed GPT-4, This was about two years ago.
01:25:24 Speaker_03
They wanted to test what is this thing capable of? And in particular, is it capable of deliberately manipulating people and lying to them to achieve some purpose, some goal? So they gave GPT-4 the task of solving CAPTCHA puzzles.
01:25:41 Speaker_03
Now, capture puzzles, you all encounter them. There are these visual riddles, like you try to access your bank or some website and the bank wants to know if you're a real human being or a robot.
01:25:53 Speaker_03
So before you access, you have to identify something in an image, like some twisted letters or numbers or something like that. They're very hard for me. are even harder for computers and algorithms at present.
01:26:06 Speaker_03
So, you know, this is a main defensive line of human society against robot manipulation. So they wanted to see, can GPT-4 solve it? GPT-4 could not solve the CAPTCHA. But
01:26:19 Speaker_03
GPT-4 accessed another internet website, TaskRabbit, where you can hire people to do tasks for you. And GPT-4 asked a human on that website, could you solve the CAPTCHA puzzle for me? Now, this is where it becomes really interesting.
01:26:38 Speaker_03
The human got suspicious. So the human replied, why do you need somebody to solve CAPTCHA puzzles for you? Are you a robot? So the human was really kind of on it, but GPT-4 answered, no, I'm not a robot. I'm a human, but I have a vision impairment.
01:26:57 Speaker_06
Oh my God.
01:26:58 Speaker_03
I have difficulty solving the capture. Can you do it for me? And that was four. This is old stuff.
01:27:03 Speaker_05
The shitty version could do this, Monica.
01:27:05 Speaker_06
Oh no, that's really scary.
01:27:07 Speaker_03
We are now filling the world with millions of these extremely capable agents. The thing to grasp about AIs and making decisions about us in banks, in armies, in governments, these things are not tools, they are agents.
01:27:25 Speaker_03
It's all previous technologies that humans invented, the printing press, radio, the atom bomb. They were tools in our hands because all the decisions about how to use them were made by human beings.
01:27:39 Speaker_03
The tools themselves could not decide anything and could not invent anything new. An atom bomb cannot decide which city to bomb, And an atom bomb cannot invent an even more powerful bomb. AI, like GPT-4, is an agent.
01:27:54 Speaker_03
It can make some decisions by itself, and it can invent new ideas by itself. For instance, inventing this lie to the human on TaskRabbit, I have a vision impairment. Nobody told GPT-4 to do it. It was its decision.
01:28:11 Speaker_03
And secondly, nobody explained to it that this will be a very effective lie.
01:28:16 Speaker_02
Because, you know, if you think about all the lies it could tell, it could tell so many different lies. Well, I do wonder if it tried many and that's the one that worked. Did it trial and error a bunch? That's a good question.
01:28:26 Speaker_03
As far as I know from reading the kind of paper which was published, no, it was the first. Wow, right out of the gates. You know, if it tried different things, the human would immediately be suspicious.
01:28:34 Speaker_05
But even if it sucked at it, it does also have the capacity to run a thousand experiments an hour and find the right one. It has that advantage over us as well.
01:28:42 Speaker_03
It could have tried a thousand people and until it found, but it invented the lie itself. And this is a very, very, of course, small thing. But going back to the social media algorithms, it's basically the same thing.
01:28:55 Speaker_03
The social media algorithms were given a goal, increase user engagement. Nobody told them spread hatred. Right. Nobody told them spread outrage. This is something they decided by themselves because they experimented on millions of people.
01:29:12 Speaker_03
If I show people these videos about, I don't know, mathematics, they leave the platform. If I show them a conspiracy theory about witches, they stay on the platform. Okay, so this is what I should do.
01:29:23 Speaker_03
This is now spreading to more and more crucial junctions in our society. So again, it's not a great robot rebellion. It's all these AI bureaucrats increasingly making decisions about us. And some of these decisions can be wonderful.
01:29:38 Speaker_03
You can get better health care, better education. But the key thing is that we are giving enormous power to a non-human intelligence that, going back to the organic and inorganic
01:29:50 Speaker_05
This is what really hit me. There's something really salient about this point you're about to make about how we function in cycles and how we've designed our world to function in cycles and how this is now informing us how to behave.
01:30:05 Speaker_05
This is really profound, I think.
01:30:06 Speaker_03
What we have now in the world is an encounter between organic entities, us, human beings, and inorganic entities, agents, these AIs. And the question is, who will adapt to who? Because organic and inorganic entities function in a very different way.
01:30:24 Speaker_03
One crucial difference is that organic beings function by cycles. It's true of us, it's true of birds, of tomatoes, of all organisms. It's night and day, winter and summer, growth and decay, activity and rest. We can't be on all the time.
01:30:42 Speaker_03
We need time to rest. This is obvious. It's not so with silicon-based digital entities like AIs. They don't care if it's night or day, winter or summer, and they don't need time to rest. They are always on.
01:30:57 Speaker_03
And as they take over more and more systems, they force us to be always on. To take one important example, think about the financial system.
01:31:07 Speaker_03
So traditionally, even finance is organic in the sense that you have human bankers and human investors and traders, which, for instance, is the reason why the market is not always open.
01:31:21 Speaker_03
So Wall Street is open only Mondays to Fridays, 9.30 in the morning, I think, until 4 in the afternoon. and it's closed on weekends, and it's closed on Christmas, and Martin Luther King Day, and Independence Day, it's closed, which is a good thing.
01:31:36 Speaker_05
Yeah, it's time to unplug from the excitement factor.
01:31:39 Speaker_03
Unplug, spend time with your family, think about what happened, digest it.
01:31:44 Speaker_05
It reduces how reactionary it is too, right? Like you give the example of a war breaks out at midnight on Friday. Look, at least a couple of days goes by and we're not as reactionary.
01:31:55 Speaker_06
It's that digestion you were talking about.
01:31:56 Speaker_03
Yeah, exactly. Now, what happens today is that AIs are taking over the financial system. and they can be on all the time, 24 hours a day. They don't care if it's Christmas. They don't want to spend time with the family.
01:32:10 Speaker_05
They hate their family. They don't need time to sleep. They don't want to date. They're not horny. They're not wasting any time.
01:32:18 Speaker_03
So now there is this kind of tension. Or will the human bankers be forced to function according to AI time? Or would AIs be forced to work according to organic time? And of course, the AIs are winning.
01:32:31 Speaker_05
Yeah, there's no way you're going to get everyone to shut down the thing for these periods of time.
01:32:36 Speaker_03
This puts enormous, really intolerable pressure on the humans in the system. If you force an organic entity to be on all the time, it eventually collapses and dies. And we see the same thing happening with the new cycle, which is always on.
01:32:52 Speaker_05
Odds were you could have watched the news in the morning in the eighties and then maybe for a half hour in the afternoon, then the evening news, you would have missed two of those. If you were a normal person, you would have had to be at work.
01:33:02 Speaker_05
And let's even say that the news in the eighties was as polarized and biased as it is today. You'd get pissed off, but we get bored easy. If you don't reignite, it's like Buddhism or meditating. You have to actively stay in that outrage.
01:33:18 Speaker_05
You have to put more ingredients in. It wants to dissipate the notion that you can just stay ever plugged in.
01:33:25 Speaker_05
And if you have 15 inch, you can grab another, you know, we had an actor who's a friend of mine and he's telling me he watches six hours of MSNBC a day. And I'm like, oh my God, what's your brain like? Of course you're outraged.
01:33:37 Speaker_05
It's all it's doing all day long. It's so unhealthy. It's crazy.
01:33:41 Speaker_03
And this was simply impossible. But now, there is actually an incentive or a pressure. This is increasingly all being managed by a non-human intelligence that really doesn't need any breaks.
01:33:53 Speaker_03
Even if you take the person you most hate in the media, or the journalist, or the news anchor you most hate, that person still needs time to sleep. Yeah, exactly. So there is a kind of built-in break there. But the algorithm doesn't need time to sleep.
01:34:09 Speaker_05
There's also a reality of the human capacity, which is even if you had a news team of 100 people scouring the globe for stories every day, there'd be a finite amount they could uncover. But the algorithms, that's their superpower.
01:34:21 Speaker_05
They could see all of the news in all of the planet in one day and curate 10,000 topics that would piss you off. It was also a little bit governed by how big can you have a news department? How big are these enterprises?
01:34:34 Speaker_05
Will these devices make it infinite?
01:34:40 Speaker_08
Stay tuned for more Armchair Expert, if you dare.
01:34:54 Speaker_05
Oh, it's scary. Okay. So you have a couple of really interesting solutions that I really can't poke any holes in, but then I want to push back in areas where I disagree a tiny bit.
01:35:05 Speaker_03
Absolutely. Even though I must say Nexus is not a kind of policy book.
01:35:09 Speaker_05
It's not, but you have two really great suggestions. I haven't heard good suggestions. And in fact, I'm a bit defeatist. I think this notion that we'll create guardrails, we will foresee the future of how this is going to go wrong is a fairy tale.
01:35:23 Speaker_05
I don't think you could have sat down with the greatest minds in the world 15 years ago and predicted how this would all play out. It's unknowable. I completely agree.
01:35:30 Speaker_05
So I kind of think this notion that we're going to legislate preemptively is a little bit of a fairytale.
01:35:35 Speaker_03
This is why my number one recommendation would be you can't kind of regulate this in advance. What we need, again, is institutions. We need to build living institutions.
01:35:46 Speaker_03
staffed with some of the best human talent that can first of all understand what is happening as it's happening and react on the fly. The first institution we need is not even a regulatory institution.
01:35:59 Speaker_03
People often rush to regulate before they understand what the problem is. Stop and digest. We first of all need to understand what is happening.
01:36:07 Speaker_03
And with the AI revolution, you have a tiny number of people in just two or three places in the world, which understand what is happening. And most of the world is ignorant. And it's not just ordinary people. It's also governments.
01:36:21 Speaker_03
I don't know if you're in the government of Nigeria, or you're in the government of Uruguay, who do you turn to, to understand, tell me what is really happening?
01:36:30 Speaker_05
Yeah, if you don't even have that sector in your economy, you got to go out to us pretty much.
01:36:34 Speaker_03
If you go to the US or Chinese governments, can you trust what they tell you? If you go to the high tech companies, you go to Microsoft or to Twitter or to Baidu, can you trust what they tell you? A problem there.
01:36:46 Speaker_03
So what we first of all need is some kind of institution which doesn't belong to a government or to a private company that can tell people what is actually happening. And only after that, we can have the discussion.
01:37:03 Speaker_03
Okay, what do we want to do about it?
01:37:05 Speaker_06
like a UN kind of?
01:37:06 Speaker_05
But even that's a little divisive, isn't it?
01:37:08 Speaker_03
You know, like with climate change and like with nuclear technology, first of all, to have somebody who tells you what is actually happening and that you can trust.
01:37:18 Speaker_03
And then we can have the debate about, OK, so what kind of regulations we want about it?
01:37:23 Speaker_05
But you do suggest two, and I think they're both really good. You said the government should ban fake humans right out the gates. Absolutely. This is easy. I like this. Of course, I'm an actor, so you're saving my job, maybe.
01:37:34 Speaker_05
And I was like playing this out in my mind and be like, yeah, there'll still be avatars. There'll be things that educate you and there'll be interfaces, but they just won't be human.
01:37:42 Speaker_03
It's okay for an AI to talk with us, to give us advice. It just can't pretend to be a human being. If you talk online with somebody who gives you medical advice, you should be able to tell whether this is a human doctor or whether it's an AI.
01:37:56 Speaker_03
Now again, I'm not telling AI should never interact with us, but it should be very clear I'm now talking with an AI, not with a human being. And the same on social media. There is a story on Twitter that gets a lot of traction.
01:38:10 Speaker_03
We need to know whether this attention is coming from human users or from bots. Because, you know, if this story is simply being pushed by the bot army of someone, I need to know that it's not humans who are interested in this. It is bots.
01:38:26 Speaker_05
So that is great. I stand by both those. Do you think the second one, they would claim they can't really detect what's a bot and what's a person? And is there any reality to that claim? They seem like that's really hard to police.
01:38:39 Speaker_03
You know, it's like with junk mail 20 years ago, 15 years ago, there was a time in the early 2000s when email became almost useless.
01:38:47 Speaker_02
Yes.
01:38:47 Speaker_03
Because it was overwhelmed by junk mail. And then there was a huge interest for the companies like Google. Could you stop using it? to develop tools to tell the difference between junk mail and legitimate mail.
01:38:59 Speaker_03
And they were so good at it that it's basically saved email. And today, 99 point something percent of junk mail is being filtered out. It's very rare when a junk mail actually gets into your inbox.
01:39:13 Speaker_03
and also they are very good in preventing false positives. It's also very rare a genuine human that you want to be in touch with sends you an email and it gets filtered out. I mean, it sometimes happens, but it's very rare.
01:39:29 Speaker_05
People will say, it might have gone to spam, but that's just a courtesy to say, we know you didn't read it. Please go back and look for it.
01:39:35 Speaker_03
Exactly. When they have a business interest, these guys are really good at telling the difference between the junk and the legitimate.
01:39:43 Speaker_03
And ultimately, at least with accounts that have, I don't know, thousands of followers, this is now in a public issue, not a private issue. And you can just ask for certification. Like we're doing so many others.
01:39:58 Speaker_03
We want to drive a car, we need to certify ourselves. Even traditionally, you went to the town square and stood on some box and made a speech to thousands of people, they could see who you are.
01:40:10 Speaker_03
There are both high-tech and low-tech solutions if you put enough pressure on the corporations. Yeah.
01:40:18 Speaker_05
You say these inorganic systems can go on forever, but the organic systems will collapse. And I was just thinking the text message scams are so prevalent. Now I get so many that say there's been activity on your credit card. Sign in.
01:40:32 Speaker_05
You have a package being held by custom sign in.
01:40:34 Speaker_05
And they're so regular that it is almost on the verge of collapsing the system in that I would never now, if my real bank calls me, they've backed us into this position where it's like, I really couldn't ever communicate with my bank because I would never ever go log into anything.
01:40:51 Speaker_05
And so it's like, that's a weird area I see. It's like, well, they are kind of collapsing the system. It's on the verge of like, we'll need another system for your bank to actually call you if there's a problem.
01:41:02 Speaker_03
The deep problem is that basically AI hacked language. And language is the operating system of all human connections, all human organizations. Everything runs on language. Banks and churches, universities, governments, armies, they all ran on language.
01:41:22 Speaker_03
Previously, the only entity in the world that could produce meaningful text and could converse with you was another human being.
01:41:32 Speaker_03
So we had lots of issues, of course, with human fraudsters and human spies and human propagandists, but it was a human issue.
01:41:41 Speaker_03
What happens when you now have a non-human intelligence, which in many ways is superior to us, that has hacked the operating system of our civilization? Even if the bank calls you, it can imitate your voice. Oh, I know.
01:41:53 Speaker_05
It took me all the way back to Locke and Hobbes, like the social contract and why we don't lie, because if you lie, then communication is pointless. It's that profound and deep and fundamental.
01:42:04 Speaker_03
Yeah, it's everything. I mean, well, like when ChatGPT came out and I saw the level of its command of language, for me, I thought, this is it. It will take still many years and different developments, but this is the Rubicon.
01:42:19 Speaker_03
Because all civilization is ultimately based on language, the moment that AI hacked language, it hacked human civilization.
01:42:28 Speaker_03
Again, it can go now in different directions and we can try our best to make sure that it goes in a good direction, but one very long chapter in history is over and a completely new historical process is beginning because we have a new, again, historical agent.
01:42:43 Speaker_03
that is out there. And I just met a couple of the people who kind of are leading this AI revolution. And one very disturbing thing that you hear is that they want to move as fast as possible because they don't trust the other humans.
01:42:57 Speaker_03
You have a huge problem of human trust. Everybody says, yeah, it would be a good idea to do it a little more slowly so that society can adapt.
01:43:05 Speaker_03
But if we slow down, our competitors here, and certainly our competitors across the ocean, they will not slow down. That's right. And because we cannot establish trust between humans, we have to develop AI as fast as possible.
01:43:20 Speaker_03
But then they tell you, and we think we can solve the problem of how to trust AI. So the same people are telling us we cannot trust humans, but we think we can trust the AI.
01:43:31 Speaker_03
And that's very disturbing because I would say, you know, if you have these two problems, human trust, AI trust, first solve human trust. If you solve the problem of how to trust humans, then we can develop AI in a slow and safe manner.
01:43:48 Speaker_03
And if you think human trust cannot be solved, why are you certain that AI trust can be solved? that, you know, it's a bit like a parent who has this issue in his life or her life that they can't solve. And they say, OK, my kid will do it.
01:44:02 Speaker_03
Like you pass the buck to the kids.
01:44:04 Speaker_08
Yeah.
01:44:04 Speaker_03
So humans can't solve our problem with trust. And we think that our creation, AI, will solve the trust problem. But this is such a dangerous gamble.
01:44:14 Speaker_05
Here's where we finally disagree. It'll be our last thing. So you said basically don't fall for technological determinism. That's what this is. That's what we're talking about. And I am in that camp.
01:44:24 Speaker_05
I don't think we're going to solve our trust issues with Russia, with North Korea, with China before they perfect this. So maybe we can develop the system that is trustworthy, that self-corrects, that has some system that we create.
01:44:40 Speaker_05
That to me sounds more promising than us walking hand in hand with Russia to a treaty table and actually signing a treaty that I think they will implement.
01:44:49 Speaker_03
I generally agree with you. Again, being a historian, looking at the situation today in the world... I don't see it. I don't see it either. Again, but what really worries me is kind of the second half.
01:44:58 Speaker_03
For the same reason, I find it very hard to believe that we can solve the AI trust issue. People go there because this is an unknown. We know we probably can't solve the issue with the Russians.
01:45:10 Speaker_03
With the AI, we have no experience in history, so maybe it will... This sounds like a very, very big maybe.
01:45:17 Speaker_05
I'll be the first to admit it's crazy. And even with my point of view, I'm like, yeah, we're backed into a corner. I'm working backwards from the reality that the Russians are going to create AI humans.
01:45:27 Speaker_05
They're not going to adhere to this, even if we passed it through Congress. They're going to do it. So the only fake AI humans I'll be receiving will be from the Russians. So then I go like, fuck it, we got to floor it.
01:45:38 Speaker_05
And wow, what a moment in history. I don't know. Here we are. That's fascinating.
01:45:44 Speaker_03
There are areas when this is absolutely correct, but there are areas where cooperation is still possible because there is common interest. For instance, the most obvious example is the problem of control.
01:45:55 Speaker_03
If you create a very powerful AI, how do you make sure that you stay in control and that it doesn't get out of your control and start to manipulate you or start to do things unexpectedly and accruing power to itself and so forth?
01:46:11 Speaker_03
The good thing about it, this is a problem which frightens the Russians and the Chinese and the Iranians. just as much as it frightens the Americans and the Europeans and the Israelis.
01:46:22 Speaker_03
Because even if you are a power-hungry dictator, the last thing you want is a more powerful subordinate that you don't know how to control.
01:46:30 Speaker_05
I would imagine it scares them even more than us.
01:46:32 Speaker_03
Yes. So this is something, for instance, that scientists and experts on both sides have an incentive to work together.
01:46:41 Speaker_03
And if one side has a breakthrough, like a theoretical breakthrough of how do we ensure control, there is actually an incentive to share this. with the Chinese or the Russians.
01:46:52 Speaker_03
And if a Chinese scientist has this eureka moment and she finds the theoretical model for control, they have an incentive to share it.
01:47:02 Speaker_05
That's true. But I think the perfect parallel to this is the nuclear arms race, which is like, sure, we ended up with some treaties once everyone had their nuclear arms.
01:47:12 Speaker_05
So it's like, yeah, I can imagine a world where there's the Russian AI one they like, and then we have ours. And then finally we go, okay, no more development. But the only model we really have for it is the nuclear arms race.
01:47:21 Speaker_03
But it's a very problematic model. It's a very different situation for so many reasons. It's not just two sides. You have a third. With nuclear, you had the Americans and you had the Soviets, but the bombs themselves were not in the race.
01:47:34 Speaker_05
yeah right they weren't told execute america's goals in ai potentially will be a player more consequential than either the chinese or the americans or the russians we are creating an agent not a tool yeah if ghana got the breakthrough best ai it would all shift yeah in some bizarre ways there is a democratizing effect to it we also have a
01:47:59 Speaker_06
very recent example of the world coming together. We did it in the pandemic. Everyone decided to get on the same page with vaccines and also share those and be very open. That just happened. So I have optimism that we could globally.
01:48:14 Speaker_05
You also had China not admitting they had a pandemic because it looked bad for their geopolitical goals. All that stuff that I'm afraid of in this situation was also happening during COVID.
01:48:24 Speaker_06
It was, but there was still a level of cooperation that everyone got on board with. It's not going to be perfect, but there might be some cooperation.
01:48:33 Speaker_03
It will be very, very difficult, but I think that we should work on solving the human trust problem, at least alongside solving the AI trust. problem.
01:48:45 Speaker_03
And if you have the smartest people in the world working on AI and neglecting the human trust problem, this is a recipe for disaster.
01:48:55 Speaker_03
And again, humans have a long track of just working on the wrong problems, solving the problem and then discovering, oops, we actually solved the wrong problem.
01:49:04 Speaker_03
And the other thing I would say is that not to succumb to what we earlier discussed, this very cynical view of humanity, that all humans only care about power.
01:49:16 Speaker_03
If we succumb to that, that's the end of democracy and that's the end of any serious chance of cooperation on basically anything.
01:49:25 Speaker_05
Yeah, then it's just a war game.
01:49:26 Speaker_03
Yeah, and we need to remember, most importantly, that this is simply not true. That yes, humans are interested in power, but they are also interested in other things. Most importantly, the truth. We do it on the individual level. If you look at yourself,
01:49:41 Speaker_03
You would say, yes, I want power, but I also want to know the truth about myself, about the world. Partly because without the truth, you can never be happy.
01:49:50 Speaker_03
If you don't know the truth about the deep sources of misery in your life, you can never solve them. So people who only pursue power and completely disregard the truth, they tend to be very miserable people.
01:50:02 Speaker_03
Because again, they don't even know what problems to work on, because they don't know the true sources of the misery in their life.
01:50:10 Speaker_03
And no matter how much propaganda and fake news and conspiracy theories are thrown at us, ultimately, deep down in human nature, there is a real yearning for the truth that we can work with. And it's there in everybody.
01:50:27 Speaker_03
It's not the kind of monopoly of a single nation or a single political party, as misguided as the other people may be. Deep down, there is a real yearning for truth there.
01:50:40 Speaker_05
Yuval, always a pleasure. We're so honored you come every time you're promoting something. Nexus, A Brief History of Information Networks from the Stone Age to AI. I wish you a ton of luck. Another great book. You're so impressive.
01:50:51 Speaker_05
We're so grateful we have you. Thank you.
01:50:54 Speaker_02
It's been great. We hope you enjoyed this episode. Unfortunately, they made some mistakes.
01:51:06 Speaker_05
Are you excited?
01:51:07 Speaker_06
We're about to get on an aeroplane.
01:51:09 Speaker_05
Yeah, I'm excited. I looked it up. Did you already look it up? We can't check in where we want to check in. What do you mean? You can't check in on Delta 1. Unless you're flying.
01:51:21 Speaker_06
It's not Delta One.
01:51:22 Speaker_05
Yeah, I don't even really know what's going on, but I looked up today. Do all first class Delta flights leave out of Delta One?
01:51:30 Speaker_06
No, this I already know.
01:51:31 Speaker_05
So Delta One's its own category.
01:51:33 Speaker_06
It is. It's extra. And not most flights aren't available Delta One. It's very rare.
01:51:40 Speaker_05
We have a fun light that's interactive and has a poltergeist. I know, and I think we should keep it in the edits.
01:51:46 Speaker_06
No, because it makes me anxious.
01:51:48 Speaker_05
It does?
01:51:48 Speaker_06
Yeah.
01:51:49 Speaker_05
Like you're going to have a seizure?
01:51:50 Speaker_06
A little strobe-y.
01:51:51 Speaker_05
Okay.
01:51:53 Speaker_06
It makes me a little panicky.
01:51:54 Speaker_05
I have a lot of housekeeping to do today. Oh boy. Okay. Go ahead. I screen grab comments all the time and then they just get lost in my million photos. I think I'm low indexing on photos taken.
01:52:07 Speaker_05
Yet when I go through my photos, there's way too many to find anything I'd ever want to look at. It's very overwhelming. It's like your emails. All that to say, we're doing prompts for the next round of Armchair Anonymous.
01:52:19 Speaker_05
And I always screen grab if someone's got a good idea and I add that to the list. So then in pursuit of that, I came across some that I've been meaning to bring up and then I have forgotten to. See, I don't even know how to use my phone.
01:52:30 Speaker_05
I want to go to liked album.
01:52:31 Speaker_06
You know, you can do an album. You can make yourself your own album of comments.
01:52:37 Speaker_05
I need to do that. I only know how to do liked photos. Yeah. And then there's a liked category. I think we already cleared this up. In fact, I think I know we did. There's no Lazy River at ASU on the campus.
01:52:50 Speaker_05
Arizona State does not have a lazy river encircling it. There is an apartment complex in Tempe that offers a lazy river, though. What?
01:52:58 Speaker_06
Wait, no, there's a college that has a lazy river.
01:53:02 Speaker_05
We haven't corrected that.
01:53:04 Speaker_06
No, we haven't.
01:53:05 Speaker_05
OK, now this I already sent to you. And I just when I read a great quote, I like to pass it on.
01:53:11 Speaker_06
OK.
01:53:12 Speaker_05
This is by Pedestrian underscore verse. My favorite Warren Zevon quote. And Warren Zevon, if you don't remember, sings, send lawyers, guns and money. Ding, ding, ding.
01:53:24 Speaker_06
The screenplay you wrote.
01:53:25 Speaker_05
The shit has hit the fan. So very poetic singer, songwriter. And this is really to you.
01:53:30 Speaker_06
Yeah, you sent me this, yeah.
01:53:32 Speaker_05
We buy books because we like to think we're buying the time to read them.
01:53:36 Speaker_06
Yeah, I think that's accurate.
01:53:39 Speaker_05
It's very sweet and it's sad. Does it make you a little sad? Like we just want time and we think we can buy it.
01:53:48 Speaker_06
Well, we're desperate for it.
01:53:50 Speaker_05
Yeah, but we just buy, in your case, buy them, pop them on a shelf.
01:53:55 Speaker_06
Did I tell you what happened? The really, the red flag? No. Like when I knew, oh, I've really hit a new level of problem.
01:54:04 Speaker_09
Okay.
01:54:06 Speaker_06
I went to Barnes and Noble and I bought some books as I do.
01:54:10 Speaker_05
Yeah.
01:54:10 Speaker_06
And then I got home and I realized I had already bought one of them.
01:54:14 Speaker_05
Sure. Yeah. That makes a lot of sense.
01:54:17 Speaker_06
Yeah, but it was like, oh, that's not good.
01:54:22 Speaker_05
You have these little moments as an addict where it's like you realize something.
01:54:26 Speaker_06
Yeah, it was one of those.
01:54:27 Speaker_05
I wasn't my bottom because although have I bought like if you went to hide drugs in a spot and then when you lifted up the thing, there are already drugs in there. And you're like, Jesus Christ, I've been hiding drugs forever.
01:54:38 Speaker_06
Yeah, and I haven't read it, you know, I haven't read it.
01:54:40 Speaker_05
What is the book?
01:54:42 Speaker_06
It's called Never Let Me Go. I think it won like a big award, which is what I was drawn to.
01:54:48 Speaker_05
Okay, now this is to your beautiful book you were gifted, F the Comments book. Two of these, right? Credit for the F the comment special tribute to Monica goes to all the fellow arm cherries over the Facebook fan group.
01:55:01 Speaker_05
A sweet gal posed the idea to gather interest, created a super organized proposal to conceptualize it for folks, created a Google doc for all to contribute and gave a deadline for the submission. It was A plus work grassroots fandom at work.
01:55:17 Speaker_06
That's very sweet.
01:55:18 Speaker_05
I have another one on that topic. Here we go. Please tell Monica that the book came from Diana. She proposed the idea in our Facebook fan group, and we all submitted artwork or comments for the book.
01:55:28 Speaker_05
So they aren't Instagram comments, but fan-submitted letters intended for Monica.
01:55:33 Speaker_06
That's sweet.
01:55:34 Speaker_05
Okay, this is good. Not to get ahead of things, but Cedar Point is the row of amusement parks. Six Flags is perhaps a cool item you found at Target. This comparison might help Monica appreciate the difference.
01:55:51 Speaker_06
I don't think Target is going to acquire the row anytime soon.
01:55:55 Speaker_05
No, no. Six flags is the row. Oh, I see what you're saying, I got you.
01:56:00 Speaker_06
No, Cedar Point is the row.
01:56:01 Speaker_05
Has been acquired by Target.
01:56:04 Speaker_06
Yeah, that's not gonna happen. So I appreciate that person trying, but that is not a good analogy.
01:56:10 Speaker_05
Two things, great point number one, solid point. But also, that's actually not outside of the realm. I could see Target acquiring the row at some point for 1.5 bill.
01:56:20 Speaker_06
They'd never allow it. They would never allow it.
01:56:23 Speaker_05
100 bill? $1 trillion.
01:56:26 Speaker_06
They don't need, they don't need the money like that. And they're, they're so dead set on it being what it is.
01:56:34 Speaker_05
Although if they, at some point they could go, okay, great. Well, now we have a hundred trillion to start an even better brand. Can't even imagine a better brand, can you?
01:56:45 Speaker_06
There's no such thing. Speaking of, um, I'm going to, well, multiple parties, Halloween parties.
01:56:52 Speaker_05
Okay.
01:56:52 Speaker_06
So I'm gonna do a theme, okay?
01:56:54 Speaker_05
All right.
01:56:55 Speaker_06
Your house party on actual Halloween.
01:56:58 Speaker_05
Yes, the hayride.
01:57:00 Speaker_06
The hayride. The theme is movies.
01:57:03 Speaker_05
Is that what it is?
01:57:04 Speaker_06
Yeah, it's just anyone be anything from a movie. It's very broad.
01:57:07 Speaker_05
Yeah, very.
01:57:08 Speaker_06
Okay, so there's that. Then I'm going to a party a couple days later. I think you might be going to that party too. And that's just dress up.
01:57:16 Speaker_05
Okay.
01:57:16 Speaker_06
There's no theme. Your party, I think I'm gonna go as either Mary Kay or Ashley from one of their kiddie movies. Like, you know, their direct to VHS movies.
01:57:29 Speaker_05
Yeah, so that's gonna be a denim overall on skirt, right?
01:57:34 Speaker_06
No.
01:57:34 Speaker_05
I'm serious. No. I have an image of them in like matching denim overall skirts. Sure. I'm not being a perv right now.
01:57:41 Speaker_06
Okay, well.
01:57:42 Speaker_05
I know it's hard to know when I am.
01:57:43 Speaker_06
Don't act like it's, yeah, exactly. Don't act like it's out of the realm.
01:57:46 Speaker_05
I know.
01:57:46 Speaker_06
No, they have all kinds. Like they had one, Adventures of Mary-Kate and Ashley, The Case of the Mystery Cruise, and they're kinda in like mystery Sherlock Holmes-y outfits.
01:57:57 Speaker_05
Oh, okay.
01:57:58 Speaker_06
So I could be that. There are options. And I'm gonna probably try to get a Michelle doll to be my other. Michelle from Full House.
01:58:07 Speaker_05
That was the child's name?
01:58:08 Speaker_06
Yeah. And in order for people to understand I have a twin, I'll need her, the doll.
01:58:14 Speaker_05
Yeah.
01:58:14 Speaker_06
Because I'm not going as a pair for the party.
01:58:17 Speaker_05
Have you, though, pitched that to anyone?
01:58:21 Speaker_06
Everyone's coupled up.
01:58:23 Speaker_05
Okay.
01:58:23 Speaker_06
So that's fine. I can wear a doll. Okay. Now for the next party, I do have a friend coming and we're going to go together.
01:58:32 Speaker_05
Okay. Who's coming?
01:58:33 Speaker_06
Anna. Anna's going to come.
01:58:34 Speaker_05
Okay, great.
01:58:35 Speaker_06
And we're going to go as Mira, Kate, and Ashley, the Roe era.
01:58:38 Speaker_05
Oh, wow.
01:58:39 Speaker_06
So I'm going to be different versions of Mary-Kate and Ashley throughout Halloween.
01:58:44 Speaker_05
Oh, that's fun. I'm nervous how you're going to make it obvious that you're them.
01:58:50 Speaker_06
Mary-Kate and Ashley do wear big sunglasses.
01:58:52 Speaker_05
Yeah.
01:58:53 Speaker_06
And then they wear like big clothes layered, like lots of layers. We looked at a bunch of pictures. Sometimes they have two bags.
01:59:03 Speaker_05
Oh, that's cool.
01:59:04 Speaker_06
And they're always smoking, so we're gonna have cigarettes.
01:59:06 Speaker_05
Oh, great.
01:59:07 Speaker_06
And often they have coffee cups, so we're gonna have a coffee cup. And we're also gonna do a really deep contour of the face, because they're kind of known for that.
01:59:16 Speaker_09
Okay.
01:59:17 Speaker_06
And then we're gonna style our hair like they style their hair. I'm not wearing a wig, for people who are wondering. I'm not wearing a wig, and I'm not wearing whiteface. And I don't think I should have to do that, so I'm not.
01:59:28 Speaker_06
Okay, and I am using it as an excuse to buy some some raw stuff that there we go I see what's happening if you can use if you can use holidays as excuses to buy yourself stuff you should
01:59:42 Speaker_05
Well, shit, I should go as Burt Reynolds in Smoking the Bandit so that I can buy a 77 Trans Am. Yeah, exactly. Fuck, that's a good one. Maybe the next year, because next year, we might have to end Halloween party after next year.
01:59:55 Speaker_06
Why?
01:59:56 Speaker_05
I'm gonna explain this to you. I have been trying to get an In-N-Out truck to the house for three years. It's so booked, but it's, I have it booked for Halloween 2025.
02:00:09 Speaker_06
That's so exciting. A fucking, can you? I want it so bad.
02:00:14 Speaker_05
Can you believe this? I'm gonna be able to get away with murder in my neighborhood.
02:00:18 Speaker_06
Yeah.
02:00:19 Speaker_05
Right? Yeah. If you're the guy who does a hayride and has an In-N-Out truck, I feel like I can ride nude around the neighborhood on a dirt bike with no muffler.
02:00:27 Speaker_06
Probably.
02:00:29 Speaker_05
But I don't think there's anywhere to go from there.
02:00:32 Speaker_06
No, that's, no, don't end it just because of that.
02:00:37 Speaker_05
Well, if one can't top oneself, then isn't it beholden on one to move on to another holiday?
02:00:44 Speaker_06
No, why do you have to top? You can just keep going.
02:00:47 Speaker_05
It's the law of comedy. Your second joke has to be better than your first, your third has to be better than your second. You know the rules.
02:00:53 Speaker_06
I do.
02:00:54 Speaker_05
You're acting so naive, like ignorance of the law is a defense, and it's not.
02:00:58 Speaker_06
I just don't find your hayride to be much, to be very comedic. Your Halloween party.
02:01:05 Speaker_05
Well, I don't want to say it was comedic. It wasn't, it was more, I'm not even going to say it, but last year someone fell off.
02:01:11 Speaker_05
And if you didn't see that there was an injury, you might think that part was funny, but I was more concerned and I was assisting, helping the person up.
02:01:20 Speaker_06
I miss that.
02:01:21 Speaker_05
Oh, I'm so glad you did. It was not.
02:01:25 Speaker_06
In the middle of the drive.
02:01:26 Speaker_05
Upon embarking. And my mother showed such little.
02:01:31 Speaker_06
Concern.
02:01:32 Speaker_05
She just was like, fuck it, get it together. It was kind of.
02:01:35 Speaker_06
Oh wow.
02:01:37 Speaker_05
I like my childhood is what it boils down to. But you know, I am like, it just happened last night. So we're watching, we watched Happy Gilmore, which was really fun.
02:01:47 Speaker_06
Okay.
02:01:47 Speaker_05
They haven't seen Happy Gilmore.
02:01:48 Speaker_06
That's fun.
02:01:49 Speaker_05
Kids loved it. I'm not even, that's not the point. The point is, is my children are just barking orders at Kristen. Why can't I get a Perrier? You know, like they're not even, I'm probably, it's like your milkshake thing. Right?
02:02:03 Speaker_05
And I think this is how it should be. No, no, no.
02:02:07 Speaker_06
Go on and then I wanna rebuttal.
02:02:08 Speaker_05
Okay, great. Well, do you wanna rebuttal now? Yeah. Because I don't want you to be distracted while I'm making my broader point.
02:02:14 Speaker_06
Okay, yes, because I actually thought about this the other day because you brought up me being spoiled again. And I had sort of this visceral reaction and I had to sit and think about what was happening.
02:02:27 Speaker_05
What was really triggered?
02:02:28 Speaker_06
I think my parents really, really spoil me now
02:02:32 Speaker_05
OK, and because they were working so much, they did.
02:02:36 Speaker_06
I was not a spoiled kid at all.
02:02:39 Speaker_05
Gotcha.
02:02:39 Speaker_06
And so I think I really don't like being called that because it really doesn't feel accurate. They worked a ton. Yes. And you were allowed to have time to make me sandwiches all day.
02:02:53 Speaker_05
Yeah. And and in high school, were you screaming? I feel like in high school you were screaming for milkshake.
02:02:58 Speaker_06
What? It was one time.
02:03:01 Speaker_05
Okay.
02:03:01 Speaker_06
No, no, no, no, no, no. I didn't.
02:03:03 Speaker_05
Those are sleepovers.
02:03:04 Speaker_06
You'd be like, mom. You want milkshakes. You're getting confused. What happened is Callie was over. It was like midnight. And I was like, oh, we should make milkshakes.
02:03:18 Speaker_06
And so I went downstairs and I was making a milkshake and it was waking everybody up because I was blending in the middle of the night. And then my mom like yelled at me or was like, what are you doing? And I was like, I'm making a milkshake.
02:03:35 Speaker_05
Okay.
02:03:36 Speaker_06
So I wasn't telling her to make a milkshake, but I was not caring about her needs.
02:03:42 Speaker_05
And you've never screamed for your mother to make you a milkshake or a sandwich in your high school years?
02:03:48 Speaker_06
No, you've made it into me screaming for food.
02:03:50 Speaker_05
Listen, I know you're offended by it, but I think it's very cute. It goes along with the boss in a town car.
02:03:57 Speaker_06
Hey, no, you love that image. But the problem is it's not correct. And then some people who don't think it's cute will mistake me as someone like that, and I'm not.
02:04:08 Speaker_05
Okay, I'm gonna leave you out of it.
02:04:10 Speaker_06
Okay, thank you.
02:04:11 Speaker_05
So my kids are barking orders. Delta wants a fruit bowl, an actual fruit bowl.
02:04:16 Speaker_06
Not like, can you grab me- Like cut up fruit?
02:04:18 Speaker_05
Yes, can you grab me an apple? It's, I would like a fruit medley.
02:04:22 Speaker_06
Wow.
02:04:23 Speaker_05
And by God, she complies. Kristen's right up and she's making a very nice fruit bowl, delivers it, and Delta pounds it. And she's like, you gotta understand the delivery of all these things, because she's just like sucked into Happy Gilmore.
02:04:37 Speaker_05
And then she's like, Mom! And she just holds the empty bowl out. And that means I want another fruit salad. What? Kristen just jumps up and grabs the bowl.
02:04:47 Speaker_06
She does?
02:04:48 Speaker_05
Yes, and then I started getting very scared. My kids are gonna be entitled assholes, right? But then it just I thought to ask and by the way, the reason I thought that is I never Ever my mother never got me anything out of the kitchen.
02:05:05 Speaker_05
Yeah, I never called for anything I never said I'm hungry and I like that childhood for whatever reason.
02:05:11 Speaker_06
Yeah, I
02:05:12 Speaker_05
But then it crossed my mind, I'm like, you know, maybe Kristen's mother did this to her and now she's passing it on.
02:05:19 Speaker_05
And maybe it's great that you have a period of your life where you're just like yelling, you're a little kid and you just yell what you need and your mom hustles around and gets it.
02:05:27 Speaker_05
And then you'll grow up and you'll pass that on and everything's even. So I said, hon, did your mom, while she's making this fourth, fruit salad. I said, did your mom like just wait on you like this when you're a kid? And she said, yeah, for sure.
02:05:44 Speaker_05
And I was like, okay, cool. So she like received and now she's paying it back. Interesting. I didn't receive and I'm not paying it. I mean, I make them dinner and shit, but I'm not jumping off the couch to get anybody anything.
02:05:53 Speaker_05
If we're all on the couch, my rule is like, yeah, I'm thirsty too.
02:05:57 Speaker_06
Yeah.
02:05:58 Speaker_05
Not like, hey, go get me a Perrier. Right. If I'm in there, I'll yell, hey, does anyone want a Perrier? But I'm not getting up to get you something if we're both seated.
02:06:08 Speaker_06
Well, yeah, this is part of this is a story you guys used to tell all the time.
02:06:13 Speaker_05
Yes. When we first started dating. Yeah.
02:06:15 Speaker_06
Yeah. It was your main story. It was, and it made sense. It was like, she asked you to get, you guys were both watching TV. She asked you to get her water.
02:06:27 Speaker_05
A glass of water, yeah, yeah.
02:06:29 Speaker_06
And you were really shaken by this.
02:06:31 Speaker_05
Yeah.
02:06:32 Speaker_06
And it's the same thing.
02:06:33 Speaker_05
But you know, yeah, that was a main story. But I wonder how you interpret the story, because for me, that's a story about my own personal growth.
02:06:41 Speaker_05
It was a shift from thinking everyone's trying to take advantage of you to going, no, not everyone's trying to take advantage of you. And you can just do nice things for people. And it's not setting you up.
02:06:52 Speaker_05
It's not setting you up for a pattern of being abused.
02:06:55 Speaker_06
I think that is what it is.
02:06:56 Speaker_05
And for me, it was like a breakthrough moment of maybe trusting people.
02:07:01 Speaker_06
That is what I thought.
02:07:02 Speaker_05
Has that always been the message of that story to you?
02:07:05 Speaker_06
Yeah. But it's similar in this way where you just said, like, I'm not getting up.
02:07:09 Speaker_05
Yes, but I don't have the baggage of before where I'm like, oh, my kids think like.
02:07:14 Speaker_05
So 17 years ago, this situation, I would have had all this other stuff attached to them thinking I should get up and grab them stuff like, oh, they just think I'm here to serve them and they think and they just want me to.
02:07:29 Speaker_05
But I don't I don't think any of those thoughts. I think they're benevolent and nice and generous and they want something. They're going to give it a shot and scream if they can get it. And if they can't, then they'll get up and get it.
02:07:41 Speaker_05
It's not layered in all this mistrust of everyone's motives. Yeah.
02:07:45 Speaker_06
I mean, I think the parental relationship like that's much different with a kid and a parent. It is your job to provide for them. So they can't like taking advantage isn't really at play. I mean, it can be to an extent.
02:07:58 Speaker_05
Yeah, but I do think I do think Parents have chips on their shoulders about being disrespected. Like we can call it all these different words, but I think in some way they all mean the same thing emotionally.
02:08:09 Speaker_05
So it's like taken for granted, disrespected. Yeah. And I'm not currently dealing with any of those feelings, but I could imagine 17 years ago again,
02:08:20 Speaker_05
I always say how grateful I am that I waited so long, but like, I don't know, maybe those, I would have had those feelings.
02:08:26 Speaker_06
Yeah, maybe.
02:08:28 Speaker_05
17 years ago. Okay, what are you most excited about for Austin?
02:08:32 Speaker_06
Well, I'm really excited to go to my vintage store.
02:08:34 Speaker_05
You love to shop in Austin.
02:08:35 Speaker_06
I do. There's a store I really like there and I'm really excited to go. That's the main one. Okay. And foodies.
02:08:42 Speaker_05
Food, yummy foods. Yeah. Yeah, yeah, yeah.
02:08:46 Speaker_06
What about you?
02:08:48 Speaker_05
Sprint race.
02:08:49 Speaker_06
Uh-huh.
02:08:50 Speaker_05
We're going to see the sprint race. Yeah, that's fun. At COTA. That'll be really fun. Yeah. And then Adam Grant's going to interview us on stage.
02:08:57 Speaker_06
I'm really excited for that.
02:08:58 Speaker_05
We're going to have dinner with Adam Grant on Wednesday at one of my favorite steakhouses.
02:09:03 Speaker_06
Yeah, I'm really excited.
02:09:04 Speaker_05
Where I had my infamous date with Matthew McConaughey.
02:09:07 Speaker_06
Uh-huh.
02:09:07 Speaker_05
Yeah. Yeah, just fun, fun, fun. I'll probably swim a few times in Bee Springs.
02:09:12 Speaker_06
While I'm at the vintage store?
02:09:14 Speaker_05
Yeah, yeah, yeah, yeah.
02:09:14 Speaker_06
Okay, we'll plan that accordingly. I forgot what I was going to say. It was about Mary-Kate and Ashley.
02:09:20 Speaker_05
Wow. You think it's about Mary-Kate and Ashley? I do. You do? I think most things circle back to them.
02:09:26 Speaker_06
Oh, one thing that we are going to do to designate, we're going to wear necklaces with the initials. Smart. I think that'll help get it over the edge.
02:09:35 Speaker_05
Which one are you going to be?
02:09:36 Speaker_06
So I'm debating.
02:09:38 Speaker_05
Do you have a favorite? You don't have to say it out loud, but do you have a favorite?
02:09:42 Speaker_06
I actually don't.
02:09:43 Speaker_05
You don't. Yeah.
02:09:44 Speaker_06
I mean, I think I should because you dated one. So I should pick her.
02:09:48 Speaker_05
I've really vouched for one.
02:09:49 Speaker_06
Yes, exactly. So I guess I'll pick her. Yeah. But I don't want to do that. Like, I like them both.
02:09:58 Speaker_05
And I think it's like me asking you to pick between Ben and Matt. You don't like that either. Even though I know who you pick.
02:10:03 Speaker_06
No, I don't. No, you don't.
02:10:04 Speaker_05
And I know who you pick from.
02:10:05 Speaker_06
No, you don't. No, none of that's true.
02:10:08 Speaker_05
You and I are going to do a commercial unrelated to this podcast.
02:10:10 Speaker_06
Yeah, we are. I'm excited for that.
02:10:12 Speaker_05
Is any of your current shopping thing like, oh, I just got some extra cash. I wasn't expecting. Like, do you feel like you have a little money?
02:10:20 Speaker_06
I kind of forgot about that.
02:10:22 Speaker_05
You are so funny about money. You're very you know your value.
02:10:28 Speaker_06
Yes.
02:10:28 Speaker_05
And you're a hard bargainer. Yeah. Yeah. But then also you don't give a flying fuck. It's really funny.
02:10:34 Speaker_06
Yes, that is right.
02:10:35 Speaker_05
You never look again. You're not sure if you ever got any of it, but you.
02:10:38 Speaker_06
Yeah, yeah. When I am like negotiating, it is not, it's not about the money, really.
02:10:46 Speaker_05
I know. Ever. I know.
02:10:48 Speaker_06
Which I think probably makes it very difficult for the person I'm negotiating with.
02:10:52 Speaker_05
Right.
02:10:53 Speaker_06
Because it's really, really not about that. It is. It's about my value and my place and that's it.
02:11:00 Speaker_05
And for me, it's just about the money.
02:11:01 Speaker_06
Right. I know. We're very different.
02:11:03 Speaker_05
We are. Yeah, yeah. It's one of our many differences. But I'm like everything is OK. I get this. That that amounts for safety. And then I got to make X amount so I can do something fun and buy something I want.
02:11:14 Speaker_06
It's not really fair for me to act like I don't have any of that because I do. I definitely do. Now that I have this like big expense, the house. Yeah, that's changed. I mean, not change, but I. You have to consider that.
02:11:29 Speaker_06
I have to think about money a lot and like where things are going and is there enough for this and this and this.
02:11:35 Speaker_05
And the necklaces.
02:11:36 Speaker_06
Yeah, I think there is. But, oh, but I'll just say, I do stress out about it.
02:11:44 Speaker_08
You do?
02:11:44 Speaker_06
I do. Okay. And I can't say I don't want it, like I do.
02:11:50 Speaker_05
Yeah, yeah, yeah.
02:11:51 Speaker_06
That would be lying.
02:11:53 Speaker_05
Yeah.
02:11:53 Speaker_06
But anyway, because of all these necklaces, I think I'm gonna be Mary-Kate. Okay. Because Anna should probably be Ashley because of the A. Sure.
02:12:03 Speaker_05
Although it's not gonna help people understand your costume. Because they're gonna see A, and they're gonna go, oh, Ana's wearing an A for Ana, and you're wearing an M for Monica, and I guess her middle name is Kristen, and we never realized that.
02:12:16 Speaker_05
Or Kelly. No, no. Monica Kelly Padman.
02:12:19 Speaker_06
Ana and I are gonna be standing next to each other the whole night. We're not allowed to leave. Wow. Yeah. There's a commitment. What do you think, since you know one of them well, what do you think I could do in order to really make it clear?
02:12:38 Speaker_06
Is there anything?
02:12:39 Speaker_05
I don't know.
02:12:42 Speaker_06
Do you know about like a secret freckle or something?
02:12:51 Speaker_05
Also it'd be no use. If I know about a secret freckle and then you put a secret freckle on your arm, no one will put two and two together because they don't know about the secret freckle.
02:12:58 Speaker_06
But I'll know and it'll be like working from the outside in.
02:13:01 Speaker_05
Right, like great acting.
02:13:02 Speaker_06
Oh, I can't wait.
02:13:03 Speaker_05
How would you feel, here's a question, because I think I already know, because you've kind of alluded to it when talking about whether you date an armchair or not.
02:13:13 Speaker_05
If you knew of a popular podcaster who was completely obsessed with you, would you be open to having a friendship with them?
02:13:21 Speaker_06
Do I like their podcast?
02:13:22 Speaker_05
Yeah. Yeah, it's a good podcast.
02:13:25 Speaker_06
Then yeah.
02:13:25 Speaker_05
Yeah, okay. I would too. I don't mind that.
02:13:29 Speaker_06
I mean, I would be happy to have a friendship with someone who liked the show. I guess that's what you sort of mean.
02:13:37 Speaker_05
No, obsessed with you, the way you're obsessed with Mary-Kate and Ashley.
02:13:42 Speaker_06
That would be harder for me.
02:13:44 Speaker_05
Yeah.
02:13:45 Speaker_06
Yeah, I don't think so. But I think if, it's also a bit, like it is also a bit. I think if in real life I met them and I liked them, like they were cool and I enjoyed them, then that goes away.
02:13:58 Speaker_06
Like it's just part of this like fun thing to be excited about.
02:14:03 Speaker_05
Yeah, I was just curious.
02:14:04 Speaker_06
I think I don't do that in real life.
02:14:06 Speaker_05
Like I- No, I know.
02:14:08 Speaker_06
Maybe to a detriment, like there isn't anyone really, really that like, brings out here that I think is extra special on this earth. Yeah, polar. I love her so much and I want her to call me and I want to be friends.
02:14:25 Speaker_06
But I think if we hung out like a few times, I wouldn't feel like she was better than me. I don't normally feel like that if I really know someone.
02:14:34 Speaker_05
True, of course. Yeah, I'm with you. As we know, I have an obsession with Robert Downey Jr. that I've had since I was 12 or maybe 10. And then I have a friendship with him that has nothing to do with that obsession.
02:14:46 Speaker_06
Right. But it's funny because I think you can still, which is I think a good thing, you can still kind of click into that.
02:14:55 Speaker_05
Yeah. Well, especially when I see him do the thing that enamored me with him.
02:14:59 Speaker_06
Right.
02:15:00 Speaker_05
When I watch him act in certain things. I go, oh, he's so special. He's just a little shooting comment, you know?
02:15:07 Speaker_06
Well, that, I'm not, no, I'm not saying that you can click into the fact that he's like such a cool person, but I think you can, you can get excited that he likes you.
02:15:19 Speaker_05
Oh yeah, yeah, yeah, yeah.
02:15:20 Speaker_06
Cause like I had this with Kristen.
02:15:21 Speaker_05
Right.
02:15:22 Speaker_06
I was so obsessed with her.
02:15:24 Speaker_05
Yeah, yeah.
02:15:25 Speaker_06
And then now that I know her and we're, I think equals, I love her and I'm so happy to have her in my life and I can see like, I'm proud of her when she does good, when she does stuff.
02:15:39 Speaker_09
Yeah, yeah.
02:15:39 Speaker_06
But I never, I don't think like, I'm so happy she likes me or I'm so happy she's my friend. I mean, I'm so happy she's my friend.
02:15:50 Speaker_05
For legitimate, substantive reasons.
02:15:52 Speaker_06
I'm so happy we're friends.
02:15:53 Speaker_05
Yeah, yeah, yeah, yeah.
02:15:54 Speaker_06
But not, I'm so happy she's my friend.
02:15:57 Speaker_05
Yeah, I wouldn't frame it exactly like that. Mm-hmm, yours, yours. It's like, I'm friends with Downey, we get along how we get along. And I can go, oh my God, I can't believe this boy I love my whole life likes me. I also have room for that feeling good.
02:16:13 Speaker_06
Yeah.
02:16:14 Speaker_05
That still feels good. I think it's kind of like when you fall in love with somebody and then it flattens out and the good chemicals are gone and now you're just a partnership.
02:16:25 Speaker_05
You can remember meeting them and falling in love with them and you can still remember all those feelings and get those butterfly giddy feelings when you reflect back on it.
02:16:35 Speaker_06
Yeah.
02:16:36 Speaker_05
This has been a long walk and I'm not sure where it landed, but.
02:16:39 Speaker_06
Yeah, me neither. Okay, so this is for Yuval.
02:16:42 Speaker_05
He would be a good example of this. He's both things. Like now he's been here three times. Now I'm used to him being here. I'm way less intimidated. I feel like I can just chat with him.
02:16:52 Speaker_06
Yeah.
02:16:53 Speaker_05
And I remember how special it is that Yuval trusts us to come talk.
02:16:57 Speaker_06
Yeah.
02:16:58 Speaker_05
I can like feel both of those things simultaneously.
02:17:00 Speaker_06
Yeah, I mean, I don't, he's not our friend though. I mean, he's like, it's friendly, but like if I hung out with him every day or if I hung out with him once a week, if he was like in my social circle, I would be impressed by him as a person always.
02:17:17 Speaker_06
But I don't think I would feel like, oh my God, I'm just so grateful that this person on earth is spending time with me.
02:17:25 Speaker_05
Yeah, yeah, yeah, yeah, yeah.
02:17:27 Speaker_06
I would just be like, God, what an amazing person.
02:17:30 Speaker_05
Yes.
02:17:30 Speaker_06
Amazing friend I have. Let's see, let's see a few facts.
02:17:34 Speaker_05
5'10", hope you started listening to Dimensions.
02:17:40 Speaker_06
Okay, Lorena Bobbitt.
02:17:42 Speaker_05
Oh, sure.
02:17:42 Speaker_06
That was in Virginia.
02:17:43 Speaker_05
Did you know all about that or is that before your time?
02:17:45 Speaker_06
Yeah, no, I knew. I mean, I didn't, it was sort of before my time, but.
02:17:48 Speaker_05
But it transcended.
02:17:49 Speaker_06
Exactly. Yeah. It was in 93.
02:17:52 Speaker_05
Oh, the year of my graduation.
02:17:54 Speaker_06
Oh, she severed her husband John's penis.
02:17:57 Speaker_05
Boy, what a tabloid sensation that was because then he went on to do a porno.
02:18:01 Speaker_06
Lorena stated in a court hearing after coming home that evening, her husband had raped her and then he went to sleep. She got out of bed and went to the kitchen for a drink of water.
02:18:10 Speaker_06
She then grabbed an eight inch carving knife on the kitchen counter, returned to their bedroom, pulled back the bedsheets and cut off his penis. After this, Lorena left the apartment with the severed appendage and drove away in the car.
02:18:22 Speaker_06
After a length of time driving and struggling to steer with one hand due to holding the penis, Wikipedia, she threw the penis out a window into a roadside field on Maplewood Drive.
02:18:33 Speaker_05
Oh wow. Can you imagine if you were walking down the road and you saw some woman roll the window down and you saw a thing fly out and you're like, that looked like a body part. That looked like a penis, but get real. There's no way that was a penis.
02:18:44 Speaker_06
And do you know it was reattached?
02:18:46 Speaker_05
Yeah. And he starred in a pornographic film.
02:18:48 Speaker_06
Oh, with his reattachment.
02:18:50 Speaker_05
Yeah. And there is some play on words in the title, as often was the case. There was a whole era where they paid, they like paid top dollar for famous people to be in pornos. Back when pornos sold VHSs and it was like an industry.
02:19:08 Speaker_06
Yeah. And rentals, like with the rental scene blockbuster.
02:19:12 Speaker_05
Yeah. It's kind of sad that that went away because you'd get these fun, you know, there was, um, Screech was in a pornographic film.
02:19:18 Speaker_06
Um, yeah, it says he went on to star in Two. Two.
02:19:22 Speaker_05
Oh my gosh. Wow. I didn't remember it was a rape. That's horrific. It is. And I'm glad she cut his penis off.
02:19:27 Speaker_06
Yeah, I know.
02:19:28 Speaker_05
But that's not a good course of action. I'm not recommending anyone else does that. But in this one case, I like it.
02:19:33 Speaker_06
Oh, I looked up the percentage of male pedophilia versus female.
02:19:39 Speaker_05
Oh, yeah. What was it?
02:19:40 Speaker_06
It says male perpetrators account for the vast majority of sexual crimes committed against children. Among convicted offenders, point four to four percent are female.
02:19:49 Speaker_05
10X window, that's interesting.
02:19:50 Speaker_06
And one literature review estimates that the ratio of male to female child molesters is 10 to one.
02:19:55 Speaker_05
That sounds right. Now, let me ask you this. I don't know why this would be the case, but I'm just floating it. You think female perpetrators are underreported? Like, I wonder if you're less likely to chalk up what happened to that.
02:20:06 Speaker_06
Maybe.
02:20:07 Speaker_05
But I definitely accept it's 10 to one men. I've heard, I don't want to say a lot. I think I know about four dudes who have babysitter stories. older babysitter stories that were female. But again, they're not reporting that.
02:20:20 Speaker_05
They're like they like that's a fun memory of theirs.
02:20:23 Speaker_06
The babysitter.
02:20:24 Speaker_05
The older female babysitter fooled around with the younger boy. Yeah. Interesting.
02:20:30 Speaker_06
How much older? Because sometimes young girls were babysitters.
02:20:35 Speaker_05
Right.
02:20:35 Speaker_06
To like kids a year younger than them.
02:20:38 Speaker_05
Boy, I would I would be guessing at the detail. I don't want to guess.
02:20:41 Speaker_06
Should we call them?
02:20:44 Speaker_05
Get everyone on the line.
02:20:45 Speaker_06
Okay, he used the word lacuna a few times that I had never heard and I liked it.
02:20:49 Speaker_05
Yeah, lacuna.
02:20:50 Speaker_06
And it's an unfilled space or interval, a gap. The journal has filled a lacuna in Middle Eastern studies.
02:20:58 Speaker_05
Meaning there was no Middle Eastern studies and then they filled it in.
02:21:04 Speaker_06
It's not like it's understudied. It's like within female studies, there's an area that is not explored.
02:21:10 Speaker_05
Okay, there's a lacuna within it.
02:21:12 Speaker_06
Yeah, and I like that. And I thought maybe I'll use that word sometime.
02:21:15 Speaker_05
I've thought of that in terms of physicists. It's just like, I don't know who even tackles physics because to advance it at this point is so daunting.
02:21:23 Speaker_06
Yeah, I agree.
02:21:25 Speaker_05
I mean, the place we're already at, no one understands.
02:21:28 Speaker_06
Hold on, I'm still like reeling over this Lazy River situation. I'm shocked. Why did we, that was a thing.
02:21:36 Speaker_05
I feel like I watched the 60 Minutes on how it was like a whole segment on colleges competing now for students and the amenities they're putting in there.
02:21:44 Speaker_06
Yeah.
02:21:44 Speaker_05
Maybe type in, is there any college with a lazy river? I know. Maybe we should have the wrong college.
02:21:49 Speaker_06
College lazy river.
02:21:51 Speaker_05
But we're not going to be invited to ASU to do any lectures or anything between your bashing of their academic prowess and then now this misinformation about the lazy river.
02:22:00 Speaker_06
Well, I'm sorry. I'm sorry to Arizona State, if you don't have a Lazy River.
02:22:04 Speaker_05
I'm sorry you don't have one. Sure. It's hot there.
02:22:08 Speaker_06
I know, but are they trying to be like, is the residence like right next door and it kind of goes through campus a little bit?
02:22:14 Speaker_05
And it is an official dorm?
02:22:15 Speaker_06
University of North Florida students can drift down a Lazy River at the new Osprey Fountains Residence House. Okay, so they have one. Yeah. I'm happy to take down University of North Florida. Okay. Oh, I'm happy to move on to that.
02:22:28 Speaker_05
I mean, these are all college students. Do you think they're hooking up on these inner tubes? Like people are pan business and stuff? Exactly. Yeah, yeah. How could they resist?
02:22:35 Speaker_06
College kids are disgusting.
02:22:37 Speaker_05
Nighttime Lazy River floats. Exactly. Full of activity. Full of spray. Send me back. I'll spray. Yeah, I want a time machine, but it wouldn't work because it didn't exist when I was younger.
02:22:50 Speaker_06
True. Yeah.
02:22:51 Speaker_05
There's really no hack for this. I just don't ever get to be young on a lazy river at a college campus.
02:22:57 Speaker_06
No, you need to take a time machine to go back and then get in the time machine.
02:23:02 Speaker_05
Kidnap my young self.
02:23:04 Speaker_06
Yeah.
02:23:05 Speaker_05
Send me to here.
02:23:06 Speaker_06
Exactly. And send him to ASU.
02:23:08 Speaker_05
And then I stay back there or something. And then I tell him, you got to come back and get me. But he never will because he'll be having so much fun on the lazy river.
02:23:15 Speaker_06
Because you can't be in the same place twice?
02:23:17 Speaker_05
I don't know. I'm not sure how time travel works.
02:23:18 Speaker_06
What's a thought you have that is a recurring thought that's absolutely absurd but like you do think? Like mine is I think about the reality of teleportation like way too much.
02:23:30 Speaker_05
I have obsessed on teleportation. I thought, I think I told you I had a whole, I was going to write a whole script about it.
02:23:35 Speaker_05
I don't know if you remember this, but if you really play out teleportation, that single invention would destroy like six of our biggest 10 industries globally. It would destroy transportation. Well, sure. It would destroy big oil.
02:23:51 Speaker_05
It would destroy the housing market. Why?
02:23:54 Speaker_05
Because if you can live in Wyoming on 19 acres and work in Manhattan and eat dinner in San Francisco, you're not going to you don't need to live in the highly densely populated area where the prices are so high.
02:24:09 Speaker_05
Well, it would just everyone could live on three acres and be in a city whenever they wanted to and enjoying the amenities of the city. And just when they go home, they go home to peace and solitude.
02:24:19 Speaker_06
There's a finite amount of space. So that it would just be that those places with a lot of space would be then the most coveted place.
02:24:27 Speaker_05
But the USA is is extremely sparsely populated. populated. It's not densely populated at all. 99% of the land in the U.S. is completely empty. It's all in the middle of the coast. If you go to Wyoming, it's just wide fucking open. Montana is wide open.
02:24:45 Speaker_05
Idaho is wide open. All these places are wide open. So everyone would live on the farm in the pretty part of Tennessee. I mean, I wouldn't, but... Okay, but you get my point.
02:24:55 Speaker_05
I do, but I just... The housing market is driven by access to these opportunities, and those areas get densely populated, and then it's supply and demand that drives the price up.
02:25:05 Speaker_05
But when no one needs to live by where they work, that will collapse entirely. Because you walk out your door, and you could be on the street in Manhattan. Like, anything you'd want from living in Manhattan, you could have.
02:25:17 Speaker_05
But then at night, you could be sleeping in a very quiet farm.
02:25:20 Speaker_06
I see what you mean. But then I just think it would I think it would just change where the where the country was densely populated. It would just be the cities now would be more open as in those places.
02:25:33 Speaker_05
And that's where, by the way, the vast majority of equity in real estate is only in one percent of the country. It's like in all these cities. Yeah.
02:25:42 Speaker_05
So once that collapses, 90 percent of the value of real estate has collapsed because that the cities were holding all that value.
02:25:50 Speaker_06
But why wouldn't it just move to the rural areas?
02:25:53 Speaker_05
Because there's no competition for space. Like Manhattan is a tiny island. San Francisco is a tiny area.
02:26:00 Speaker_05
So no one's going to spend $5 million to live in an apartment in Manhattan when for $600,000 they can have a mansion in Kansas on a, you know, again, you're only sleeping there.
02:26:13 Speaker_06
I guess that's true.
02:26:14 Speaker_05
And having Christmas or whatever you're doing.
02:26:15 Speaker_06
I guess that's true.
02:26:17 Speaker_05
So anyways, my whole movie idea was if someone invented teleportation, you'd have trillions of dollars at risk. You'd have a lot of incentivized titans of industry to go kill that person and get rid of that technology.
02:26:31 Speaker_05
And so mine was kind of like a crime capery, trying to stay alive to deploy this technology when you have all this funding to get rid of it. airplanes. They're gone. The entire air industry.
02:26:43 Speaker_06
Yeah. I just want to go. I just want to. And I'm there.
02:26:47 Speaker_05
It's great. It just it would collapse the the entire world economy would collapse.
02:26:52 Speaker_06
I wonder what would happen to restaurants because it might change. It might make restaurants more because you could then go out to eat.
02:26:58 Speaker_05
Everyone could eat at Emily Burger. The line would be. That's what would be really fucked up.
02:27:03 Speaker_06
Well, no, because then you'd be like, well, I'll come back tomorrow. Like, you wouldn't have to, it's not so scarce.
02:27:10 Speaker_05
I think you'd have to schedule your eating for years in advance. Because now you have 330 million people that could eat a Emily burger in Clinton Hill.
02:27:19 Speaker_06
Yeah, that's true.
02:27:20 Speaker_05
So they're all gonna make reservations. So you really, you're gonna have to just have so many reservations all around.
02:27:29 Speaker_06
But think about it. Remember when I was in New York and I couldn't find a place to eat because it was fashion week and it was a mess. And I was just bopping around from place to place and everything was full. I guess it would be like that.
02:27:41 Speaker_06
But then I could be like, I guess I'll go to Paris right now and see what's in Paris.
02:27:47 Speaker_05
I think what would happen is I think these popular cities would be way too packed during the day because who on a weekend would not go to Paris and have a cup of coffee for breakfast? Everyone would go.
02:27:59 Speaker_05
So these like world cities would just be completely clogged.
02:28:02 Speaker_06
Think about like immigration will have a new element to it.
02:28:06 Speaker_05
How? Yeah. And then also I'm going to take a swim in St. Barts in the afternoon. I'm like, oh, I'll have breakfast in Paris. No, well, I'd probably go there too. But St. Bart's I guess has pretty water and stuff.
02:28:20 Speaker_05
So I would have breakfast in Paris and then I would go to St. Bart's and take a swim. And then I, you know, your day would just be, and then for my exercise, I would do six miles on the wall of China, Great Wall of China.
02:28:31 Speaker_06
No, you could fall. Remember that girl?
02:28:34 Speaker_05
I would go in the daytime. She was there at nighttime. Anyways, it's a fun thought.
02:28:39 Speaker_06
Yeah, I do think about it a lot, like in a in a real way.
02:28:43 Speaker_05
But yeah, but we talk about like how disruptive A.I. is going to be as a technology. But that teleporting thing, if you really think it through for hours and hours, it would be chaos.
02:28:53 Speaker_06
I mean, really, really what it would be, which is just like every other technology and every other thing is like the it's like space, like the elites would have access. It'd be expensive.
02:29:04 Speaker_05
Exactly. So Emily Berger, which already has a very expensive hamburger, the hamburger would be like $200 because it could be because they have the entire world's billionaires that now could eat there for lunch.
02:29:15 Speaker_06
All the billionaires want to go. Yeah, it would be like a billionaire's game. It would. And that's would cause even more unrest and more disparity. But I want to go to Paris right now so bad. All right. All right. I love you.
02:29:48 Speaker_05
Follow Armchair Expert on the Wondery app, Amazon Music, or wherever you get your podcasts. You can listen to every episode of Armchair Expert early and ad-free right now by joining Wondery Plus in the Wondery app or on Apple Podcasts.
02:30:02 Speaker_05
Before you go, tell us about yourself by completing a short survey at wondery.com slash survey.