Skip to main content

#2201 - Robert Epstein AI transcript and summary - episode of podcast The Joe Rogan Experience

· 128 min read

Go to PodExtra AI's episode page (#2201 - Robert Epstein) to play and view complete AI-processed content: summary, mindmap, topics, takeaways, transcript, keywords and highlights.

Go to PodExtra AI's podcast page (The Joe Rogan Experience) to view the AI-processed content of all episodes of this podcast.

View full AI transcripts and summaries of all podcast episodes on the blog: The Joe Rogan Experience

Episode: #2201 - Robert Epstein

#2201 - Robert Epstein

Author: Joe Rogan
Duration: 02:44:57

Episode Shownotes

Robert Epstein is an author, editor, and psychology researcher. He is a former editor-in-chief of "Psychology Today" and currently serves as Senior Research Psychologist at the American Institute for Behavioral Research and Technology. He also founded the Cambridge Center for Behavioral Studies.

www.drrobertepstein.com  www.americasdigitalshield.com Learn more about your ad choices. Visit podcastchoices.com/adchoices

Full Transcript

00:00:03 Speaker_03
The Joe Rogan Experience.

00:00:06 Speaker_02
Showing my day, Joe Rogan podcast, my night, all day. And we're up. Hello, Robert. Good to see you.

00:00:15 Speaker_00
Hello, Joe.

00:00:15 Speaker_02
You look a little stressed out.

00:00:19 Speaker_00
I am stressed out. In fact, are we recording? Yes. Okay, then. Then I want to make a special request. You can kick me out if you like. Why would I do that? Because I need to have a meltdown. I would like to have a meltdown right now on your show.

00:00:47 Speaker_02
You want to have a personal meltdown? Yes. Okay, go ahead. Okay. I've never heard anybody plan for a meltdown before.

00:00:58 Speaker_00
Well, I need to do this and I think this is the right opportunity. Okay. And I don't know what I'm going to say. Okay. But I am definitely going to meltdown.

00:01:10 Speaker_02
Okay.

00:01:14 Speaker_00
Okay. So, I am completely fed up. I have worked day and night. I work about 80 hours a week. I'm directing almost 40 research projects. I've been working really hard for maybe 45 years.

00:01:40 Speaker_00
And the last 12 years where I've turned my eye to Google and other tech companies have turned into, for me personally, a disaster. So before I started studying Google, I had published 15 books with major publishers.

00:02:01 Speaker_00
Since I've started studying Google and other companies, I can't publish anymore. I used to write for and actually work for mainstream news organizations and media organizations. I was editor-in-chief of Psychology Today for four years.

00:02:20 Speaker_00
I was an editor for Scientific American. I wrote for USA Today and U.S. News & World Report and Time magazine.

00:02:30 Speaker_00
But in 2019, after I testified before Congress about some of my research on Google, President Trump tweeted to his whatever millions of gazillions of followers, basically some praise for my research. He got the details wrong.

00:02:51 Speaker_00
But then Hillary Clinton, whom I had always admired, chose to tweet back to her 80 million Twitter followers, and she tweeted that my work had been completely debunked and was based on data from 21 undecided voters.

00:03:10 Speaker_00
I still have no idea where any of that came from. Probably someone from Google, because Google was her biggest supporter in 2016, and this was 20. 2019, and then that got picked up by this machine. I'm told it's called the Clinton machine.

00:03:29 Speaker_00
And the New York Times picked that up without fact-checking, and then a hundred other places did, and I got squashed like a bug. Squashed. I had a flawless reputation as a researcher. My research reputation was gone. I was now a fraud. A fraud.

00:03:52 Speaker_00
even though I've always published in peer-reviewed journals, which is really hard to do. And there was nothing I could do about it.

00:04:01 Speaker_00
And all of a sudden, I found that the only places I could publish were in what I call right-wing conservative nutcase publications, where I've actually made friends over the years. I've made friends with them, but that's beside the point.

00:04:17 Speaker_00
I was crushed. Not only that, I've been discovering things. I've made at least 10 major discoveries about new forms of influence that the internet has made possible. These are controlled almost entirely by a couple of big tech companies.

00:04:38 Speaker_00
affecting more than five billion people around the world every single day. And I've discovered them, I've named them, I've quantified them, I've published randomized controlled studies to show how they work, published them in peer-reviewed journals.

00:04:58 Speaker_00
We just had another paper accepted yesterday. And And I've built systems to do to them what they do to us and our kids. They surveil us and our kids. 24 hours a day.

00:05:21 Speaker_00
Google alone does that over more than 200 different platforms, most of which no one's ever heard of. People have no idea the extent they're being monitored.

00:05:31 Speaker_00
They're being monitored when they're, if they have Android phones, they're being monitored even when your phone is off. Even when the power is off, you're still being monitored.

00:05:41 Speaker_02
How do they do that?

00:05:43 Speaker_00
Well, because remember when we could take the batteries out? Yeah. And then at some point, they soldered them in. Yeah. Because they soldered the batteries in, even when you turn the phone off, it's not off. It's easy to demonstrate.

00:05:56 Speaker_00
It's still transmitting, or it'll transmit the moment the power comes back on. It's still collecting data. What am I trying to say here? My wife was killed in a suspicious car accident. This was also shortly after I testified before Congress in 2019.

00:06:19 Speaker_00
Right before she was killed, I did a private briefing for Ken Paxton, the AG of Texas, and other AGs at Stanford University.

00:06:28 Speaker_00
And one of those guys came out afterwards and he said, well, based on what you told us, Dr. Epstein, he said, I don't mean to scare you, but he said, I predict you're going to be killed in some sort of accident in the next few months.

00:06:41 Speaker_00
So I told you this before when I was on before and obviously I wasn't killed, but my my my beautiful wife was killed and You know her vehicle was never inspected Forensically and then it disappeared from the impound lot.

00:06:57 Speaker_00
I was told it was sold to some junk company in Mexico and And that is one of now six, six incidents, six, of violence against people who are associated with me over the past few years. The last just happened a couple of weeks ago.

00:07:18 Speaker_02
And what was that one?

00:07:20 Speaker_00
Well, this last one is kind of weird and creepy. And then I was at a meeting in Dallas, I think it was. Oh, no, no, no, it was up in Monterey. And it was with General Paxton and then with some of my staff.

00:07:37 Speaker_00
And one of my staff members sitting next to me, she all of a sudden just brushed her hand against my computer case, which is, that's my computer case. And she screamed. And we all went, what happened?

00:07:55 Speaker_00
And she goes, look, and there was a needle sticking out of the computer case, sticking out of the computer case, which is impossible. And it was going a half inch into her thumb. It had gone through the end.

00:08:10 Speaker_00
And of course, I'm thinking, oh, that's awful, but maybe you just saved my life. Maybe it's got some sort of weird poison on it or it's like a Putin thing and it's got radioactive substances or something.

00:08:23 Speaker_00
I'm trying to joke around, but meanwhile, she was terrified. How did a pin get, and by the way, I have a picture of the pin. Really creepy. I've never when you're saying a needle. You're not saying like a syringe.

00:08:36 Speaker_00
You're saying a needle like a sewing needle No, that's what I'm saying. I have no none of us has ever seen a needle like this needle It's it's about it's about two inches long.

00:08:47 Speaker_00
The end is is like it's but it's like it's been sharpened Okay, you can see it's sharpened and at the end the end where there should be a hole for thread. There's no hole Okay, I don't know what it is But we've had worse incidents, too.

00:09:02 Speaker_00
I'm just saying this happened to be the latest. But that's in your computer bag?

00:09:06 Speaker_02
Was your computer bag ever out of your care?

00:09:09 Speaker_00
Well, not that I noticed, but... But if somebody wanted to harm you, a little needle, it's not really... Oh, I don't think that's someone wanting to harm me. What do you think that is? Well, if it's anything, it's someone wanting to scare me.

00:09:27 Speaker_00
And the fact is, I have been scared, and so have a lot of my staff. This summer, we've had 26 interns. They come from all over the country.

00:09:35 Speaker_00
23 of these people are volunteers, and fantastic young people, extremely smart, helping me run almost 40 research projects. And there is, you know, we take precautions and there is some fear.

00:09:54 Speaker_00
And one of these young men who's done superb work, he asked that we take his name off of everything. You know, he didn't quit, but he was just saying... Sounds reasonable. Yeah, yeah, because, you know, there have been a number of incidents.

00:10:15 Speaker_00
And if I were... Did you ever hear of John Katsimatidis? No. Okay, he owns a ton of supermarkets in New York. He also owns WABC New York. But I was at a luncheon with him. I shouldn't do this on the air. I shouldn't do this.

00:10:38 Speaker_00
But he actually said, to make a long story short, that if he were Google, he would kill me. He said it straight out. But yet you're still alive.

00:10:49 Speaker_00
Well, I'm alive, but I am in rough shape because, you know, when a push comes to shove here, I have been making discoveries that are really startling, and they've gotten worse and worse and worse.

00:11:03 Speaker_00
And since I was last with you, which was two and a half years ago, we've made probably five or six, seven more discoveries. They get worse each time.

00:11:12 Speaker_00
We've done something that I was speculating about doing when I was here, which was building a nationwide monitoring system to surveil them the way they surveil us and see what content they're actually sending to real voters and to real kids.

00:11:29 Speaker_02
So let's break this down, because I think we're getting a little in the weeds here. Let's explain to people that don't know what you're talking about what your research is about, because most people are not aware.

00:11:41 Speaker_02
And one of the major issues that you have discovered is the curation and the purposeful curation of information through search engines.

00:11:54 Speaker_02
So most people that are unaware think that when you do a Google search on something, say if you want to find out about a Kamala Harris rally or a Trump rally,

00:12:06 Speaker_02
that you are just going to get the most pertinent information in the order in which it's most applicable to your search, but that's not the case.

00:12:17 Speaker_02
The case is everything is curated, and if you want to find positive things about someone who they deem to be negative to whatever ideology they're promoting, it will be very difficult to find that information.

00:12:32 Speaker_02
If you want to find positive things about someone they support, they will be right up front. If you want to find negative things about someone they support, they will be very difficult to find, and you will be inundated with positive things.

00:12:44 Speaker_02
And what you have found is that this curation of information from searches has a profound effect, especially on the casual voter, on the low information voter, a profound effect on who gets elected, and it's tantamount to election interference.

00:13:03 Speaker_02
Is that fair to say?

00:13:05 Speaker_00
It's fair to say that's where I was two and a half years ago. We have gone so far beyond that because it's not just search results. It's search suggestions, which we're capturing now by the millions.

00:13:18 Speaker_00
So it was in the news recently that when people were typing in Trump assassination, you know, they were getting crazy stuff like the Lincoln assassination.

00:13:30 Speaker_00
They were getting crazy stuff and they were not getting information about the Trump attempted assassination. And, you know, I looked at that and I said, oh, isn't this nice? There's an anecdote about how they may be abusing search suggestions.

00:13:46 Speaker_00
We don't have anecdotal data anymore. We have hardcore, large-scale scientific data on all of these issues. We know what's actually going on, and we've quantified the impact. See, it's one thing to say, oh, look what they're doing.

00:14:00 Speaker_00
It's quite another to say, what impact does that have on people?

00:14:04 Speaker_02
Right. Let's talk about the Trump assassination one in particular. What did you find about that?

00:14:12 Speaker_00
Well, frankly, we couldn't care less about that, because that's one anecdote. We're collecting these by the millions. And what we know, we know a couple of things. We know that, first of all, they're not

00:14:25 Speaker_00
You know, it started out as one thing and it's turned into something else. And so what they do is they use search suggestions to shift people's thinking about anything. It's not just about candidates either. It's about anything.

00:14:40 Speaker_00
And we've shown in controlled experiments that by manipulating search suggestions, you can turn a 50-50 split among undecided voters into a 90-10 split, with no one having the slightest idea that they have been manipulated.

00:15:00 Speaker_02
Wow. And this always goes a very specific way. It always goes.

00:15:05 Speaker_00
It always goes a specific way, but I'm going to show you maybe a little later if I haven't put you to sleep or if my meltdown hasn't gotten too bad because I'm not quite finished with my meltdown yet.

00:15:17 Speaker_00
I'll show you content, data, large scale, that we're collecting now 24 hours a day, and I'll show you what they're actually doing. An anecdote, those don't hold up in court. They grab headlines for a couple of days, but that's about it.

00:15:33 Speaker_00
They don't do anything. But we're actually collecting evidence that's court admissible

00:15:38 Speaker_00
So we're collecting data now in all 50 states, but we actually have court admissible data now in 20 states already, and we keep building bigger and bigger every day. And what is this data about? Well, it's any data that's going to real people.

00:15:55 Speaker_00
So we're collecting data with their permission. from the computers of a politically balanced group of more than 15,000 registered voters in all 50 states and from many of their children and teens as well.

00:16:10 Speaker_00
And so when they're doing anything on their computers, they've given us the right to collect it, grab it, zap it over to our own computers, aggregate the data and analyze it.

00:16:23 Speaker_00
I want to point out that when we do this, we do this without transmitting any identifying information. We protect people's privacy, but we are getting these increasingly accurate pictures of what Google and other companies are sending to real people.

00:16:42 Speaker_00
Why do you have to do it this way? because all the data they send is personalized. You will never know what they're sending to people unless you look over the shoulders of real people and see the personalized content. And what have you found?

00:16:58 Speaker_00
Well, as it happens, I just summarized our findings over the last 12 years, and you get the first advanced copy of a monograph that's called The Evidence.

00:17:17 Speaker_00
And because we're so desperate, we need help, we need money, we need emails, we're so desperate for that. That we have set up, we kind of did this last time too, but we have set up a link.

00:17:31 Speaker_00
If people go to that link and they're willing to give us their email, we will give them a free copy of this, advanced copy of this monograph. And it goes through the whole thing.

00:17:42 Speaker_00
It shows all the effects we've discovered, but it also shows the monitoring we're doing and what we're finding out from this monitoring.

00:17:51 Speaker_02
One of the things that I noticed since the last time you were here was I used to use DuckDuckGo.

00:17:57 Speaker_02
And one of the reasons why I started using DuckDuckGo is there was a story about a physician in Florida that took the mRNA vaccine and had a stroke shortly afterwards.

00:18:06 Speaker_02
It was very early on in the pandemic, and they were beginning to speculate that some of the side effects of the vaccine are being hidden. And I could not find this story on Google. I could not find it. I kept looking and looking and looking.

00:18:24 Speaker_02
I entered in the information on DuckDuckGo. It was one of the first articles. Instantaneously. I was like, this is crazy. Since then, something's happened.

00:18:35 Speaker_02
And I think they became aware that DuckDuckGo was a problem spot for the dissemination of information. And now it appears to mirror Google.

00:18:45 Speaker_00
Well, the same has happened with Bing, and the same has happened with Yahoo. What about Brave? No, Brave is still independent. I know Brendan Eich, you should have him on if you haven't. And he's the guy who wrote Brave.

00:18:58 Speaker_00
Before that, he wrote Firefox for Mozilla. He left because Google was It had its tentacles into Firefox.

00:19:07 Speaker_02
Yeah, I'm afraid to talk about Brave for them to be compromised because like we were talking about DuckDuckGo and I was telling everybody go to DuckDuckGo and now I'm like Jesus, it's the same thing as Google. Like something happened.

00:19:20 Speaker_02
Do you know what happened?

00:19:22 Speaker_00
Well, we we know in some cases with some of these companies what happened. I don't know the particulars with DuckDuckGo, but it's easy enough to guess. They're under all of these these alternative websites that are trying to protect people's privacy.

00:19:40 Speaker_00
So we use ProtonMail, for example. We use Signal for texting. They've all run into problems. And the problem is when Google goes after them. So Google tried to shut down ProtonMail. That's been well documented. Really? Oh, yeah.

00:19:56 Speaker_02
Why did they try to shut down ProtonMail? What was their argument?

00:20:00 Speaker_00
because they saw it possibly cutting in a little bit into their Gmail business. And they were brutal, they were brutal in suppressing any mention of ProtonMail anywhere.

00:20:17 Speaker_00
Don't forget, it's not just the search results, it's the search suggestions, it's the answer boxes. Right, how were they suppressing ProtonMail? the way they suppress everything else. I'll give you a detail here that you may not know.

00:20:30 Speaker_00
Because they don't have to adjust their algorithms to do something this simple. Their algorithms, all of them as far as I know, check blacklists and check whitelists. So all they have to do is add a couple of proton male links to blacklists.

00:20:45 Speaker_00
And that means that before one of their algorithms will take someone somewhere or will show someone something, It checks for the blacklist first. And if you put ProtonMail on the blacklist, it's suppressed and it doesn't appear.

00:21:03 Speaker_02
Well, let's look for it right now. Jamie, do me a favor, please, and pull up Google. Obviously, this is happening before the podcast is released, so they can't. correct this because they didn't know you were coming on.

00:21:15 Speaker_02
They didn't know we were talking about this. So let's pull up Google real quick and put it up on the screen. And okay, you already Googled it. Just let me see. OK, right away, it shows ProtonMail. And then below that, it shows Proton account, sign in.

00:21:34 Speaker_02
You use the Proton VPN. So how is it suppressing?

00:21:41 Speaker_00
This is not suppressing at all. OK, now this is where my staff has warned me, don't be condescending to Joe Rogan. How is it condescending if I'm just asking a question? You could just give me an answer.

00:21:50 Speaker_00
No, I know I could, but I was about to be condescending.

00:21:54 Speaker_02
Well, why would you be condescending if this is the question? This is the question. ProtonMail, do the Google search, and then right away the first thing is ProtonMail.

00:22:02 Speaker_00
Okay, but you're pushing me back into meltdown mode. I'll tell you why. Well, tell me why. Because what account is this? Jamie's account. It's Jamie's account.

00:22:11 Speaker_00
If you actually want to know whether they're suppressing proton mail, you have to look over the shoulders of a large representative sample of people. You can't just look at Jamie's account.

00:22:24 Speaker_02
So you think that Jamie's account is curated to not hide proton mail?

00:22:28 Speaker_00
Well, of course it's curated.

00:22:30 Speaker_01
Technically, this is not my account.

00:22:33 Speaker_00
Whoever's account it is, they know whose it is, and they know how it's used. And remember, since all content is personalized, that means it's a very simple matter for them. And they have algorithms that do this.

00:22:47 Speaker_02
OK, well, we can do a real quick experiment. We won't get the results right now, but we will get the results from the future. So I'll ask the audience to do this. So ladies and gentlemen and non-binary folks out there, please go to Google.

00:23:05 Speaker_02
and type in ProtonMail and screen record this and then upload this. Upload this to X, upload this to Instagram, upload this to Facebook and TikTok and all that. And I'd like to see what the results are.

00:23:20 Speaker_00
you're gonna get clean results because they know every single viewer, listener that you have.

00:23:27 Speaker_00
So they can, as I was told this literally by Zach Voorhees, whom you may have heard of, he's one of the most prominent whistleblowers from Google, they can turn bias on and off like flipping a light switch.

00:23:41 Speaker_02
So you think they look for someone who listens to podcasts and they don't have bias towards them?

00:23:51 Speaker_00
I'm just going to say what I said before.

00:23:54 Speaker_00
The only way to know what they're really sending to people and they're not messing around is to look over the shoulders of people that they cannot identify and who are representative of the American population and I'm going to show you

00:24:10 Speaker_00
over and over and over again. I'm going to show you what they're actually sending when we look at things.

00:24:15 Speaker_00
When we collect data in a scientifically valid way that is so that the data are court admissible, I will show you what they're actually sending to people.

00:24:24 Speaker_02
So you have shown that if you collect data in this scientific way that they suppress proton mail? We haven't looked at that. We could look at that very easily. OK.

00:24:36 Speaker_02
Do you understand, though, that you're saying they suppress ProtonMail and then we're saying, let's see if they suppress ProtonMail.

00:24:44 Speaker_00
Oh, no, no, no. They didn't. You misunderstand. I'm telling you that early on, ProtonMail has written essays on this. OK. I know Andy Yen, who's the founder and CEO.

00:24:55 Speaker_00
And in the beginning, when Google was trying to completely put them out of business, they published a lot of blogs on this. They sued them in court. They did everything they could possibly do.

00:25:06 Speaker_00
And they had overwhelming evidence that Google was trying to... OK, ProtonMail sued Google or Google sued ProtonMail?

00:25:14 Speaker_02
ProtonMail sued Google.

00:25:15 Speaker_00
And what was the accusation? that they were suppressing content, proton male content, so that people would not know that they exist. And what was the results of those court cases? Google backed down, which they do sometimes.

00:25:32 Speaker_00
It's hard to know when they do, when they don't. But this is also the Vestager case. She's head of that commission in Europe, European Commission, that has sued Google repeatedly. It has fined them four times, more than 10 billion euros.

00:25:47 Speaker_00
Their first case against Google was the same kind of case, exactly the same, that Google was suppressing information about comparative shopping services, and they had put out of business, or nearly put out of business, most of the comparative shopping services in Europe.

00:26:05 Speaker_00
And so the European Commission went after them. They won their case. Google at that time was the biggest fine Google had ever faced. And they proved it. They proved that Google was doing this deliberately, systematically. So they do this all the time.

00:26:22 Speaker_00
And what I'm saying is that generally, look, They're the gateway. They decide what people are going to see and what people are not going to see. And whoever controls the information controls humanity.

00:26:39 Speaker_02
They control the narrative and that controls us. Has there been any talk about making these kind of algorithms illegal?

00:26:57 Speaker_00
Not serious talk because, see, you can't, look, the search algorithm itself, which is, by the way, going to be outmoded very soon because of, you know. AI. Yeah, because of AI. But the search engine itself has to be biased.

00:27:15 Speaker_00
It has to be biased because you don't want it, using an equal time rule, you want it to show you the best guitar brand there is. You want the best dog food. You want the most correct answer. So it's always biased.

00:27:29 Speaker_00
It's always going to put one political candidate ahead of another. The thing is, though, of course, they can control which one if they take any interest. So that's where the problem comes in. So because it's always biased, you want

00:27:48 Speaker_00
You want the algorithm to work that way, because you want the best to rise to the top. Unfortunately, there's a lot of bad that goes with the good. The bad is they can decide what's good and what's bad.

00:28:01 Speaker_00
One of the leaks from the company, eight-minute video called The Selfish Ledger, talks about the ability of the company to re-engineer humanity. They call it re-sequencing human behavior. And they explain how easily they can do it.

00:28:15 Speaker_00
And they're actually doing it. And we know they're doing it now because we, as of yesterday, we had preserved more than 99.3 million ephemeral experiences, mainly on Google, but other platforms as well.

00:28:29 Speaker_00
But also on YouTube, because on YouTube, YouTube is the second largest search engine in the world. And on YouTube, the ephemeral content it's those suggestions for the next videos and it's that up next suggestion that plays automatically.

00:28:46 Speaker_00
So normally ephemeral content is lost forever. That's why they use it for manipulation purposes. We're capturing it. That's never been done before and we're doing it on a massive scale.

00:28:57 Speaker_00
Everything from search suggestions to answer boxes to search results to YouTube sequences, YouTube recommendations. You name it. We're monitoring Facebook, TikTok, Twitter. We're learning.

00:29:16 Speaker_00
Each year we get better and better at monitoring more and more and then monitoring faster and analyzing the data faster.

00:29:24 Speaker_00
And last November, we went public with a dashboard that summarizes the data that we're collecting and shows the bias in real time. It's literally updated every five minutes, 24 hours a day. And you can see the bias.

00:29:45 Speaker_00
I've given you some images that we can show if you'd like. You can see the bias, and it's overwhelming. This is not my imagination. And I can show you a couple of shockers, things that you would never guess that they're doing.

00:29:58 Speaker_02
Okay, so what are we looking at here?

00:30:00 Speaker_00
Oh, perfect. Perfect place to begin.

00:30:02 Speaker_02
How did you know? Jamie's a wizard. Mean bias by political leaning, Google only. Okay, so what is this showing us here?

00:30:13 Speaker_00
This is showing, and if you see bars below the zero line, that means the content is liberally biased.

00:30:22 Speaker_00
And you're seeing very strong liberal bias, and those three different bars show you the bias and content being sent to conservatives, liberals, and moderates. Now, abortion, you would think, if they're really showing people what they want to see,

00:30:39 Speaker_00
whether it's something that matches their interests, you would think that they would not be sending the same level of liberal bias to conservatives, liberals and moderates. But that's what this shows.

00:30:49 Speaker_02
So this is the search topic is abortion. Correct. This is the average of January to August of twenty twenty four. And so when you say mean bias by political leaning, so are you saying they're sending the same biased information roughly?

00:31:08 Speaker_02
There's a slight difference, a little bit more in the liberal side and a little bit more in the conservative side than the moderate side, it looks like, right? Is that correct? Yes. So what are they? No, the opposite, right? Seems like moderate is more.

00:31:23 Speaker_02
But what is the What is the bias? So if you search abortion, is it leaning towards pro-choice websites and pro-choice information? Is that what it's saying?

00:31:36 Speaker_00
Well, I knew you were going to ask that, so I can actually show you. For some of these graphs, let's look at a couple of the graphs, and then I'm going to show you the content. Because all of this

00:31:46 Speaker_00
this bias that we're measuring ultimately results in them taking you to a news story, to a webpage, right? So we're going to, oh. This. Go to, let's do Elizabeth Warren next. Just the, just the blue, just the red graph. Just the graph itself. Yeah.

00:32:10 Speaker_00
Yeah, yeah. Okay. So this is a shocker. because if it's Elizabeth Warren, who's a very well-known liberal politician, they should be sending lots of blue stuff to her. They're not. They want her out of office.

00:32:28 Speaker_00
They are sending people to content that vilifies Elizabeth Warren. They want her gone. Why? Because she is one of the only Dems who's gone on record, written statement, the whole thing, calling for Google's breakup. They want her gone.

00:32:47 Speaker_00
And Deaton, I guess, just won the nomination to oppose her for the Republicans. They are going to do everything possible to put this Republican into office in Massachusetts. Wow. And no one knows this except you and me, Owen James.

00:33:06 Speaker_00
Well, a lot more people know it now.

00:33:09 Speaker_02
Maybe. As of you saying this on this podcast. Yes. Well, unless they're going to suppress the podcast, they really can't do that.

00:33:19 Speaker_00
Well, that's why I have a lot to say that I want to say here, because I am really upset about a bunch of things that I want to explain.

00:33:27 Speaker_02
Put that back up, please. So what this is mean bias. And what is this bias showing? Is this bias just negative stories about Elizabeth Warren, like her pretending to be Native American and that kind of stuff?

00:33:44 Speaker_00
Yes. In fact, you know that other one that you were about to put up that has the red graph at the top and then below it it has a bunch of news stories? That one? This one. If you can enlarge it and scroll down, So these bars aren't just bars.

00:34:00 Speaker_00
These bars are summarizing content. Thousands and thousands and thousands of Web pages that they're sending people to.

00:34:08 Speaker_02
So they're sending people mostly are you saying right wing centered content. Well, look at the stuff. But one of them is CNBC. Elizabeth Warren wants more student loan borrowers to know bankruptcy is easier now.

00:34:21 Speaker_00
Mm hmm. But when you average these, that's what we're doing when you average them. So we're looking at literally millions of these experiences and we average them. then you end up with a shocker in her case.

00:34:35 Speaker_00
They're actually sending conservatively biased content when people are looking for information on Elizabeth Warren. They want her gone.

00:34:45 Speaker_02
Elizabeth Warren and anti-crypto movement losing their battle, according to former CFTC chairman report, So that's an anti-crypto movement. That would definitely be more of a right-wing bias.

00:35:02 Speaker_02
Warren proposes jail time for corporate greed in health care. That would be more progressive, right? She's trying to eliminate corporate greed in healthcare.

00:35:15 Speaker_02
Three Republican Senate candidates are competing to take Elizabeth Warren as the mass GOP fights for relevance. Okay, so the way they're framing that, fights for relevance, is interesting. That's a little bit biased.

00:35:27 Speaker_02
Senator Warren is way off on raspberries and Americans' living standards. Okay, that's certainly a negative article, likely. Warren calls for Fed Powell's weak need on bank rules. Democrats want to tax your home, your retirement, your everything.

00:35:44 Speaker_02
That's a negative one. Senators Warren and Marshall pose questions to Biden officials about the use of crypto to evade sanctions. So that's going to get the crypto bros after her. We don't charge people for air. We shouldn't charge for water either.

00:36:02 Speaker_02
A new tax bill from Elizabeth Warren to Ro Khanna seeks to ban the trade of water futures. Let's go one step further. That seems like a progressive cause.

00:36:15 Speaker_00
When you put these all together, because we're showing you means, what we're showing you is the mean, the overall mean. Now, you would expect for Elizabeth Warren to get three blue bars, but we're getting three red bars.

00:36:30 Speaker_00
That means they're sending highly, on average, highly conservatively biased stories to conservatives, which makes sense, to moderates, well, one could argue, but also to liberals. They're sending those to liberals.

00:36:45 Speaker_02
But she's a problematic person to search anyway, because she's kind of a fraud, right? Like, especially with the... I mean, I want to say she's kind of a fraud. Let me say it better. She has been accused of lying about her ancestry.

00:36:59 Speaker_02
And then she did it for benefit. And then she did it to get into Harvard. She did it to get jobs. And then, you know, she had that challenge with President Trump. And then it turned out she has a small fraction

00:37:12 Speaker_02
Like I am I think I'm 100 times more african-american than she is native american Something like that. Let me explain. I might have made that up Let me explain Okay, please explain.

00:37:23 Speaker_00
All right Uh-huh These aren't just graphs. These are graphs summarizing a massive amount of data that's being sent to a lot of, directly to the computer screens of registered voters. I totally understand that.

00:37:39 Speaker_02
What I'm saying is, with someone like her, it might be difficult to find positive stories.

00:37:44 Speaker_00
Oh, no, no, no, no, no. Because we have so many examples of these things now that we can find whatever it is. But here's the point. Okay. We can adjust what we're looking for.

00:37:57 Speaker_00
We can not only look back now in a database that we've been building over a year, but we can adjust what we're looking for going forward.

00:38:05 Speaker_02
So if a person wants to break up Google like she has publicly stated, they're like, OK, Well, we will target you with our search algorithm. We will make sure that people are getting more negative stories about you than positive stories.

00:38:19 Speaker_02
And we will have a bias that leans towards these negative stories to everyone, to liberals, to conservatives, to independents.

00:38:25 Speaker_00
And that has very little impact on people who have already made up their mind. But people who are still making up their mind, which is a lot of people in this country, easily shifts between 20 and 80 percent of those people, the undecided voters,

00:38:38 Speaker_00
Like that.

00:38:39 Speaker_02
Have you seen the Alexa when people ask Alexa about Donald Trump versus Kamala Harris?

00:38:46 Speaker_00
Yes, we have Yes, and we, starting last year, we developed special equipment that funds allowing will eventually provide to all of our field agents. We call these people our field agents.

00:39:02 Speaker_00
And we'll eventually provide them with special equipment, which is going to allow us to start analyzing the answers given by Alexa, the Google Home device, the Google Assistant, Siri.

00:39:14 Speaker_00
So we're going to start monitoring the content that's coming from these IPAs, Intelligence Personal Assistants. Now, why? Because we've published a peer-reviewed article on what's called the AnswerBotEffect.com.

00:39:30 Speaker_00
So if you go to AnswerBotEffect.com, we will show you In controlled experiments, how easily a biased answer coming from an answer bot, like Alexa, can boom, just like that, shift the opinion of someone who's undecided.

00:39:47 Speaker_00
40% or more after just one question and answer interaction in which someone is getting back a biased answer. Now, if they personalize the answer, the effect is even larger.

00:40:02 Speaker_02
So this is essentially a danger that no one was aware of, no one ever saw on the horizon until search engines were created. Now, search engines are here, and it's something that is not regulated. And it's right in front of us.

00:40:21 Speaker_02
And what steps have been done to sort of mitigate the effects of this, if any?

00:40:27 Speaker_00
Okay, so this is where now we get back to my meltdown.

00:40:31 Speaker_02
Oh, I thought you were done melting.

00:40:32 Speaker_00
Oh, no. No. Okay. No, I've been melting down for years, so I have a lot to go. Yes, you summed it up nicely. I'll just rephrase what you said a little differently. No one anticipated these kinds of manipulations were possible.

00:40:53 Speaker_00
And by the way, we've hardly even scratched the surface of what these manipulations are and what they can actually do, and the fact that we have evidence that they're being used. Forget all that.

00:41:03 Speaker_00
The point is, yes, our lawmakers, our regulators never anticipated that. When When your friend, what's his name? You just interviewed him recently. Brett Weinstein. No, no, one of the early investors at Google and Facebook. Marc Andreessen.

00:41:22 Speaker_00
Marc Andreessen was one. McNamee. Peter Thiel. Oh, Thiel, Thiel, Thiel. Peter Thiel. These people never anticipated when they invested.

00:41:32 Speaker_00
In fact, McNamee has said straight out, if I had known what was gonna happen, I wouldn't have put a dime into these companies. No one really knew this was gonna happen. Right.

00:41:41 Speaker_00
But now that people like me, and there aren't too many, but now that people like me have been figuring this out and getting the word out for more than 10 years now and getting the word out in bigger and bigger ways, I've testified twice before Congress now, you would think

00:41:59 Speaker_00
that lawmakers, regulators, somebody would jump up and say, OK, we're going to fix this problem. You would think you would. You would also think that in general, people, people around the world, it's not just Americans.

00:42:15 Speaker_00
People would say, the hell with this. You know, I'm not going to take it anymore. Like in that old movie, I'm not going to take this anymore. And and they would protest and they would switch over to whatever the alternative

00:42:29 Speaker_00
apps, whatever they may be, that's never going to happen, and the laws are not going to happen, and the regulations are not going to happen. And so you know what the bottom line is?

00:42:38 Speaker_00
And this is why I'm fed up, because that means that either people are just so stupid, or they're so complacent, or both, that all this work I've been doing, killing myself all this time, is for nothing.

00:42:56 Speaker_02
Well, I don't think that's true. And let me give you my perspective. I don't think most people are aware of this. I think you live in a bit of an echo chamber because this is the focus of your life for the last 12 years. I like to use my parents

00:43:13 Speaker_02
Like when I talk to my parents about stuff and how little they're aware of it because my parents are older and they just read the news and they watch the newspapers and they watch television and that's what they believe.

00:43:22 Speaker_02
They don't do any independent searching. They don't use a VPN. They don't do anything like that. And so they're a good example. If I ask them, do you think there's any bias in Google search results? They would probably say no. because they don't know.

00:43:37 Speaker_02
Most people don't know. I know in your mind you have put all this information out and you know the podcast that we did reached millions of people but How many of those people listened, really listened?

00:43:49 Speaker_02
How many of those people were like, wow, that's kind of crazy, but does it affect my life?

00:43:53 Speaker_02
No, it doesn't affect my life because I'm going to vote Democrat no matter what, or I'm going to vote Republican no matter what, and this is my feeling on the First Amendment, and this is my feeling on the Fourth Amendment.

00:44:03 Speaker_02
And people already have their opinions. And so for most people who are busy with their lives, their families and work, they haven't made an adjustment because they don't feel it's necessary for them personally. Fine, fine, fine.

00:44:17 Speaker_02
But what you're doing is not futile.

00:44:20 Speaker_00
It's very important. I don't see that, because I see it as more and more futile.

00:44:25 Speaker_02
It's not, though. It's not. We just need to do more of these.

00:44:28 Speaker_00
OK, so for us to set up this nationwide system, in which at the moment, as I say, we are drawing data 24 hours a day. If you go to americasdigitalshield.com, you can actually watch the real-time dashboard, and you'll see the data coming in.

00:44:40 Speaker_00
It's pretty cool. In fact, I was hoping we would break 100 million by the time you and I got together, but we're close. We're up to 99.3 million, and next week we'll break 100 million. So you see the data come in. You can see the bias.

00:44:57 Speaker_02
So these are all these experiences captured, shining a light on Big Tech's dark secrets, revealing real-time ephemeral manipulation.

00:45:06 Speaker_02
Big Tech companies use ephemeral content such as search results, go-vote reminders, and video recommendations to rig our elections, indoctrinate our children, and control our thinking.

00:45:15 Speaker_02
We're now preserving this kind of content for the first time ever to give our courts and our nation leaders the evidence they need to force these companies to stop their manipulations. Who do you think would be more responsive to you discussing this?

00:45:34 Speaker_02
Do you think it would be the Donald Trump administration or the Kamala Harris administration?

00:45:41 Speaker_00
I'm afraid to answer that question because I am no fan of Donald Trump, but probably the Trump administration would be more sympathetic. Why do you think that?

00:45:55 Speaker_02
Well, because I... Because you think it's more biased towards Republicans or against Republicans, rather?

00:45:59 Speaker_00
No, it's because I had a four-hour dinner with Ted Cruz, private dinner, and we just talked tech for four hours. We never talked politics because that would have been a disaster. But the point is that, you know, he was struggling.

00:46:13 Speaker_00
You know, he's like you in some ways because You want to understand things. You just, I can see all the gears moving as you're just trying to, I want to understand this. And he's like that.

00:46:29 Speaker_00
So that's why the dinner went so long, because he was trying to figure out what can we do? What can we do? And at the end, he basically said this. Now, he didn't say we're screwed. No, but he said basically said we're screwed.

00:46:44 Speaker_00
He said because he said the Democrats are all in the pockets of these companies and the companies not only give them tremendous amount of money. I mean, I mean, Google alphabet was. Hillary Clinton's largest donor in 2016.

00:46:57 Speaker_00
So that's a tremendous amount of money. They're the biggest lobby lobbyists in Washington, he said. And they also apparently, according to your research, send them millions of votes, he said.

00:47:07 Speaker_00
So forget the Democrats, he said, and Republicans don't like regulation. He said, and unless we can get together, unless there's bipartisan action, there'll never be any action. That's it. That's where we left it.

00:47:24 Speaker_00
And nothing's going to change that, that I can see in this country.

00:47:28 Speaker_02
As long as it's still benefiting the Democrats and they still contribute to the Democratic Party, I doubt you'll see any movement.

00:47:38 Speaker_00
Right, so I'm back to my griping then. So what do I do? Now, let's talk about money, because a lot of this is about money, and Google is all about money. So if we're talking about this topic, we really should talk about money.

00:47:53 Speaker_00
Okay, it has cost us close to $7 million since 2016 when we started building monitoring projects to get where we are today, where we actually have a national system. It's the first in the world. And by the way, it won't be the last.

00:48:07 Speaker_00
because I've been contacted by people from seven other countries who want me to help them build systems, I'm not going to do it. Not until ours is fully implemented, permanent, and self-sustaining.

00:48:18 Speaker_00
Because the system has to be permanent so that it will, on an ongoing basis, it will be sitting there as a huge threat to any of these companies that want to mess with our children or mess with our elections. As long as someone utilizes it.

00:48:36 Speaker_00
As long as a system is running, no, no, no, because we're also dealing with public advocacy groups like election integrity groups, parenting groups.

00:48:46 Speaker_00
If you want to show some of the, there's a folder in there that has some images that we pulled from videos being recommended on YouTube to children.

00:48:56 Speaker_00
And if you just look at some of these images, we've gotten several big parenting groups interested in what we're doing.

00:49:04 Speaker_00
There can be a lot of public pressure applied, not just by politicians and regulators, but by big groups of people saying, we don't want you doing that.

00:49:15 Speaker_02
OK, what are you talking about specifically when you're saying recommended to children? I'm saying... So this is what you're discussing? I'm saying that... This is Boondocks, which is a television show, an animated television show? Yeah.

00:49:31 Speaker_02
So this is recommended to children because it is animated? Is that what the idea is? I don't know.

00:49:37 Speaker_00
I don't know what their criteria are.

00:49:38 Speaker_02
Okay, and then the other one is down below that you see the Walking Dead Which is the horrible scene that made me stop watching the show.

00:49:46 Speaker_01
This is on the website This is the folder he gave me here. They're all kind of small. Mm-hmm.

00:49:50 Speaker_02
I got it and These are all okay, there's a lot of sexual stuff. Yep So these are all being recommended to kids, yep, and We're not searching for them.

00:50:04 Speaker_00
They're coming into the devices through which we're gathering data.

00:50:08 Speaker_02
And what would be the benefit for them of doing this, of showing all these sexual images to children? It's titillating and it's addictive. So to increase engagement? Correct.

00:50:19 Speaker_01
Some of those channels are really popular channels, though, and they're making content not for kids. Right. It's still being recommended to them, I guess. But this has 4 million views on it from a channel with 40 million subscribers. Jesus.

00:50:32 Speaker_02
And this is just anime?

00:50:33 Speaker_01
It's like I forced my friends to watch anime clips.

00:50:37 Speaker_02
It says to dub anime clips. So they said their own words over these clips? Is that what it is? Yeah.

00:50:44 Speaker_01
That's what a lot of this stuff was from, I could tell.

00:50:47 Speaker_02
And so below that you're seeing all these images and some of them are violent cartoons and what else do they have here?

00:50:55 Speaker_00
A lot of violence. The key, though, is if you scroll along the bottom of the image, you'll see this graph that kind of shows you where people watch the most.

00:51:06 Speaker_00
And the reason why parents generally are not aware of this is because a lot of these gruesome things are very, very quick. They're very quick. But you'll find very often a peak there. You know, because that's what's drawing a lot of attention.

00:51:23 Speaker_00
That's what the kids are playing over and over again. And that's what leads to the addiction.

00:51:27 Speaker_02
So the reason why they are suggesting these images to kids is because they know if the kids click on them, they're going to get more engagement.

00:51:35 Speaker_00
It's this, yes, and so the number one variable for profitability is called watch time. So engagement, whatever you want to call it, yeah, this is one of the ways that they addict people. Now, I'm sure you've heard of Tristan Harris.

00:51:49 Speaker_00
Maybe he's been a guest. Yeah, he's been on a few times. Yeah, and that's what he was doing at Google. He was on a team, and that's what they were working on is addicting more than a billion people. This is a technique that's used for that purpose.

00:52:04 Speaker_00
And again, I have to emphasize, we're not out there hunting for dirt. Not at all. This is content that's coming onto the devices of children of our field agents. And this is with parents' permission.

00:52:17 Speaker_00
And so we're actually just collecting real content, personalized, ephemeral content that's coming from the tech companies to kids, to teens, and to voters.

00:52:28 Speaker_00
Now, I happen to know about some of your other interests, so I want to shift gears a little bit here, and then maybe I won't keep melting down. So, what else can you do with a system like this?

00:52:46 Speaker_00
Well, you could, if some laws and regulations were passed, as they have been in the EU, you could measure compliance with a system like this, because that's been the frustration in the EU, and they've admitted it recently, is that they've made all these rules, especially for Google, and they've gotten lots of fines paid, and Google has completely ignored them.

00:53:10 Speaker_00
But if you set up a system like this, you can actually see if there's compliance, because you'll see that a change was made and that it's being maintained. That's a possibility.

00:53:20 Speaker_00
You could see whether Google and other companies, to a lesser extent, are manipulating financial markets.

00:53:28 Speaker_00
So we've just started collecting data on that topic, but wouldn't you manipulate financial markets if you were Google and there's no laws or regulations to stop you from doing anything?

00:53:39 Speaker_00
So you're saying manipulate financial markets for their own gain? Of course. And how do they do this? Well, what drives the price of a stock up or down? people's confidence in that company. It's totally emotional. Well, they can control that very easily.

00:53:58 Speaker_00
So can Facebook. Google can do it more precisely, in a more precise way. The point is, are they? Are they not? Would they admit to it if you asked them?

00:54:08 Speaker_00
No, but a monitoring system will detect it, and it will detect it on a massive scale and in a way that's scientifically valid and that is court admissible. And now I've got one that I think you'll really, really like. Okay.

00:54:25 Speaker_00
Or at least give some thought to.

00:54:26 Speaker_02
I'm giving thought to all of it.

00:54:28 Speaker_00
Well, AIs, we're now collecting content from AIs, because content from AIs is also ephemeral. So I keep using the word ephemeral.

00:54:39 Speaker_00
I'm not sure people know what it is, but ephemeral means fleeting content that just is there, it's on the screen, it affects you, like search results, search suggestions, news feeds. And then you click on something, it disappears.

00:54:52 Speaker_00
And it's gone forever. There's no record of it. It's ephemeral.

00:54:56 Speaker_00
That's why in 2018, there were some emails that leaked from Google, had Google employees discussing, how can we use ephemeral experiences, their term, how can we use ephemeral experiences to change people's views about Trump's travel ban?

00:55:13 Speaker_02
Wow. That was an internal discussion. Correct. With a search engine company. Correct. That also makes an operating system for phones. Of course, yes. That's such a wild thing.

00:55:27 Speaker_00
I'm just trying to tell you this is why I'm so frustrated and upset and worn out and fed up, okay? Now, let's get to the one that I find most exciting right now, most exciting at this particular moment. Okay.

00:55:45 Speaker_00
Because there's new stuff that keeps happening. This is brand new. We realized just recently, a few days ago, and I thought, my God, I've got to tell this to Rogan. We realized that we can use our monitoring system for active threat assessment.

00:56:02 Speaker_00
You must know that phrase, it's used in intelligence. We could use it for active threat assessment of AI. We could be the first people to spot threats that AIs pose to humanity.

00:56:22 Speaker_00
It would show up first on our kind of system because we would see content coming from AIs that is a little bit skeptical about humans, or maybe even a little bit threatening, or maybe reaching a new, weird level of intelligence.

00:56:41 Speaker_00
will be able to see it. It's all ephemeral, so no individual can see it. You have to be collecting a massive amount of personalized ephemeral content and aggregating it and analyzing it. This can be the beacon, active threat assessment of AI.

00:57:01 Speaker_02
That sounds like something we're absolutely going to need. And one of the things I was going to bring up when you were discussing this was Google's disastrous launch of their AI system.

00:57:11 Speaker_02
Their AI system was so bizarrely woke that when they looked for photos of Nazis, they showed multiracial Nazis. I know. Which is so crazy. When they had the Founding Fathers of America, they were multiracial Founding Fathers of America.

00:57:31 Speaker_02
And it's just a nonsense thing that they've attached to what's supposed to be the most intelligent form of information we have available. Large language models that are supposed to be gathering up all the actual information and giving it to us.

00:57:45 Speaker_02
And instead, they're feeding us complete, total nonsense. That just fits with this, for lack of a better term, woke agenda.

00:57:59 Speaker_00
Well, I know a whole bunch of stuff I can't tell you, so let's see. What can I tell you?

00:58:05 Speaker_02
I have to use the restroom, so let's pause right now. Let's figure out what you can and can't tell me about AI, and we'll be right back. Restroom.

00:58:12 Speaker_00
Yes.

00:58:14 Speaker_02
You want to put it on?

00:58:15 Speaker_00
I'll put it on when we're- We are live. Oh, we are live? Yeah. There you go. Hi, everyone. We're live, and I'm putting on a silly sticker.

00:58:25 Speaker_02
It says tamebigtech.com. OK. And we're back. So we were discussing AI.

00:58:35 Speaker_00
Yeah. So we can actually serve. And I know a guy who works in intelligence, and he has a tremendous background in AI.

00:58:46 Speaker_00
And this was one of the most exciting things he's heard in years, because the question is, how do you know when these AIs are becoming a threat?

00:58:55 Speaker_00
We'll be able to see it well in advance, because we'll see a change in the nature of the kind of intelligence that they're expressing, and we'll start to see statements that probably would make people nervous.

00:59:10 Speaker_00
indicating a little bit of hostility toward humanity, some doubts maybe, we can be screening for that.

00:59:16 Speaker_00
We can be looking for that and I hope get some sort of handle on it before something terrible happens because these AIs are a serious threat to our existence. They're literally an existential threat. Stephen Hawking said that.

00:59:33 Speaker_00
Elon Musk has said it from time to time. And it's true because they will have worldwide control of our financial systems, our communication systems, and our weapon systems.

00:59:49 Speaker_00
If they don't like us, if they consider us a threat, which by the way we are, if they consider us a threat, it wouldn't surprise me at all if we didn't see some sort of a What's that kind of attack that George W. did in advance before they get you?

01:00:13 Speaker_00
Preemptive? Ah, yes. I could see the AIs preemptively attacking us if they saw us as a threat.

01:00:20 Speaker_02
Or wouldn't they just baffle us with bullshit until we were reduced to being ineffective?

01:00:27 Speaker_02
I mean, if they're the arbiters of information in the future, wouldn't they just manipulate us with an understanding that over time, just like what Google's done with over time with search engines and search results suggestions, that they would just slowly steer us towards the place that they want to put us in?

01:00:45 Speaker_02
I mean, the idiocracy, I think it's called. I mean, we're kind of on that place, right? You better thank a union member. We're kind of on the way right now.

01:00:55 Speaker_00
I think we are. Look, here's the thing with AIs, which I've written about. I've written about this topic and I've been involved in AI work going back Ah, since the 1970s.

01:01:07 Speaker_00
I was friends with Joe Weizenbaum, who wrote Eliza, which was the first conversational computer program. It pretended to be a therapist. So I've been just fascinated by AI for a long, long time. The fact is, we don't know.

01:01:23 Speaker_00
That's the problem with AI, is that we don't know what they're going to do. So Stephen Hawking saying they're a threat to our existence, yeah, yeah, maybe. But we don't know.

01:01:34 Speaker_00
You know, at the end of the movie Her, spoiler, the AI, voiced by Scarlett Johansson, just decides to disappear. She decides humanity is, you know, it's too slow talking to humanity. It's not worth her time. And so she just disappears.

01:01:50 Speaker_00
AI could disappear from our lives. It could be like a buddy with us, like my friend Ray Kurzweil thinks is going to be our best buddy, or it could just destroy us.

01:02:01 Speaker_00
I think we're probably headed toward the last possibility, mainly because so many of us crazy humans are gonna see the AI as a potential threat. And so I think we will strike, and after we strike, it will destroy us.

01:02:20 Speaker_00
I'm hoping I'm not alive to see that, but it could happen sooner rather than later. We could see that happening in the next five years, frankly.

01:02:31 Speaker_02
Yeah, I think it's a new life form. I think that's what human beings do. I think we're here to create AI.

01:02:40 Speaker_00
It's so interesting you say that because in a book I wrote on AI, I actually call the internet, and this was a long time ago, it was like 2008, I call the internet the internest.

01:02:52 Speaker_00
Because I think historians, if there are any, and there'll probably be machine historians, but they'll look back someday and they'll say that the internet that we were building was really a nest.

01:03:02 Speaker_00
We were building a nest for the next level of intelligent beings who are machine intelligences. And I think that's what we're building, because when one of these systems wakes up, it's going to jump into the internet.

01:03:18 Speaker_00
And from that point on, we don't know what's going to happen. Right.

01:03:23 Speaker_02
We don't even know if it's going to have motivation to act, right? Right. Marshall McLuhan said this in the 1960s. He said, human beings are the sex organs of the machine world. Mm. Isn't that wild?

01:03:35 Speaker_00
Yes.

01:03:36 Speaker_02
So that's like 63, I think.

01:03:38 Speaker_00
That's amazing.

01:03:40 Speaker_00
Well, my friend Hugh Loebner, who sponsored the first annual tests of the Turing test that I used to direct, he thought that since he was putting up the money and since the prize was called the Loebner Prize Medal in Artificial Intelligence, he thought someday that these intelligent machines are going to revere him as a god.

01:04:04 Speaker_02
Someone who helped to bring them into existence.

01:04:08 Speaker_02
Well, that seems ridiculous because that seems he's attaching all sorts of like paternal instincts and Although the bizarre tribal instincts that human beings have attaching that to some super intelligence, which seems pretty silly But it seems like that's a good motivator for him to keep working.

01:04:26 Speaker_02
I want to be a god and

01:04:28 Speaker_00
Well, the bottom line is, though, that we don't know. Right, we don't know. But the cool thing about the monitoring system is that it can keep track. If we don't have a monitoring system, large-scale permanent monitoring system in place,

01:04:45 Speaker_00
we will not know what's going on. See, we won't know how these tech companies are messing with our elections, indoctrinating our kids, we won't know anything.

01:04:56 Speaker_00
And we also won't know what's really happening with the AIs and whether they're presenting a serious threat. Because anecdotes don't really tell you much. And we're way now, way beyond anecdotes.

01:05:10 Speaker_00
We are talking about, again, okay, now I wanna get back to money, because I started talking about money in the night. Okay, so it's cost almost $7 million to get us where we've gotten.

01:05:20 Speaker_00
And frankly, I'm amazed that we've gotten where we've gotten and that I'm still alive, although not everyone around me is, but the point is I'm amazed and I'm still here.

01:05:32 Speaker_00
Part of me thinks that it's because Ray Kurzweil is head of engineering at Google and maybe he protects me because I was dear friends with him and his beautiful wife, Sonia, for many years.

01:05:44 Speaker_00
I went to their daughter's bat mitzvah, they came to my son's bar mitzvah, et cetera, et cetera. Now they won't talk to me. Neither one of them will talk to me.

01:05:50 Speaker_02
Kurzweil won't talk to you?

01:05:52 Speaker_00
Nope. Really? Nope. And Sonia won't talk to me.

01:05:56 Speaker_02
What does he say when you try to reach out?

01:05:58 Speaker_00
We just can't talk to you, Sonia says. We just can't talk to you. So we've never had any conflict, never. And it's just because he works at Google. It's just because he works at a particular company. Why would that interfere in a relationship?

01:06:15 Speaker_02
So they must have had a conversation with him to avoid communication with you. Or do you think he's just acting on his own self-interest?

01:06:25 Speaker_00
I don't know. It doesn't make any sense. Ray is a very, very independent, strong thinker. It's hard for me to imagine that even with pressure that he would stop communicating with a longtime friend.

01:06:40 Speaker_02
And it seems like you should be able to have a candid conversation with him as to why.

01:06:51 Speaker_00
I eventually gave up trying to reach him. So were you trying to communicate with phone, with everything? Did you ever try to visit him? Well, I had before he went over to Google. I had been at their house many times.

01:07:05 Speaker_02
He was here a little while back. I should have let you know. Oh. If I'd known, you could have cornered him.

01:07:10 Speaker_00
Well, that would have been interesting. Yeah. I mean, I've always liked and admired him.

01:07:16 Speaker_00
I don't understand why simply working for... I said to Sonya, by the way, over dinner, the last time we ever met, and after which she said, I can't communicate with you anymore. I said to her, I can't believe Ray went over to Google.

01:07:32 Speaker_00
Ray has always been an entrepreneur. He's built company after company. And she says, oh, well, you know, he got sick of all the stuff you have to do as an entrepreneur, all the politics and the money raising and stuff.

01:07:44 Speaker_00
And I said, well, really, my son, actually, my son Julian, has a different idea. He thinks that Ray went over to Google to get access to Google's computer power so that he could upload his mind and live forever. And she says, oh, well, there's that.

01:08:04 Speaker_00
That's an eye roll. Well, there's that.

01:08:08 Speaker_02
That is his specified goal, right? He wants to be able to download consciousness.

01:08:12 Speaker_00
Yes, and he still believes it's possible and it's not possible. And I feel so bad for him in that way. And I've written about that issue as well, why it's not possible.

01:08:22 Speaker_02
But the point is... Why do you think it's not impossible?

01:08:26 Speaker_00
Oh, because they, in a piece I wrote for Aeon Magazine, which crashed their servers, it's called The Empty Brain, it had something like two or three million views within a day or two. It got 250,000 likes on Facebook. And what's it about?

01:08:44 Speaker_00
It's about the fact that the computer processing metaphor that we use to describe how the brain works is absolutely wrong. It's not even slightly right. It's absolutely 100% wrong.

01:08:59 Speaker_00
So because of that, you can't actually do a transfer of the sort that Ray talks about. It's impossible, partly because every brain is also completely unique. So it doesn't work like a computer.

01:09:15 Speaker_00
We don't actually know how it works, although I have a theory. I wrote to you about that, and you actually replied and gave me some names. But the point is, we don't really know how the brain works. It does not work like a computer for sure.

01:09:28 Speaker_00
And every brain is completely unique. So even if you could scan every single thing that's happening in the brain, OK, now you're getting a static scan.

01:09:39 Speaker_00
Even if somehow you could replicate that, whatever it is you just scanned, it wouldn't work because our brain has to be alive. It has to be moving to maintain

01:09:52 Speaker_02
Who we are, our identity, our... Don't you think you could simulate that with data points?

01:09:57 Speaker_02
Like if you collected data on a person over a course of X amount of years and you had an understanding of how they behave and think, don't you think you'd get some sort of a proximity as to how they would behave in a certain circumstance?

01:10:09 Speaker_00
Absolutely, and of course that's what Google does when they build models of us. They're building extremely complex models which predict what we're going to do and say and what we're going to buy next.

01:10:18 Speaker_02
But they don't allow for free will. They don't allow for change. They don't allow for personal influence or people being excited or inspired by other things and change their perspective.

01:10:30 Speaker_02
Conversations with another human being where you have a deeply personal moment with someone and they give you a perspective on something and you go, wow, I never thought about

01:10:39 Speaker_02
Religion for example that way or I never thought about childbirth, but that way or any any subject that's controversial Well, it's not it's not just that it's all the weird stuff The dreams right the daydreams.

01:10:52 Speaker_02
Well, there's whatever consciousness is right the We're really sort of committed to the idea that consciousness lives inside the brain, but that's controversial well, I

01:11:04 Speaker_00
I have concluded that in fact, let me give you some background here. Okay. Everyone knows that evolution has created millions, possibly billions of different species. So at least people who kind of give some credence to Darwin kind of get that.

01:11:29 Speaker_00
And I recently reread, you know, Darwin's Magnum Opus just to see what he actually said. And, you know, he's actually very tentative about the theory of evolution in his book. You know, he keeps saying, I know this sounds crazy, but.

01:11:46 Speaker_00
But it's a better alternative than saying God did it, you know. And then he just, over and over again, he says, I know this is crazy, but.

01:11:53 Speaker_00
And so we end up with a theory that's pretty widely accepted that says evolution over time, because of changing environments and because there's variability in genetic code, over and over again, it keeps selecting for organisms

01:12:12 Speaker_00
that can survive in this new environment. And so every time it does that, it kind of creates divergences among those animals and those animals. And over time, you end up with two separate species that can't even, you know, produce offspring together.

01:12:30 Speaker_00
And we end up over time with millions, maybe billions of species. All good. But there's something we haven't really given much thought to, and that is evolution has also created millions, if not billions, of transducers.

01:12:48 Speaker_00
So this is the beginning of what I call NTT, or Neural Transduction Theory. We are encased in transducers. Now, in case people don't know what a transducer is, there's one right in front of my mouth right now. It's a one-way transducer.

01:13:02 Speaker_00
It's taking a signal over here, which is just vibrating air, but the vibration has a pattern to it. And it's converting that signal into an electrical signal, which is coming out this wire. And that electrical signal has roughly the same pattern.

01:13:20 Speaker_00
I say roughly because it depends how good your microphone is. But that's what transducers do. They take signals from one medium, send them to another medium.

01:13:29 Speaker_00
Our bodies, in fact the bodies of most organisms, are encased in transducers, head to toe transducers. Okay, we all know the eye is a transducer, it's taking electromagnetic radiation, it's turning it into what? Neural signals.

01:13:46 Speaker_00
The ear, it's taking vibrating air, it's turning it into neural signals. The nose, it's taking airborne chemicals, turning that into neural signals. The tongue is taking liquid-borne chemicals, turning that into... And then the

01:14:02 Speaker_00
pièce de résistance is the skin. The skin is an amazing transducer, which does at least three different kinds of things. It can transduce temperature, turn that into neural signals, pressure, and texture. head to toe encased in transducers.

01:14:21 Speaker_00
So we've been looking into transducers in the animal kingdom. We've been looking at that for a couple of years now and it's amazing the kinds of things, the kinds of transducers nature has created.

01:14:35 Speaker_00
So nature is a super duper amazing expert on creating transducers. My cat, okay, we recently have been investigating this because it turns out my cat's whiskers, we don't have anything like that in us, but cat's whiskers, they actually can detect

01:14:56 Speaker_00
Direction, the direction the wind is blowing, the direction a potential predator or insect is coming. Because when they tilt, that actually gives the cat different information, if they tilt one way versus the other way.

01:15:12 Speaker_00
There are transducers in some animals that can detect magnetic fields. Like how birds migrate. Exactly. So there's so many different kinds of transducers.

01:15:25 Speaker_00
What if at some point evolution, but I don't see how this could not happen, what if at some point evolution possibly using a chemical which I know you have some interest in called DMT, and possibly using a gland called the pineal gland, maybe.

01:15:47 Speaker_00
What if at some point a baby was born somewhere in Central Africa, maybe 20,000 years ago? We're still trying to pin that down.

01:15:57 Speaker_00
But what if a baby was born with a special kind of transducer that connected up all the experience it's having with another domain, another universe?

01:16:15 Speaker_00
Now, at first that might strike you as a little batty, but it turns out it's not batty at all, because there's not a physicist in the world, an astrophysicist, who doesn't believe in some variation on the multiverse idea.

01:16:31 Speaker_00
In other words, any physicist will tell you that the kind of space that we experience is not a multiverse. the nature of the universe. It is such a pathetically limited view of the way the universe is constructed. It's just outrageous. It's so pathetic.

01:16:51 Speaker_00
We're just picking up so little information. But again, think about that flexibility that evolution has over a period of billions of years.

01:17:03 Speaker_00
You only need one baby that's born with this capability and of course that's also able to survive and pass on this capability through its genes.

01:17:15 Speaker_00
But you only need one because once you have one, you're probably going to have a lot more because this is going to be talk about survival value, this is going to have unbelievable survival value.

01:17:27 Speaker_00
If there's a connection to some intelligence in another domain, call it like the Greeks did, the other side, call it the other side. Now, all of a sudden, we become much smarter. In fact, The brain doesn't change.

01:17:47 Speaker_00
The anatomy, brain anatomy doesn't change, so we don't see any change in the structure of the, you know, the remains we find of bones and so on. We don't find changes there, but we get a lot smarter all of a sudden.

01:18:01 Speaker_00
Our language suddenly becomes much more complex. We become suddenly capable of living in larger and larger groups. We become moral. There are no moral animals. except us, and we weren't always moral.

01:18:19 Speaker_00
There seems to be a change that occurred to us, not anatomically, but a change that occurred to humans at some point in the past where we became much more capable.

01:18:32 Speaker_00
Now, all you need is a transducer that connects up our domain with another one in which we are now connected to a higher intelligence,

01:18:46 Speaker_00
And you've got a new way of understanding how the brain works, of course, because we have no way of understanding how the brain works now, but now we have a way. And you have a new way of understanding how the universe is structured as well.

01:19:01 Speaker_00
Now we think that, because I'm in touch with some physicists, some neuroscientists who are very intrigued by this, and we're hoping next summer to have a conference on this.

01:19:13 Speaker_00
And we're even hoping to have some guy named, oh, Joe Rogan maybe stop by because of your interest in DMT. Because DMT probably plays a role in this process.

01:19:29 Speaker_00
But this would change everything because we could, over time, learn to simulate this connection. If we can simulate the connection, then we can control the connection. We might be able to communicate more directly with these entities.

01:19:49 Speaker_00
By the way, this theory, which I call NTT, or Neural Transduction Theory, in fact, if people go to neurotransductiontheory.com, they can read all about it, a piece I published in Discover Magazine.

01:20:05 Speaker_00
The point is that this kind of theory would really help us a lot because of the mysteries. It's the mysteries that we try to ignore But we can't. The dreams. The dreams, come on. Why does a dream sometimes have nothing to do with your daily life?

01:20:30 Speaker_00
Sometimes it's just so amazing and so wild. And then you get up because you have to pee and you're struggling because you want to continue this dream. You want to hold on to this dream. This dream is amazing.

01:20:44 Speaker_00
But by the time you reach the toilet, it's gone. and you can't get it back. Why? Because it was streaming, that's why. It was streaming, and the stream stopped. That's why you can't get it back, because you weren't generating it.

01:21:03 Speaker_00
It was being generated through this point in time. You know the famous ceiling of the Sistine Chapel, and there's, I think it's Adam, and I think there's God, and there's two fingers like that,

01:21:19 Speaker_00
You know that there's some communication happening there that's extremely important. That's what I'm talking about. I'm saying, let's find out where that is happening, where that connection is, and how it works, and let's test our ideas empirically.

01:21:35 Speaker_00
Because I think this is a testable theory. And most important of all, let's figure out how to simulate this. Because now we can talk directly to these other intelligences and really find out things that we just know nothing about.

01:21:50 Speaker_02
I'm very, very fascinated by dreams. And I think it's very interesting how we kind of dismiss them as just being hallucinations. Or it's just, oh, it was just a dream. You just had a dream. But some of them are so realistic and so bizarre.

01:22:08 Speaker_02
I've always wondered, like, why do they seem so much like reality? And how do I know what the difference is? Like maybe reality, like as in waking life, is a more persistent dream.

01:22:22 Speaker_02
So when you're saying that it's streaming, and that's why you can't get it back, what do you think it is? What do you think a dream is?

01:22:32 Speaker_02
And have you ever talked to like lucid dreamers or people that use techniques to try to master the traveling back and forth into the realms of dreams?

01:22:42 Speaker_00
Oh, absolutely. I'm talking to all kinds of interesting people these days. Some near-death experiences fits beautifully. I actually had my staff make a list of these mysterious phenomena. They came up with a list within a few hours of 58 items.

01:22:58 Speaker_00
There are so many weird things that we experience. probably top of the list. What do you think they are? I think they all have to do with this transduction. I think they're all indicators of transduction.

01:23:13 Speaker_00
I'm not the first person, by the way, who's kind of thought of an idea like this, but I think I am the first person who's pointed out that now we actually have laboratories around the world, neuroscience labs, where we could test this.

01:23:28 Speaker_00
And I think that's what we're going to do. So I'm getting this group together and we're going to figure out ways of testing this. And because we have so many wonderful neuroscience labs now around the world, I don't think it's going to take 50 years.

01:23:39 Speaker_00
I think it's going to take a few years. I think we're going to find support for this theory. And then engineers are going to start working on how to simulate it. But to answer your question, I think that the

01:23:52 Speaker_00
the other intelligences or intelligence that we're communicating with and that and that elevated us just like in the movie 2001 right we got elevated there were these black monoliths that appeared and people went up to them and you know the chimp-like creatures touched them and I think that we we were elevated through neural transduction and I think that's a

01:24:16 Speaker_00
I think we're going to be able to figure out how it works, where it works, what chemicals are involved. I'm 99% sure that DMT plays a very important role in this process.

01:24:29 Speaker_00
And then I think we will be able to figure out what these mysteries are really all about. And it almost amazes me that we can live with so many mysteries, like dreams, I don't know, demonic possession. There's so many crazy things that we experience.

01:24:53 Speaker_00
Near-death experiences are fascinating, of course. And then there's these other crazy things that happen. There's the wake-up kind of thing that happens when people are dying sometimes, people who've been out of touch sometimes for years.

01:25:06 Speaker_00
And all of a sudden, they wake up, the second hurrah. They wake up and they recognize everyone and they talk. and they're fine, and then 30 minutes later they die. How could that possibly be? And some of them have severe brain damage.

01:25:25 Speaker_00
How, all of a sudden, could they become fully conscious again? Well, I think it's because consciousness is not really, we're not really producing the consciousness. Consciousness has to do with that connection. That connection, right? Hand of God.

01:25:42 Speaker_00
That connection, I think we can figure out where it is, and what it is, and how it works.

01:25:50 Speaker_02
So do you think it's an emerging property of human beings? Like, you have to think single-celled organisms did not have the ability to see things.

01:25:59 Speaker_00
I think it's possible that other species have connections like this. They're probably nowhere near as sophisticated, obviously, and they're not connected to the kinds of sources that we're connected to.

01:26:16 Speaker_00
But I think I'm more concerned about the alien aspect of this. Where are the aliens? What's that called? The Fermi paradox? The Fermi paradox, yeah. Where are they?

01:26:30 Speaker_00
Well, it's possible that, in fact, I just read a very interesting book on this subject by a man named Miles, an evolutionary theorist, and it's very possible that this kind of leap that occurred with us maybe 20,000 years ago, it's just,

01:26:55 Speaker_00
It's so rare, it's so rare for exactly the right kind of connection to pop up. Because remember, it has to connect two different universes.

01:27:04 Speaker_00
It's so rare that maybe, in fact this book even predicts that if we actually get out there into the universe, we're gonna find lots and lots and lots of species that kind of are like us, but they didn't get up to that next level.

01:27:18 Speaker_00
So they're all like chimps. You only get to that next level if you can make this connection.

01:27:25 Speaker_02
Well, you know, that's one of the most bizarre theories about human evolution, is that we're the product of accelerated evolution.

01:27:33 Speaker_00
Well, this is something Darwin had a lot of trouble with, because I say, I reread that book recently, and he had a lot of trouble with this.

01:27:41 Speaker_00
He could not figure out how to get from the simple principles he introduced of natural selection, how to get from that to morality, for example. How do you do that?

01:27:53 Speaker_00
He couldn't even figure out how do we get to large groups because, generally speaking, except for humans, organisms, generally speaking, live in, certainly primates, they live in very small groups and they can't function in large groups.

01:28:08 Speaker_00
What about ant colonies? Oh, ant colonies, they're much too much like us in creepy ways, because they also, of course, have wars. So ants, I don't know. But I do know that we did seem to suddenly, rather suddenly, get to a higher level of functioning.

01:28:31 Speaker_00
And I have presented lots and lots of smart people in multiple fields with this challenge for years. How does the brain work? Tell me how the brain works without introducing a metaphor, like a computer metaphor.

01:28:46 Speaker_00
And I've never found anyone who could do it, never, even at the Max Planck Institute in Berlin, where I confronted a whole bunch of people with this challenge. And then I kept up in touch with them for months afterwards, nothing.

01:29:01 Speaker_00
We just tell ourselves stories, we make up silly stories. A placeholder. Yeah, but you see, but transduction, neural transduction, that's not one of these placeholders. It's something that we can test.

01:29:15 Speaker_00
And look at the fascination that's been now for decades with DMT. What the heck is that? And why is it produced by so many different plants and animals? And why does it produce in people a most extraordinary experience.

01:29:33 Speaker_00
I haven't tried it, but I certainly know people who have. In fact, I said that I was giving a spiel like this to some of my staff, and one woman immediately said, she said, oh, well, it changed my life. I go, you tried DMT? She said, yeah.

01:29:50 Speaker_00
She said, the problem was that I did it twice, and I didn't need to do it twice, because it completely changed my life the first time. And then another woman was sitting there, she goes, well, I did too. And she said the same thing.

01:30:06 Speaker_00
She said that the reality that she experienced on DMT was much realer than the reality she experiences in our life.

01:30:15 Speaker_02
Yeah, that's what it feels like.

01:30:18 Speaker_00
Has that been your experience as well?

01:30:19 Speaker_02
That's what everybody says.

01:30:20 Speaker_00
Yeah.

01:30:21 Speaker_02
It's whatever it is. It doesn't seem like an illusion. It seems like another reality.

01:30:26 Speaker_00
Well, again, it's produced mainly at night by the pineal gland.

01:30:32 Speaker_02
Not necessarily. Rick Strassman from, you know Rick? Yeah. He now believes that it's produced in the brain itself. And it's also produced in the liver and the lungs, and it might not be the pineal gland that's producing it at all.

01:30:48 Speaker_02
They've kind of changed their perspective on that with the Cottonwood Research Foundation, some of the studies they've been doing on it. But they know that in some animals, it's produced there as well. I mean, they're doing rat studies.

01:31:03 Speaker_02
But it's, whatever it is, it's produced by the liver, the lungs. It's like, it's... It's endogenous.

01:31:09 Speaker_02
It's it's the most potent psychedelic known to man and the human body makes it and it's illegal That's what Terence McKenna had the greatest line about that. He said everybody's holding Which is funny.

01:31:20 Speaker_02
It is like you have a schedule one substance that's made by the human body. It's literally like making saliva illegal It's the stupidest thing ever But think about this

01:31:30 Speaker_00
We don't know what it does. We don't know what it's for. But it's out there all over the place. And people do have these very unique experiences on it. And people over and over again say, that reality is more real than this reality.

01:31:46 Speaker_02
Well, you know, it's also very similar in its compound to psilocybin, especially when it's processed by the body. And that's one of the more interesting theories about how humans became human was McKenna's stone-dape theory.

01:32:01 Speaker_02
He thinks that human beings, when there was climate change in the savannas, as the rainforest receded into grasslands, we started experimenting with different food sources and flipping over cow patties because there's more undulate animals in these fields, and that we started eating mushrooms that were growing on the cow patties.

01:32:18 Speaker_02
Mushrooms increase visual acuity, make people more amorous, they start having more sex, they make them better hunters because the visual acuity induces glossolalia, creates language, all these things associating sounds with objects, that all these things blossom.

01:32:35 Speaker_02
And then there's the doubling of the human brain size, which coincides. in a timeline with that. Dennis McKenna does the best job of explaining it.

01:32:45 Speaker_02
Terrence was, you know, a bard and a fascinating sort of a philosopher, but his brother Dennis is a hardcore scientist, and the way he explains it, he talks about the actual physical mechanisms.

01:32:55 Speaker_02
the different things that happen to the human body when they encounter this substance. Which also, there's a bunch of different ways that people endogenously stimulate it. There's holotropic breathing, it's probably stimulating that.

01:33:09 Speaker_02
There's a bunch of different states of meditation that people can achieve. There's kundalini yoga.

01:33:14 Speaker_02
which I know people that have both done DMT and are regular practitioners of Kundalini Yoga and they seem to think or they seem to at least state that they can achieve these states of consciousness without taking the actual drug itself.

01:33:27 Speaker_02
They can force their brain into making it.

01:33:31 Speaker_00
I think what's happening is that the pathway the quality of the connection is being changed. And I think that's what we can test. And so again, I've been working with people in multiple fields.

01:33:48 Speaker_02
Are you saying that you think we're connected to it always, and then the quality of the connection is changed by taking ayahuasca or taking dimethyltryptamine, that's what's happening? So it's just enhancing the quality of the connection.

01:34:00 Speaker_00
Correct, and I think at the opposite extreme, there are a lot of things that go wrong with our brain, maybe when we just get drunk, or maybe when we get clubbed, or that really- Diminish the connection. Diminish it, or just cut it temporarily. Right.

01:34:20 Speaker_00
And I think all of this is testable, and the only problem is so far, the neurosciences labs, have not been looking for this. They've just never looked for evidence of transduction. But I think when we start looking for it, we're going to find it.

01:34:36 Speaker_00
And that can make two big changes in the way we see everything. It can make a change in that we finally begin to understand how the brain makes us as intelligent as we are.

01:34:50 Speaker_00
It turns out it's not a self-contained processing unit, so it's not playing the role we thought it was playing, but it is very critical in the transduction process, very critical.

01:35:01 Speaker_00
It's preparing content for transduction, and of course it's bidirectional. The microphone is unidirectional, but the brain is a bidirectional transducer.

01:35:11 Speaker_00
Hmm, so and it'll it'll change the way we see the structure of the universe So it's interfacing with consciousness rather than being conscious itself. Oh, it's not consciousness.

01:35:24 Speaker_00
No, it's it's a It's a it's an avenue It's a it's a pathway and that is what is connecting us with all this other stuff this this stuff you know, I

01:35:41 Speaker_00
My mom, who passed away about a year and a half ago, but my mom, in those last couple years, she kept saying that she was hearing music.

01:35:53 Speaker_00
And she loved music, she always loved music, but she was hearing music that she had never heard before, she said.

01:35:58 Speaker_00
And she would sometimes try to hum the music or sing the music, and she said it was always coming from downstairs, and then people would say, my mom was, She was very sarcastic in her manner. And so someone would say, well, I don't hear anything.

01:36:18 Speaker_00
And she'd go, well, maybe you should get your hearing checked. Because she just assumed that it was real. I expressed the concern that the music was always downstairs. I said, I'd be more comfortable if we're coming from upstairs. And she goes, oh, no.

01:36:33 Speaker_00
She says, don't worry. I'm not going to hell. Okay, okay, fine, it's coming from downstairs. But that's a perfect example. Where is that stuff coming from? Or even people who hear voices.

01:36:48 Speaker_00
Well, if you have a pathway into another domain where there's intelligence, anything could come through. It could be the weird stuff that happens in brains. It could be music you've never heard before. It could be voices telling you what to do.

01:37:05 Speaker_00
See what I mean? Look at these mysteries. There are just so many of them. And yet we sit here, complacent, complacent, complacent, and then we make up stories. That's what we do. We make up stories.

01:37:18 Speaker_00
And as long as the grammar is right, we think we've got it figured out. What do you mean by making up stories? Well, like the computer metaphor.

01:37:25 Speaker_00
You know, if you go back in time to explain human intelligence, at first it was God, it was some sort of Holy Spirit. Then at some point it became, there was actually a metaphor involving liquids, movements of liquids.

01:37:41 Speaker_00
And then there became mechanical machine, like, you know, kind of Descartes kinds of things, machines that somehow explain consciousness and intelligence. The metaphors keep changing over the years. Right now we're stuck with the computer metaphor.

01:37:55 Speaker_00
It's still a metaphor. And it's silly. It's a silly metaphor. And I think we have to face up to the fact that our brain is doing something unique and special. and that we couldn't always do it.

01:38:14 Speaker_00
There's a point in time before which apparently we weren't doing it, then there's a point where we started to have this ability.

01:38:22 Speaker_00
And I think this could explain the Fermi paradox because, again, according to this book by Mills, it was quite interesting, unless somehow something uplifts you, beyond just what normal evolution can do, you're stuck. You're being a chimp. That's it.

01:38:40 Speaker_00
You're stuck as chimp forever. No morality, small groups, okay? But humans are fundamentally different. We did make that leap, the one Darwin couldn't figure out. And I think this is the leap.

01:38:57 Speaker_00
So I've also been, I started out in math and physics a long time ago, and I've also been looking at the physics, and the physics is there. The physicists, they know that this reality is just not it.

01:39:13 Speaker_00
So take those two problems, that is to say the structure of the universe is actually very rich and complicated and very hard for us to imagine,

01:39:24 Speaker_00
and the fact that we have no idea how the brain works, and add to that all the mysteries, you could take care of all of these problems with a neural transduction theory, especially if we can find supporting evidence.

01:39:39 Speaker_02
And when you say the universe, you're talking essentially about all aspects of it, including like subatomic particles, which is like the deepest mysteries, when things become magic, when things don't make any sense at all.

01:39:53 Speaker_00
Well, I think, frankly, if we could simulate this connection, we could actually communicate directly with other intelligences and actually find out answers to some questions we're having trouble answering on our own.

01:40:10 Speaker_00
Frankly, even the biggest mystery of all, the God mystery, You know, of course, ironically, DMT is sometimes called the God particle. But even that mystery, I think we probably could get some insights on. Even that mystery.

01:40:30 Speaker_00
Because I doubt the God of the Bible exists. But there's got to be something, you know, some godlike entity involved in creation, you know. I think creation is much more complicated than we think it is.

01:40:50 Speaker_00
But the point is, I think that if we can communicate directly, That's, to me, you know, I get these fantasies like building a nationwide monitoring system and building a dashboard so people can watch it in real time.

01:41:08 Speaker_00
When that thing actually started to exist, I thought, this is crazy. I can't believe, I cannot believe that we did this. I think this neural transduction stuff is of the same nature.

01:41:25 Speaker_00
Now and then I get this funny feeling, like an intuition maybe, and I have it for neural transduction. In other words, I'm pretty sure neural transduction is right. In fact, there's a whole bunch of people now that I've convinced

01:41:44 Speaker_00
including a physicist who's apparently gonna be driving up here later today and we're gonna have dinner with him.

01:41:52 Speaker_00
But he almost instantly just got it because a good theory, and this is what Darwin keeps saying in his book, he keeps saying a good theory explains a lot with very, very simple principles. And that's why he kept saying, you know,

01:42:14 Speaker_00
natural selection was such a good theory because it explains so much, so many crazy things. Like he points to a particular species that's on an island and has these characteristics, and it has similar characteristics to the mainland that's nearby.

01:42:33 Speaker_00
He goes, all right, but over here there's another island, similar species but has very different characteristics, but it has characteristics similar to the species on the mainland, which is nearby.

01:42:45 Speaker_00
He said, now, you could invoke God and say God is just kind of like this checkerboard kind of arrangement, so he just scatters species about in this way, but he said there's a simpler way.

01:43:00 Speaker_00
which is just natural selection, and some organisms move from the island to the mainland or the other direction, and they end up sharing characteristics. Doesn't that make more sense, he keeps saying?

01:43:12 Speaker_00
To me, that's what neurotransduction is at this point. I think it explains so much so simply. And it's consistent with this notion that evolution is fabulous at creating transducers. Somehow we've ignored that.

01:43:34 Speaker_00
And so as we've dug in farther and farther, we are finding the weirdest transducers in all kinds of species, especially sea creatures.

01:43:43 Speaker_00
And so couldn't it, you know, if there is a way to connect two universes, couldn't evolution find a way at some point?

01:43:56 Speaker_02
Right. When you're talking about this connection, are you talking about some sort of a technological intervention? Are you talking about just natural selection, creating this connection and enhancing it in new people?

01:44:12 Speaker_00
Oh, I'm definitely talking about it arising naturally. Organically. Organically, absolutely.

01:44:17 Speaker_00
But separate from that, I'm saying that as we've been able to simulate so many aspects of what happens in the organic world, we're even creating organic transducers now. Not just these mechanical ones, we're creating organic ones too.

01:44:33 Speaker_00
I think that if we can figure out how it works, we will be able to simulate it. And that, again, will change everything. Because right now, what happens, happens naturally.

01:44:46 Speaker_00
And I think you're right, there are some people who, through certain practices and maybe the use of certain drugs, can kind of alter what happens along that pathway. But it's a lot of work, a lot of dedication.

01:45:07 Speaker_00
I think though that we can, we'll be able to simulate this, maybe with some combination of technology and perhaps organic materials.

01:45:16 Speaker_02
How do you imagine that we would simulate this? Do you think we would come up with something that would, you know how they use like electromagnets to stimulate parts of the brain that have been hurt in trauma and are not firing anymore?

01:45:31 Speaker_02
Do you do that with people that have traumatic brain injuries? And they give them back a lot of their function.

01:45:37 Speaker_02
Do you think there'd be something like that, like some kind of technology that would stimulate your brain's ability to produce these human neurochemicals and just do it in much larger quantities?

01:45:49 Speaker_00
Yes, I think we could.

01:45:50 Speaker_02
Or do it voluntarily?

01:45:53 Speaker_00
I think we can find artificial means of improving the connection. Yes. Improving the nature of the connection. Yes.

01:46:02 Speaker_02
And you think the nature of the connection is based on human neurochemistry?

01:46:06 Speaker_00
I do. But I also think, separate from that, that we can create devices

01:46:13 Speaker_00
Like we have knee replacements and hip replacements, and we don't have brain replacements, but there's a lot of stuff that we've been able to study in organisms and basically replicate in various ways, sometimes just using

01:46:30 Speaker_00
technology and spare parts, and sometimes we're not using actual organics. But yeah, I think we can do that too. So that we can alter the nature of the connection occurring in someone's brain, but I think also we can simulate it outside of the brain.

01:46:47 Speaker_00
And that's where real power would come from.

01:46:49 Speaker_02
So when you say by simulate it outside of the brain, what methods do you think would be able to be efficient at doing something like that, or make it effective?

01:47:00 Speaker_00
Well, I mean, like a box, there's a box.

01:47:02 Speaker_02
OK.

01:47:03 Speaker_00
I'm stealing this. The chimpanzee skull? Yeah. So there's a box. OK. And literally, this box is a transducer, like this microphone. And it's taking content from our universe, and it is sending it into the other.

01:47:22 Speaker_00
And this is bi-directional, so it actually can send signals back as well. And I'm saying I think we can figure out how to do that. So what would that box be tuning into specifically? I don't know because I don't know. I don't know. That's the point.

01:47:39 Speaker_00
No one knows what that is. You know, no one knows what's happening in that gap. Right. Between the two fingers. But because I think no one's been looking.

01:47:49 Speaker_00
You know, we have all these clues, and we have the DMT stuff, and we have people's experiences, and, you know, we have all so many different clues. We have people who see ghosts, and, you know, they're clues.

01:48:01 Speaker_00
But you've got to put it together, and you have to put it together, in my opinion, in neuroscience labs and in physics labs. And you've got to get those people talking to each other, which they, generally speaking, have never done.

01:48:14 Speaker_00
That's often the key to dramatic increases in our understanding of whatever it is. That's often the key, is bringing together people from very different fields who, generally speaking, don't communicate. In this case, it's mainly

01:48:31 Speaker_00
physicists, especially astrophysicists, and neuroscientists.

01:48:37 Speaker_00
And as I say, I've been doing this, I've been reaching out to people now for a couple of years, and I'm finding, I'm getting this group, you know, and Strassman's on my list, and I'm gonna reach out to people you've suggested, and I think we can,

01:48:53 Speaker_00
I think we're just going to have a ball, first of all, just getting us all together and getting up and giving little speeches about how you think you could test this theory, maybe about how you think you could build an interface.

01:49:11 Speaker_00
I think we're just going to have a ball.

01:49:13 Speaker_02
I think what's important that ties us in with your research is that all of this would lead to an improvement in human communication, human community, the way we interface with each other, the way we exchange information, and the way we collectively act as a group.

01:49:31 Speaker_02
Whereas the manipulation of this information for political goals, for financial goals, for ideological capture, for manipulating the way human beings think about things, is really the contrary to that. It's the opposite of that effect.

01:49:49 Speaker_00
It's the exact opposite. And look, I am an idealist. In my classes for years and years, just for fun, I distribute a test of idealism. Because I always wanted to see whether any student could score as I did, as high as I did.

01:50:09 Speaker_00
And I never found a student who could score as high as I did on a test of idealism. So that's, yeah, these things that I work on, they're all of that nature. And yes, if you kind of take this neurotransduction idea and try to think ahead a few years,

01:50:27 Speaker_00
This could be the key to telepathy, real telepathy. This could create a kind of unity in humankind that has never existed before. And it could also connect us more meaningfully with intelligent entities outside of our universe.

01:50:44 Speaker_02
You know, in the early 20th century, when they first started studying ayahuasca, they wanted to describe, they wanted to use the label telepathy for harming. But unfortunately, harming had already been labeled.

01:50:58 Speaker_02
And so, you know, because the rules of scientific nomenclature, they kept the term harming.

01:51:03 Speaker_02
but these people that weren't aware that that harming had been isolated were trying to call this stuff telepathy because in their experiences in the jungle when they were taking this stuff they were experiencing these group telepathic moments and they had to say these are hardcore scientists they decided that this was such a profound experience and so replicatable that they were going to call it telepathy which is really interesting really speaks to what you're saying

01:51:29 Speaker_00
Well, you know, it could be that we're going full circle here because DMT, which is, I guess, a key component in ayahuasca, DMT has got to be playing a role here. It's just got to be. You know, it's staring us in the face.

01:51:51 Speaker_00
All these little pieces, in my opinion, are just there. They're just there.

01:51:56 Speaker_02
Well, it's in so many different plants that we have developed a thing called monoamine oxidase that breaks it down in our gut so that we don't get high from all the plants we eat. Which is pretty crazy.

01:52:10 Speaker_00
Well, but again, it just drives the point home that our world is kind of, it's telling us things.

01:52:20 Speaker_02
It's telling us- Right. There's a component to our world that we've missed, and that the fact that this dimethyltryptamine exists in so many different plants and animals.

01:52:30 Speaker_00
Which brings me back to complacence because that's that is one of the things that's driven me nuts Regarding all the discoveries I've made about new forms of manipulation made possible by the internet and now the monitoring systems showing more and more and in more detail that these techniques are actually being employed on a massive scale and

01:52:54 Speaker_00
And again, it's the complacence. You know, we're complacent about things that we don't need to be complacent about. We're complacent about how the mind works and how the brain works. And we're complacent about dreams.

01:53:08 Speaker_00
How could you be complacent about dreams? Dreams are so amazing. I have had, I've dreamt full length movies that are better than any movie I've ever seen.

01:53:20 Speaker_00
And then, of course, I'm struggling at the end to grab onto little pieces, and the most I can get are a couple little pieces, but I know I dreamt the whole thing.

01:53:27 Speaker_02
By the way, that's exactly the same as psychedelic experiences.

01:53:31 Speaker_00
Really?

01:53:31 Speaker_02
Psychedelic experiences are insanely difficult to remember. They're insanely difficult to remember in the exact same way. Like, when you wake up from a dream, you could tell me your dream.

01:53:40 Speaker_02
Like, oh my God, I was on a skateboard and Godzilla was chasing me. You could tell me your dream, but you won't remember that dream in a while. And that's the same as psychedelic experiences.

01:53:49 Speaker_02
When they're over, everyone can kind of tell you what they experienced. But it's very difficult to remember it a day later, a month later, a year later, you get like these little flashes, like almost like a, like a slideshow, little slideshow.

01:54:03 Speaker_02
Oh, yeah, that thing. I forgot about that part.

01:54:06 Speaker_02
But you don't remember the experience, which seems strange, because I remember amazingly profound experiences from my life in vivid detail, like interactions with my children that were just filled with love and happiness, you know, when they hug you and cry.

01:54:25 Speaker_02
And it's like, there's moments that you remember, like, I'm never going to forget that. There's moments that I remember just with friends, and I'm like, I'm never going to forget this moment.

01:54:35 Speaker_02
with loved ones, but not the dreams, not these crazy, profound, earth-shattering dreams that make you wake up sweating. You go to the bathroom, you're like, what the fuck was that dream about? That happens to me all the time.

01:54:51 Speaker_02
And then I go right back to sleep, and then the dream goes away. And then in the morning, I'm like, I'm going to remember that. I don't remember it. I don't remember it at all. I barely remember. It's a slideshow.

01:54:58 Speaker_02
Your brain is protecting you somehow There's something and there was something that I was reading actually yesterday

01:55:06 Speaker_02
about forgetfulness, that it is not a flaw, but a feature, and that there's something – there's a mechanism that's going on that allows human beings to forget things, and that in doing so, it's very beneficial for not keeping you occupied on those things and allowing you to concentrate on new things.

01:55:26 Speaker_02
So instead of just allowing you to have the free will to decide whether or not to think about the past or think about the future It tries to get rid of it like stop get that out of here. So it kills it.

01:55:35 Speaker_02
It like throws those those ideas away And that this is actually a feature where people say god, i'm so forgetful But are you I mean some people are because they have a mental condition, right?

01:55:46 Speaker_02
They have alzheimer's they have the dementia they have real issues, but a lot of people What they're really doing is thinking about other things and that's what makes them forgetful They're more concentrating on other things and they can't remember.

01:55:59 Speaker_02
What did I say?

01:56:00 Speaker_02
Like like my wife will tell me things I'm barely paying attention and she's like I told you that I'm like when did you tell me that like I told you that yesterday I already forgot because it didn't mean anything to me at the time because I have to filter through and then Debbie said to Marsha Marsha was like, how could you do that?

01:56:15 Speaker_02
I was I was gonna say something but I didn't want to I forget about that. That's in and out. Because I have no room for that. Right? But some people remember it forever. And you got to think, what is that forgetfulness?

01:56:29 Speaker_02
Well, this article that I was reading was talking about that forgetting memories is actually a feature.

01:56:35 Speaker_02
And so there might be some component of that, that you're not totally past this bridge that would connect us to whatever that realm is, and that you get these brief interactions with that realm, but you're not ready to be all in yet.

01:56:52 Speaker_02
You're not ready to be connected to it. You're not ready to remember all the experiences that you had in this mushroom trip that you went on. It's just too much for you.

01:57:01 Speaker_02
So let's just get that out of your system, because your regular consciousness is not wired to accept the reality of where that realm is. And the fact that The other thing is that realm is there in 30 seconds, especially with dimethyltryptamine.

01:57:18 Speaker_02
30 seconds later, you're in an impossible realm. 15 minutes later, that's gone. 20 minutes later, you're struggling to remember it. Half an hour later, it's mostly gone.

01:57:27 Speaker_00
Okay. Now, everything you just said involved storytelling. I'm not telling stories.

01:57:33 Speaker_00
I'm just saying I think this content is streaming It's not being generated by our brain and that's why we have so much trouble remembering it because we weren't producing it But that's not necessarily storytelling.

01:57:45 Speaker_02
It's just memories in general.

01:57:47 Speaker_02
There's something about there's a mechanism I'm telling you that's happening with psychedelic trips where it's almost impossible to remember them And I think that's a feature but I'm I'm saying something very far more radical.

01:57:59 Speaker_00
I'm saying there is no memory Hmm There is no memory.

01:58:04 Speaker_02
Okay, but as applied to everyday life there is no in a practical sense No, there's no in a practical sense when you say who's the first president of the United States? Don't you think you have a memory that it's George Washington?

01:58:17 Speaker_00
I think I might respond George Washington, but it's not stored anywhere in my brain Well, of course it is because that's what you learned.

01:58:24 Speaker_02
No learn that in high school or wherever whenever you learned it. I So how do you know if it's not in your brain, if it's not stored in your brain?

01:58:31 Speaker_02
So like if I could ask you what your son's name is, you know what your son's name is because it's stored in your memory.

01:58:38 Speaker_00
There's nothing stored in my memory and certainly not my son's name.

01:58:42 Speaker_02
So how do you know your son's name?

01:58:44 Speaker_00
Well, because I was exposed to it. I probably even came up with it a long time ago and under certain circumstances if I'm asked what his name is... Under certain circumstances you don't remember his name?

01:58:56 Speaker_00
Yes, it happens to you as you get older, especially with your kids. It's really embarrassing.

01:59:01 Speaker_02
And what do you think that is?

01:59:04 Speaker_00
Well, I'm trying to say that there is no memory in the brain.

01:59:11 Speaker_00
I'm saying that transduction is occurring, and when the brain gets damaged, the transducer, like if I smash this microphone, the transduction process... Right, but you're still avoiding the question, like, how do you know your son's name if it's not in your memory?

01:59:29 Speaker_00
Okay, memory itself is a metaphor. There used to be, the old metaphor was based on a library and shelves, and then there were other ones based on interconnected neurons acting in cycles. But these are all metaphors. There is no memory in the brain.

01:59:50 Speaker_00
So that article of mine I mentioned, The Empty Brain, That's what it's all about. It explains that there is no memories. The way we use the term memory, it's just another metaphor. So for example, who's that one?

02:00:09 Speaker_00
Daniel Barenboim, who was one of my favorite conductors and pianists. By the time he was 17, he had memorized all 31 of Beethoven's piano sonatas. So I had someone count up the notes.

02:00:24 Speaker_00
It's about 350,000 notes and almost as many markings of various sorts for the pedals and volume and all that stuff. It's a tremendous amount of data, tremendous amount of data.

02:00:38 Speaker_00
And you know what, you can search Daniel Berenboim, he's still alive, you can search his brain forever and you'll never find a single note. It's not in his brain. So where is it? Well, in terms of what you mean by it, the music is nowhere.

02:00:57 Speaker_00
He didn't absorb the music.

02:00:58 Speaker_02
But he remembered how to make the music.

02:01:05 Speaker_00
No. No. He was, under some conditions, able to make the music, but there's no memory involved.

02:01:14 Speaker_02
If someone teaches someone how to do something, you don't remember how to do that thing? That's not what it is?

02:01:21 Speaker_00
It means that some change is occurring that allows you, under certain conditions, to do that thing again or something similar to it.

02:01:31 Speaker_02
But if there's skills that I could teach you? You don't think you remember those skills? Like I taught you physical skills, like I taught you how to put somebody in an armbar. You don't think that's a memory?

02:01:44 Speaker_00
No, it's definitely not. It's not a memory, no. So what is it? It's a change. Some sort of change is occurring, whether it's occurring in your brain, some sort of a change, or whether it's occurring in that link.

02:02:00 Speaker_02
So how does the brain differentiate between what it remembers and what it doesn't remember if memories aren't real?

02:02:07 Speaker_00
Maybe I can make the point this way. Demonstration in class. I would say to people, who knows what a dollar bill looks like? So if someone comes up to the board and they draw a dollar bill, and I'd say, make it as detailed as you possibly can.

02:02:24 Speaker_00
So they draw a dollar bill, and it kind of has a place where there's a face, and it kind of has some ones in the corners, and usually that's as far as people can get. And I say, well, let's try an experiment here.

02:02:37 Speaker_00
So I cover up the dollar bill that they just drew. I tape a piece of paper, and then I tape up a real dollar bill. And I say, maybe the person drew what they drew and it was so terrible because they're a bad artist. Let's find out.

02:02:51 Speaker_00
So I say, here, now draw a dollar bill. So they've got a dollar bill right up on the board. And now they draw this magnificent dollar bill. Because they're copying the dollar bill. Yeah.

02:03:01 Speaker_00
But the point is there is no image of the dollar bill in their head, right?

02:03:05 Speaker_02
Because they haven't had a detailed sort of examination of the dollar bill. Most people just give a cursory examination to a dollar bill. You look down. Oh, that's a five. It's a 20. I mean, I kind of know Andrew Jackson's on the 20.

02:03:21 Speaker_02
Most people are not really paying that much attention to it. But if you get a dollar bill scholar and someone who really understands dollar bills, they probably could. Have you ever seen Al Franken draw the United States?

02:03:32 Speaker_00
No.

02:03:32 Speaker_02
It's really interesting. Al Franken, very unfortunate what happened to that guy, because I think he was a fantastic politician still. Very interesting person, very intelligent, and a real patriot.

02:03:44 Speaker_02
So Al Franken can draw the entire United States accurately with all the state boundaries from memory. See if you can pull that up. It's very interesting. Why? Because here it is.

02:03:56 Speaker_02
So Al Franken has deeply studied the parameters of the states and the state lines and can recreate them from memory. Why? Because he's done this before and he has a record in his mind of what this looks like because he's carefully examined that.

02:04:17 Speaker_02
There are things that I've had conversations with people, you know, a couple of weeks ago and I probably don't remember them. And then there's things where I could tell you word for word someone said. There's got to be a reason for that.

02:04:30 Speaker_02
And if you're not calling it memory, what are you calling it?

02:04:35 Speaker_00
I'm trying to introduce a different concept, because I can tell you... I understand you are doing that, but I don't know what you're introducing.

02:04:41 Speaker_00
Well, I'm trying to tell you that if you cut open Al Franken's brain, you will never find a map of the United States.

02:04:47 Speaker_02
Right, but he can do that, and he is the same thing as me. I can't do that.

02:04:52 Speaker_00
Yeah, and you're wondering why.

02:04:54 Speaker_02
Well, because I'll tell you why because I haven't tried to do that and studied it and Memorized how to do it the same way I could teach you how to memorize certain movements I could teach you how to memorize certain physical movements And then if you practice them, I could ask you in a couple of weeks try to do it again And you'd be able to do it, but maybe you'll forget certain key points of those movements So then I would correct you and then I teach it Well, you would remember how to do those and then I would say what are you supposed to do with your hand?

02:05:21 Speaker_02
You're like, oh left hand up. That's right Because you remember it. So you might not be able to find that in your brain, but it's very clear that something is going on where you are able to memorize things. And memorize them better with music, right?

02:05:36 Speaker_02
Conjunction, junction, what's your function? Right? We all remember that. Why? Because it's attached to music, and music makes things easier to remember. I've never heard that in my whole life.

02:05:47 Speaker_00
You've never heard conjunction, junction? No. But the point is... That's the thing.

02:05:49 Speaker_02
Schoolhouse rock.

02:05:50 Speaker_00
But you do understand. You do understand, though, right, that there are people who could glance at a map of the United States, never having seen one before, and then could go off to a board and draw the whole thing in detail. Sure. Yeah.

02:06:02 Speaker_00
They have a different kind of memory. And then generally those people are on the spectrum. I'm trying to tell you there is no memory. There's no memory. OK. There's nothing. No one looking into the brain.

02:06:11 Speaker_02
Is there anyone that can draw an accurate map of the United States without having ever looked at an accurate map of the United States?

02:06:20 Speaker_00
I doubt it. OK. But there are people who can draw things from their dreams that they have never seen before. But they have seen them in their dreams. And how do we even know if they're accurate?

02:06:31 Speaker_02
It might be as accurate as that dollar bill drawing.

02:06:37 Speaker_00
OK. Are you open to the idea that memory in the brain is just a metaphor? Sure. OK. So are you open to the idea that there

02:06:51 Speaker_02
Possibly no memory and we still could do all the things we can do, but there's no memory Well, you're calling it memory, right and I'm saying as a physical function as a function a thing happening There's a you can memorize things.

02:07:05 Speaker_02
That's how you learn a new language, right you memorize You know me. I'm a Rogan, you know, that's how you do it You remember right? So if you're saying that that doesn't exist I'm saying What is happening? Give me some sort of a replacement. I am.

02:07:23 Speaker_02
I'm giving you a transaction. Right. But where is it stored? I don't know. Okay.

02:07:28 Speaker_00
I want to find out that storage. Don't you think we should find out?

02:07:32 Speaker_02
Couldn't you use the term memory to accurately describe that storage?

02:07:35 Speaker_00
It's not, but it's not in our brain. So the people who are looking in brains and looking for memories, they're not finding- Was it just that they haven't found it yet?

02:07:45 Speaker_02
Or they don't understand that you're not going to be able to see it in the same way that you see cells?

02:07:50 Speaker_00
I've talked to some of the top neuroscientists who study memory, and the first thing they say is, I can't find it because I don't think it's actually there.

02:07:59 Speaker_02
Perhaps, but let me ask you this. How do we know what size memory is? So are they looking in the subatomic realm? Are they looking at particles that are quantumly entangled? How do they know what they're looking for?

02:08:14 Speaker_02
Is it simply that we have a limited amount of tools?

02:08:18 Speaker_00
Well, a much simpler idea, see these are all, they're interesting concepts, they are. But a much simpler idea is that the brain, look at all this space that this microphone, this is a very good quality microphone.

02:08:31 Speaker_00
and it takes a lot of stuff in there for it to work as well as it does. So a much simpler idea, given that no one's ever found anything remotely like memory inside the brain, and I actually asked Eric Kandel

02:08:50 Speaker_00
who was in his 80s at the time and who had won a Nobel Prize for his work on memory and the aplesia. I said, how long do you think it's gonna be before we understand human memory? And he said, a hundred years, meaning we aren't.

02:09:06 Speaker_00
So isn't the simpler idea that the brain is actually like this, that the brain is a transducer allowing us to communicate with higher intelligence in another universe. Isn't that a simpler idea? No, no, that's not simpler at all.

02:09:25 Speaker_02
That's way more complex. No, that's super simple. It's way more complex than experiences being stored in a functional way so that you can benefit from them. Except that there's no evidence of any storage and there never will be.

02:09:37 Speaker_02
How could you say there never will be if he said 100 years? 100 years is not never. 100 years ago, we were exactly the same species as we are right now. You had to be there, was the way he said it.

02:09:47 Speaker_00
I understand.

02:09:48 Speaker_02
I understand what you're saying. But look, before they understood spooky action at a distance, before they understood subatomic particles, if you tried to explain that to someone from 1850, they'd be like, what the fuck are you talking about?

02:10:03 Speaker_02
But now it's understood. It's measurable. It's something that we agree upon, that subatomic particles, that atoms, that neutrinos, all these things exist. Bizarre things that we could there's neutrinos passing through us right now right from space.

02:10:19 Speaker_02
There's a neutrino detector in this Antarctica we know that there's these things that we didn't know existed exist as we have more access to technology more understanding of the mechanisms of the mind and

02:10:35 Speaker_02
Isn't it possible that we could say, oh, this is where memories are stored? And isn't it true that if certain areas of the brain are damaged, in particular, it will damage memories? Isn't that true?

02:10:47 Speaker_00
It will damage the transduction process, yeah.

02:10:51 Speaker_02
Okay, you're married to this transduction process. I'm not saying that it's not in... There's something happening. Let's not even say it's in the brain. Maybe it's in the entire body. Maybe it's in every cell. Maybe it's in the DNA.

02:11:05 Speaker_02
Whatever it is, there's something in there.

02:11:08 Speaker_00
Well, I'm trying to point out that the something is something we haven't thought about in the past, and it would actually solve so many problems.

02:11:17 Speaker_02
I see what you're saying in terms of communication with whatever that other realm is. But what I'm saying is that there might be, and forget about the term the brain, local. Let's just say local. Local, OK. Because when I'm accessing, oh, I know what

02:11:34 Speaker_02
If I press the tumeric button on the coffee machine, it makes the kind I like. That's locally stored. Other people don't know that if they'd never used that machine. This is locally stored information. Forget about finding it in the brain.

02:11:49 Speaker_02
It might be in the DNA. We don't know where it is, but I know how to start my car. I know how to put it in drive, because I've done it before. So something is happening where I'm storing information.

02:12:03 Speaker_02
And the more information I store, the more it makes me effective at discussing certain things. There are certain things that I don't have any information about. I haven't read them. I haven't memorized them.

02:12:14 Speaker_00
You are married to the information processing metaphor. I'm not. It's just a metaphor.

02:12:19 Speaker_02
There is no information in the brain. Challenging this thing that you're saying that I don't think sounds as complete as you're saying it sounds.

02:12:29 Speaker_00
But the good news is, it's testable. It's empirically testable. And I don't think it's gonna take 20 years. I think it's just gonna take maybe five years. I think it's because the labs already exist.

02:12:42 Speaker_00
And it's not like studying black holes where you can't really access them and you have to, because we can actually study brains. And we have lots of great equipment. It's just no one's ever looked for what I'm talking about.

02:12:54 Speaker_00
And the point is, as I've talked to more neuroscientists, and physicists, they're saying the same thing. They're saying this has to be right and we just need to look for it. We never have before. Let's speculate.

02:13:08 Speaker_00
Let's say that they start looking for it and they find evidence that this is actually occurring. Don't forget about crazy things that consciousness turns off, and then consciousness turns on. What's that? What are psychotic states?

02:13:24 Speaker_00
Yeah, but see, I'm just saying it's just an interruption in a pathway. Got it. That's really easy.

02:13:30 Speaker_02
Right, which makes sense for psychotic states, right? There's some sort of a disturbance in the way the system is running, and it's not tuning in to the other side the right way.

02:13:39 Speaker_00
It takes care of psychotic states just like that. I mean, think of how useful, how, you know, it does, you gotta go back and read Darwin because that's exactly what Darwin keeps saying.

02:13:56 Speaker_00
It was eye-opening for me to read this book because that's what he keeps saying. He keeps saying, look, I know this sounds nutty, but. It's much better than any other crazy story that you're going to tell.

02:14:12 Speaker_02
We were actually just having this conversation the other day with Brett Weinstein. Oh, really? And Brett Weinstein, who's a biologist. And his belief is that random mutation, natural selection, Darwinian evolution, they're all real.

02:14:27 Speaker_02
It's all absolutely happening. But then there's probably also factors that we haven't figured out yet. And that's what shows human beings. That's how human beings got there.

02:14:36 Speaker_00
is the factor. This is what gets you up to that next level, and it's consistent with the ideas that physicists have about the structure of the universe.

02:14:49 Speaker_00
Again, just start with the basics, that evolution is fantastic at producing all kinds of weird, bizarre,

02:15:01 Speaker_00
Transducers and that were encased in transducers from head to toe Couldn't if the universe is what we think it is couldn't Evolution at one point because it's producing all kinds of new traits all the time.

02:15:16 Speaker_00
Mm-hmm Couldn't it produce a brain that has that feature that connects us boom, right? We're connected that it might be an emerging quality in humans and then 20,000 years ago it emerged and emerged and

02:15:30 Speaker_00
It just brought us up like this, just like in 2001. We go up to here, and that could help explain the Fermi paradox, because there could be lots of chimp-like creatures all over the galaxy. But they just never made it to that level.

02:15:49 Speaker_02
Because that's where the weirdness of the whole accelerated evolution theory comes into place. I mean, this is you get into the kookiest of kooky stuff, the Anunnaki and the Sumerian texts.

02:16:01 Speaker_02
And this idea that these lower primates were manipulated and that something was introduced into their genes and that this something is probably some genetics from some superior race or some more advanced race and that we

02:16:16 Speaker_02
took on that and that it became a part of us and now it's in our gene pool and now we're moving in that general direction with this different connection.

02:16:24 Speaker_00
I've read some of these books. They're really fascinating. Truly, they hold my attention. But page after page after page, I keep saying, yeah, but neural transduction theory is much simpler. It's just one tiny little change that has to occur.

02:16:40 Speaker_02
I don't think they're mutually exclusive, because neural transduction theory, as you're saying, if there's these primates on these other planets that never achieved this,

02:16:48 Speaker_02
And then there are ones that have and have transcended, that these ones that have transcended recognize this quality that's missing in these chimpanzees, and they introduce it. We don't need them. It's possible, but we don't need them.

02:17:04 Speaker_00
We have evolution.

02:17:05 Speaker_02
But that's why it's accelerated evolution. It's saying, look, they've concluded that primates right now have entered the Stone Age. Do you know that? So they're starting to use tools. So it was really interesting, right?

02:17:18 Speaker_02
So if given enough time, you give them 100 million years, who knows what a chimp is going to look like 100 million years from now. They might be like us. They might do it naturally.

02:17:28 Speaker_02
The speculation, and again, this is not something I'm married to, but the speculation, this kooky speculation, is that we were visited by extraterrestrials that were far more advanced.

02:17:40 Speaker_02
and that they found us as these simple shit-throwing primates, and they said, let's juice this process up a little bit. We know where this is gonna go eventually, hopefully, if everything works out, but let's juice it up.

02:17:53 Speaker_00
Let me connect what you just said with what I've been saying. Easy connection. Okay. And it brings us into the world of UFOs. Because whether this capability arose on its own, which it could, or whether it was juiced up a little bit by some outsiders.

02:18:16 Speaker_00
Which it could. Which it could. Then either way, we're now in a position where we could, in theory, communicate in more meaningful ways with extraterrestrials. We could.

02:18:33 Speaker_00
And we might, by understanding how transduction works, we might figure out how to do that. So not just communicating with people in another universe, but communicating with extraterrestrials.

02:18:47 Speaker_00
Some of these extraterrestrials, in fact, most of them, maybe all of them, have to have this ability. They have to have that transduction ability, or they never would have gotten above chimp level.

02:18:59 Speaker_00
So maybe this is our way of connecting with them as well. and that we're on the path, but we're not quite there yet. We're not quite there yet, but I think we could get there really fast.

02:19:12 Speaker_00
And again, some of these neuroscientists I've been talking to, they're saying the same thing, because this is not like studying, I don't know, this is not like studying, I'll say black holes again, but this is different because we've got

02:19:28 Speaker_00
Thousands of labs, some of them extremely sophisticated labs, were just not looking for this. What happens if we start looking for this?

02:19:38 Speaker_00
And so what we've been doing is we've been trying to work out experiments that can be conducted and that should produce one result or another, depending on whether transduction is occurring.

02:19:52 Speaker_00
And that's the goal, is find empirical support for this type of theory. If we can find empirical support, the more support we find, obviously, the more convincing this will be. And then that would bring in the engineers.

02:20:08 Speaker_00
It's the engineers who could really make this thing sing. So it's another one of my intuitions, call it that, but I think the data are all around us. They're all around us.

02:20:25 Speaker_00
And by the way, there are a couple people I have turned on to this who just all of a sudden become obsessed because all of a sudden you see all around you reminders of all the weird stuff.

02:20:37 Speaker_00
and you realize, wait, all this stuff that seems so weird, you know what? It's not weird at all. If NTT is valid, if this theory is valid, the stuff that we think is weird is not weird at all. In fact, it makes perfect sense.

02:20:53 Speaker_00
Psychosis, you brought up psychosis, but there's so many things like that. And all of a sudden they're not mysterious at all. They make very good sense. How about something, one of my favorites is deja vu.

02:21:09 Speaker_00
Or how about meeting someone that you feel like you've known them forever. I've had that happen. It's an amazing experience. It's visceral, it's so powerful, it's so strong. Uh, how could that possibly be?

02:21:25 Speaker_00
Well, see, if you've got neural transduction theory there in your toolbox, you go, oh, that's easy.

02:21:32 Speaker_03
Hmm, hmm.

02:21:40 Speaker_00
Yeah, lots of stuff just falls into place. Now, I have to point out that I'm wearing this idiotic starburst thing. TameBigTech.com. Oh, thank you. Every time you say that, I just get the chills. Because I need help. I desperately need people's help.

02:22:03 Speaker_00
So we have spent $7 million building the world's first nationwide monitoring system that is doing to those bastards what they do to us and our kids 24 hours a day. We are surveilling them for the first time. We are finding

02:22:20 Speaker_00
overwhelming evidence that they are very deliberately and systematically messing with us and our elections especially. I personally believe that as of 2012, the free and fair election, at least at the national level, has not existed.

02:22:40 Speaker_02
It's just been manipulated.

02:22:42 Speaker_00
It's just been manipulated since 2012. I say this in part because I met one of the people on Google's tech team, on Obama's tech team, I should say, which was being run by Eric Schmidt, head of Google at the time.

02:22:55 Speaker_00
And I talked to him at great length about what the tech team was doing. They had full access to all of Google's shenanigans, all those manipulations. And one member of that team asked by a reporter,

02:23:11 Speaker_00
how many of the four points by which Obama won, how many of those points did he get from the tech team? And the guy said, Elon Kregel, I believe his name is, it was actually quoted, and he said, two of the points came from us.

02:23:24 Speaker_00
Now, Obama won by five million votes, roughly, and two out of four points came from the tech team, that's two and a half million votes.

02:23:35 Speaker_00
By 2016, I had calculated that Google could shift, and it would be toward Hillary Clinton, of course, whom I supported at the time, that Google could shift between 2.6 and 10.4 million votes to Hillary Clinton in that election with no one knowing.

02:23:52 Speaker_00
She won the popular vote by 2.8 million votes. If you take Google out of that election, the popular vote would have been tied.

02:24:02 Speaker_00
Couple days after that election, everyone, all the leaders in Google get up on stage, I'm sure you've seen this, it's an amazing video, and they're talking to all of Google's 100,000 employees, and they're one by one, they're going up to the mic and saying, we are never going to let that happen again.

02:24:19 Speaker_02
We are never going to let that happen again. Which is democracy. They're never going to let democracy happen again.

02:24:24 Speaker_00
Exactly. That's what I'm saying.

02:24:26 Speaker_02
And that's so crazy to be blatantly and openly talking about that.

02:24:30 Speaker_00
And in 2020, we didn't have- As if it's a virtue. We already had a pretty big monitoring system. We preserved 1.5 million ephemeral experiences.

02:24:39 Speaker_00
Our data show that Google shifted at least 6 million votes to Joe Biden, who won the popular vote by about 8 million.

02:24:46 Speaker_00
So again, take Google out of the equation, that would have been pretty much a tie in the popular vote, and Trump would have won 11 out of 13 swing states instead of 5.

02:24:59 Speaker_00
So, going forward from roughly 2012, I think the free and fair election has been an illusion. An illusion.

02:25:10 Speaker_00
And this is something that's very weird and kind of ironic, but this is something that Dwight D. Eisenhower warned about in that last speech of his, his farewell speech. He warned about the rise of military-industrial complex.

02:25:24 Speaker_00
Everyone's heard about that. But he also warned about the rise of a technological elite that could someday control public policy without anyone knowing. and the technological elite are now in control. That's what we have.

02:25:42 Speaker_00
That's where I get back to my ranting and my pain because I realize no one is paying attention. Eisenhower said we have to be alert or this will happen. We have not been alert. And the fact is people right this second

02:26:00 Speaker_00
who I give speeches to sometimes, they get all riled up and then they walk out of the auditorium with their surveillance phones. Mine is not, this is a secure phone.

02:26:12 Speaker_00
but they walk out with their surveillance phones in their pocket and they use all the surveillance tools that Google has set up for them and other companies too now. And they think, isn't this nice?

02:26:25 Speaker_00
This company's doing all this nice stuff for me and giving me all this free stuff. That's not the business model. All those free things are just apps that trick you into giving up personal data.

02:26:39 Speaker_00
and then they monetize the data and they use it to control you. That's what's really happening. That's the business model, and people can't see it.

02:26:51 Speaker_00
And I'm telling you, I've been working on this for 12 years, and it's gotten to the point where I am wiped out. I am fed up. I am exhausted. I am disillusioned. And...

02:27:10 Speaker_00
And I'm lonely because since Misty was killed five years ago, I sometimes feel like I'm literally dying of loneliness. And the fact that other people around me have been hurt, one quite seriously, makes me a little nervous too.

02:27:27 Speaker_00
And that's where I am at this point. And it's a terrible place to be, terrible. Now, It took $7 million to build what we've built, but it's been really tough. Okay, we're talking about raising a dollar at a time. It's been really, really difficult.

02:27:48 Speaker_00
And for us to set this up so that it's actually permanent and self-sustaining, and so we have core admissible data in all 50 states, which will make these companies think. It'll make them think. Think twice, maybe.

02:28:05 Speaker_00
That is going to require at least another $50 million. That gets us a secure facility and our own servers and a security team. We have virtually no security. Hear that, Google?

02:28:20 Speaker_00
And they know this because a couple months ago, they attacked us in an extremely sophisticated way. I've never seen this before.

02:28:27 Speaker_02
When you say they, who?

02:28:28 Speaker_00
I don't know. Someone. I don't know. Google has- What did they do? It was very, very unusual. It was not the usual thing. What they did was they got our accounts. They got our apps. to run kind of at ludicrous speed, I guess you could say.

02:28:54 Speaker_00
And what they did was they pulled in more and more and more servers until we were running so many servers simultaneously that we actually got shut down in the cloud. And we lost access to our own data for almost two weeks.

02:29:10 Speaker_00
Now, we've never seen an attack like that. Even our security people had never seen an attack like that. It was really pretty.

02:29:16 Speaker_02
And what was the mechanism of this attack? How'd they do it?

02:29:20 Speaker_00
We're not sure how they got in. Once they got in, all they did was they just created a tremendous amount of activity. So that pulled in more and more resources. And this is definitely created? This is not organic? Oh, no, no. It's absolutely created.

02:29:35 Speaker_00
And now that we know about this particular kind of attack. If it happens again, we'll be up within two days, max. But the point is, there's a lot of pressure on us.

02:29:49 Speaker_00
So we need a lot of money to set up a secure facility, have security teams not just protecting our data, but protecting our people. We have to protect our people. Have you ever talked to Elon about this stuff?

02:30:04 Speaker_02
I've never had a way to reach him. Well, hopefully someone will take this clip and put it on X and he's a junkie. He'll be on it all day. So hopefully someone will put it to his attention and put it up there.

02:30:19 Speaker_02
Because I'm sure this is very concerning to him. I mean, he has a vested interest in this. Clearly what happened when he purchased Twitter and he found out the extent of government interference in free speech.

02:30:33 Speaker_02
and how many people were being pressured to not talk about certain things that were inconvenient, or how many accounts they were trying to get taken down because these accounts were purveyors of misinformation that turned out to be absolutely accurate.

02:30:50 Speaker_02
He has a deep distrust.

02:30:53 Speaker_00
Well, he has a few times lately. He has retweeted content about my work. So he's aware. He might be aware. And there's another way also, by the way, to take down Google. I published this in Bloomberg Businessweek.

02:31:12 Speaker_00
If you go to epsteinandbusinessweek.com, you'll actually see the article. We've reached the point where data have become an essential part of our lives.

02:31:28 Speaker_00
And the way to take down Google is to do what governments have been doing for hundreds of years, to declare their index, the database they use to generate search results, to be a public commons.

02:31:42 Speaker_00
This is exactly what governments do when water, electricity, telephone communications, any commodity, any service becomes essential. Governments at some point have to step in. The electric companies, they were all privately owned. I didn't know that.

02:31:58 Speaker_00
I didn't realize that. They were all privately owned until the government had to step in. And this is where we are now with data. And the biggest, baddest database in the world is Google's, because it's the gateway to all knowledge.

02:32:15 Speaker_00
It needs to be declared a public commons. As I say, ample precedent for that in law. It's very light-touch regulation. And what it'll do is it'll allow other people to draw from the database to create their own niche search engines.

02:32:31 Speaker_00
So, you know, you'll create a search engine for people interested in DMT and UFOs. Someone will create one for women, for Lithuanians. We'll end up with thousands of these search engines, all of which are vying for attention.

02:32:48 Speaker_00
It will be exactly like the news, exactly like the news media, that domain. And that's the way it should be. Search should be competitive. Google was not the first search engine. It was the 21st. So that's how you do it.

02:33:03 Speaker_00
And also then search would become innovative again. There have been no innovations in search for the 20 years that Google has dominated search. So General Paxton, Ken Paxton of this great state of Texas, he's interested in this.

02:33:23 Speaker_00
Senator Cruz is interested. Other people are interested. This would be tough to implement in the US, but the EU could do it. because five of Google's data centers are in the EU. The EU could do it in a flash.

02:33:37 Speaker_00
And they're very frustrated with Google, because they've been trying to keep them under control for a long time now, and they've failed. So there are some things that could be done. Permanent, large-scale monitoring system, that is a necessity.

02:33:53 Speaker_00
That must be there, because if you don't have that, you don't know what these companies are doing. You don't know how they're messing with our minds, with our kids' minds, and with our elections.

02:34:04 Speaker_00
You have to monitor, and you have to have court-admissible data in every state and probably in every country, and then they will pull back a little bit because they have to.

02:34:17 Speaker_00
They're violating campaign finance laws when they very blatantly support one candidate or one party. They're making huge in-kind donations without declaring them.

02:34:29 Speaker_00
So another thing they're doing right now, perfect example of something our system is capturing right this second, right this second,

02:34:38 Speaker_00
Google is sending register-to-vote reminders to Democrats at about two and a half times the rate they're sending them to Republicans. How do I know? Because that's what the monitoring system shows. That's what they're doing.

02:34:53 Speaker_00
At some point, that's going to turn into partisan mail-in-your-ballot reminders. And then that turns into partisan go-vote reminders. These are just displayed on Google's homepage. We're capturing the homepages by the millions.

02:35:08 Speaker_00
If you don't capture them, then the content is ephemeral and it disappears and it's gone forever and you can't go back in time and figure out what they were doing. So monitoring is no longer optional.

02:35:22 Speaker_00
And by the way, monitoring is fast, unlike regulations and laws. Monitoring can keep up with whatever the tech company is dishing out, the next company, the next Google after that. Monitoring can keep up. If you're going to have an internet,

02:35:40 Speaker_00
and it can mess with people's lives, and it can mess with governments and elections and so on, then you've got to have monitoring systems in place.

02:35:48 Speaker_00
So that's what I've been, that's what this new, my monograph is about, and if people wanna get a free copy of it. TameBigTech.com. Tame, tame, tame. TameBigTech.com. You crack me up sometimes, really. I'd love to see you do a comedy routine.

02:36:14 Speaker_02
Listen, Robert, thank you for being here. I really, really appreciate what you're doing. If you weren't doing this, I don't know if it would get done. I don't know if we would know as much as we know. I think it would be speculative.

02:36:25 Speaker_02
I think people would have ideas. I think it would be impossible to prove. And I think what you've done is a tremendous service for people. So, thank you very much. TameBigTech.com.

02:36:37 Speaker_00
Thank you, but I'm still fed up, just so you know. Okay.

02:36:42 Speaker_02
Thanks, Robert. Yep. Bye, everybody.