Skip to main content

How Monsters are Made AI transcript and summary - episode of podcast Hidden Brain

· 42 min read

Go to PodExtra AI's episode page (How Monsters are Made) to play and view complete AI-processed content: summary, mindmap, topics, takeaways, transcript, keywords and highlights.

Go to PodExtra AI's podcast page (Hidden Brain) to view the AI-processed content of all episodes of this podcast.

View full AI transcripts and summaries of all podcast episodes on the blog: Hidden Brain

Episode: How Monsters are Made

How Monsters are Made

Author: Hidden Brain, Shankar Vedantam
Duration: 00:46:43

Episode Shownotes

What makes ordinary people do evil things? It was a question that long fascinated the psychologist Philip Zimbardo, who died in October. Zimbardo was best known for the controversial Stanford prison experiment, in which he created a simulated prison in the basement of a university building and recruited volunteers to

act as prisoners and guards. This week, we explore how Zimbardo came to create one of psychology's most notorious experiments – and inadvertently became the poster child for the human weaknesses he was trying to study. We're bringing Hidden Brain to the stage in San Francisco and Seattle in February 2025! Join our host Shankar Vedantam as he shares seven key insights from his first decade hosting the show. Click here for more info and tickets.

Summary

In this episode of Hidden Brain, Shankar Vedantam examines the Stanford prison experiment led by Philip Zimbardo, exploring the conditions under which ordinary people can commit evil acts. Zimbardo's insights stemmed from his own challenging childhood and observations of social dynamics, leading to a significant focus on leadership, group behavior, and situational factors. The episode discusses the emotional conflicts faced by Zimbardo and his assistant Christina Maslach, highlighting ethical concerns around the experiment's design and implications for understanding human nature, power dynamics, and the psychological roots of malevolence.

Go to PodExtra AI's episode page (How Monsters are Made) to play and view complete AI-processed content: summary, mindmap, topics, takeaways, transcript, keywords and highlights.

Full Transcript

00:00:00 Speaker_02
Hey there, Shankar here with a quick note before we start today's show. I'm bringing Hidden Brain to the stage in San Francisco and Seattle in February 2025. I'll share seven psychological insights from the last decade of hosting Hidden Brain.

00:00:13 Speaker_02
Each insight has made my life better, and I think it will do the same for you. Please join me for an evening of science and storytelling. More information and a link to tickets is at hiddenbrain.org slash tour. That's hiddenbrain.org slash t-o-u-r.

00:00:31 Speaker_02
All attendees receive one year's complimentary membership to the meditation and sleep app Calm. Hope to see you there.

00:00:39 Speaker_10
Again, go to hiddenbrain.org slash tour. This is Hidden Brain. I'm Shankar Vedanta.

00:00:50 Speaker_02
The biblical King Solomon is said to have constructed a religious edifice nearly 3,000 years ago.

00:00:57 Speaker_02
Accounts of the Temple of Solomon, largely drawn from the Hebrew Bible, say that Solomon placed an object of incalculable value within a windowless room of the temple. It was the Ark of the Covenant, a wooden chest decorated with gold.

00:01:13 Speaker_02
Inside it were tablets given to Moses by God inscribed with the Ten Commandments. Remnants or artifacts from the temple have never been found.

00:01:29 Speaker_02
About 300 years ago, the Temple of Solomon became a subject of intense interest to a gifted mathematician in England. Isaac Newton came to believe that biblical accounts of the temple contained messages and clues that could be mathematically decoded.

00:01:44 Speaker_02
He also felt the temple's architecture contained geometrical secrets and a blueprint of human history.

00:01:51 Speaker_02
Based on his calculations and deductions, he predicted a major event in the 21st century, one that some people took to mean the end of the world as we know it.

00:02:03 Speaker_02
Even as he dabbled in what would now be considered the occult, Isaac Newton also famously revolutionized our understanding of the physical world. He came up with laws of physics that are still taught in high schools today.

00:02:15 Speaker_02
His contributions to mathematics, especially the science of calculus, are used on a daily basis in fields as diverse as space travel and epidemiology.

00:02:25 Speaker_02
How did the scientist who helped us understand the law of gravity come to believe he had special powers to decode scripture? Why did one of the most influential mathematicians in history spend so many years trying to turn base metals into gold?

00:02:42 Speaker_02
It's easy to say Isaac Newton was being brilliant when he was inventing calculus and foolish when he dabbled in the occult. The truth was that Isaac Newton saw both of these as forms of exploration.

00:02:55 Speaker_02
It's just that we know that one turned out to be right and the other turned out to be wrong. Today, we're going to tell you the story of another explorer. He was a psychologist who wanted to answer big questions.

00:03:10 Speaker_02
But in trying to do so, he dreamed up one of the most notorious experiments of the 20th century and inadvertently became the poster child for the human weaknesses he was trying to study. What I try to do is create evil.

00:03:23 Speaker_02
It's really studying evil from the inside out. the risks of exploration, and the lessons of hindsight, this week on Hidden Brain.

00:03:42 Speaker_02
In February 1933, shortly after Adolf Hitler was appointed chancellor of Germany, a fire broke out in the Reichstag building, home to the German parliament.

00:03:52 Speaker_02
The Nazis cited the fire as proof that communists were planning a violent takeover of the country. The solution they proposed was to suspend the constitution, declare emergency rule, and bring an end to civil rights.

00:04:05 Speaker_02
It paved the way for Adolf Hitler to become the dictator of Germany. A month after the Reichstag fire, a baby boy was born in the South Bronx to a family of Italian origin.

00:04:16 Speaker_01
My parents were second generation Sicilian. My family background was my grandfather was a barber. My other grandfather was a shoemaker. So it was really, you know, tradespeople.

00:04:27 Speaker_02
This is Philip Zimbardo. In time, he was to become one of the most influential psychologists of the 20th century, and much of his life's work would attempt to understand how Adolf Hitler and the Nazis came to transform the minds of ordinary Germans.

00:04:45 Speaker_02
But in 1933, Philip's family was not thinking about Nazi Germany. It was the height of the Great Depression. In the United States, nearly a quarter of all workers were unemployed.

00:04:57 Speaker_02
Some people were living in shanty towns, while others were on the move, desperate to find work. In addition to coming from a poor family, Philip was prone to deadly illnesses.

00:05:08 Speaker_01
When I was a child, I was very sickly. In fact, I almost died from pneumonia and whooping cough at a time when there was no penicillin or sulfur drugs for contagious diseases.

00:05:19 Speaker_01
So I was hospitalized for six months, and kids around me all died, and I survived somehow. Resilience or hardiness, I'm not sure what.

00:05:36 Speaker_02
Young Philip wasn't seen as resilient or hardy by the other kids in his neighborhood. They looked at this kid, who had escaped a brush with death, and saw a weakling, someone who could be pushed around.

00:05:48 Speaker_01
When I came out of the hospital because I was really sick, I used to get beaten up all the time, also because I looked Jewish. And then I realized that the world is made up of leaders and followers, and followers are going to get beaten up.

00:06:03 Speaker_01
So I really, as a 6, 7, 10-year-old kid, started trying to understand, what was it about some kids who got to be leaders? And I think I figured out a kind of recipe. They were bigger. They were the first ones to talk up. They usually had a joke.

00:06:19 Speaker_01
They usually had a big, stronger guy backing them up. And they always gave the group some interesting activity.

00:06:27 Speaker_02
Philip was not yet a psychologist, but already he was coming up with psychological theories about the world. In what was to become a hallmark of his future career, he came up with bold, intuitive leaps to explain the world he saw.

00:06:41 Speaker_02
And even as he engaged in the rites and rituals of childhood, there was always a part of his mind that played the observer, that watched himself engage in those rites and rituals.

00:06:53 Speaker_02
Philip noticed there were things you had to do to become accepted into a group.

00:06:57 Speaker_01
In order to get into the gang, and the gang was the kids on the block, the newest kid had to physically fight the most recent kid who was in the gang until one of you got a bloody nose. And I hated violence.

00:07:10 Speaker_01
I could see it's a stupidness of these kind of rituals. Philip also noticed how hierarchy worked in groups. Some kids give orders, some kids follow orders. And I thought, you know, typically the orders they follow are stupid.

00:07:24 Speaker_01
But once you get used to that, it becomes a habit.

00:07:27 Speaker_02
If some kids were going to give stupid orders and some kids were going to follow them, Philip realized he much preferred to be in the first group. He decided he would study the tricks and techniques leaders used to get to the top.

00:07:40 Speaker_02
He set to work developing those habits.

00:07:43 Speaker_01
But it was really structuring situations, being the one that came up with a new idea to say, hey, marble season should be always boring. Why don't we do stickball? So essentially, it's coming up with the idea of what to do

00:07:56 Speaker_01
that other kids would say, yeah, that's interesting, let's do it.

00:08:00 Speaker_02
It worked. Philip shared his image as a sickly kid. He became an athlete, excelling in track, softball, and baseball. He was voted the most popular kid in his high school class.

00:08:13 Speaker_02
But there was another realm where Philip realized he could not outshine the competition. That's because the competition, when it came to being the smartest kid in class, was no competition at all.

00:08:24 Speaker_02
A kid named Stanley had run off with those honors, and there was absolutely no catching him.

00:08:30 Speaker_01
He won all the medals at graduation, so obviously nobody liked him because we were all envious of him. But he was super smart and super serious.

00:08:40 Speaker_02
That Stanley was Stanley Milgram, who just happened to be another famous psychologist of the 20th century. What were the odds that Stanley Milgram and Philip Zimbardo would be classmates at the same high school in the Bronx?

00:08:56 Speaker_02
World War II had just ended a few years earlier, and the conflict was still fresh in everyone's minds. Philip and Stanley were fascinated and horrified by the Nazi regime's ability to mobilize the German people.

00:09:08 Speaker_02
How had Adolf Hitler managed to get ordinary Germans to join in the mass extermination of millions of people?

00:09:16 Speaker_02
Stanley Milgram found it unsettling how easily the Nazis managed to associate large groups of people with being inherently different and inferior.

00:09:25 Speaker_01
He was worried that the Holocaust could happen again in America. And everybody said, Stanley, that was Nazi Germany. That was then. We're not that kind of people. And he would say, I'll bet they thought the same thing.

00:09:36 Speaker_01
And the bottom line, he says, how do you know how you would act unless you're in this situation? Because we all think we're good people.

00:09:43 Speaker_02
We all think we're good people, and yet, our circumstances can prompt us to do things we might never anticipate. Philip felt that good kids were not always good, and bad kids were not always bad. It was the situation that made them what they were.

00:09:59 Speaker_01
If you're middle class, you don't do things for money because your parents give you the money. And if you're poor, nobody's going to give you things.

00:10:07 Speaker_01
So if you want to buy sneakers, then somebody comes on the corner and says, hey, carry this package down the other and give it to some guy named Charlie.

00:10:14 Speaker_01
Well, you knew it had to be something illegal because they're going to give you $10 for carrying a package down the street. But you also knew that if you got caught, there would be consequences. And I was tempted. I could use $10.

00:10:27 Speaker_01
I mean, I used to work in a laundry truck, delivering laundry in Harlem. I think we got $2 a day or something. So the temptation is always there.

00:10:40 Speaker_02
This, of course, is the central claim of the field of social psychology. Our behavior is not just about who we are as people. It's a product of who we are as individuals and the situations in which we find ourselves.

00:10:54 Speaker_02
Stanley Milgram went on to become a psychologist at Yale. Philip Zimbardo became a psychologist at Stanford. As they began their research careers, both men continued to circle the same questions they had asked one another as teenagers.

00:11:09 Speaker_02
Can you get good people to do terrible things? Are all humans pliable under the right conditions? Do we all have inner monsters just waiting for the right circumstances to be unleashed?

00:11:21 Speaker_00
Under what conditions would a person obey authority, who commanded actions that went against conscience? These are exactly the questions that I wanted to investigate at Yale University.

00:11:31 Speaker_02
This is Stanley Milgram in a documentary he later produced about his work. In 1961, he began a series of experiments on obedience. The experiments were set up as a way for one volunteer to help improve the memory of another volunteer.

00:11:48 Speaker_02
A person in the role of a learner was given a list of items they had to remember. They would then recite the list to a volunteer who was asked to play the role of teacher.

00:11:57 Speaker_02
The person running the study told the teacher that every time the learner got an answer wrong, the teacher had to punish the learner by administering an electric shock.

00:12:07 Speaker_01
The brilliance of the experiment is that the teacher is sitting in front of a shock box with 30 switches. The first switch is only 15 volts. When you press a button, the learner who's in another room experiences some minimal level of shock.

00:12:22 Speaker_01
And then of these 30 switches, the increment is 15 volts. 15, 30, 45, et cetera. And the person doesn't respond until it gets up to nearly 100. But the problem is that you are now on a slippery slope down.

00:12:38 Speaker_01
So each increment is not noticeable from the previous. And so now when you hit 100 and the guy starts to say, hey, that really hurts, you're only slightly different from where you were. And at some point, you realize you should have stopped sooner.

00:12:52 Speaker_02
We talked about this experiment at length in an episode of Hidden Brain titled The Influence You Have. No one was actually being shocked in the experiment. The learner was actually an actor working with Stanley Milgram.

00:13:05 Speaker_02
But the experiment unsettled people. That's because large numbers of volunteers who were playing the role of the teacher seemed perfectly willing to administer electric shocks that were ostensibly lethal enough to kill the learner.

00:13:18 Speaker_02
When the psychologists later debriefed some of the volunteers, they explained why they had followed orders.

00:13:24 Speaker_04
Why didn't you stop anyway? I did stop, but he kept going, keep going. But why didn't you just disregard what he said? He says it's got to go on, the experiment.

00:13:39 Speaker_02
Stanley Milgram's obedient studies made him famous well beyond the bounds of psychology classrooms and academic conferences. Across the country at Stanford, Philip Zimbardo was also developing the idea for a psychology experiment.

00:13:53 Speaker_02
It would be so audacious, so controversial, that it would generate even more shockwaves than the work of his former classmate.

00:14:02 Speaker_06
Prisoners must report all rule violations to the guards. Prisoners must report all rule violations

00:14:08 Speaker_02
Coming up, how Philip Zimbardo came to create the infamous Stanford Prison Experiment. You're listening to Hidden Brain. I'm Shankar Vedantam.

00:14:40 Speaker_08
This is Hidden Brain.

00:14:41 Speaker_02
I'm Shankar Vedantam. It was the summer of 1971. The Vietnam War was grinding on, as were increasingly large protests against it. The New York Times had begun publishing excerpts from the Pentagon Papers.

00:14:58 Speaker_02
The cult leader Charles Manson was behind bars, recently convicted of first-degree murder and conspiracy to commit murder.

00:15:06 Speaker_02
It was a time of disruption and unrest, a time when Americans were feeling uneasy about their leaders and their nation's role in the world.

00:15:15 Speaker_02
Many young people in particular had the sense that the United States had gone astray and lost its moral compass. It was against this backdrop that Philip Zimbardo put an ad in newspapers in Palo Alto, California.

00:15:33 Speaker_02
He was looking for male college students to volunteer for a study. It was designed to understand how the roles we play shape how we behave.

00:15:43 Speaker_01
Who we are is really shaped not so much by somebody telling you what to do and not to do. It's really that we play roles. We're a student, we're a teacher, we're a worker. And those roles are always in some setting.

00:15:55 Speaker_01
And within those, you belong to some subgroups. Initially, you're with the new workers, you're with the freshmen, and so forth. And I said, so that's where we really want to study how power operates.

00:16:07 Speaker_02
He started to think about various scenarios where people play highly specific roles and have varying degrees of power.

00:16:15 Speaker_01
I said, well, what kind of setting could we use to illustrate that? Now, you could have done a summer camp.

00:16:22 Speaker_01
In fact, since I did my experiment, many people have written to me saying, oh my God, I was in a summer camp where the counselors were brutal guards. But I thought prison, because prisons are all about power. Prison.

00:16:34 Speaker_02
What Phil had in mind was not to study how prisons operate in a state like California, but something much more audacious. He wanted to build his own prison.

00:16:44 Speaker_02
In the basement of Stanford University's psychology department, Phil and his colleagues created prison cells, complete with bars, each designed to hold three people. A corridor served as the prison yard.

00:16:56 Speaker_02
A closet was set aside for prisoners who were to be confined to solitary confinement. For two weeks, Phil announced, they would simulate a prison that would run 24 hours a day.

00:17:06 Speaker_02
Half the volunteers would play the role of prisoners, and half would play the role of prison guards. They sifted through about 75 applications they'd received from young men who were interested in the study.

00:17:19 Speaker_02
It wasn't difficult to pick volunteers, except for one problem. No one wanted to play the role of a prison guard.

00:17:26 Speaker_01
It's 1971. These are anti-war activists. These are civil rights activists. Everybody's got hair down to here, not only long hair. And nobody wanted to be a guard. Guards are pigs. I didn't go to college to become a prison guard.

00:17:42 Speaker_02
Phil says the research team ended up flipping a coin to determine who would be guards and who would be prisoners. The day before the experiment began, he gathered the volunteers who had been chosen as guards for an orientation.

00:17:55 Speaker_02
Phil had decided that he himself would play the role of prison superintendent. An undergraduate research assistant would be the warden. Phil told the guards that they weren't allowed to physically harm the prisoners.

00:18:07 Speaker_02
but they were given broad latitude to maintain order.

00:18:10 Speaker_01
We want them to own the prison, okay? And so what it means is we go with them to buy uniforms at army, navy stores. They have military kind of uniform. We give them symbols of power, handcuffs, billy clubs, whistles.

00:18:24 Speaker_01
And then I impose an interesting subtle piece from the movie Cool Hand Luke. Namely, everybody, when they were in contact with the prisoners, had to wear silver reflecting sunglasses. which means that nobody can see your eyes.

00:18:43 Speaker_02
I want to stop here for a moment to note how unusual all of this was, and how unthinkable it would be to run an experiment like this today. Phil did not have a host of administrators looking over his shoulder, making sure he didn't cross the line.

00:18:57 Speaker_02
From a scientific standpoint, he was simultaneously playing both researcher and participant. He was the prison superintendent, and also the person who was supposed to be dispassionately studying how the prison operated.

00:19:10 Speaker_02
The theatricality of the experiment was no accident. Phil was an impresario, and he loved being the center of attention.

00:19:18 Speaker_02
One student who took an introductory psychology class at Stanford in the 1970s told me that Phil showed up on the first day wearing a long black cape.

00:19:28 Speaker_02
The professor stood before the slack-jawed students, swept his cape 360 degrees around him like a magician, and intoned, welcome to psychology. Others remembered Phil would wear that cloak on his travels, including when he went through airports.

00:19:45 Speaker_02
With his piercing eyes and flamboyant style, I bet he looked every part the wizard. To make the drama really pop for his prison experiment, Phil even involved the Palo Alto Police Department. Again, zero chance this happens today.

00:20:02 Speaker_01
I recruited the Palo Alto Police Department, the real police department, to make simulated arrests. To go to each kid's place, ask for the name of the kid, and then give them their Miranda rights.

00:20:16 Speaker_01
You have a right to remain silent, a right for a lawyer. And then bring them down to where the squad car is with lights flashing, lean them against the car, handcuff them, give them their rights again.

00:20:28 Speaker_01
And again, neighbors and everybody's looking around.

00:20:33 Speaker_02
The police then took the students to the real police station to be fingerprinted and have their mug shots taken. They then put a blindfold on them and had them wait in a cell. If all of this wasn't bad enough, what happened next was truly shocking.

00:20:47 Speaker_01
Then my graduate students came, took the prisoner, put him in their car, took him down to our prison, and they stripped him naked.

00:20:56 Speaker_01
And when they take off the blindfold, the kid is standing naked, and all the guards are around, you know, laughing, mocking him, and say, welcome to the Stanford jail.

00:21:07 Speaker_02
The prisoners were given smocks and flip-flops. They were each assigned numbers that were sewn onto their clothing.

00:21:13 Speaker_01
And they all had an iron chain and a lock on one ankle that was there all the time. And instead of cutting their hair, they wore women's nylon stocking caps. So it was really de-individuation, dehumanization.

00:21:29 Speaker_02
By the end of the first day, the experiment wasn't going as Phil thought it would.

00:21:34 Speaker_02
He briefly considered shutting it down, not because he had qualms about what he had put his volunteers through, but because the young men playing the role of guards refused to embrace their roles.

00:21:44 Speaker_01
The kids playing the role of guards just felt awkward. In fact, you can hear. We videotaped a lot of it. They say, come on, you guys, stop laughing. This is serious business. Nobody could take it seriously.

00:21:54 Speaker_06
So let's hear everyone say, Mr. Correctional Officer, I feel fine.

00:22:00 Speaker_05
Mr. Correctional Officer, I feel fine. Mr. Correctional Officer, I feel fine.

00:22:04 Speaker_02
Phil decided to keep the experiment going. Early in the morning on day two, the prisoners revolted after being woken by guards blasting whistles and ordering them out of their cells for drugs.

00:22:21 Speaker_01
They ripped off their stocking caps. They ripped off their numbers. They barricaded themselves in their cell. And then they made a huge mistake. They started ridiculing the guards. Like, you little punk, when I get out, I'm going to kick your butt.

00:22:33 Speaker_01
And suddenly, the guards come to me and say, what are we going to do? I said, you're a prisoner. What do you want to do? They said, we need reinforcements. So it means there are three guards on the morning shift. They call in all the other guards.

00:22:43 Speaker_01
The guards meet. They have a meeting. They say, OK, we've got to treat force with force. So they break down the doors. They drag the prisoners out. They strip them all naked. They put the ringleaders of the rebellion in solitary confinement.

00:22:58 Speaker_01
Suddenly, they say, these are dangerous prisoners.

00:23:03 Speaker_02
These are dangerous prisoners.

00:23:13 Speaker_01
That's the switch. They're no longer college students like you. Everybody knows they're college students. So now they are dangerous prisoners. And what do you do with dangerous prisoners? You have to teach them that they have no power.

00:23:23 Speaker_01
They have minus power. Because the potential for rebellion is always there. And this is true of real guards in real prisons.

00:23:32 Speaker_02
How do you convey to prisoners that you hold all the power and they hold none?

00:23:37 Speaker_01
So in every way and every day, you have to suppress their freedom, suppress their likelihood to rebel. And you had to demonstrate, mostly by doing arbitrary, stupid things, that you had power. So you tell a joke and they laugh, you punish them.

00:23:54 Speaker_01
They tell a joke and they don't laugh, you punish them. So you create a totally arbitrary environment where they have, where prisoners, college students, have no idea what to do.

00:24:02 Speaker_02
Years later, Phil would be called in to help understand the behavior of U.S. guards at the Abu Ghraib prison in Iraq. Young American service members sadistically humiliated Iraqi prisoners and pretended they were going to electrocute them.

00:24:17 Speaker_02
They stripped prisoners naked, threatened them with snarling dogs, and laughed and took photos as prisoners screamed in terror. At the Stanford prison, some prisoners were denied bathroom breaks and were forced to use buckets as toilets.

00:24:35 Speaker_01
And then punishment is usually just push-ups, jumping jacks, but then it escalates, and it always escalates, as we saw in Abu Ghraib, toward the sexual.

00:24:43 Speaker_01
So the guards say, you're Frankenstein, you're Mrs. Frankenstein, walk like Frankenstein, okay, and hug them, say, I love you. And then little things like this, or, you know, your female camels, your male camels, bend over, now, you know, hump them.

00:25:00 Speaker_01
So it's a play on words. And they're simulating sodomy in an experiment with college students. Now, some of this I didn't see, because the experiment's going on 24 hours a day. And most of the things, the worst things happen at night.

00:25:14 Speaker_02
The guards separated their captives into good prisoners and bad prisoners, those who were compliant and those who rebelled.

00:25:21 Speaker_06
Prisoners must report all rule violations to the guards. Prisoners must report all rule violations to the guards. Again.

00:25:30 Speaker_05
Prisoners must report all rule violations to the guards.

00:25:36 Speaker_06
Failure to obey any of the above rules may result in punishment.

00:25:46 Speaker_02
As the days passed, a few prisoners requested to be released from the experiment and were allowed to leave. On the third day of the experiment, those who remained were allowed short visits from parents and friends.

00:25:59 Speaker_02
Two days later, in the evening, another visitor came to the jail, a graduate student named Christina Maslach. She was also Phil's girlfriend. What she saw appalled her.

00:26:11 Speaker_01
She sees something which, from her point of view, is unimaginable that should happen anywhere. Unimaginable should happen in an experiment. Namely, guards are cursing and screaming and pushing prisoners. Prisoners have bags over their head.

00:26:23 Speaker_01
They're shuffling. Their legs are chained. It's like a chain gang. It's like you see in pictures of the South in Louisiana prisons.

00:26:32 Speaker_01
And I'm looking at this, and literally, I have 8 o'clock breakfast, 10 o'clock parole board hearing, 12 o'clock, and it's 10 o'clock, and it simply says, toilet run.

00:26:43 Speaker_02
What Christina saw as a dehumanizing abuse of power, Phil, in his role as prison superintendent, saw as a toilet run. In other words, it was simply time to check off an item on his bureaucratic to-do list.

00:26:56 Speaker_01
I look up and I put a checkmark. So what I'm seeing is nothing more than an administrative checkmark. So I am now in the mentality of a prison superintendent, which is the mentality of an administrator.

00:27:12 Speaker_01
And she's looking at this as a person with feelings. She had been a student, she's about to be a professor, and she sees this as horrific.

00:27:31 Speaker_02
Christina started to cry and fled outside, with Phil at her heels.

00:27:36 Speaker_01
Now we're standing outside in the fresh air. It's now 11 o'clock at night or 10.30 at night in front of the psychology building. And she's saying, how could you see what I see and not get upset? And I'm saying, what do you see?

00:27:51 Speaker_01
It's the dynamics of human nature. It's the power of the situation. I'm giving all the psychological jargon, and she's giving me the humanity of the situation. Boys are suffering. They're not prisoners. That's what she's saying. They're not prisoners.

00:28:05 Speaker_01
They're not guards. They're boys, and you're responsible. How could you allow this to happen?

00:28:12 Speaker_02
They stood there in the California night, and a gulf grew between them.

00:28:26 Speaker_01
There's this chasm between us. I'm over here and you're over here, and we're looking at the same thing, and we see two totally different worlds. And then she says, I'm not sure I want to continue my relationship with you if this is the real you.

00:28:37 Speaker_01
I thought you were someone else when I started dating you, and I don't know who this is. So that's the ultimate power of a situation to transform.

00:28:44 Speaker_01
She's looking at me, and she knew me for a number of years as a professor there, and she's looking at me saying, I don't know who you are. And really what she's saying, do you know who you are? And the answer is no.

00:28:53 Speaker_01
That this is what I had become is abhorrent. I mean, I fight authoritarianism. I fight, you know, I'm as liberal as most people get. I'm anti-authoritarian, anti-control, anti-structure, all these things. And that's what I became.

00:29:08 Speaker_01
I mean, I became my worst inner enemy. And at that point, I just stopped. I mean, I think it's when she said, I'm not sure I want to continue my relationship. It was like a double slap in the face. I said, oh my God, what's happened to me?

00:29:23 Speaker_01
It was really like she should have just shook me and say, wake up, the dream is over, the game is over.

00:29:31 Speaker_02
In a theatrical irony even he could not have devised, Phil had demonstrated in his own behavior what he hoped his experiment would demonstrate to others.

00:29:41 Speaker_02
He had become so caught up in the roles he was playing, the swashbuckling scientist, the prison superintendent, the high wizard of psychology, that he had lost track of his own values.

00:29:53 Speaker_02
Six days after it started, the Stanford prison experiment was over. But before the volunteers could leave, Phil interviewed them.

00:30:02 Speaker_01
Then we had all the prisoners come together to debrief. We spent hours, then all the guards separately, and then the prisoners and guards. Because I use that as a moral re-education, because I could say we all did bad things, including me.

00:30:14 Speaker_01
What do we learn from this? We learn about the power of the situation. We learn to be aware of how easy each of us can get seduced into a role.

00:30:23 Speaker_02
He followed up again weeks later with the volunteers and asked the prisoners if they thought they would have behaved like the guards if the roles had been reversed.

00:30:31 Speaker_01
They said, I don't know, I probably would have played by the rules, but I would not have been as creative. That the worst guards were the ones who clearly went beyond the rules.

00:30:41 Speaker_01
That is, it was clear what you had to do to be a guard, and it was going beyond the boundary of your role. that in every role, there's a moral latitude. And clearly, some guards went beyond it. You know, you could say, do 10 push-ups, do 10 more.

00:30:57 Speaker_01
But then to tell somebody to sit on your back when you're doing push-ups, that's going beyond the thing. You know, to tell somebody to kiss the other guy at the Bride of Frankenstein, that's being creatively evil.

00:31:09 Speaker_01
So again, most of the prisoners said, either I'm not sure what I would do, but I would be a guard who played by the rules and not develop new rules.

00:31:25 Speaker_02
Phil's study sparked intense public interest and criticism, both of which have continued to this day.

00:31:32 Speaker_02
For starters, there was the inhumane treatment of the volunteers playing the role of prisoners, the decision to strip them naked, the physical punishments and verbal abuse, the restricted access to toilet facilities.

00:31:46 Speaker_02
Both the Stanford Prison Experiment and Stanley Milgram's Obedience Studies prompted universities to create more rigorous review processes before greenlighting experiments.

00:31:56 Speaker_02
I think it's impossible any review board at any major university today would green light a study like the Stanford Prison Experiment.

00:32:06 Speaker_02
In addition to the ethical concerns, scholars have also criticized the experiment as bad science, or not really science at all. Some have argued that the guards were essentially primed to be abusive.

00:32:18 Speaker_02
As Phil noted earlier, the guards were given to understand the prison was theirs and that they should do what was needed to maintain order. They were arguably being nudged to amp up their behavior to please the researchers.

00:32:31 Speaker_02
Remember how Phil almost shut the experiment down the first day because nothing much was happening?

00:32:36 Speaker_02
The guards may well have picked up on that, a possibility confirmed recently by volunteer Dave Eshelman, who played a guard known for being particularly aggressive to the prisoners.

00:32:46 Speaker_02
He was interviewed for a documentary film titled The Stanford Prison Experiment, Unlocking the Truth. Dave says he thought the experiment was designed to show that prisons were terrible. He believed his role was to provide proof.

00:32:59 Speaker_03
Proof that prisons are an evil environment. Given the times and given the fact we were students and very anti-establishment, we would have done anything to prove that this prison system was an evil institution. We were happy to play that role.

00:33:18 Speaker_02
For his part, Phil Zimbardo continued to defend the Stanford Prison Experiment for the rest of his very long career.

00:33:24 Speaker_02
He felt the question he was trying to answer was a vital one, and the best way to do so was to see how people responded to a situation in the real world, in real time, rather than creating an abstract scenario in which they imagined how they might respond.

00:33:39 Speaker_01
All of that research, in a way, really is trying to answer the question from childhood, what makes good people do bad things? And my focus has always been on trying to understand how situations shape us, mold us, and corrupt us.

00:33:55 Speaker_01
So starting with an evil orientation, what I try to do is create evil. It's really studying evil from the inside out. You know, theologians, poets, dramatists, sociologists, you know, criminologists have studied evil. But they've studied evil in place.

00:34:10 Speaker_01
So what I try to do that's unique is create it. You see the process of transformation. You see people who start off on day one, normal, healthy. You put them in, and then you see the divergence. The role becomes the person.

00:34:24 Speaker_01
And you play a character, and then it becomes your identity.

00:34:30 Speaker_02
Over the years, Phil went on to work on other topics. For a time, he was president of the American Psychological Association. Around 2010, I remember attending a talk he gave at a psychology conference.

00:34:43 Speaker_02
The line of young scholars who waited to shake his hand afterwards stretched the length of a very long ballroom. Phil Zimbardo died in October 2024. He was 91 years old. when we come back, the larger lessons of the Stanford Prison Experiment.

00:35:04 Speaker_02
You're listening to Hidden Brain. I'm Shankar Vedanta.

00:35:18 Speaker_10
This is Hidden Brain.

00:35:19 Speaker_02
I'm Shankar Vedanta.

00:35:22 Speaker_02
In 2004, after the horrific abuses at Iraq's Abu Ghraib prison came to light, American leaders like President George W. Bush, Vice President Dick Cheney, Defense Secretary Donald Rumsfeld, and the Chairman of the Joint Chiefs of Staff Richard Myers said that the servicemen and women who had perpetrated the abuses were just bad apples.

00:35:42 Speaker_02
The lawyer for one of the servicemen got in touch with Phil Zimbardo.

00:35:46 Speaker_01
I became an expert witness for one of the guards, Chip Frederick, because I knew that these were good apples that somebody put in a bad barrel. That's the metaphor I started using. In fact, I think I know I gave an interview at NPR.

00:35:58 Speaker_01
And I was the first to say, I want to believe our American soldiers are good. They're not bad apples, as Cheney and Rumsfeld and Bush and General Myers say.

00:36:08 Speaker_01
I believe they were good apples when they got there, and somebody put them in a very bad barrel. And that barrel looks exactly like the prison study. So I got to know, really, everything there is to know about Albuquerque.

00:36:19 Speaker_02
The more Phil read about the abuses in the prison, the greater the parallels he saw with the Stanford Prison Experiment. It told him, notwithstanding all his errors and misjudgments, the study had been on to something really important.

00:36:32 Speaker_01
The only abuses happened in that prison on the night shift. None happened on a day shift. And it turns out, in Abu Ghraib, part of the unit was the center of interrogation. So you have military intelligence with its set of interrogators interrogating.

00:36:49 Speaker_01
Now, they're not getting any information. For the military, it's called actionable intelligence, because what happened was, when the insurgency broke out, the military was caught blind. They had no idea this was going to happen.

00:37:00 Speaker_01
So they start arresting all men and boys around explosions. So they had no information. But now they're interrogating them and they're getting nothing. So military intelligence goes to the head of military police and says, we need your guys to help us.

00:37:12 Speaker_01
They've got to prepare the prisoners for interrogation, break them, take the gloves off, all these euphemisms. And so when we interrogate them, they're going to spill the beans.

00:37:21 Speaker_02
Much like in the Stanford Prison Experiment, the young guards in Abu Ghraib received the message that their prisoners needed to be humiliated and brought to heel. At Stanford, those messages implicitly came from Phil himself.

00:37:33 Speaker_02
In Iraq, they came from senior officers and in the name of national security and patriotism.

00:37:39 Speaker_01
And then in three months, no senior officer ever goes down to the dungeon to see what's happening. So this gives them complete liberty to reproduce the Sanford Prison study in spades, namely to do whatever you want.

00:37:52 Speaker_01
The guards were not part of the interrogation team. They were simply humiliating, tormenting the prisoners to break their will. So this is the clearest situational variable. They give guards total power with no oversight recipe for abuse.

00:38:08 Speaker_01
And in fact, what they said was, every time an explosion goes off and one of your buddies dies, his blood is on your hands, explicitly. So it's not like these guys were getting off on a torture.

00:38:22 Speaker_01
Essentially, you are part of our national security realm. And if you remember, in the war, it was everybody is for us or against us. So really, if you don't do this, you're a suspect.

00:38:37 Speaker_02
Once the sides were clearly drawn and marked, once the stakes of doing their jobs were laid out, Phil says the situation came to shape the behavior of Chip Frederick and the other young guards.

00:38:47 Speaker_01
So the guards got sucked in, but then, as we said earlier, it's now the night shift. So the biggest contributor to evil is boredom.

00:38:57 Speaker_01
And so you're bored, you got 12 hours to kill, and places filled with stress and danger, and the only playthings are prisoners. And most of the prisoners are already naked. And it's never been the case. We have female guards with naked prisoners.

00:39:11 Speaker_01
Again, for Muslims, you never show yourself naked in front of a woman. And so the sexual agenda is there. I mean, it's just boiling over. And it only goes down. It gets worse and worse and worse.

00:39:23 Speaker_01
So the images, the dozen images that were shown on television, are almost the least objectionable. The others are even worse.

00:39:36 Speaker_02
When I interviewed Phil in April 2013, he told me he continued to be fascinated by the idea that ordinary people could turn into monsters. He had published a book called The Lucifer Effect, understanding how good people turn evil.

00:39:52 Speaker_02
Sitting across from me at a studio at NPR, Phil cited the words of the historian and philosopher Hannah Arendt about the Nazi leader Adolf Eichmann.

00:40:01 Speaker_01
Hannah Arendt, in trying to understand Eichmann, this brutal killer, coined the term the banality of evil, meaning this guy looks like your Uncle Charlie. I mean, her phrase is he was terrifyingly normal.

00:40:15 Speaker_02
Phil told me he had started to ask himself a new question. If circumstances and situations could turn people bad, couldn't different circumstances and situations turn people good?

00:40:26 Speaker_01
And I said, well, isn't that true? If we flip it, isn't there the banality of heroes? That most heroes are ordinary people, everyday people.

00:40:33 Speaker_01
They do little things each day that we never know about unless they live in a major media city and somebody has to make a videotape. Then I started thinking of heroic things that I knew people did.

00:40:43 Speaker_01
And then I said, gee, now we should celebrate heroism more than we do. Well, the problem is with a lot of the stuff on heroes, it really made them seem extra special. These are male warriors, Agamemnon, Achilles, samurai.

00:41:00 Speaker_01
And there's almost no research on heroism.

00:41:08 Speaker_02
Phil started working in high schools, helping students understand the steps that lead people to betray their values and how they could stand up to injustice.

00:41:16 Speaker_01
We started developing classroom modules, which first started off by saying, be aware of the power of the dark side. So we teach you about the prison study, the messages of a Milgram study.

00:41:28 Speaker_01
We show videos of a woman lying on a subway station and Liverpool station in London. In five minutes, the little clock is going, 35 people pass right by and nobody stops. The question is, what's wrong? Are these bad people?

00:41:41 Speaker_01
And so the kids say, we would help. Well, what's the difference between you sitting here and people there? And they come up with situational differences. And then we teach them these lessons. The Bison Effect, prejudice and discrimination.

00:41:54 Speaker_01
We get the college kids to teach high school kids. High school kids, they teach middle school kids. It offsets all the evil I've done. It's been really enriching.

00:42:13 Speaker_02
In 1492, Christopher Columbus set sail from Spain in search of the great continent of Asia. He figured he needed to go west.

00:42:21 Speaker_02
Some of his misjudgment was based on erroneous maps and miscalculations about the size of the earth, and some of it was simple overconfidence. Convinced he was on the right track, Christopher Columbus sailed across the Atlantic Ocean.

00:42:35 Speaker_02
When he landed in the Bahamas, he believed he had found Asia. He labeled the people of the region Indians. He kept going, believing he would soon stumble on China and Japan.

00:42:49 Speaker_02
In the centuries that followed his voyage, hundreds of cartographers and historians corrected his numerous mistakes.

00:42:56 Speaker_02
They mapped the Americas, documenting the lives of the indigenous people who had lived there for centuries, and the many harms that European visitors had done to these native groups.

00:43:07 Speaker_02
Where the Italian explorer was all swashbuckling derring-do, they were careful and prized accuracy. In every era, there are people like Christopher Columbus. They are the heroes and villains of our history books, the bulls in our China shops.

00:43:23 Speaker_02
They leap before they look. They take a grand or outlandish idea and run with it. They go out on limbs in pursuit of their ambitions. Sometimes they change the world for the better. Sometimes they're a disaster, a cautionary tale for the rest of us.

00:43:42 Speaker_02
Cartographers, by contrast, are careful, detail-oriented. They're the ones who pressure test the claims of the explorers, the ones who develop the detailed maps of the new worlds the explorers have haphazardly sketched for us.

00:43:57 Speaker_02
Cartographers often roll their eyes at the antics of explorers. Phil Zimbardo was an explorer. He wanted to answer a big question and make a splash while doing it.

00:44:12 Speaker_02
The pull of big, provocative ideas was irresistible to him, and his work continues to provoke conversations about the nature of good and evil. But in his zeal to explore these big ideas, he was also reckless.

00:44:25 Speaker_02
He overlooked details and cut corners in the way he constructed his most famous experiment. He wrote roughshod over the safety and concerns of the young volunteers who were ostensibly in his care.

00:44:40 Speaker_02
And yet, it is also true that he generated enormous public interest in psychology. In the late 1990s, when the internet was still in its infancy, I remember running a Google search for Stanford.

00:44:53 Speaker_02
I wanted to look up the university directory for an expert I needed to interview. But the first result that popped up on my screen was not for the university. It was a link to the Stanford Prison Experiment.

00:45:05 Speaker_02
Phil helped his colleagues see that questions of morality were not merely philosophical questions, but scientific questions. The field of moral psychology is an extraordinarily robust area of study today.

00:45:19 Speaker_02
Phil also helped the general public recognize the value of psychology in answering some of the most pressing questions of our time.

00:45:27 Speaker_02
When we see genocides and mass murder around the world today, when we see human beings dehumanize one another, many of us no longer look for religious or theological explanations.

00:45:39 Speaker_02
We too ask ourselves the question that a boy once asked himself at a high school in the Bronx. What is it that prompts good people to do evil things? Hidden Brain is produced by Hidden Brain Media.

00:45:58 Speaker_02
Our audio production team includes Annie Murphy-Paul, Kristen Wong, Laura Querell, Ryan Katz, Autumn Barnes, Andrew Chadwick, and Nick Woodbury. Tara Boyle is our executive producer. I'm Hidden Brain's executive editor.

00:46:13 Speaker_02
If you love our show, please help to support it. Join our podcast subscription, Hidden Brain Plus, for ideas and interviews you won't hear anywhere else. Go to either apple.co slash hiddenbrain or support.hiddenbrain.org.

00:46:29 Speaker_02
You'll be providing us with vital support to bring you many more episodes of Hidden Brain in the future. I'm Shankar Vedantham. See you soon.