Back in 1961, Yale researcher Stanley Milgram performed a now-controversial experiment. He recruited people to volunteer in a psychology study supposedly about learning and memory. When they arrived, they were told the setup: a pair of participants were to play two roles, teacher and learner, while the experimenter (a stern man in a lab coat) observed. However, the trick of the experiment was that each participant was always “randomly” assigned to be the teacher, while the second alleged participant, assigned as learner, was in fact always an accomplice to the experiment.
For the experiment, the participant (as teacher) was moved to a separate room from the learner. Through an intercom, the participant was to read a list of word pairs to the learner, who then had to choose matching pairs when quizzed. After incorrect answers, the participant was to flip a switch to shock the learner — a panel at their desk had switches labeled 45 volts increasing up to 450 volts. The participant had watched the learner get strapped in to the shock equipment. The learner mentioned in passing that he had a heart condition, after which the experimenter authoritatively assured him that there was no danger (again, this was all acted out with the participant thinking the learner was just another volunteer). Back in the test room, the participant received a not-insignificant 45 volt shock to see what it felt like, and then the word-pair testing began.
The teacher read the words, and the learner appeared to be responding, and getting shocked at successively higher levels after each mistake. In fact, there were no real shocks, but a pre-recorded tape played reactions to each shock. As the shock levels went up, the learner feigned increased pain and eventually banged on the wall, complained about his heart condition, and asked to be released. If the participant continued, the learner stopped responding at all.
At a certain level, many participants stopped to question the experiment, querying the experimenter who was observing in the same room. When the participant did so, the experimenter verbally pushed them to continue (“Please continue”, “The experiment requires that you continue”, “It is absolutely essential that you continue”, “You have no other choice, you must go on”). Only if the participant still refused to go on was the experiment ended; otherwise it went on until they had delivered three shocks at the maximum level (450 volts, labeled “dangerous” on their panel).
Amazingly, most of the subjects — 65% — gave the final level of shock. While every participant at some point questioned the experimenter, not a single one dropped out before the 300 volt shock level. Even those who quit the experiment before the final level just removed their own participation, without trying to stop the experiment itself or check on the health of the learner. (Remember, the learner had complained of a heart condition before and during the experiment, and had gone silent after a possibly dangerous shock).
This experiment and its later replications vividly demonstrate the power of authority over peoples’ behavior. In this case it was a man in a labcoat in a university, but the experiment itself was inspired by the “just following orders” defense used in war crime trials for World War II Nazis. It’s not just the World War II Germans, though. Atrocities can be committed by anyone, including Americans. As Michael Albert put it:
“I have long since understood that the Germans weren’t different than the Brits or Americans or anyone else, though their circumstances were different, but for those who still don’t understand mass subservience to vile crimes induced by structural process of great power and breadth, I have to admit that I mostly just want to shout: Look around, dammit!”
But Milgram showed us that it is not just soldiers who can be influenced by authority to do bad things, but also normal people. Someone might counter that the Milgram results happened in a very different era, and that people today would act differently. However, the experiment has recently been replicated with similar results. See the ABC video below:
Related experiments have further confirmed how situational effects can bring normal people to commit horrible acts. In 1971, researcher Philip Zimbardo at Stanford created a fake prison underground in the psychology building. An experiment assigned the participants to play either a prisoner or a guard role, and paid them by the day to stay in the experiment. Prisoners were stuck in the basement 24/7, while the guards worked shifts in the mock prison and then went home to their normal life. What’s interesting, though, is that the guards quickly took on their role in seriousness, abusing the prisoners and showing genuine sadistic tendencies. Guards adapted their behavior to conform to what they thought Zimbardo — playing Prison Superintendent — wanted. (See this YouTube video for more).
Haslam and Reicher (2003) partially replicated Zimbardo’s prison experiment, demonstrating the crucial role of a leader (in this case, Zimbardo as superintendent) in establishing these patterns of behavior.
However, it is not just authority that can alter a normal person’s behavior for the worse. Conformity to a larger group can make people go against their better judgement. In the 50′s, Solomon Asch ran an experiment in which a group of participants were presented with a simple task: shown a picture with a plain line labeled X and three parallel comparison lines (A, B and C), choose which comparison line is the same length.
Crucially, all but one of the alleged participants in the group were actually accomplices to the experiment. All of the group members responded in order, out loud, which line matches the example line’s length. Occasionally, though, what the accomplices did was uniformly select a wrong answer — something anyone with normal vision could see was wrong. But after hearing a row of people claim this wrong answer, the participants often conformed and gave an answer they knew was wrong.
In later interviews, many subjects claimed they did not actually believed their conforming answers, but some actually did. Was it possible that their very perceptions were changed by the pressure to conform?
A more recent replication study, published in 2005, used brain imaging to show that different parts of the brain were active depending on whether the actor was conforming or independently dissenting. When not conforming, regions of the brain associated with emotion were active, suggesting an emotional cost to going against the group. Whether conforming to a wrong answer or not, there was no increased activity in the parts of the brain associated with conscious decision making. Crucially, though, when conforming to a unanimous wrong answer, a brain area devoted to spatial awareness lit up. In other words, it appears that how we see things — not just metaphorically, but literally what we see — is affected by social pressure.
Together, these and similar experimental results show how tenuous our control is over our own behavior. Under the combined influence of authority, leadership and conformity, perfectly normal people can come to disregard their own beliefs, morals and even perceptions. Deep down in all of us lurks a potential monster.
Perhaps an upside may be found among the variation of behaviors. Some people quit Milgram’s shock experiment (after giving quite a few shocks, of course). A few guards in Zimbardo’s prison experiment did little favors for the prisoners. Not everyone conformed in Asch’s line-perception experiments (having someone else dissent before you makes it much more likely you will dissent). And, of course, even some Nazi officers refused to participate in the atrocities of Hitler’s regime.
In other words, there is some hope. Further work in this line of study will hopefully identify and highlight the factors influencing non-conformity in similar situations, so that we might better understand what pitfalls to avoid in the future. Until then, we best keep these experiments in mind as a reminder that we are all susceptible to influence that may push us far beyond our normal ethical boundaries.