Reicher And Haslam Study Evaluation Essay

Stanford Prison Experiment

by Saul McLeod, updated 2017

Aim: To investigate how readily people would conform to the roles of guard and prisoner in a role-playing exercise that simulated prison life.

Zimbardo (1973) was interested in finding out whether the brutality reported among guards in American prisons was due to the sadistic personalities of the guards (i.e., dispositional) or had more to do with the prison environment (i.e., situational).

For example, prisoner and guards may have personalities which make conflict inevitable, with prisoners lacking respect for law and order and guards being domineering and aggressive. Alternatively, prisoners and guards may behave in a hostile manner due to the rigid power structure of the social environment in prisons.

If the prisoners and guards behaved in a non-aggressive manner, this would support the dispositional hypothesis, or if they behave the same way as people do in real prisons, this would support the situational explanation.

Procedure: To study the roles people play in prison situations, Zimbardo converted a basement of the Stanford University psychology building into a mock prison. He advertised asking for volunteers to participate in a study of the psychological effects of prison life,

More than 70 applicants answered the ad and were given diagnostic interviews and personality tests to eliminate candidates with psychological problems, medical disabilities, or a history of crime or drug abuse. The study comprised 24 male college students (chosen from 75 volunteers) who were paid $15 per day to take part in the experiment.

Participants were randomly assigned to either the role of prisoner or guard in a simulated prison environment. There were two reserves, and one dropped out, finally leaving ten prisoners and 11 guards. The guards worked in sets of three (being replaced after an 8-hour shift), and the prisoners were housed three to a room. There was also a solitary confinement cell for prisoners who 'misbehaved.' The prison simulation was kept as “real life” as possible.

Prisoners were treated like every other criminal, being arrested at their own homes, without warning, and taken to the local police station. They were fingerprinted, photographed and ‘booked.’ Then they were blindfolded and driven to the psychology department of Stanford University, where Zimbardo had had the basement set out as a prison, with barred doors and windows, bare walls and small cells. Here the deindividuation process began.

When the prisoners arrived at the prison they were stripped naked, deloused, had all their personal possessions removed and locked away, and were given prison clothes and bedding. They were issued a uniform, and referred to by their number only. The use of ID numbers was a way to make prisoners feel anonymous. Each prisoner had to be called only by his ID number and could only refer to himself and the other prisoners by number. Their clothes comprised a smock with their number written on it, but no underclothes. They also had a tight nylon cap to cover their hair, and a locked chain around one ankle.

All guards were dressed in identical uniforms of khaki, and they carried a whistle around their neck and a billy club borrowed from the police. Guards also wore special sunglasses, to make eye contact with prisoners impossible. Three guards worked shifts of eight hours each (the other guards remained on call). Guards were instructed to do whatever they thought was necessary to maintain law and order in the prison and to command the respect of the prisoners. No physical violence was permitted.

Zimbardo observed the behavior of the prisoners and guards (as a researcher), and also acted as a prison warden.

Findings: Within a very short time both guards and prisoners were settling into their new roles, with the guards adopting theirs quickly and easily.

Asserting Authority

Within hours of beginning the experiment some guards began to harass prisoners. At 2:30 A.M. prisoners were awakened from sleep by blasting whistles for the first of many "counts." The counts served as a way to familiarizing the prisoners with their numbers. More importantly, they provided a regular occasion for the guards to exercise control over the prisoners.

The prisoners soon adopted prisoner-like behavior too. They talked about prison issues a great deal of the time. They ‘told tales’ on each other to the guards. They started taking the prison rules very seriously, as though they were there for the prisoners’ benefit and infringement would spell disaster for all of them. Some even began siding with the guards against prisoners who did not obey the rules.

Physical Punishment

The prisoners were taunted with insults and petty orders, they were given pointless and boring tasks to accomplish, and they were generally dehumanized. Push-ups were a common form of physical punishment imposed by the guards. One of the guards stepped on the prisoners' backs while they did push-ups, or made other prisoners sit on the backs of fellow prisoners doing their push-ups.

Asserting Independence

Because the first day passed without incident, the guards were surprised and totally unprepared for the rebellion which broke out on the morning of the second day.

During the second day of the experiment, the prisoners removed their stocking caps, ripped off their numbers, and barricaded themselves inside the cells by putting their beds against the door. The guards called in reinforcements. The three guards who were waiting on stand-by duty came in and the night shift guards voluntarily remained on duty.

Putting Down the Rebellion

The guards retaliated by using a fire extinguisher which shot a stream of skin-chilling carbon dioxide, and they forced the prisoners away from the doors. Next, the guards broke into each cell, stripped the prisoners naked and took the beds out. The ringleaders of the prisoner rebellion were placed into solitary confinement. After this, the guards generally began to harass and intimidate the prisoners.

Special Privileges

One of the three cells was designated as a "privilege cell." The three prisoners least involved in the rebellion were given special privileges. The guards gave them back their uniforms and beds and allowed them to wash their hair and brush their teeth. Privileged prisoners also got to eat special food in the presence of the other prisoners who had temporarily lost the privilege of eating. The effect was to break the solidarity among prisoners.

Consequences of the Rebellion

Over the next few days, the relationships between the guards and the prisoners changed, with a change in one leading to a change in the other. Remember that the guards were firmly in control and the prisoners were totally dependent on them.

As the prisoners became more dependent, the guards became more derisive towards them. They held the prisoners in contempt and let the prisoners know it. As the guards’ contempt for them grew, the prisoners became more submissive.

As the prisoners became more submissive, the guards became more aggressive and assertive. They demanded ever greater obedience from the prisoners. The prisoners were dependent on the guards for everything so tried to find ways to please the guards, such as telling tales on fellow prisoners.

Prisoner #8612

Less than 36 hours into the experiment, Prisoner #8612 began suffering from acute emotional disturbance, disorganized thinking, uncontrollable crying, and rage. After a meeting with the guards where they told him he was weak, but offered him “informant” status, #8612 returned to the other prisoners and said “You can't leave. You can't quit.” Soon #8612 “began to act ‘crazy,’ to scream, to curse, to go into a rage that seemed out of control.” It wasn’t until this point that the psychologists realized they had to let him out.

A Visit from Parents

The next day, the guards held a visiting hour for parents and friends. They were worried that when the parents saw the state of the jail, they might insist on taking their sons home. Guards washed the prisoners, had them clean and polish their cells, fed them a big dinner and played music on the intercom.

After the visit, rumor spread of a mass escape plan. Afraid that they would lose the prisoners, the guards and experimenters tried to enlist the help and facilities of the Palo Alto police department. The guards again escalated the level of harassment, forcing them to do menial, repetitive work such as cleaning toilets with their bare hands.

Catholic Priest

Zimbardo invited a Catholic priest who had been a prison chaplain to evaluate how realistic our prison situation was. Half of the prisoners introduced themselves by their number rather than name. The chaplain interviewed each prisoner individually. The priest told them the only way they would get out was with the help of a lawyer.

Prisoner #819

Eventually while talking to the priest, #819 broke down and began to cry hysterically, just two previously released prisoners had. The psychologists removed the chain from his foot, the cap off his head, and told him to go and rest in a room that was adjacent to the prison yard. They told him they would get him some food and then take him to see a doctor.

While this was going on, one of the guards lined up the other prisoners and had them chant aloud:

"Prisoner #819 is a bad prisoner. Because of what Prisoner #819 did, my cell is a mess, Mr. Correctional Officer."

The psychologists realized #819 could hear the chanting and went back into the room where they found him sobbing uncontrollably. The psychologists tried to get him to agree to leave the experiment, but he said he could not leave because the others had labeled him a bad prisoner.

Back to Reality

At that point, Zimbardo said, "Listen, you are not #819. You are [his name], and my name is Dr. Zimbardo. I am a psychologist, not a prison superintendent, and this is not a real prison. This is just an experiment, and those are students, not prisoners, just like you. Let's go."

He stopped crying suddenly, looked up and replied, "Okay, let's go,“ as if nothing had been wrong.

An End to the Experiment

Zimbardo (1973) had intended that the experiment should run for a fortnight, but on the sixth day it was terminated. Christina Maslach, a recent Stanford Ph.D. brought in to conduct interviews with the guards and prisoners, strongly objected when she saw the prisoners being abused by the guards.

Filled with outrage, she said, "It's terrible what you are doing to these boys!" Out of 50 or more outsiders who had seen our prison, she was the only one who ever questioned its morality.

Zimbardo (2008) later noted, “It wasn't until much later that I realized how far into my prison role I was at that point -- that I was thinking like a prison superintendent rather than a research psychologist.“

Conclusion: People will readily conform to the social roles they are expected to play, especially if the roles are as strongly stereotyped as those of the prison guards. The “prison” environment was an important factor in creating the guards’ brutal behavior (none of the participants who acted as guards showed sadistic tendencies before the study). Therefore, the findings support the situational explanation of behavior rather than the dispositional one.

Zimbardo proposed that two processes can explain the prisoner's 'final submission.' Deindividuation may explain the behavior of the participants; especially the guards. This is a state when you become so immersed in the norms of the group that you lose your sense of identity and personal responsibility. The guards may have been so sadistic because they did not feel what happened was down to them personally – it was a group norm. The also may have lost their sense of personal identity because of the uniform they wore. Also, learned helplessness could explain the prisoner's submission to the guards. The prisoners learned that whatever they did had little effect on what happened to them. In the mock prison the unpredictable decisions of the guards led the prisoners to give up responding.

After the prison experiment was terminated, Zimbardo interviewed the participants. Here’s an excerpt:

‘Most of the participants said they had felt involved and committed. The research had felt "real" to them. One guard said, "I was surprised at myself. I made them call each other names and clean the toilets out with their bare hands. I practically considered the prisoners cattle and I kept thinking I had to watch out for them in case they tried something."

Another guard said "Acting authoritatively can be fun. Power can be a great pleasure." And another: "... during the inspection I went to Cell Two to mess up a bed which a prisoner had just made and he grabbed me, screaming that he had just made it and that he was not going to let me mess it up. He grabbed me by the throat and although he was laughing I was pretty scared. I lashed out with my stick and hit him on the chin although not very hard, and when I freed myself I became angry."’

Most of the guards found it difficult to believe that they had behaved in the brutalizing ways that they had. Many said they hadn’t known this side of them existed or that they were capable of such things. The prisoners, too, couldn’t believe that they had responded in the submissive, cowering, dependent way they had. Several claimed to be assertive types normally. When asked about the guards, they described the usual three stereotypes that can be found in any prison: some guards were good, some were tough but fair, and some were cruel.

Critical Evaluation: Demand characteristics could explain the findings of the study. Most of the guards later claimed they were simply acting. Because the guards and prisoners were playing a role, their behavior may not be influenced by the same factors which affect behavior in real life. This means the study's findings cannot be reasonably generalized to real life, such as prison settings. I.e, the study has low ecological validity.

However, there is considerable evidence that the participants did react to the situation as though it was real. For example, 90% of the prisoners’ private conversations, which were monitored by the researchers, were on the prison conditions, and only 10% of the time were their conversations about life outside of the prison. The guards, too, rarely exchanged personal information during their relaxation breaks - they either talked about ‘problem prisoners,’ other prison topics, or did not talk at all. The guards were always on time and even worked overtime for no extra pay. When the prisoners were introduced to a priest, they referred to themselves by their prison number, rather than their first name. Some even asked him to get a lawyer to help get them out.

The study may also lack population validity as the sample comprised US male students. The study's findings cannot be applied to female prisons or those from other countries. For example, America is an individualist culture (were people are generally less conforming) and the results may be different in collectivist cultures (such as Asian countries).

A strength of the study is that it has altered the way US prisons are run. For example, juveniles accused of federal crimes are no longer housed before trial with adult prisoners (due to the risk of violence against them).

Another strength of the study is that the harmful treatment of participant led to the formal recognition of ethical guidelines by the American Psychological Association. Studies must now undergo an extensive review by an institutional review board (US) or ethics committee (UK) before they are implemented. A review of research plans by a panel is required by most institutions such as universities, hospitals, and government agencies. These boards review whether the potential benefits of the research are justifiable in the light of the possible risk of physical or psychological harm. These boards may request researchers make changes to the study's design or procedure, or in extreme cases deny approval of the study altogether.

Ethical Issues: The study has received many ethical criticisms, including lack of fully informed consent by participants as Zimbardo himself did not know what would happen in the experiment (it was unpredictable). Also, the prisoners did not consent to being 'arrested' at home. The prisoners were not told partly because final approval from the police wasn’t given until minutes before the participants decided to participate, and partly because the researchers wanted the arrests to come as a surprise. However, this was a breach of the ethics of Zimbardo’s own contract that all of the participants had signed.

Also, participants playing the role of prisoners were not protected from psychological harm, experiencing incidents of humiliation and distress. For example, one prisoner had to be released after 36 hours because of uncontrollable bursts of screaming, crying and anger.

However, in Zimbardo's defense, the emotional distress experienced by the prisoners could not have been predicted from the outset. Approval for the study was given by the Office of Naval Research, the Psychology Department and the University Committee of Human Experimentation. This Committee also did not anticipate the prisoners’ extreme reactions that were to follow. Alternative methodologies were looked at which would cause less distress to the participants but at the same time give the desired information, but nothing suitable could be found.

Extensive group and individual debriefing sessions were held, and all participants returned post-experimental questionnaires several weeks, then several months later, then at yearly intervals. Zimbardo concluded there were no lasting negative effects.

Zimbardo also strongly argues that the benefits gained about our understanding of human behavior and how we can improve society should out balance the distress caused by the study. However, it has been suggested that the US Navy was not so much interested in making prisons more human and were, in fact, more interested in using the study to train people in the armed services to cope with the stresses of captivity.

Discussion Questions

What are the effects of living in an environment with no clocks, no view of the outside world, and minimal sensory stimulation?

Consider the psychological consequences of stripping, delousing, and shaving the heads of prisoners or members of the military. What transformations take place when people go through an experience like this?

After the study, how do you think the prisoners and guards felt?

If you were the experimenter in charge, would you have done this study? Would you have terminated it earlier? Would you have conducted a follow-up study?


Haney, C., Banks, W. C., & Zimbardo, P. G. (1973). A study of prisoners and guards in a simulated prison. Naval Research Review, 30, 4-17.

How to reference this article:

McLeod, S. A. (2017). Zimbardo - Stanford Prison Experiment. Retrieved from

Was this article useful? Please help us improve by giving feedback below


Understanding of the psychology of tyranny is dominated by classic studies from the 1960s and 1970s: Milgram's research on obedience to authority and Zimbardo's Stanford Prison Experiment. Supporting popular notions of the banality of evil, this research has been taken to show that people conform passively and unthinkingly to both the instructions and the roles that authorities provide, however malevolent these may be. Recently, though, this consensus has been challenged by empirical work informed by social identity theorizing. This suggests that individuals' willingness to follow authorities is conditional on identification with the authority in question and an associated belief that the authority is right.

Citation: Haslam SA, Reicher SD (2012) Contesting the “Nature” Of Conformity: What Milgram and Zimbardo's Studies Really Show. PLoS Biol 10(11): e1001426.

Published: November 20, 2012

Copyright: © 2012 Haslam, Reicher. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Funding: The authors received no specific funding for this work.

Competing interests: The authors have declared that no competing interests exist.


If men make war in slavish obedience to rules, they will fail.

Ulysses S. Grant [1]

Conformity is often criticized on grounds of morality. Many, if not all, of the greatest human atrocities have been described as “crimes of obedience” [2]. However, as the victorious American Civil War General and later President Grant makes clear, conformity is equally problematic on grounds of efficacy. Success requires leaders and followers who do not adhere rigidly to a pre-determined script. Rigidity cannot steel them for the challenges of their task or for the creativity of their opponents.

Given these problems, it would seem even more unfortunate if human beings were somehow programmed for conformity. Yet this is a view that has become dominant over the last half-century. Its influence can be traced to two landmark empirical programs led by social psychologists in the 1960s and early 1970s: Milgram's Obedience to Authority research and Zimbardo's Stanford Prison Experiment. These studies have not only had influence in academic spheres. They have spilled over into our general culture and shaped popular understanding, such that “everyone knows” that people inevitably succumb to the demands of authority, however immoral the consequences [3],[4]. As Parker puts it, “the hopeless moral of the [studies'] story is that resistance is futile” [5]. What is more, this work has shaped our understanding not only of conformity but of human nature more broadly [6].

Building on an established body of theorizing in the social identity tradition—which sees group-based influence as meaningful and conditional [7],[8]—we argue, however, that these understandings are mistaken. Moreover, we contend that evidence from the studies themselves (as well as from subsequent research) supports a very different analysis of the psychology of conformity.

The Classic Studies: Conformity, Obedience, and the Banality Of Evil

In Milgram's work [9],[10] members of the general public (predominantly men) volunteered to take part in a scientific study of memory. They found themselves cast in the role of a “Teacher” with the task of administering shocks of increasing magnitude (from 15 V to 450 V in 15-V increments) to another man (the “Learner”) every time he failed to recall the correct word in a previously learned pair. Unbeknown to the Teacher, the Learner was Milgram's confederate, and the shocks were not real. Moreover, rather than being interested in memory, Milgram was actually interested in seeing how far the men would go in carrying out the task. To his—and everyone else's [11]—shock, the answer was “very far.” In what came to be termed the “baseline” study [12] all participants proved willing to administer shocks of 300 V and 65% went all the way to 450 V. This appeared to provide compelling evidence that normal well-adjusted men would be willing to kill a complete stranger simply because they were ordered to do so by an authority.

Zimbardo's Stanford Prison Experiment took these ideas further by exploring the destructive behaviour of groups of men over an extended period [13],[14]. Students were randomly assigned to be either guards or prisoners within a mock prison that had been constructed in the Stanford Psychology Department. In contrast to Milgram's studies, the objective was to observe the interaction within and between the two groups in the absence of an obviously malevolent authority. Here, again, the results proved shocking. Such was the abuse meted out to the prisoners by the guards that the study had to be terminated after just 6 days. Zimbardo's conclusion from this was even more alarming than Milgram's. People descend into tyranny, he suggested, because they conform unthinkingly to the toxic roles that authorities prescribe without the need for specific orders: brutality was “a ‘natural’ consequence of being in the uniform of a ‘guard’ and asserting the power inherent in that role” [15].

Within psychology, Milgram and Zimbardo helped consolidate a growing “conformity bias” [16] in which the focus on compliance is so strong as to obscure evidence of resistance and disobedience [17]. However their arguments proved particularly potent because they seemed to mesh with real-world examples—particularly evidence of the “banality of evil.” This term was coined in Hannah Arendt's account of the trial of Adolf Eichmann [18], a chief architect of the Nazis' “final solution to the Jewish question” [19]. Despite being responsible for the transportation of millions of people to their death, Arendt suggested that Eichmann was no psychopathic monster. Instead his trial revealed him to be a diligent and efficient bureaucrat—a man more concerned with following orders than with asking deep questions about their morality or consequence.

Much of the power of Milgram and Zimbardo's research derives from the fact that it appears to give empirical substance to this claim that evil is banal [3]. It seems to show that tyranny is a natural and unavoidable consequence of humans' inherent motivation to bend to the wishes of those in authority—whoever they may be and whatever it is that they want us to do. Put slightly differently, it operationalizes an apparent tragedy of the human condition: our desire to be good subjects is stronger than our desire to be subjects who do good.

Questioning the Consensus: Conformity Isn't Natural and It Doesn't Explain Tyranny

The banality of evil thesis appears to be a truth almost universally acknowledged. Not only is it given prominence in social psychology textbooks [20], but so too it informs the thinking of historians [21],[22], political scientists [23], economists [24], and neuroscientists [25]. Indeed, via a range of social commentators, it has shaped the public consciousness much more broadly [26], and, in this respect, can lay claim to being the most influential data-driven thesis in the whole of psychology [27],[28].

Yet despite the breadth of this consensus, in recent years, we and others have reinterrogated its two principal underpinnings—the archival evidence pertaining to Eichmann and his ilk, and the specifics of Milgram and Zimbardo's empirical demonstrations—in ways that tell a very different story [29].

First, a series of thoroughgoing historical examinations have challenged the idea that Nazi bureaucrats were ever simply following orders [19],[26],[30]. This may have been the defense they relied upon when seeking to minimize their culpability [31], but evidence suggests that functionaries like Eichmann had a very good understanding of what they were doing and took pride in the energy and application that they brought to their work. Typically too, roles and orders were vague, and hence for those who wanted to advance the Nazi cause (and not all did), creativity and imagination were required in order to work towards the regime's assumed goals and to overcome the challenges associated with any given task [32]. Emblematic of this, the practical details of “the final solution” were not handed down from on high, but had to be elaborated by Eichmann himself. He then felt compelled to confront and disobey his superiors—most particularly Himmler—when he believed that they were not sufficiently faithful to eliminationist Nazi principles [19].

Second, much the same analysis can be used to account for behavior in the Stanford Prison Experiment. So while it may be true that Zimbardo gave his guards no direct orders, he certainly gave them a general sense of how he expected them to behave [33]. During the orientation session he told them, amongst other things, “You can create in the prisoners feelings of boredom, a sense of fear to some degree, you can create a notion of arbitrariness that their life is totally controlled by us, by the system, you, me… We're going to take away their individuality in various ways. In general what all this leads to is a sense of powerlessness” [34]. This contradicts Zimbardo's assertion that “behavioral scripts associated with the oppositional roles of prisoner and guard [were] the sole source of guidance” [35] and leads us to question the claim that conformity to these role-related scripts was the primary cause of guard brutality.

But even with such guidance, not all guards acted brutally. And those who did used ingenuity and initiative in responding to Zimbardo's brief. Accordingly, after the experiment was over, one prisoner confronted his chief tormentor with the observation that “If I had been a guard I don't think it would have been such a masterpiece” [34]. Contrary to the banality of evil thesis, the Zimbardo-inspired tyranny was made possible by the active engagement of enthusiasts rather than the leaden conformity of automatons.

Turning, third, to the specifics of Milgram's studies, the first point to note is that the primary dependent measure (flicking a switch) offers few opportunities for creativity in carrying out the task. Nevertheless, several of Milgram's findings typically escape standard reviews in which the paradigm is portrayed as only yielding up evidence of obedience. Initially, it is clear that the “baseline study” is not especially typical of the 30 or so variants of the paradigm that Milgram conducted. Here the percentage of participants going to 450 V varied from 0% to nearly 100%, but across the studies as a whole, a majority of participants chose not to go this far [10],[36],[37].

Furthermore, close analysis of the experimental sessions shows that participants are attentive to the demands made on them by the Learner as well as the Experimenter [38]. They are torn between two voices confronting them with irreconcilable moral imperatives, and the fact that they have to choose between them is a source of considerable anguish. They sweat, they laugh, they try to talk and argue their way out of the situation. But the experimental set-up does not allow them to do so. Ultimately, they tend to go along with the Experimenter if he justifies their actions in terms of the scientific benefits of the study (as he does with the prod “The experiment requires that you continue”) [39]. But if he gives them a direct order (“You have no other choice, you must go on”) participants typically refuse. Once again, received wisdom proves questionable. The Milgram studies seem to be less about people blindly conforming to orders than about getting people to believe in the importance of what they are doing [40].

Tyranny as a Product of Identification-Based Followership

Our suspicions about the plausibility of the banality of evil thesis and its various empirical substrates were first raised through our work on the BBC Prison Study (BPS [41]). Like the Stanford study, this study randomly assigned men to groups as guards and prisoners and examined their behaviour with a specially created “prison.” Unlike Zimbardo, however, we took no leadership role in the study. Without this, would participants conform to a hierarchical script or resist it?

The study generated three clear findings. First, participants did not conform automatically to their assigned role. Second, they only acted in terms of group membership to the extent that they actively identified with the group (such that they took on a social identification) [42]. Third, group identity did not mean that people simply accepted their assigned position; instead, it empowered them to resist it. Early in the study, the Prisoners' identification as a group allowed them successfully to challenge the authority of the Guards and create a more egalitarian system. Later on, though, a highly committed group emerged out of dissatisfaction with this system and conspired to create a new hierarchy that was far more draconian.

Ultimately, then, the BBC Prison Study came close to recreating the tyranny of the Stanford Prison Experiment. However it was neither passive conformity to roles nor blind obedience to rules that brought the study to this point. On the contrary, it was only when they had internalized roles and rules as aspects of a system with which they identified that participants used them as a guide to action. Moreover, on the basis of this shared identification, the hallmark of the tyrannical regime was not conformity but creative leadership and engaged followership within a group of true believers (see also [43],[44]). As we have seen, this analysis mirrors recent conclusions about the Nazi tyranny. To complete the argument, we suggest that it is also applicable to Milgram's paradigm.

The evidence, noted above, about the efficacy of different “prods” already points to the fact that compliance is bound up with a sense of commitment to the experiment and the experimenter over and above commitment to the learner (S. Haslam, SD Reicher, M. Birney, unpublished data) [39]. This use of prods is but one aspect of Milgram's careful management of the paradigm [13] that is aimed at securing participants' identification with the scientific enterprise.

Significantly, though, the degree of identification is not constant across all variants of the study. For instance, when the study is conducted in commercial premises as opposed to prestigious Yale University labs one might expect the identification to diminish and (as our argument implies) compliance to decrease. It does. More systematically, we have examined variations in participants' identification with the Experimenter and the science that he represents as opposed to their identification with the Learner and the general community. They always identify with both to some degree—hence the drama and the tension of the paradigm. But the degree matters, and greater identification with the Experimenter is highly predictive of a greater willingness among Milgram's participants to administer the maximum shock across the paradigm's many variants [37].

However, some of the most compelling evidence that participants' administration of shocks results from their identification with Milgram's scientific goals comes from what happened after the study had ended. In his debriefing, Milgram praised participants for their commitment to the advancement of science, especially as it had come at the cost of personal discomfort. This inoculated them against doubts concerning their own punitive actions, but it also it led them to support more of such actions in the future. “I am happy to have been of service,” one typical participant responded, “Continue your experiments by all means as long as good can come of them. In this crazy mixed up world of ours, every bit of goodness is needed” (S. Haslam, SD Reicher, K Millward, R MacDonald, unpublished data).


The banality of evil thesis shocks us by claiming that decent people can be transformed into oppressors as a result of their “natural” conformity to the roles and rules handed down by authorities. More particularly, the inclination to conform is thought to suppress oppressors' ability to engage intellectually with the fact that what they are doing is wrong.

Although it remains highly influential, this thesis loses credibility under close empirical scrutiny. On the one hand, it ignores copious evidence of resistance even in studies held up as demonstrating that conformity is inevitable [17]. On the other hand, it ignores the evidence that those who do heed authority in doing evil do so knowingly not blindly, actively not passively, creatively not automatically. They do so out of belief not by nature, out of choice not by necessity. In short, they should be seen—and judged—as engaged followers not as blind conformists [45].

What was truly frightening about Eichmann was not that he was unaware of what he was doing, but rather that he knew what he was doing and believed it to be right. Indeed, his one regret, expressed prior to his trial, was that he had not killed more Jews [19]. Equally, what is shocking about Milgram's experiments is that rather than being distressed by their actions [46], participants could be led to construe them as “service” in the cause of “goodness.”

To understand tyranny, then, we need to transcend the prevailing orthodoxy that this derives from something for which humans have a natural inclination—a “Lucifer effect” to which they succumb thoughtlessly and helplessly (and for which, therefore, they cannot be held accountable). Instead, we need to understand two sets of inter-related processes: those by which authorities advocate oppression of others and those that lead followers to identify with these authorities. How did Milgram and Zimbardo justify the harmful acts they required of their participants and why did participants identify with them—some more than others?

These questions are complex and full answers fall beyond the scope of this essay. Yet, regarding advocacy, it is striking how destructive acts were presented as constructive, particularly in Milgram's case, where scientific progress was the warrant for abuse. Regarding identification, this reflects several elements: the personal histories of individuals that render some group memberships more plausible than others as a source of self-definition; the relationship between the identities on offer in the immediate context and other identities that are held and valued in other contexts; and the structure of the local context that makes certain ways of orienting oneself to the social world seem more “fitting” than others [41],[47],[48].

At root, the fundamental point is that tyranny does not flourish because perpetrators are helpless and ignorant of their actions. It flourishes because they actively identify with those who promote vicious acts as virtuous [49]. It is this conviction that steels participants to do their dirty work and that makes them work energetically and creatively to ensure its success. Moreover, this work is something for which they actively wish to be held accountable—so long as it secures the approbation of those in power.


  1. 1. Strachan H (1983) European armies and the conduct of war. London: Unwin Hyman (p.3).
  2. 2. Kelman HC, Hamilton VL (1990) Crimes of obedience. New Haven: Yale University Press.
  3. 3. Novick P (1999) The Holocaust in American life. Boston: Houghton Mifflin.
  4. 4. Jetten J, Hornsey MJ (Eds.) (2011) Rebels in groups: dissent, deviance, difference and defiance. Chichester, UK: Wiley-Blackwell.
  5. 5. Parker I (2007) Revolution in social psychology: alienation to emancipation. London: Pluto Press. (p.84)
  6. 6. Smith, JR, Haslam SA. (Eds.) (2012) Social psychology: revisiting the classic studies. London: Sage.
  7. 7. Turner JC (1991) Social influence. Buckingham, UK: Open University Press.
  8. 8. Turner JC, Hogg MA, Oakes PJ, Reicher SD, Wetherell MS (1987). Rediscovering the social group: A self-categorization theory. Oxford: Blackwell.
  9. 9. Milgram S (1963) Behavioral study of obedience. J Abnorm Soc Psych 67: 371–378.
  10. 10. Milgram S (1974) Obedience to authority: an experimental view. New York: Harper & Row.
  11. 11. Blass T (2004) The man who shocked the world: the life and legacy of Stanley Milgram. New York, NY: Basic Books.
  12. 12. Russell NJ (2011) Milgram's obedience to authority experiments: origins and early evolution. Br J Soc Psychol 50: 140–162.
  13. 13. Haney C, Banks C, Zimbardo P (1973) A study of prisoners and guards in a simulated prison. Nav Res Rev September: 1–17 Washington (D.C.): Office of Naval Research. p.11.
  14. 14. Zimbardo P (2007) The Lucifer effect: how good people turn evil. London, UK: Random House.
  15. 15. Haney C, Banks C, Zimbardo P (1973) A study of prisoners and guards in a simulated prison. Nav Res Rev September: 1–17 Washington (D.C.): Office of Naval Research. p.12.
  16. 16. Moscovici S (1976) Social influence and social change. London, UK: Academic Press.
  17. 17. Haslam SA, Reicher SD (2012) When prisoners take over the prison: a social psychology of resistance. Pers Soc Psychol Rev 16: 152–179.
  18. 18. Arendt H (1963) Eichmann in Jerusalem: a report on the banality of evil. New York: Penguin.
  19. 19. Cesarani D (2004) Eichmann: his life and crimes. London: Heinemann.
  20. 20. Miller A (2004). What can the Milgram obedience experiments tell us about the Holocaust? Generalizing from the social psychology laboratory. Miller A, ed. The social psychology of good and evil. New York: Guilford. pp. 193–239.
  21. 21. Browning C (1992) Ordinary men: Reserve Police Battalion 101 and the Final Solution in Poland. London: Penguin Books.
  22. 22. Overy R (2011) Milgram and the historians. Psychologist 24: 662–663.
  23. 23. Helm C, Morelli M (1979) Stanley Milgram and the obedience experiment: authority, legitimacy, and human action. Polit Theory 7: 321–346.
  24. 24. Akerlof GA (1991) Procrastination and obedience. Am Econ Rev 81: 1–19.
  25. 25. Harris LT (2009) The influence of social group and context on punishment decisions: insights from social neuroscience. Gruter Institute Squaw Valley Conference May 21, 2009. Law, Behavior & the Brain. Available at SSRN:
  26. 26. Lozowick Y (2002) Hitler's bureaucrats: the Nazi Security Police and the banality of evil. H. Watzman, translator. London: Continuum.
  27. 27. Blass T (Ed.) (2000) Obedience to authority. Current perspectives on the Milgram Paradigm. Mahwah (New Jersey): Erlbaum.
  28. 28. Benjamin LT, Simpson JA (2009) The power of the situation: the impact of Milgram's obedience studies on personality and social psychology. Am Psychol 64: 12–19.
  29. 29. Haslam SA, Reicher SD (2007) Beyond the banality of evil: three dynamics of an interactionist social psychology of tyranny. Pers Soc Psychol Bull 33: 615–622.
  30. 30. Vetlesen AJ (2005) Evil and human agency: understanding collective evildoing. Cambridge: Cambridge University Press.
  31. 31. Mandel DR (1998) The obedience alibi: Milgram's account of the Holocaust reconsidered. Analyse und Kritik 20: 74–94.
  32. 32. Kershaw I (1993) Working towards the Führer: reflections on the nature of the Hitler dictatorship. Contemp Eur Hist 2: 103–108.
  33. 33. Banyard P (2007) Tyranny and the tyrant. Psychologist 20: 494–495.
  34. 34. Zimbardo P (1989) Quiet rage: The Stanford Prison Study [video]. Stanford: Stanford University.
  35. 35. Zimbardo P (2004) A situationist perspective on the psychology of evil: understanding how good people are transformed into perpetrators. Miller A, editor. The social psychology of good and evil. New York: Guilford. pp. 21–50.
  36. 36. Milgram S (1965) Some conditions of obedience and disobedience to authority. Hum Relat 18: 57–76.
  37. 37. Reicher SD, Haslam SA, Smith JR (2012) Working towards the experimenter: reconceptualizing obedience within the Milgram paradigm as identification-based followership. Perspect Psychol Sci 7: 315–324.
  38. 38. Packer D (2008) Identifying systematic disobedience in Milgram's obedience experiments: a meta-analytic review. Perspect Psychol Sci 3: 301–304.
  39. 39. Burger JM, Girgis ZM, Manning CM (2011) In their own words: explaining obedience to authority through an examination of participants' comments. Social Psychological and Personality Science 2: 460–466.
  40. 40. Reicher SD, Haslam SA (2011) After shock? Towards a social identity explanation of the Milgram ‘obedience’ studies. Brit J Soc Psychol 50: 163–169.
  41. 41. Reicher SD, Haslam SA (2006) Rethinking the psychology of tyranny: the BBC prison study. Brit J Soc Psychol 45: 1–40.
  42. 42. Tajfel H, Turner JC (1979) An integrative theory of intergroup conflict. Austin WG, Worchel S, editors. The social psychology of intergroup relations. Monterey (California): Brooks/Cole. pp.33–47.
  43. 43. Packer DJ (2008) On being both with us and against us: a normative conflict model of dissent in social groups. Pers Soc Psychol Rev 12: 50–72.
  44. 44. Packer DJ, Chasteen AL (2010) Loyal deviance: testing the normative conflict model of dissent in social groups. Pers Soc Psychol B 36: 5–18.
  45. 45. Haslam SA, Reicher SD, Platow MJ (2008) The new psychology of leadership: identity, influence and power. Hove, UK: Psychology Press.
  46. 46. Milgram S (1964) Issues in the study of obedience: a reply to Baumrind. Am Psychol 19: 848–852.
  47. 47. Bruner JS (1957) On perceptual readiness. Psychol Rev 64: 123–152.
  48. 48. Oakes PJ, Haslam SA, Turner JC (1994) Stereotyping and social reality. Oxford: Blackwell.
  49. 49. Reicher SD, Haslam SA, Rath R (2008) Making a virtue of evil: a five-step model of the development of collective hate. Social and Personality Psychology Compass 2: 1313–1344.

0 Replies to “Reicher And Haslam Study Evaluation Essay”

Lascia un Commento

L'indirizzo email non verrà pubblicato. I campi obbligatori sono contrassegnati *