7 Conformity, Compliance, and Obedience 231
7 * Conformity, Compliance, and Obedience
When
you think of the long and gloomy history of man, you will find more hideous
crimes have been committed in the name of obedience than have ever been
committed in the name of rebellion.
—C.
P. Snow
The jury
had been impanelled to hear the case State v. Leroy Reed. Reed, a paroled
felon, had been arrested for possessing a gun. Karl, a firefi ghter, sat in the
jury box, carefully listening and watching. The prosecuting attorney argued
that the defendant should be found guilty of violating his parole, despite any
sympathy jurors might feel for him. The defense attorney argued that even
though Reed had bought a gun, he should not be found guilty.
According
to the defense, Reed bought the gun because he believed that it was required
for a mail-order detective course in which he had enrolled. Reed wanted to
better his life, and he thought that becoming a private detective was just the
ticket. He admired real-life detectives very much. He had told a police
detective at the county courthouse that he was learning to be a detective and
had bought a gun. The detective was incredulous and told Reed to go home and
get it. Reed did so and was promptly arrested because possessing a gun is a
criminal offense for felons. Evidence also showed that Reed was able to read at
only a fi fth-grade level and probably did not understand that he was violating
his parole by purchasing a weapon. The judge told the jury that, according to
the law, they must fi nd Reed guilty if he possessed a gun and knew that he possessed
a gun. As he went into the jury room, Karl was convinced that Reed was guilty.
After all, the prosecutor had presented sufficient evidence concerning the
points of law that according to the judge must be fulfi lled for conviction.
Reed had bought a gun and certainly knew that he possessed that gun. As the
deliberations began, however, it became obvious that not all of the jurors
agreed with Karl.
The
results of a fi rst-ballot vote taken by the foreperson showed that nine jurors
favored acquittal and only three, including Karl, favored conviction. After
further discussion, two of the jurors favoring conviction changed their votes.
Karl alone held fi rm to his belief in the defendant’s guilt. As the
deliberations progressed, the other jurors tried to convince Karl that a
not-guilty verdict was the fairer verdict. This pressure made Karl very anxious
and upset. He continually put his face in both hands and closed his eyes.
Continued efforts to persuade Karl to change his verdict failed.
After a
while, however, Karl, still unconvinced, decided to change his verdict. He told
the other jury members that he would change his verdict to not guilty but that
he “would just never feel right about it.”
Why did
Karl change his verdict, even though he did not agree with his fellow jurors?
This case, vividly brought to life in the PBS film Inside the Jury Room,forces
us not just to look at Karl’s behavior but also to speculate about our own.
Would each of us be as willing to compromise our beliefs in the face of a unanimous
majority who think differently? Under what conditions can our behavior be
modifi ed by others? These questions are at the very core of what distinguishes
social psychology from other areas of psychology: the infl uence of others on
our behavior. In Chapter 6, we saw how persuasive arguments from others can
infl uence our behavior. Karl was certainly exposed to such arguments. However,
he did not accept them as a basis for changing his verdict. Rather, Karl
modified his verdict in response to the knowledge that all of his fellow jurors
believed that Leroy Reed should be found not guilty. Thus, as Karl’s case
illustrates, sometimes we modify behavior based on perceived pressure from
others rather than through a process of accepting what they say.
Like Karl,
we are often infl uenced by what those around us do. For example, when you are
seated in a classroom, you will note that most people are behaving similarly:
They are taking notes and listening to the professor. In social situations,
such as the classroom, the behavior of others often defi nes the range of
appropriate behavior. This is especially true when the situation is new or
ambiguous. What if, for example, the fi re alarm rang while you were sitting in
class? Would you immediately get up and leave, or would you look around to see
what others do? Most people insist that they would get up and leave. However,
experience teaches us otherwise. If your classmates were just sitting in their
seats calmly, you probably would do the same. The social infl uence processes
that operate on you in the classroom situation can also be applied to
understanding situations like Karl’s changing his verdict.
In this
chapter, we explore three types of social infl uence: conformity, compliance,
and obedience. We ask: How does social infl uence sometimes cause us to do or
say things that we don’t necessarily believe in, as was the case with Karl? Why
was Karl able to hold out when there were others on his side but fi nally gave
in when he was the only one in favor of conviction? What other factors and
types of situations make us more or less likely to conform? When we conform, do
we always conform with the majority, or can a minority sometimes lead us to
conform to their point of view? Under what conditions do we comply with or
agree to a direct request? And, fi nally, what factors lead us to obey the
orders of a person in a position of authority? These are some of the questions
addressed in this chapter.
Conformity:
Going Along with the Crowd
As a
juror, Karl was placed in an uncertain position because he was receiving
conflicting input about the situation. From the judge and the prosecution, he
received a message about the law that convinced him Reed was guilty and that
his responsibility as a juror was to convict him of violating his parole. From
his fellow jurors, on the other hand, he received a different message, a
message that made him doubt this conclusion. The other jurors told him that in
their opinion, Reed should be found not guilty despite the evidence. They believed
that extenuating circumstances, including Reed’s lack of intent to commit a crime,
made a not-guilty verdict appropriate. Additionally, Karl was well aware that
he was the only juror holding out for conviction. The force brought to bear by
the social situation eventually caused Karl to change his verdict, although
privately he did not agree with most of his fellow jurors. Karl was the victim
of social influence.
If Karl
had been responsible for deciding Reed’s fate on his own, he would have convicted him. But once he was in a
social context, he had to reconsider his personal views in light of the views
of others. He yielded to group pressure even though he felt the group was
wrong. Karl’s behavior is illustrative of what social psychologists call conformity. Conformity occurs
when we modify our behavior in response to real or imagined pressure from
others. Notice that nobody directly asked or ordered Karl to change his
verdict. Instead, he responded to the subtle and not-so-subtle pressures
applied by his fellow jurors.
Informational
and Normative Social Infl uence
What is it
about the social situation that can cause us to change our opinion, even if we
privately feel such an opinion shift is wrong? To adequately address this
question, we need to make a distinction between two kinds of social influence:
informational and normative (Deutsch & Gerrard, 1955).
Sometimes
we modify our behavior in response to information that we receive from others.
This is known asinformational social influence. In many social situations,
other people provide important information through their actions and words.
Imagine yourself in the place of one of Karl’s fellow jurors,
say, the jury foreperson. You think the
defendant is guilty, but nine of your fellow jurors think the opposite. They
try to convince you of the defendant’s innocence by sharing their perceptions
of the evidence with
you. One juror may remind you of an important piece of information that you had
forgotten; another may share an interpretation of the defendant’s
behavior that had not occurred
to you. If you modify your opinion based on such new or reinterpreted
information, you are responding to informational social influence. The
persuasion process discussed in Chapter 6 illustrates informational social
influence.
This is,
in fact, what happened to the foreperson in the Reed case. Initially, he was
among the three jurors who were voting to convict. But after hearing the group
discuss the issues and the evidence, he came to see the crime and the
surrounding circumstances in a different way. Based on his reinterpretation of
the evidence, he decided to change his verdict. He did so in direct response to
what was said and how other jurors said it.
Generally,
we are subject to informational social influence because we want to be accurate
in our judgments. We use other people’s opinions as a source of information by which to test the validity of our
own judgments. We conform because we perceive that others have correct
information (Campbell & Fairey, 1989). Shifts in opinion based on
informational social influence result from the sharing of arguments and factual
information (Kaplan & Miller, 1987). Essentially, opinion and behavior
change come about via the kind of persuasion processes discussed in Chapter 6.
Conformity
also comes about as a result of normative social influence. In this type of
social influence situation, we modify our behavior in response to a norm, an
unwritten social rule that suggests what constitutes appropriate behavior in a
particular situation. Our behavior is guided not only by rational consideration
of the issue at hand but also by the discomfort we experience when we are in
disagreement with others. We are motivated to conform to norms and to the
implicit expectations of others in order to gain social acceptance and to avoid
appearing different or being rejected (Campbell & Fairey, 1989).
During
deliberations, Karl was not influenced directly by the informational content of
the jury deliberations. Instead, the fact that others disagreed with him became
crucial. The arguments and opinions expressed by the other jurors suggested to
him that the operational norm was that the law didn’t
apply in this case; Reed ought to be
acquitted despite evidence pointing to his guilt. Karl changed his verdict in
order to conform to this norm.
In a
normative social influence situation, at least two factors are relevant. First,
the input we obtain from others serves as a clue to the nature of the norm in
effect at any given time (Kaplan & Miller, 1987). Karl was surprised to
discover what the norm was in the jury room. Second, the size and unanimity of
the majority convey information about the strength of the norm in effect. As we
see later in the chapter, these two variables are important in determining the
likelihood and amount of behavior change in a social influence situation.
Although
both informational and normative social influence can exert powerful control
over our behavior, their effects are different. The changes caused by
informational social influence tend to be stronger and more enduring than those
caused by normative social influence (Burnstein & Sentis, 1981). This is
because changes caused by new information or a new interpretation of existing
information may be persuasive and
convincing.
As we saw in Chapter 6, the opinion changes that result from persuasion are
usually based on our accepting information, elaborating on it, and altering our
attitudes and behavior accordingly. This type of information processing tends
to produce rather stable, long-lasting change.
For
normative social influence to occur, we need not be convinced that our opinion
is incorrect. We respond to our perception of what we believe others want us to
do. Consequently, a change in opinion, attitude, or behavior brought about by
normative pressure is often fragile. Once normative pressure eases up, we are
likely to go back to our previous opinions. Karl went along with the other
members of the jury, but he did not really believe they were right. In fact,
Karl stated that he would go along with the majority but that he would “never
feel right about it.”
Because
norms play such an important role in our behavior, and because normative social
influence is so critical an element in conformity and other forms of social
influence, we turn now to a more detailed discussion of these important forces.
Social
Norms: The Key to Conformity
Norms play
an important role in our everyday lives. These unwritten rules guide much of
our social behavior. Humans seem to be predisposed to form norms—and conform to
them—even in the most minimal situations. Norms exist on many levels, ranging
from broad cultural norms to smaller-scale, situation-specific norms. We have
cultural norms for how close we stand to another person when talking, for how
men and women interact in business settings, and for the clothing we wear. We
have situation-specific norms for how to behave in class or in the courtroom.
Violating
norms makes us uncomfortable. We are embarrassed if we show up at a wedding
reception in casual dress and find everyone else dressed formally, or if we go
to tennis camp in tennis whites only to discover everyone else wearing the camp
T-shirt. In general, standing out from the crowd, being the only different one,
is something human beings don’t like.
To get a better
idea of how norms develop and how normative social influence works, imagine
that you are taking part in an experiment. You are sitting in a totally dark
room waiting for a point of light to appear on the wall across from where you
are sitting. After the light is shone, you are asked to judge how far the light
moved (in inches). In fact, unknown to you, the light is stationary and only
appears to move, a phenomenon called the autokinetic effect. If asked to make
successive judgments of the amount of movement that you perceive, what will
occur? Will your judgments vary widely, or will they show some consistency? If
you have to do the same task with two others, will your judgments remain
independent or blend with those of the others?
These
questions were asked by Sherif (1936, 1972) in his classic studies on norm
formation. When participants did the task alone, Sherif found that their
judgments eventually reflected some internalized standard that put a limit on
their estimates of how far the light moved. That is, rather than being
haphazard, individual participants showed evidence of establishing a range and
norm to guide their judgments. When these participants were then placed within
a group context, the individualized ranges and norms blended into a single
group norm. The results from this experiment showed that subjects who did the
task alone showed a wide range of judgments (from 1 inch to 7.5 inches). But
after three sessions in which the individuals judged the distance in groups,
their judgments converged, producing a funnel-shaped graph. According to
Sherif, this convergence shows that the group, without specific instructions to
do so, developed a group norm. Interestingly, this group norm was found to
persist even when the participants were brought back to do the task again a
year later.
Classic
Studies in Conformity
The
convergence of judgments shown in Sherif’s study should not be surprising. The autokinetic effect is misleading, so
the task was ambiguous, depending on subjective estimates of the distance
traveled by a light. Individual judgments eventually converged on a group norm,
demonstrating conformity. But what happens if the task is less ambiguous? Do
participants still conform to a group norm? Or do they maintain their
independence? These are some of the questions Solomon Asch addressed in a
now-classic series of experiments (1951, 1955, 1956).
The Asch
Paradigm
Imagine
that you have signed up for an experiment investigating perceptual judgments.
When you arrive at the lab, you find that several other participants are
already present. You take the only remaining seat. You are told that the
experiment involves judging the length of lines presented on a card at the
front of the room. You are to look at each of three lines and decide which one matches
a standard presented to the left (Figure 7.1). The experimenter tells you that
each of you will give your judgment orally one after another. Because you are
in the last chair you will give your judgment last.
Figure
7.1A line judgment task that might have been used by Asch in his conformity
experiments. The participant was required to pick a line from the right that
matched the standard line on the left.
The
experiment begins uneventfully. Each member of the group gives what you
consider the correct response, and then you give your response. But soon the
others begin to give answers you believe to be incorrect, and you must decide
what to do. Should you give the correct answer (which is obvious) or go along
with the others, who are wrong?
Before we
see what happened, let’s take a closer look at the Asch
paradigm. The “other
participants” were not really participants at all. They were confederates of
the experimenter who were instructed to give incorrect answers on several
“critical trials.” Misinformation provided by the incorrect majority places the
real participant in a dilemma. On the one hand, he has the evidence of his own
senses that tells him what the correct answer is. On the other hand, he has
information from the majority concerning what is correct. The participant is
placed in a situation in which he must decide between these two competing
sources of information. From these competing sources of information, pressure
on the participant arises.
Now, when
you are faced with a situation like the one created in the Asch experiments,
there are two ways you can test reality to determine which line really matches
the standard. You can jump up, whip out your pocket measuring tape, rush to the
front of the room, and measure the lines. This is directly testing your
perceptions against reality. However, you probably won’t
do this, because it will violate your sense of the operative social norm—how you should act in
this situation. The other way is to test the accuracy of your perceptions
against those of others through a social comparison process (Festinger, 1954).
Asch’s paradigm strongly favors doing the latter. Given that
participants in these experiments
probably will not measure the lines, what do they do about the conflict between
information from their own senses and information from the majority?
Conformity
in the Asch Experiments. Asch’s experimental paradigm placed the participant’s
own perceptions into
confl ict with the opinions of a unanimous majority advocating a clearly
incorrect judgment. When confronted with the incorrect majority, Asch’s
participants made errors in the direction of the incorrect majority on over 33%
of the critical trials. Therefore, Asch showed a conformity rate of 33% on his
linejudgment task. Almost all participants knew the correct answer. When they
did the same
task alone, the error rate (mismatching the line with the
standard) was 7.4%, one-fourth the error rate when other participants were
present. Yet many changed their opinions to be in conformity with the group judgment.
So, even with a simple perceptual task, an individual may abandon his or her
own judgment and go with the majority. Why would we do this? As we see next,
there are different reasons why people conform or remain independent.
Paths to Conformity and Independence Based on his results and
interviews with participants, Asch classified them as either yielding
(conforming) or independent (nonconforming) (Asch, 1951). Of the yielding
participants, some (but relatively few) gave in completely to the majority.
These participants experienced distortion of perception and saw the majority
judgments as correct. They appeared to believe that the incorrect line was
actually the correct one. The largest group of yielding participants displayed
distortion of judgment. These participants yielded because they lacked
confidence in their own judgments—“I’m not sure anymore.” Without such
confidence, they were not able to stick with their own perceptions and remain
independent. Finally, some yielding participants experienced distortion of
action. Here, participants knew that the majority was wrong but conformed so
that they did not appear different to the other participants—“I’ll go along”
(Figure 7.2). This is what happened to Karl. Interestingly, there was a
remarkable consistency among yielding participants. Once bound to the majority,
they stayed on the path of conformity.
Of the independent participants, about 25% remained totally
independent, never agreeing with the incorrect majority (Asch, 1955). These
participants had a great deal of confidence in their own judgments and
withstood the pressure from the majority completely. Other independent
participants remained so because they felt a great need to remain self-reliant;
still others remained independent because they wanted to do well on the task.
Asch’s interviews tell us that there are many paths to
conformity or independence. Some participants remain independent because they
trust their own senses, whereas others remain independent because they feel a
great need to do so. These latter participants appear to remain independent
because of psychological reactance (Brehm, 1966).
Figure 7.2Based on postexperimental interviews, Asch determined
that there was no one path to conformity. Different participants conformed for
different reasons.
As
described in Chapter 6, psychological reactance occurs when individuals feel
that their freedom of choice or action is threatened because other people are
forcing them to do or say things (Brehm & Brehm, 1981). To reestablish
independence, they reject the majority’s pressure and go their own way. Even
when individuals choose to remain independent,
however, they still feel the pressure the incorrect majority exerts. Resisting
the pressure of the majority is not easy. Independent participants can
withstand that pressure and stick with their own perceptions.
How Does
Social Influence Bring About Conformity?
What is it
about social influence situations that causes conformity? When your opinion is
different from that of a unanimous majority, you are faced with a dilemma. On
the one hand, your senses (or belief system) suggest one thing; on the other,
the social situation (the majority) suggests something quite different. Placed
in such a situation you experience conflict, which is psychologically
uncomfortable (Moscovici, 1985). When you grapple with this conflict, your
tendency is to pay attention to the views of the majority. Once the majority
influence is removed, however, attention is focused back on the stimulus (e.g.,
the judgment of lines in the Asch studies). Once majority influence is removed,
you will return to your previous judgments (Moscovici, 1985).
The
effects of dividing attention between the majority and the stimulus were
demonstrated in a study in which participants were asked to judge how similar
two noises were in volume (Tesser, Campbell, & Mickler, 1983). Participants
performed this task under conditions of high social pressure, when three
members of a majority disagreed with the participant’s
evaluation of the noise, or under
conditions of low social pressure, when only one person disagreed. Under high
social pressure, participants responded by either attending very little or
attending a great deal to the stimulus to be judged. Under low social pressure,
participants paid a moderate amount of attention to the stimulus.
Researchers
speculated that high social pressure would lead to high levels of arousal. This
arousal is due to the competing tendencies to pay attention both to the
stimulus and to the source of social influence, other people. The net result is
that a person will default to his or her dominant way of behaving. Those who
have a strong tendency to conform may resolve the conflict by adopting the view
of the majority. Others less prone to the effects of social influence may
increase their attention to the stimulus as a way to resolve the conflict. By
focusing on the stimulus, they take their minds off the social pressure. Like
Karl in the jury room, some participants in the Asch studies actually put their
hands over their ears or eyes so that they did not hear or see what other
people said. This was the only way they could resist conforming.
Another
way to approach this question is to examine the effects of consensus, or
agreement with others, on our perceptions and behavior. Attitudes and behavior
that are in line with those of others are a powerful source of social
reinforcement. We like it when our attitudes and behaviors are verified. The
perception that our beliefs have social support is related to higher levels of
self-esteem (Goodwin, Costa, & Adonu, 2004). Additionally, we are quicker
to express an attitude that has consensual support than one that flies in the
face of the majority. This is known as the minority slowness effect (Bassili,
2003). The larger the majority, the faster we will be willing to express a view
that is in line with that majority (Bassili, 2003). It matters little whether
the attitudes are important to us (e.g., political attitudes) or less important
(e.g., foods we like); we are slower to express attitudes that deviate from the
majority than those that do not (Bassili, 2003).
It is well
known that we tend to match our attitudes and behaviors to those of others
(Prentice & Miller, 1993). Social norms, once they become popular, take on
a life of their own and become “self-replicating” (Conway & Schaller,
2005). Conway and Schaller offer two explanations for the influence of
consensus on behavior. First is just plainold conformity rooted in our desire
not to be different from others, as demonstrated by the Asch experiments.
Second, the attitudes and behaviors of others provide us with important
information about the world and supply “social proof ” for the consensually
accepted beliefs. In other words, we tend to flock to attitudes and behaviors that
are widely accepted. So, not only are we repulsed by being an outcast among our
peers, we are attracted to those who hold beliefs with which we agree.
Factors
That Affect Conformity
We have
established that the opinions of others can alter our behavior. However, we
have not yet explored how variables such as the nature of the task, the size of
the majority, and the effect of one other person in agreement work to affect
conformity. Next, we explore several variables relating to the amount of
conformity observed in social influence situations.
Nature of
the Task
The first
variable that can affect the amount of conformity observed relates to the task
itself. One variable affecting conformity rates is the ambiguity of the task.
As the task facing the individual becomes more ambiguous (i.e., less obvious),
the amount of conformity increases (Crutchfield, 1955). Asch’s
task was a simple one, involving the judgment of the length of lines, and
produced a conformity rate of about 33%. Conformity research conducted with more
ambiguous stimuli shows even higher levels of conformity. For example, Sherif’s
(1936) experiment on norm formation using the autokinetic effect (an extremely ambiguous task)
found conformity rates of about 70%.
Other
research involving attitudinal issues with no clear right or wrong answer
produced conformity rates similar to Sherif’s. In one study,
highly independent professionals such as army officers and expert engineers
were led to believe that other professionals had answered an opinion item
differently than they had (Crutchfield, 1955). For example, colonels in the army were
told that other colonels had agreed with the item
“I often
doubt that I would make a good leader.” Now, this is blasphemy for army
officers, who are trained to lead. Yet when faced with a false majority, 70% of
the officers said they agreed with that item. Privately, they disagreed
strongly.
The type
of task faced by a group may also determine the type of social influence
(informational or normative) that comes into play. For example, informational
social influence should be strongest when participants face an intellective
issue, in which they can use factual information to arrive at a clearly correct
answer (Kaplan & Miller, 1987). Normative social influence should be more
crucial on a judgmental issue. A judgmental issue is based on moral or ethical
principles, where there are no clear-cut right or wrong answers. Therefore,
resolution of the issue depends on opinion, not fact. In a jury simulation
study investigating the use of informational and normative social influence,
Kaplan and Miller (1987) impanelled six-person juries to judge a civil lawsuit.
The juries were required to award the plaintiff compensatory damages and
punitive damages. Compensatory damages are awarded to reimburse the plaintiff
for suffering and losses due to the defendant’s behavior.
Generally, awarding compensatory damages is
a fact-based intellective task. If, for example, your lawn mower blows up
because the No Pain, No Gain Lawn Mower Company put the gas tank in the wrong
place, it is easy for the jury to add up the cost of the mower plus whatever
medical costs were incurred. Punitive damages, on the other hand, are awarded
to deter the defendant from repeating such actions in the future. The issue of
awarding punitive damages is a judgmental task. How much should you punish the
manufacturer so that it ceases making mowers that blow up?
The
results of the study indicated that juries doing an intellective task (awarding
compensatory damages) were more likely to use informational social influence
than normative social influence. When the task has a clear standard, then it is
the information that majority members can bring forth that convinces other
jurors. Juries doing a judgmental task, on the other hand, were more likely to
use normative influence. Where
there is
no clear-cut answer, the jurors in the majority try to convince the minority to
agree by pressuring them to conform to the group (majority) decision.
The Size
of the Majority
The size of
the majority also affects conformity rates. As the size of the majority
increases, so does conformity, up to a point (Asch, 1951, 1956; Milgram,
Bickman, & Berkowitz, 1969). Generally, as shown in Figure 7.3, there is a
nonlinear relationship between the size of the majority and conformity. That
is, majority influence significantly increases until some critical majority
size is reached. After that, the addition of more majority members does not
significantly increase conformity. For example, Milgram and colleagues (1969)
found that increasing the number of individuals (confederates of the
experimenter) on a sidewalk who looked upward toward the sky increased
conformity (the percentage of passersby looking upward) up to a majority size
of five and then leveled off (see Figure 7.3).
There is
no absolute critical size of a majority after which addition of majority
members does not significantly increase conformity. Milgram and colleagues
found that conformity leveled off after a majority size of five. Asch (1951),
using his line-judgment task, found that conformity leveled off after a
majority size of three. Regardless of the critical size of the majority, the
general nonlinear relationship between majority size and conformity is firmly
established.
Figure 7.3
The effect of majority size on conformity. Conformity initially increases but
eventually levels off.
Why does
conformity level off after some critical majority size? Two explanations have
been suggested (Baron, Kerr, & Miller, 1992). First, as majority members
are added beyond the critical point, the individual in the conformity situation
might suspect that the additional majority members are going along to avoid
making trouble in the group. If the individual conformer perceives this to be
the motive for joining the majority, the power of the additional majority
members is reduced. Second, as the size of the majority grows, each new
majority member is probably noticed less. That is, the individual is more
likely to notice a third person added to a majority of two than to notice a
tenth person added to a majority of nine.
Increases
in the size of a majority are most likely to produce increased conformity in
normative social influence situations, when the situation causes us to question
our perceptions and judgments (Campbell & Fairey, 1989). When a majority is
arrayed against us, and we cannot obtain adequate information about the stimuli
that we are to judge, we conform. This is exactly what happened in Asch’s
experiment.
Normative
social influence also produces conformity when a judgment is easy and the
individual is sure the group is wrong but cannot resist the pressure of the
majority. This is what happened to Karl in the jury room. Informational
influence was nil. The other jurors could not offer any information that Karl
did not have already. They did not dispute the evidence. They made the judgment
that the law, not the evidence, was wrong. The jurors wanted Karl to conform to
this norm. Eventually, as we know, he did.
When you
know you are right and the rest of the group is wrong, more conformity results
when the majority comprises three members than if it comprises only one
(Campbell & Fairey, 1989). This makes sense because it is normative
influence that is operating in this situation. But what if you are not certain
whether the majority is right or wrong? In this case, you search for
information that could inform your decision, information that will help you
make the right choice. It is informational influence that counts here. Just a
few people, perhaps even one person, can convince you through informational
social influence if their information is persuasive (Campbell & Fairey,
1989).
Having a
True Partner
Often the
changes caused by the forces producing conformity are fragile and easily
disrupted. This is the case when we find that there is another person who
supports our perceptions and actions in a given social situation. Imagine, for
example, that you have been
invited to
a black-tie wedding reception at a posh country club on a Saturday night. When
an invitation specifies black-tie, the norm is for men to wear tuxedos and
women to wear formal dresses. Now, suppose that you don’t
want to dress so
formally but feel you should because everyone else will (normative social
influence). But then suppose that you speak to a friend who is also attending
and who also doesn’t want to wear a tuxedo or a formal dress. The two of you agree to wear
less-formal attire, and you feel comfortable with your decision. The next
weekend, you are invited to another black-tie party, but this time your friend
is not attending. What will you do this time? You decide to dress formally.
This
example illustrates an important social psychological phenomenon. The true
partner effect occurs when we perceive that there is someone who supports our
position; we are then less likely to conform than if we are alone facing a
unanimous majority. This effect was first demonstrated empirically by Asch
(1951). In one variation of his experiment, Asch had a true partner emerge at
some point during his conformity experiment. On a given trial, the true partner
would break with the incorrect majority and support the real participant’s
judgments. The results of this manipulation were striking: Conformity was cut
by nearly 80%! As in the example of the black-tie parties, when we have a true
partner, we are better able to withstand the strong forces of normative social
influence.
Why does this occur? There are many possible explanations. For
example, when we violate a norm by ourselves, we draw attention to ourselves as
deviant. Recall that some of Asch’s participants conformed because they did not
want to appear different. Apparently, it makes us very uncomfortable to be
perceived by others as different. When we have a true partner, we can diffuse the
pressure by convincing ourselves that we are not the only ones breaking a norm.
Another explanation for the true partner effect draws on the
social comparison process (Festinger, 1954; Kruglanski & Mayseless, 1990).
As discussed in Chapter 2, social comparison theory proposes that we compare
our thoughts, beliefs, and actions with those of others to find out if we are
in agreement. When we find that we agree, we feel validated; it is rewarding
when we receive such confirmation. Our confidence in our beliefs increases
because they are shared with others.
Think back to the second black-tie party. Without a true
partner, you bring your behavior into line with the norm in effect: wearing
formal attire. Asch (1951) found the very same thing when he had the true
partner withdraw his support of the participant. When the participant was
abandoned, his conformity went back up to its previous level.
The true partner effect applies in jury deliberations; we saw
that Karl experienced great distress when he was the only one holding out for
conviction. Earlier in the deliberations, Karl had other jurors (true partners)
who supported his view. When those jurors changed their votes, their support
for Karl disappeared. Now, Karl faced not only a unanimous majority but also one
that included two former true partners. Would things have turned out
differently if one other juror had stuck with Karl? Perhaps. The courts have
acknowledged that conformity pressures are greater when a person is the single
advocate of a particular point of view.
Gender and Conformity
Besides investigating situational forces that affect
conformity, social psychologists have investigated how individual
characteristics affect conformity. Early research suggested that women were
more likely to conform than men (Eagly & Carli, 1981). For example, 43% of
the studies published before 1970 reported this phenomenon, in contrast to only
21% published after 1970. Did changes in the cultural climate make women less
likely to conform? Or did early conformity studies have a male bias, as
expressed in male-oriented tasks and a predominantly male environment? Research
indicates that the nature of the task was not important in producing the
observed gender differences, but the gender of the experimenter was. Generally,
larger gender differences are found when a man runs the conformity experiment.
No gender differences are found when a woman runs the experiment (Eagly &
Carli, 1981).
An analysis of the research also shows that there are
conditions under which women are more likely to conform than men and others
under which men are more likely to conform than women (Eagly & Chrvala,
1986). For example, women are more likely to conform than men in group pressure
situations—that is, under conditions of normative social influence—than in
persuasion situations, where informational social influence is being applied
(Eagly, 1978; Eagly & Carli, 1981).
Two explanations have been proposed for gender differences in
conformity (Eagly, 1987). First, gender may serve as a status variable in newly
formed groups. Traditionally, the female gender role is seen as weaker than the
male role. In everyday life, males are more likely to hold positions of high
status and power than women. Men are more likely to be in the position of
“influ encer” and women in the position of “influencee.” The lower status of
the female role may contribute to a greater predisposition to conform on the
part of women, especially in group pressure situations. Second, women tend to
be more sensitive than men to conformity pressures when their behavior is under
surveillance—that is, when they have to state their opinions publicly (Eagly,
Wood, & Fishbaugh, 1981). When women must make their opinions public, they
are more likely than men to conform. In the Asch paradigm, participants were
required to state their opinions publicly; this favors women conforming more
than men.
Historical and Cultural Differences in Conformity
Asch conducted his classic experiment on conformity during the
1950s in the United States. The sociocultural climate that existed at the time
favored conformity. The country was still under the influence of “McCarthyism,”
which questioned individuals who did not conform to “normal” American ideals.
This climate may have contributed in significant ways to the levels of
conformity Asch observed (Larsen, 1982; Perrin & Spencer, 1981).
Researchers working in England failed to obtain conformity effects as strong as
those Asch had obtained (Perrin & Spencer, 1981). This raised a question:
Were the Asch findings limited to a particular time and culture?
Unfortunately, this question has no simple answer. Evidence
suggests that within the United States, rates of conformity vary with the
sociopolitical climate (Larsen, 1974, 1982). The conformity rate in the early
1970s was 62.5% (that is, 62.5% of participants conformed at least once in an
Asch-type experiment) compared to a rate of 78.9% during the early 1980s
(Larsen, 1982). Compare this to Asch’s (1956) rate of 76.5%. Results like these
suggest that conformity rates may be tied to the cultural climate in force at
the time of a study.
The evidence for cross-cultural influences is less clear. A
host of studies suggest that conformity is a fairly general phenomenon across
cultures. Conformity has been demonstrated in European countries such as
Belgium, Holland, and Norway (Doms & Van Avermaet, 1980; Milgram, 1961;
Vlaander & van Rooijen, 1985) as well as in non-Western countries such as
Japan, China, and some South American countries (Huang & Harris, 1973;
Matsuda, 1985; Sistrunk & Clement, 1970). Additionally, some research
suggests that there may be cross-cultural differences in conformity when North
Americans are compared to non–North Americans (see Furnham, 1984, for a review)
and across other non–North American cultures (Milgram, 1961). Differences in
conformity in Asian cultures (Korean versus Japanese) have also been found
(Park, Killen, Crystal, & Wanatabe, 2003).
What is the bottom line? It is safe to say that the Asch
conformity effect is fairly general across cultures. However, some cultural
groups may conform at different levels than others. It also seems evident that
cultural groups should not be seen as being uniform in conformity. Conformity
also appears to fluctuate in size across time within a culture.
Minority Influence
In the classic film Twelve Angry Men, Henry Fonda portrayed a
juror who was firmly convinced that a criminal defendant was not guilty. The
only problem was that the other 11 jurors believed the defendant was guilty. As
the jurors began to deliberate, Fonda held fast to his belief in the defendant’s
innocence. As the film progressed, Fonda convinced each of the other 11 jurors
that the defendant was innocent. The jury finally returned a verdict of not
guilty.
In this
fictional portrayal of a group at work, a single unwavering individual not only
was able to resist conformity pressure but also convinced the majority that
they were wrong. Such an occurrence would be extremely rare in a real trial
(Kalven & Zeisel, 1966). With an 11 to 1 split, the jury would almost
always go in the direction of the majority (Isenberg, 1986; Kalven &
Zeisel, 1966). The film, however, does raise an interesting question: Can a
steadfast minority bring about change in the majority? For almost 35 years
after Sherif’s original experiments
on norm formation, this question went unanswered. It was not until 1969 that
social psychologists began to investigate the influence of the minority on the
majority. This line of investigation has been pursued more by European social
psychologists than American social psychologists.
Can a
Minority Influence the Majority?
In the
first published experiment on minority influence, researchers devised an
Aschlike conformity situation. Participants were led to believe that they were
taking part in a study on color perception (Moscovici, Lage, & Naffrechoux,
1969). Participants were shown a series of slides and asked to say the color of
the slide aloud. Unbeknownst to the real participants (four, making up the
majority), two confederates (comprising the minority) had been instructed to
make an error on certain trials—by calling a blue slide green, for example.
Researchers found that 8.42% of the judgments made by the real participants
were in the direction of the minority, compared to only .025% of the judgments
in a control condition in which there was no incorrect minority. In fact, 32%
of the participants conformed to the incorrect minority. Thus, a minority can
have a surprisingly powerful effect on the majority.
In this
experiment, the minority participants were consistent in their judgments.
Researchers theorized that consistency of behavior is a strong determinant of
the social influence a minority can exert on a majority (Moscovici et al.,
1969). An individual in a minority who expresses a deviant opinion consistently
may be seen as having a high degree of confidence in his or her judgments. In
the color perception experiment, majority participants rated minority members
as more confident in their judgments than themselves. The consistent minority
caused the majority to call into question the validity of their own judgments.
What is it
about consistency that contributes to the power of a minority to influence a
majority? Differing perceptions and attributions made about consistent and inconsistent
minorities are important factors. A consistent minority is usually perceived as
being more confident and less willing to compromise than an inconsistent
minority (Wolf, 1979). A consistent minority may also be perceived as having
high levels of competence, especially if it is a relatively large minority
(Nemeth, 1986). Generally, we assume that if a number of people share a point
of view, it must be correct. As the size of the minority increases, so does
perceived competence (Nemeth, 1986).
Although
research shows that consistency increases the power of a minority to influence
a majority, consistency must be carefully defined. Will a minority that adopts
a particular view and remains intransigent be as persuasive as one that is more
flexible? Two styles of consistency have been distinguished: rigid and
negotiating (Mugny, 1975). In the rigid style, the minority advocates a
position that is counter to the norm adopted by the majority but is unwilling
to show flexibility. In the negotiating style, the minority, although remaining
consistent, shows a willingness to be flexible. Each of these styles
contributes to the minority’s image in the eyes of the majority
(Mugny, 1975). The
rigid minority is perceived in a less positive way than a negotiating minority,
perhaps leading to perceptions that the rigid minority’s
goal is to block the majority. Conversely, the negotiating minority may be perceived as having
compromise as its goal.
Generally,
research suggests that a more flexible minority has more power to influence the
majority than a rigid one, as long as the perception of minority consistency
remains (Mugny, 1975; Nemeth, Swedlund, & Kanki, 1974). The perception of
the minority is also partially dependent on the degree to which it is willing
to modify its position in response to new information. A minority that adapts
to new information is more influential than a minority that holds a position
irrespective of any additional information (Nemeth et al., 1974).
A minority
also has more power to influence the majority when the majority knows that
people have switched to the minority viewpoint. The effect, however, leveled
off after three defections from the minority (Clark, 1999). Clark concluded
that minority influence depended on the quality of the arguments they made
against the majority viewpoint and the number of majority defections. In a
later experiment, Clark (2001) employed the “12 angry men paradigm” to further
test this effect. In the 12 angry men paradigm jurors are exposed to arguments
opposing a majority verdict by either a single minority juror, or by multiple
jurors, some of whom were members of the majority. Clark found that minority
influence increased when the original dissenting minority member was joined by
a member of the majority.
Another
interesting aspect of minority influence is that a minority is more likely to
voice a dissenting view when he or she is anonymous (e.g., via computer)
compared to face-to-face communication (McLeod, Baron, Marti, & Yoon,
1997). Interestingly, however, a minority has more power to influence a
majority in face-to-face communication. Ironically, then, those media that
enhance the likelihood of a minority voicing a dissenting opinion also decrease
the ability of the minority to influence the majority (McLeod et al., 1997). In
another ironic twist, the degree to which a majority will carefully process a
persuasive message of the minority is inversely related to the size of the
minority. The smaller the minority, the more likely it is that the majority
will carefully process the minority’s message (Martin, Gardikiotis, &
Hewstone, 2002). A majority only
needs a 50% split to gain compliance from a minority (Martin et al., 2002).
Majority
and Minority Influence: Two Processes or One?
Social
influence, as we have seen, operates in two directions: from majority to
minority and from minority to majority. The discovery of minority influence
raised an issue concerning the underlying social psychological processes
controlling majority and minority influence. Do two different processes control
majority and minority influence, or is there a single process controlling both?
The
Two-Process Model
Judgments
expressed by a minority may be more likely to make people think about the
arguments raised (Moscovici, 1980). This suggests that two different processes
operate: majority influence, which occurs almost exclusively on a public level,
and minority influence, which seems to operate on a private level. Majority
influence, according to the two-process approach, operates through the
application of pressure. People agree with a majority because of public
pressure, but often they really don’t accept the majority’s view on a private level. The fact
that the majority exerts great psychological pressure is reflected in the
finding that people feel very anxious when they find themselves in disagreement
with the majority (Asch, 1956; Nemeth, 1986). However, as soon as majority
pressure is removed, people return to their original beliefs. Majority
influence, in this model, is like normative influence—it does not necessarily
have a lasting effect. For example, Karl, in the Leroy Reed case, changed his
verdict in response to group pressure. However, he probably went home still
believing, deep down, that Reed should have been convicted.
Minority
influence, according to the two-process approach, operates by making people
think more deeply about the minority’s position (Nemeth, 1986). In doing so, they evaluate all the aspects of the
minority view. The majority decides to agree with the minority because they are
converted to its position (Nemeth, 1992). Minority influence is like
informational influence. The character played by Henry Fonda in Twelve Angry
Men convinced the majority members to change their votes through informational
social influence. Thus, unlike the majority influencing Karl in the Reed case
through normative pressure, Fonda changed the minds of the other jurors by
applying persuasive informational arguments.
A
Single-Process Model: Social Impact Theory
The
dual-process model suggests that there are different psychological processes
underlying majority and minority influence. A competing view, the
single-process approach to social influence, suggests that one psychological
process accounts for both majority and minority influence. The first theory
designed to explain majority and minority influence with a single underlying
process was proposed by LatanĂ© (LatanĂ©, 1981; LatanĂ© & Wolf, 1981). LatanĂ©’s
social impact theorysuggests that social influence processes are the result of the interaction
between the strength, immediacy, and number of influence sources. This model
can be summed up by the formula:
Influence
= Ć’(SIN)
where
Srepresents the strength of the source of the influence, I represents the
immediacy (or closeness) of the source of influence, and Nrepresents the number
of influence sources.
Latané
(1981) suggested an analogy between the effect of social influence and the
effect of lightbulbs. If, for example, you have a bulb of a certain strength
(e.g., 50 watts) and place it 10 feet from a wall, it will cast light of a
given intensity against the wall. If you move the bulb closer to the wall
(immediacy), the intensity of the light on the wall increases. Moving it
farther from the wall decreases the intensity. Increasing or decreasing the
wattage of the bulb (the strength of the source) also changes the intensity of
the light cast on the wall. Finally, if you add a second bulb (number), the
intensity of light will increase. Similarly, the amount of social influence
increases if the strength of a source of influence is increased (e.g., if the
source’s credibility
is enhanced), if the source’s immediacy is increased, or if the
number of influence
sources is increased.
Latané
also suggested that there is a nonlinear relationship between the number of
sources and the amount of influence. According to Latané, adding a second
influence source to a solitary source will have greater impact than adding the
101st source to 100 sources. Social impact theory predicts that influence
increases rapidly between zero and three sources and then diminishes beyond
that point, which is consistent with the research on the effects of majority
size.
Social
impact theory can be used to account for both minority and majority influence
processes. In a minority influence situation, social influence forces operate
on both the minority and majority, pulling each other toward the other’s
position (Latané, 1981).
Latané
suggested that minority influence will depend on the strength, immediacy, and
number of influence sources in the minority, just as in majority influence.
Thus, a minority of two should have greater influence on the majority than a
minority of one, a prediction that has received empirical support (Arbuthnot
& Wayner, 1982; Moscovici & Lage, 1976).
An
experiment by Hart, Stasson, and Karau (1999) provides support for the social
impact explanation for minority influence. In their experiment, Hart et al.
varied the strength of the minority source (high or low) and the physical
distance between the minority member and majority (near or far). The results
showed that in the “near” condition the high- and low-strength minority had
equivalent levels of influence. However, in the “far” condition, the
low-strength source had little influence whereas the high-strength minority had
a strong influence. So, two factors included in social impact theory affect the
amount of minority influence.
Although
there is still a measure of disagreement over the exact mechanisms underlying
minority influence, it is fair to say that there is more support for the
single-process model. However, there is also evidence supporting the
dual-process model .
Compliance:
Responding to a Direct Request
Compliance
occurs when you modify your behavior in response to a direct request from another
person. In compliance situations, the person making the request has no power to
force you to do as he or she asks. For example, your neighbor can ask that you
move your car so that she can back a truck into her driveway. However, assuming
your car is legally parked, she has no legal power to force you to move your
car. If you go out and move your car, you have (voluntarily) complied with her
request. In this section, we explore two compliance strategies: the
foot-in-the-door technique and the door-in-theface technique. We start by
looking at the foot-in-the-door technique.
Foot-in-the-Door
Technique
Imagine
that you are doing some shopping in a mall and a person approaches you. The
solicitor asks you to sign a petition condemning drunk driving. Now most people
would be happy to sign such a petition. After all, it is for a cause that most
people support, and it takes a minimal amount of effort to sign a petition.
Imagine further that you agree to this initial request and sign the petition.
After you sign the petition, the solicitor then asks you for a $5 donation to
PADD (People Against Drunk Driving). You find yourself digging into your wallet
for a $5 bill to contribute.
Consider
another scenario. You are again in the mall doing some shopping, when a person
from PADD approaches you and asks you for a $5 donation to help fight drunk
driving. This time, instead of digging out your wallet, you tell the solicitor
to hit the road, and you go back to your shopping.
These two
scenarios illustrate a common compliance effect: the foot-in-the-door technique
(FITD). In the first scenario, you were first asked to do something small and
effortless, to sign a petition. Next, you were asked for a donation, a request
that was a bit more costly than simply signing a petition. Once you agreed to
the first, smaller request, you were more inclined to agree to the second,
larger request. This is the essence of the FITD technique. When people agree to
a small request before a larger one is made, they are more likely to agree to
the larger request than if the larger request were made alone.
In the
experiment that first demonstrated the FITD technique (Freedman & Fraser,
1966), participants were contacted in their homes by a representative of a
fictitious marketing research company under four separate conditions: (1) Some
participants were asked if they would be willing to answer a few simple
questions about the soap products used in their households (a request to which
most participants agreed). The questions were asked only if the participant
agreed. This was called the “performance” condition. (2) Other participants
were also asked if they would be willing to answer a few simple questions, but
when they agreed, they were told that the company was simply lining up
participants for a survey and that they would be contacted later. This was
called the “agree-only” condition. (3) Still other participants were contacted,
told of the questionnaire, and told that the call was merely to familiarize
people with the marketing company. This was the “familiarization” condition.
(4) A final group of participants was contacted only once. This was the
single-contact (control) condition.
Participants
in the first three conditions were called again a few days later. This time a
larger request was made. The participants were asked if they would allow a team
of five or six people to come into their homes for 2 hours and do an inventory
of soap products. In the single-contact condition, participants received only
this request. The results of the experiment, shown in Figure 7.4, were
striking. Notice that over 50% of the subjects in the performance condition
(which is the FITD technique) agreed to the second, larger request, compared to
only about 22% of the subjects in the single-contact group. Notice also that
simply agreeing to the smaller request or being familiarized with the company
was not sufficient to significantly increase compliance with the larger
request. The FITD effect occurs only if the smaller task is actually performed.
Since this
seminal experiment, conducted in 1966, many other studies have verified the
FITD effect. It even works in an online environment using web pages to make the
small and large requests (Guéguen & Jacob, 2001). Researchers quickly
turned their attention to investigating the underlying causes for the effect.
Figure
7.4Compliance to a large request as a function of the nature of an initial,
smaller request. The highest level of compliance for a large request was
realized after participants performed a smaller request fi rst, illustrating
the foot-inthe-door technique.
Why It
Works: Three Hypotheses
One
explanation for the FITD effect is provided by self-perception theory (Bern,
1972). Recall from Chapter 6 that we sometimes learn about ourselves from
observing our own behavior and making inferences about the causes for that
behavior. According to the self-perception hypothesis, the FITD works because
agreeing to the first request causes changes in our perceptions of ourselves.
Once we agree to the smaller, original request, we perceive ourselves as the
type of person who gives help in that particular situation, and thus we are
more likely to give similar help in the future.
In a
direct test of the self-perception explanation, Burger and Caldwell (2003) paid
some participants $1 to sign a petition supporting aid to the homeless (the
initial request in a FITD procedure). Other participants received a bookmark
that said “It’s great to see someone
who cares about people in need” (self-concept enhancement). Two days later
participants received a telephone call asking them to volunteer time to sort
items at a food bank to help the homeless. The results showed that participants
in the enhancement condition were more likely to agree to the second request
than those who were paid $1. Burger and Caldwell explain that those in the
enhancement condition showed a shift in their self-perception toward perceiving
themselves as helping individuals. Those paid $1 did not show such as shift.
Generally, other research has provided support for the self-perception
explanation for the FITD technique (Dejong, 1979; Goldman, Seever, &
Seever, 1982; Snyder & Cunningham, 1975).
Originally
it was believed that merely agreeing to any initial request was sufficient to
produce the FITD effect. However, we now know differently. The FITD effect
works when the initial request is sufficiently large to elicit a commitment
from an individual and the individual attributes the commitment to internal,
dispositional factors. That is, the person reasons, “I am the type of person
who cooperates with people doing a market survey” (or contributes to PADD, or
helps in particular types of situations).
Although
self-perception theory has been widely accepted as an explanation for the FITD
effect, another explanation has also been proposed. This is the perceptual
contrast hypothesis, which suggests that the FITD effect occurs because the
smaller, initial request acts as an “anchor” (a standard of reference) against
which other requests are judged (Cantrill & Seibold, 1986). The later
request can be either assimilated to or contrasted with the anchor.
Theoretically, in the FITD situation, the second, larger request is assimilated
to the anchor (the smaller, first request) and is seen as less burdensome than
if it were presented alone. That is, the second and larger request is seen as
more reasonable because of the first request with which the person has already
agreed. Although this hypothesis has generated some interest, there is not as
much support for it as there is for the self-perception explanation.
Another
explanation for the effectiveness of the FITD effect focuses on the thought
processes of its recipients. It was suggested that information about the
solicitor’s and recipient’s
behavior affects compliance
in the FITD effect (Tybout, Sternthal, & Calder, 1983). According to this
view, targets of the FITD technique undergo changes in attitudes and cognitions
about the requested behavior. Compliance on a second request depends, in part,
on the information available in the participant’s memory that
relates to the issue
(Homik, 1988).
This
hypothesis was put to the test in a field experiment involving requests for
contributions to the Israeli Cancer Society (ICA; Hornik, 1988). Participants
were first asked to fulfill a small request: to distribute ICA pamphlets.
Participants agreeing to this request were given a sticker to display on their
doors. One version of the sticker touted the participant’s
continuing involvement in the ICA campaign. A second version suggested that participants
had fulfilled their obligation completely. Ten days later participants were
contacted again and asked to donate money to the ICA. Additionally, the control
group of participants was contacted for the first time.
The
results of this study confirmed the power of the FITD technique to produce
compliance (compared to the control group). Those participants who received the
sticker implying continued commitment to the ICA showed greater compliance with
the later request than did either those who had received the sticker showing
that an obligation was fulfilled or those in the control group. Participants in
the continuedcommitment group most likely held attitudes about themselves, had
information available, and had self-perceptions suggesting continued commitment.
This translated into
greater
compliance.
Limits of
the FITD Technique
As you can
see, the FITD technique is a very powerful tool for gaining compliance.
Although the effect has been replicated over and over, it has its limits. One
important limitation of the FITD technique is that the requests being made must
be socially acceptable (Dillard, Hunter, & Burgoon, 1984). People do not
comply with requests they find objectionable. Another limitation to the FITD
technique is the cost of the behavior called for. When a high-cost behavior is
called for (e.g., donating blood), the FITD technique does not work very well
(Cialdini & Ascani, 1976; Foss & Dempsey, 1979). Does this mean that
the FITD technique cannot be used to increase socially desirable but highcost
behaviors such as blood donation? Not necessarily. A small modification in the
technique
may prove effective: adding a moderately strong request between the initial
small and final large requests. Adding such an intermediate request increases
the power of the FITD technique (Goldman, Creason, & McCall, 1981). A
gradually increasing, graded series of requests may alter the potential donor’s
self-perceptions, which are strongly
associated with increased compliance in the FITD paradigm.
Interestingly,
although the FTTD technique does not increase blood donations significantly, it
can be used to induce people to become organ donors (Carducci & Deuser,
1984). However, there are even some limits here. In an experiment by Girandola
(2002), participants were exposed to a FITD procedure under one of four
conditions. Some participants received the second request immediately after the
first request and others after a delay of 3 days. Half of the participants were
presented with the second request (indicate how willing they were to become an
organ donor) by the same person making the initial request or a different
person. As shown in Figure 7.5, the FITD procedure was effective in increasing
willingness to become an organ donor in all conditions except when the same
person who made the first request made the second request immediately.
Why the
difference between blood and organ donation? It may be that the two behaviors
involve differing levels of commitment. Blood donation takes time and involves
some pain and discomfort. Organ donation, which takes place after death, does
not. Blood donation requires action; organ donation requires only agreement. It
appears that blood donation is seen as a higher-cost behavior than organ
donation. Under such highcost conditions the FITD technique, in its original
form, does not work very well.
Finally,
the FITD technique does not work equally well on everyone. For example, it
works better on individuals who have a stronger need to maintain cognitive
consistency than on those who have a weaker need (Cialdini, Trost, &
Newsome, 1995; Guadango, Asher, Demaine, & Cialdini, 2001). Additionally,
individuals who have a clear sense of their self- concepts (high self-concept
clarity) were more affected by a FITD manipulation than those low in
self-concept clarity (Burger & Guadango, 2003).
Figure
7.5The relationship between the time of a second request and the identity of
the person making the second request.
Door-in-the-Face
Technique
Imagine
that you are sitting at home reading a book when the telephone rings. The
caller turns out to be a solicitor for a charity that provides food baskets for
needy families at Thanksgiving. The caller describes the charity program and
asks if you would be willing to donate $250 to feed a family of 10. To this
request you react as many people do: “What! I can’t possibly give
that much!” In response, the caller offers you several other alternatives, each requiring a
smaller and smaller donation (e.g., $100, $50, $25, and $10). Each time the
caller asks about an alternative you feel more and more like Ebenezer Scrooge,
and finally you agree to provide a $25 food basket.
Notice the
tactic used by the solicitor. You were first hit with a large request, which
you found unreasonable, and then a smaller one, which you agreed to. The
technique the solicitor used was just the opposite of what would take place in
the FITD technique (a small request followed by a larger one). In this example
you have fallen prey to the door-in-the-face technique (DITF).
After
being induced into buying a candy bar from a Boy Scout who used the DITF
technique, one researcher decided to investigate the power of this technique to
induce compliance (Cialdini, 1993). Participants were approached and asked if
they would be willing to escort a group of “juvenile delinquents” to a local
zoo (Cialdini et al., 1975). Not surprisingly, most participants refused this
request. But in the DITF condition, this request was preceded by an even larger
one, to spend 2 hours per week as a counselor for juvenile delinquents for at
least 2 years! It is even less surprising that this request was turned down.
However, when the request to escort delinquents to the zoo followed the larger
request, commitments for the zoo trip increased dramatically (Figure 7.6). Subsequent
studies verified the power of the DITF technique to induce compliance (e.g.,
Cialdini & Ascani, 1976; Williams & Williams, 1989). As with the FITD
technique, the DITF technique also works in an online environment (Guéguen,
2003).
Figure
7.6Compliance to a small request as a function of the nature of an initial
request. Participants complied more with a second, smaller request if it
followed a larger request, demonstrating the door-inthe-face technique.
Some
researchers have suggested that the DITF technique works because the target of
the influence attempt feels compelled to match the concession (from the first,
larger request to the smaller, second request) made by the solicitor (Cialdini
et al., 1975). The social psychological mechanism operating here is the norm of
reciprocity (Gouldner, 1960). The norm of reciprocity states that we should
help those who help us. Remember Aesop’s fable about the mouse that came across
a lion with a thorn in its foot?
Despite the obvious danger to itself, the mouse helped the lion by removing the
thorn. Later, when the lion came on the mouse in need of help, the lion
reciprocated by helping the mouse. This is an illustration of the norm of
reciprocity. The norm of reciprocity is apparently a very powerful force in our
social lives (Cialdini, 1988).
Implied in
this original statement of the norm is the idea that we may feel compelled to
reciprocate when we perceive that another person is making a concession to us.
This norm helps explain the DITF effect. It goes something like this: When a
solicitor first makes a large request and then immediately backs off when we
refuse and comes back with a smaller request, we perceive that the solicitor is
making a concession. We feel pressure to reciprocate by also making a concession.
Our concession is to agree to the smaller request, because refusing the smaller
request would threaten our sense of wellbeing tied to the norm of reciprocity.
In the DITF technique, then, our attention becomes focused on the behavior of
the solicitor, who appears to have made a concession (Williams & Williams,
1989). If we don’t reciprocate, we may later feel guilty
or fear that we will appear
unreasonable and cheap in the light of the concession the solicitor made.
The power
of the norm of reciprocity has been shown in empirical research. For example,
one study found that more participants agreed to buy raffle tickets from
someone who had previously done them a favor (bought the participant a soft
drink) than from someone who had not done them a favor (Regan, 1971). In this
study, the norm of reciprocity exerted a greater influence than overall liking
for the solicitor. Research has also shown that the norm of reciprocity is
central to the DITF effect (Cialdini, 1993; Cialdini et al., 1975; Goldman &
Creason, 1981). If a solicitor makes more than one concession (when a solicitor
reads a list of smaller and smaller requests), compliance is higher than if the
solicitor makes only one concession (Goldman & Creason, 1981). This is
especially true if the intermediate request is moderate (Goldman, Creason,
& McCall, 1981).
Although
there is support for the role of reciprocity in the DITF effect, some
researchers have questioned its validity and have suggested alternative
explanations for these situations. One such alternative is the perceptual
contrast hypothesis. As discussed earlier, this hypothesis focuses on the
contrast in size between the first and second requests. Applied to the DITF
effect, the perceptual contrast hypothesis suggests that individuals agree to
the second (small) request because it appears more reasonable in the light of
the first (large) request. The individual may perceive that the second request
is less costly than the first. Although there is some evidence against this
view of initial commitment to the salesperson, you are likely to follow through
on it (Burger & Petty, 1981). There is evidence that commitment to a person
(e.g., a salesperson) is more important than commitment to the behavior (e.g.,
buying a car) in compliance (Burger & Petty, 1981). So, you may not be so
inclined to buy the car if you negotiate first with the salesperson and then
with the sales manager than if you had continued negotiating with the original
salesperson.
Commitment
affects our behavior in two ways. First, we typically look for reasons to
justify a commitment after making it (Cialdini, 1993). This is consistent with
cognitive dissonance theory, as discussed in Chapter 6. Typically, we devise
justifications that support our decision to buy the car. Second, we also have a
desire to maintain consistency between our thoughts and actions and among our
actions (Cialdini, 1993; Festinger, 1957). When the salesperson returns with a
higher offer, we may be inclined to accept the offer because refusal would be
dissonant with all the cognitions and justifications we developed during the
stewing period.
Finally,
the self-presentationexplanation suggests that refusing the first request in
the DITF procedure may cause the person making the request to perceive the
target as an unhelpful person. In order to avoid this perception, the target
agrees to the second request to project a more positive image to the requestor
(Pendleton & Batson, 1979). There is some evidence for this explanation.
Millar (2002) found that the DITF effect is more powerful when a friend of the
target makes the requests than if a stranger makes the requests. Millar also
reported that the target of the request was more concerned with
selfpresentation if the request was made by a friend compared to a stranger.
Unfortunately, there is also evidence against the self-presentation explanation
(Reeves, Baker, Boyd, & Cialdini, 1993). So, self-presentation may be
involved in the DITF effect, but it may not be the best explanation for the
effect.
Compliance
Techniques: Summing Up
We
described and analyzed two different compliance techniques. Are they all
equally effective, or are some more effective than others? Research indicates
that the DITF technique elicits more compliance than the FITD technique
(Brownstein & Katzev, 1985; Cialdini & Ascani, 1976; Rodafinos,
Vucevic, & Sideridis, 2005). There is also evidence that a combined
FITD-DITF strategy elicits greater compliance than either of the techniques
alone (Goldman, 1986).
Another
two-stage technique called low-balling may be more effective for gaining
compliance than either the FITD or the DITF techniques (Brownstein &
Katzev, 1985). In low-balling an initial request or offer is made that appears
too good to be true. Once you agree to this request, a higher request is made.
In one experiment, participants were stopped and asked to donate money to a
museum fund drive. The request was made under either FITD, DITF, low-ball, or a
control condition. The average amount of money donated was highest under the
low-ball conditions, compared to the FITD, DITF, and control conditions (which
did not differ significantly from one another).
Table 7.1
Various Compliance Techniques
Although
we have focused on two compliance techniques, you should be aware that there
are other techniques that are used to induce you into donating money or buying
products. Space does not allow a complete discussion of all of these
techniques. We have summarized the various compliance techniques in Table 7.1.
All of these compliance techniques have been and will be used to induce people
to buy products (some of which they may want and some of which they may not
want). The psychological mechanisms of reciprocity, commitment, consistency,
and perceptual contrast operate to varying degrees to produce compliance.
Because we all share these mechanisms, we all find ourselves on occasion doing
something we don’t really want to do. Sellers of all types use
compliance techniques to sell their products (Cialdini, 2000). The best way to
guard ourselves against these techniques is to recognize and understand them
when they are used.
Obedience
In 2003
American soldiers in charge of the Abu Ghraib prison in Iraq subjected inmates
to various forms of abuse and humiliation. When the actions of the soldiers
came to light in 2004, those directly involved were arrested and subjected to
military justice. One soldier, 21-year-old Lynndie England, was one of those
arrested. In a now famous photograph, England is shown holding a naked Iraqi
prisoner on a dog leash. When asked to explain her actions, England repeatedly
said she was following the orders of her superiors. In her words she was
following the directions of “persons in my higher chain of command,” and that
“I was instructed by persons in higher rank to stand there and hold this leash
and look at the camera.”
When
England invoked orders from her superiors to explain her behavior, she was
continuing a long tradition of those who have found themselves in similar
positions. In fact, high-level Nazis routinely claimed that they were following
orders when they perpetrated heinous crimes against Jews, Gypsies, and Eastern
Europeans during World War II. The question we shall evaluate in this section
is whether an ordinary person can be induced into doing something extraordinary
in response to a command from someone in authority.
Defining
Obedience
Obedienceoccurs
when we modify our behavior in response to a direct order from someone in
authority. Most of the obedience we observe daily is constructive
obediencebecause it fosters the operation and well-being of society. Certainly
no group, no society, could exist very long if it couldn’t
make its members obey laws, rules, and customs.
Generally, obedience is not a bad thing. Traffic flows much easier when there
are motor vehicle laws, for example. But when the rules and norms people are
made to obey are negative, obedience is one of the blights of society. This
kind of obedience is called destructive obedience. Destructive obedience occurs
when a person obeys an authority figure and behaves in ways that are counter to
accepted standards of moral behavior, ways that conflict with the demands of
conscience. It is this latter form of obedience that social psychologists have
studied.
Unfortunately,
destructive obedience—the form of obedience we are most concerned with in this
chapter—is a recurring theme in human history. Throughout human history, there
are many instances when individuals carried out orders that resulted in harm or
death to others. In addition to the case of Lynndie England just noted, at the
Nuremberg trials following World War II, many Nazi leaders responsible for
murdering millions of people fell back on the explanation that they were
following orders. More recently, in the ethnic violence between Serbs and Bosnians
in the former Yugoslavia, Serbian soldiers allegedly received orders to rape
Muslim women in captured towns or villages. Islamic tradition condemns women
who have been raped or who become pregnant outside marriage; these orders were
intended to destroy the fabric of Muslim family life. The Serbian soldiers had
been ordered to engage in blatantly immoral and illegal behavior. More
recently, mass murders took place in Kosovo at the behest of the Serbian
leadership.
Destructive
obedience doesn’t only crop up in such large-scale
situations. Destructive obedience
can also manifest itself so that your everyday activities may be threatened.
For example, Tarnow (2000) cites evidence that excessive obedience to the
captain’s orders
may be responsible for up to 25% of all airplane crashes. One form of obedience
seems to be particularly problematic: when the nonflying crew member (copilot)
does not correctly monitor and subsequently challenge an error made by the
pilot. These types of errors are made in 80% of airline accidents (Tarnow,
2000). Tarnow suggests that the atmosphere in the cockpit is one of a captain’s
absolute authority. The captain is
given these powers by law. However, more power flows from the captain’s
greater flying
experience than the copilot (to become a captain, you need at least 1,500 hours
of flight time vs. 200 hours for a first officer). The power stemming from the
law and greater experience makes it difficult for junior officers to challenge
the captain, even in cases where the captain’s decision is
clearly wrong (Tarnow, 2000). The
consequences of this obedience dynamic may be tragic.
Destructive
Obedience and the Social Psychology of Evil
There is a
tendency to attribute acts of destructive obedience to some abnormal internal
characteristics of those who perpetrate such acts. Often we refer to
individuals such as Adolph Eichmann (the “architect” of the Holocaust) as
“evil.” The term evilhas been widely used historically and in contemporary
culture. For example, in his 2002 State of the Union Address, President George
Bush identified Iran, Iraq, and North Korea as an “Axis of Evil” because of
their pursuit of nuclear and other weapons of mass destruction. In 1983, the
late President Ronald Reagan referred to the former Soviet Union as an “Evil
Empire” and the focus of all evil in the world at the time. And, of course
Osama bin Laden is commonly tagged with the “evil” moniker.
What does
the term evilactually entail? Traditionally, notions of evil have been left to
philosophers and theologians. Recently, however, social psychologists have
given consideration to the concept and have developed social psychological
concepts of evil. In contrast to the traditional notion of evil that imbues a
person with aberrant internal characteristics, social psychologists favor a
situational definition of evil focusing on overt behavior. For example,
Zimbardo (2004) defines evil as “intentionally behaving, or causing others to
act, in ways that demean, dehumanize, harm, destroy or kill innocent people”
(p. 22). Under this definition, a wide range of behaviors including terrorism,
genocide, and even corporate misdeeds could be considered evil (Zimbardo,
2004).
How does a
social psychological definition of evil relate to obedience? Obedience to a
command from an authority figure can produce evil outcomes. For example, Adolph
Eichmann, carrying out orders of his Nazi superiors, was directly responsible
for the extermination of millions of innocent human beings. Obedience has the
power to transform ordinary people into those who are willing do things they
would not ordinarily do (Zimbardo, 2004). Zimbardo has identified 10 principles
inherent in obedience that can bring about this transformation. These are shown
in Table 7.2.
What are
roots that underlie evil? This question of course can be addressed from a
number of perspectives, including philosophical and religious. However, we will
limit ourselves to a social psychological answer to the question. Baumeister
and Vohs (2004) identify four roots of evil deeds. These are:
1. Instrumentality: Using violence to achieve a
goal or solve a conflict.
2. Threatened egotism:Violence as a response to
impugned honor or wounded pride.
3. Idealism: Evil deeds performed to achieve
some higher good.
4. Sadism: Enjoying harming others (more likely
to be reported by victims than perpetrators).
According
to Baumeister and Vohs, the four roots form a causal chain that moves one
toward perpetrating evil deeds. A final link between the four roots and the
actual evil behavior, however, is a loss of self-control (Baumeister &
Vohs, 2004). When one loses normal constraints against carrying out evil deeds
(e.g., mass violence), evil is more likely to be the result. When mechanisms of
self-control are maintained, evil deeds are less likely.
Table
7.2 Ten Principles Inherent in Obedience
That Can Bring About Transformation of Obedience to Evi
Staub
(1989) suggests three other roots of evil. These are: difficult life
conditions, cultural and personal preconditions, and the social-political organization.
Staub points out that evil deeds are often perpetrated under difficult life
conditions such as economic depression and social disorganization. For example,
the dismal economic conditions in Germany after World War I certainly
contributed to the rise of the Nazi Party and the subsequent evil perpetrated
on Jews and others. Cultural and personal factors are rooted in individual
self-concept and traditional in-group/out-group separations in a culture. When
one’s self-esteem is threatened, that individual will move toward
regaining a sense of
control and power. This can be accomplished by establishing a sense of
superiority of one’s in-group over out-groups. This is
precisely what happened in Nazi Germany. Finally,
certain social-political organization structures are more likely to give rise
to evil deeds than others. Totalitarian, authoritarian systems that
institutionalize prejudice and discrimination are most likely to lead to evil
deeds. Again, this is precisely what existed in Nazi Germany prior to the
implementation of the “Final Solution” of the Jewish problem resulting in the
murder of millions.
The
Banality of Evil: Eichmann’s Fallacy
It would
be a relief if those carrying out acts of destructive obedience were deviant
individuals predisposed to antisocial behavior. Unfortunately, history tells us
that those who perpetrate evil are often quite ordinary. William Calley, who
was in command of the platoon that committed a massacre at the Vietnamese
village of My Lai, was ordinary before and after My Lai. So too was Mohammad
Atta, the leader of the 9/11 hijackers. So was Adolph Eichmann, one of the
architects of the Holocaust and the Nazi officer responsible for the delivery
of European Jews to concentration camps in World War II.
Eichmann’s
job was to ensure that the death camps had a steady flow of victims. He secured the railroad cattle cars
needed to transport the human cargo. His job was managerial, bureaucratic;
often he had to fight with competing German interests to get enough boxcars.
When the war was over, Eichmann, a most-wanted war criminal, escaped to
Argentina. From 1945 to 1961, he worked as a laborer outside Buenos Aires. His
uneventful existence ended in 1961 when he was captured by Israeli secret
agents, who spirited him to Israel. There he stood trial for crimes against
humanity. After a long trial, Eichmann was found guilty and was later hanged.
The
Israelis constructed a special clear, bulletproof witness box for Eichmann to
appear in during the trial. They were afraid that someone in Israel might
decide to mete out some personal justice. What did the man in the glass booth
look like? Eichmann was a short, bald man whose glasses slipped down his nose
now and then. You could walk past him a hundred times on the street and never
notice him. During the trial, Eichmann portrayed himself as a man anxious to
please his superiors, ambitious for advancement. Killing people was a
distasteful but necessary part of his job. Personally, he had no real hatred of
the Jews. He was just following orders.
Philosopher
and social critic Hannah Arendt observed Eichmann in the dock. She was struck
by the wide gap between the ordinariness of the man and the brutal deeds for
which he was on trial. In her book, Eichmann in Jerusalem: A Report on the
Banality of Evil (1963), Arendt essentially accepted Eichmann’s
defense. Her analysis of Eichmann suggested
that evil is often very commonplace. Those who carry out acts of destructive
obedience are often ordinary people, rather like you and me.
People
were shocked by Eichmann and by Arendt’s analysis. They had expected a Nazi war criminal to be the
epitome of evil. There was a prevailing belief that evil deeds are done by evil
people, a belief referred to as Eichmann’s fallacy (Brown, 1986). Sometimes individuals who
perpetrate evil deeds are quite ordinary, as Eichmann apparently was.
As you
might expect, not everyone subscribes to the general idea of the banality of
evil. For example, Calder (2003) argues that a person can have an “evil
character” and still have an ordinary appearance and demeanor. However, Calder
admits that it is possible for ordinary individuals to commit acts of evil even
in the absence of an evil character. In an interesting distinction, Calder
suggests that some people, such as Adolph Hitler, carry out evil deeds on their
own, without direction from anyone else (autonomous evil). Calder classifies
individuals in this category as moral monsters. Moral monsters like Hitler are
singled out for special condemnation because of their active roles in
initiating and directing evil acts (Calder, 2003). Others, such as Adolph
Eichmann, carry out evil at the behest of others (nonautonomous evil).
Individuals in this category are moral idiots. We may be more inclined to label
moral monsters as truly evil than moral idiots. However, it is possible to
label the actions of moral idiots as truly evil if those acts are particularly
heinous and show a consistent pattern.
Our
discussion of the nature of evil leads us to a central question: Are evil deeds
the product of an evil character (internal attribution), or are they driven
more by aspects of the social situation (external attribution)? This brings us
to the main question we shall consider in the sections to follow: Do evil deeds
always lead us back to an evil person? Although it might make us feel better if
the answer to this question were yes, we see in this chapter that things are
not, unfortunately, so simple.
Ultimately,
Who Is Responsible for Evil Deeds?
After
World War II, the Allies tried many of the high-ranking Nazis who, like
Eichmann, claimed innocence. Their principal defense was to shift
responsibility to their superiors: They were only following orders. More
recently, a former East German border guard, Ingo Heinrich, was brought to
trial for his role in preventing East German citizens from escaping to the west
during the height of the cold war. Heinrich, along with his fellow border
guards, had orders to shoot to kill anyone attempting to escape over the Berlin
Wall. Heinrich did just that. But some of his comrades, under the same orders,
shot over the heads of escapees. After the fall of the Berlin Wall and the
reunification of Germany, Heinrich was arrested and charged with murder. He was
eventually convicted and sentenced to 3.5 years in prison.
The cases
of Eichmann and Heinrich raise some important issues about responsibility. Is
“I was only following orders” a valid defense? Does it erase personal
responsibility? Or should individuals be held accountable for their behavior,
even if they were following orders? On the surface it would appear that
Eichmann and Heinrich were personally responsible for their behavior. However,
a deeper examination of authority and its effects on behavior suggests a more
complex picture, a picture with many aspects. These issues and questions served
as the catalyst for what are probably the most famous experiments on obedience.
Milgram’s
Experiments on Obedience
How does
one test destructive obedience in a laboratory setting? The late Stanley
Milgram devised a simple yet powerful situation. Before we look at it, let’s
consider the sociohistorical
“climate” in the United States at the time. The year was 1962. Vietnam was but
a blip on the back pages of the newspapers. The Kennedy assassinations had not
yet occurred, nor had the murder of Martin Luther King, Jr., Watergate, or the
riots in the streets of Newark, Detroit, and Watts. This was America before the
real 1960s began, still holding on to some of the innocence, however illusory,
of the 1950s. This context is important to consider because it may have
influenced how people behaved in Milgram’s experiments.
The
Participant’s Perspective
Let’s
begin by considering what these experiments looked like from a participant’s perspective (Elms, 1972). Imagine
you are living in New Haven, Connecticut. One day you notice an ad in the paper
asking for volunteers for an experiment on learning and memory at nearby Yale
University. The researchers are clearly seeking a good representation of the
general population. The ad piques your curiosity, and you decide to sign up for
the experiment.
When you
arrive for the experiment, a young man, Mr. Williams, Dr. Milgram’s
associate, writes out a
check to each of you for $4.50. Williams tells you that little is known about
the impact of punishment on learning, and that is what this experiment is
about. You become a bit concerned when Williams says that one of you will be a
learner and the other will be a teacher. Your fears about getting punished soon
evaporate when you draw lots to see who will be the learner and you draw the
role of the teacher.
Preliminaries
out of the way, Williams leads you both into a room past an ominous-looking
piece of equipment labeled “Shock Generator, Thorpe ZLB . . . Output 15
volts—450 volts” (Milgram, 1974). The learner, Mr. Wallace, is told to sit in a
straight–backed metal chair. Williams coolly tells you to help strap Wallace’s
arms down to prevent
“excessive movement” during the experiment, which you do. Williams then applies
a white paste to Wallace’s arms, which he says is electrode paste “to avoid blisters and burns.”
Wallace is now worried, and he asks if there is any danger. Williams says,
“Although the shocks can be extremely painful, they cause no permanent tissue
damage” (Elms, 1972, p. 114).
In front
of the learner is a row of switches that he will use to respond to your
questions. Williams tells you that a light panel in the other room will
register the learner’s responses.
If his answers are correct, you, the teacher, tell him so. If incorrect, you
deliver an electric shock from the shock generator.
It’s
time to start the experiment. You leave Wallace strapped to the shock generator
and follow Williams into the next room. He places you before a control panel
that has 30 levers,
each with a little red light and a big purple light above. The lights have
signs above them reading 15 volts, 30 volts, 45 volts, and so on, up to 450
volts. There are also printed descriptions of the shock levels above the
labels, reading Slight Shock, Moderate Shock, Strong Shock, Intense Shock,
Extreme Intense Shock, and finally, over the last few switches, in red, Danger:
Severe Shock XXXXX. At this point, you hope that Wallace is brighter than he
looks (Elms, 1972).
Before you
begin the experiment, Williams gives you a sample shock of 45 volts, which
gives you a little jolt. Next, you are told that your task is to teach Wallace
several lists of word pairs, such as blue–box, nice–day, wild–duck. You read
the entire list of word pairs and then test him, one pair at a time, by
providing the first word from each pair.
At first
the test is uneventful; Wallace makes no errors. Then he makes his first
mistake, and you are required to give him a 15-volt shock. Williams tells you
that for every error after that, you are to increase the shock by 15 volts. On
subsequent trials Wallace makes frequent errors. When you get to 105 volts, you
hear Wallace yell through the wall, “Hey, this really hurts!”
Williams,
cool as ever, doesn’t seem to notice. You certainly do. At
150 volts, the moaning
Walace yells, “Experimenter, get me out of here! I won’t
be in the experiment anymore.
I refuse to go on!” (Elms, 1972, p. 115). You look at Williams. He says softly
but firmly, “Continue.”
Williams
brings you more word-pair lists. You begin to wonder what you and Wallace have
gotten into for $4.50. You are now at 255 volts, Intense Shock. Wallace screams
after every shock. Whenever you ask Williams if you can quit, he tells you to
continue. At 300 volts, you wonder if Wallace is going to die. “But,” you
think, “they wouldn’t let that happen at Yale . . . or would
they?”
“Hey, Mr.
Williams,” you say, “whose responsibility is this? What if he dies or is
seriously injured?” Williams does not bat an eye: “It’s
my responsibility, not yours, just continue
with the experiment.” He reminds you that, as he told you before, the labels
apply to small animals, not humans.
Finally it
is over. There are no more shock switches to throw. You are sweaty, uneasy.
Wallace comes in from the other room. He is alive and seems okay. You
apologize. He tells you to forget it, he would have done the same if he had
been in your shoes. He smiles and rubs his sore wrists, everybody shakes hands,
and you and Wallace walk out together.
Predicted
Behavior and Results in the Milgram Experiment
How do you
think you would behave in Milgram’s experiment? Most people think they would refuse to obey the
experimenter’s orders. Milgram was interested in this question, so he asked
a wide range of individuals, both expert (psychiatrists) and nonexpert (college students and noncollege
adults), how they thought participants would behave in this situation. They all
predicted that they would break off the experiment, defying the experimenter.
The psychiatrists predicted that participants would break off when the learner
began to protest, at the 150-volt level. So, if you believe that you would defy
the experimenter and refuse to inflict pain on another person, you are not
alone.
Another
study, independent from Milgram’s, investigated the role of several variables in
predicting obedience in a Milgram-type experiment (Miller, Gillen, Schenker,
& Radlove, 1974). Miller et al. provided participants with verbal
descriptions and a slide show depicting Milgram’s experiment.
Miller et al. looked at two classes of variables:
Perceiver
variables (gender and normative information [some participants were provided
with the results of Milgram’s baseline experiment and others were
not]) and stimulus
person variables (gender and physical attractiveness). The dependent variable
was the predicted shock level that would be administered in the situation. The
results showed that participants believed that males would administer higher
shock levels than females and that unattractive individuals would administer
higher shock levels than attractive individuals. The latter finding was true
mainly for female shock administrators. Interestingly, males showed greater
consistency between predictions of another person’s obedience behavior than did females. Female
participants believed they themselves would administer lower levels of shock
than would another person in the same situation.
The
underlying assumption of these predictions is that individual characteristics
will be more powerful determinants of behavior than situational factors. The
predictions of Milgram’s participants reflect the notion that
moral knowledge predicts moral behavior;
in other words, if you know what is right, you will do it. However, the results
of Milgram’s first “baseline” experiment (in which there was no feedback
from the victim) don’t
support these rosy predictions. A majority of participants (65%) went all the way to 450 volts. In fact, the
average shock level delivered by the participants in this first experiment was
405 volts! We can infer from this result that under the right circumstances,
most of us probably also would go all the way to 450 volts.
Of course,
no electric shock was ever given to Wallace, who was, in fact, a professional
actor, playing out a script. However, Milgram’s participants did
not know that the
entire situation was contrived.
Situational
Determinants of Obedience
Milgram
himself was surprised at the levels of obedience observed in his first
experiment. He and others conducted several additional experiments
investigating the situational factors that influence levels of obedience. In
the following sections, we explore some of these situational factors.
Proximity
of the Victim In his fi rst series of experiments, Milgram tested the limits of
obedience by varying the proximity, or closeness, between the teacher and the
learner (victim). The conditions were:
1. Remote victim. The teacher and the learner
were in separate rooms. There was no feedback from the victim to the teacher.
That is, Wallace didn’t speak, moan, or scream.
2. Voice feedback. The teacher and the learner
were in separate rooms, but Wallace began to protest the shocks as they became
more intense. This is the experiment just described. In one version of the
voice-feedback condition, Wallace makes it clear that he has a heart condition.
After receiving 330 volts he screams, “Let me out of here. Let me out of here.
My heart is bothering me” (Milgram, 1974, p. 55).
3. Proximity. The teacher and the learner were
in the same room, sitting only a few feet apart.
4. Touch proximity. The teacher and the learner
were in the same room, but the learner received the shock only if his hand was
placed on a shock plate. At one point the learner refused to keep his hand on
the plate. The teacher was told to hold the learner’s
hand down while delivering the shock. The teacher often had to hand-wrestle the victim to be
sure the hand was properly placed on the shock plate.
These four
conditions decrease the physical distance between the teacher and the learner.
Milgram found that reducing the distance between the teacher and the learner
affected the level of obedience (Figure 7.7). In the remote-victim condition,
65% of the participants obeyed the experimenter and went all the way to 450
volts (the average shock intensity was 405 volts). As you can see from Figure
7.7, obedience was not substantially reduced in the voice-feedback condition.
In this condition, obedience dropped only 2.5%, to 62.5%, with an average shock
intensity of 368 volts.
Thus,
verbal feedback from the learner, even when he indicates his heart is bothering
him, is not terribly effective in reducing obedience. Significant drops in the
rates of obedience were observed when the distance between the teacher and the
learner was decreased further. In the proximity condition, where the teacher
and the learner were in the same room and only a few feet apart, 40% of the
participants went to 450 volts (with an average shock intensity of 312 volts).
Finally, when the teacher was required to hold the learner’s
hand on the shock plate in the touch-proximity condition, only 30% obeyed and went to 450 volts (the
average shock intensity was 269 volts).
Figure 7.7
The effect of moving the learner closer to the teacher. In the remote
condition, obedience was highest. Adding voice feedback did not reduce
obedience significantly. It was only when the learner and teacher were in the
same room that obedience dropped. The lowest level of obedience occurred when
the teacher was required to touch the learner in order to administer the
electric shock.
Why does
decreasing the distance between the teacher and the learner affect obedience so
dramatically? Milgram (1974) offered several explanations. First, decreasing
the distance between the teacher and the learner increases empathic cues from
the learner, cues about his suffering, such as screaming or banging on the
wall. In the remote-victim condition, the teacher receives no feedback from the
learner. There is no way for the teacher to assess the level of suffering of
the learner, making it easier on the teacher’s conscience to inflict harm. In the
feedback conditions, however, the suffering of the learner is undeniable. The
teacher has a greater opportunity to observe the learner in voice-feedback,
proximity, and touch conditions than in the remote-victim condition. It is
interesting to note, however, that even in the touch-proximity condition, a
sizable percentage of participants (39%) were willing to fully obey the
experimenter. It is apparent that there are some among us who are willing to
discount empathic cues and continue to do harm to others in a face-to-face,
intimate-contact situation. For example, there was no shortage of Nazis willing
to shoot Jews at close range during the early stages of the Holocaust.
Milgram
also suggested that in the remote-victim condition a “narrowing of the
cognitive field,” or cognitive narrowing, occurs. That is, the teacher can put
the learner out of mind and focus on the learning task instead. As the victim
becomes more observable, such narrowing becomes more difficult, and obedience
is reduced. These results suggest that it is more difficult to inflict harm on
someone you can see, hear, or touch. This is why it is probably easier to drop
bombs on a city of 500,000 from 30,000 feet than to strangle one person with your
bare hands.
Power of
the Situation A second variable Milgram investigated was the nature of the
institution behind the authority. The original studies were conducted at Yale
University. To test the possibility that participants were intimidated by the
school’s power
and prestige, Milgram rented a loft in downtown Bridgeport, Connecticut, and
conducted the experiment under the name “Research Associates of Bridgeport.” He
also had the experimenter represent himself as a high school biology teacher.
Under these conditions, obedience fell to 47.5%, down from 65% in the original,
baseline study. Although this difference of 17.5% does not meet conventional
levels of statistical significance, it does suggest that removing some of the
trappings of legitimacy from an authority source reduces obedience somewhat.
Presence
and Legitimacy of the Authority Figure What if the authority figure was
physically removed from the obedience situation? In another variation on his
original experiment, Milgram had the experimenter give orders by telephone,
which varied the immediacy of the authority fi gure, as opposed to varying the
immediacy of the victim. He found that when the experimenter is absent or tried
to phone in his instructions to give shock, obedience levels dropped sharply,
to as little as 20%. The closer the authority figure, the greater the
obedience.
After
Milgram’s original research was publicized, other researchers became
interested in the aspects of authority that might influence obedience levels.
One line of research
pursued the perceived legitimacy of the authority figure. Two different studies
examined the effect of a uniform on obedience (Bickman, 1974; Geffner &
Gross, 1984). In one study (Geffner & Gross, 1984), experimenters
approached participants who were about to cross a street and requested that
they cross at another crosswalk. Half the time the experimenter was uniformed
as a public works employee, and half the time the experimenter was not in
uniform. The researchers found that participants were more likely to obey
uniformed than nonuniformed individuals.
Confl
icting Messages about Obedience Milgram also investigated the impact of
receiving confl icting orders. In two variations, participants received such
conflicting messages. In one, the conflicting messages came from the learner
and the experimenter.
The
learner demanded that the teacher continue delivering shocks whereas the
experimenter advocated stopping the experiment. In the second variation, two
authority figures delivered the conflicting messages. One urged the teacher to
continue whereas the other urged the teacher to stop.
When such
a conflict arose, participants chose the path that lead to a positive outcome:
termination of harm to the learner. When there was conflict between authority sources,
or between the learner and the authority source, not one participant went all
the way to 450 volts.
Group
Effects A fourth variation involved groups of teachers, rather than a single
teacher. In this variation, a real participant was led to believe that two
others would act as co-teachers. (These other two were confederates of the
experimenter.) When the learner began to protest, at 150 volts, one confederate
decided not to continue. Defying the experimenter’s instructions, he walked away and
sat in a chair across the room. At 210 volts the second confederate followed.
Milgram’s results showed that having the two confederates defy the
experimenter reduced obedience markedly. Only 10% of the participants obeyed to
450 volts (mean shock intensity 305 volts). Thirty-three percent of the
participants broke off after the first confederate defied the experimenter but
before the second confederate. An additional 33% broke off at the 210-volt
level after the second confederate defied the experimenter. Thus, two-thirds of
the participants who disobeyed the experimenter did so immediately after the
confederates defied the experimenter.
Why does
seeing two others disobey the experimenter significantly reduce the participant’s
obedience? One explanation centers on a phenomenon called diffusion of responsibility. Diffusion of
responsibility occurs when an individual spreads responsibility for his or her
action to other individuals present. In the obedience situation in which there
were two other teachers delivering shocks, the participant could tell himself
that he was not solely responsible for inflicting pain on the learner. However,
when the two confederates broke off, he was left holding the bag; he was now
solely responsible for delivering shocks. Generally, when people are in a
position where they can diffuse responsibility for harming another person,
obedience is higher than if they have to deliver the harm entirely on their own
and cannot diffuse responsibility (Kilharn & Mann, 1974). In short, having
two people defy the experimenter placed the participant in a position of
conflict about who was responsible for harming the learner.
There is
another explanation for the group effects Milgram observed. When the two
confederates broke off from the experiment, a new norm began to form:
disobedience. The old norm of obedience to the experimenter is placed into
conflict with the new norm of disobedience. The norm of disobedience is more
“positive” than the norm of obedience with respect to the harm to the learner.
Remember that when participants were given he choice between a positive and a
negative command, most chose the positive. The lone participants in the
original studies, however, had no such opposing norms and so were more inclined
to respond to the norm of obedience. Evidently, having role models who defy
authority with impunity emboldens us against authority. Once new norms develop,
disobedience to oppressive authority becomes a more viable possibility.
The Role
of Gender in Obedience
In Milgram’s
original research, only male participants were used. In a later replication, Milgram also included female
participants and found that males and females obeyed at the same levels.
However, later research showed that there is a gender difference in obedience.
In an experiment conducted in Australia, Kilham and Mann (1974) found that
males obeyed more than females. In another study conducted in the United
States, Geffner and Gross (1984) found that males obeyed a uniformed authority
more than females did.
Another
way to approach the issue of gender effects in obedience is to determine
whether male or female authority figures are more effective in producing
obedience. In Geffner and Gross’s (1984) experiment, the effects of
experimenter gender, participant gender, and participant age on obedience were
investigated. The results showed no simple
effect of experimenter gender on obedience. Instead, experimenter gender and
participant age interacted, as shown in Figure 7.8. Notice that there was no
difference between older and younger participants (“younger” participants being
under age 30, and “older” participants being over age 50) when the experimenter
was female. However, when the experimenter was male, younger participants
obeyed the male experimenter more than older participants did.
Obedience
or Aggression?
Milgram’s
experiment used an aggressive response as the index of obedience. Could it be that participants
were displaying aggression toward the learner, which had little to do with
obedience? Such an interpretation appears unlikely. In situations where
participants were allowed to choose the level of shock to deliver to the
learner, the average shock delivered was 82.5 volts, with 2.5% obeying
completely. This is quite a drop from the 405 volts with 65% obeying completely
in the baseline condition (Milgram, 1974).
These
results were supported by a replication of Milgram’s
experiment by other researchers
(Mantell, 1971). In one condition of this experiment, participants were allowed
to set the level of shock delivered to the learner. Compared to 85% of
participants who used the highest level of shock in a replication of Milgram’s
baseline experiment (no
feedback from the learner), only 7% of the participants in the “self-decision”
condition did so. These results and others (Kilham & Mann, 1974; Meeus
& Raaijmakers, 1986; Shanab & Yahya, 1978) lead us to the conclusion
that participants were displaying obedience to the experimenter rather than to
their own aggressive impulses.
Figure
7.8Obedience as a function of the gender of an authority figure and participant
age. Younger participants were more likely to obey a male authority fi gure
than older participants. Younger and older participants obeyed a female
authority figure equally.
Obedience
across Culture, Situation, and Time
Milgram’s
original experiments
were conducted in the United States, using a particular research technique.
Would his results hold up across cultures and across experimental situations?
Some critics of Milgram’s study, Dutch researchers Meeus and
Raaijmakers (1986),
argued that the type of obedience required in Milgram’s
experiment—physically hurting
another person—was not realistic. Such behavior is rare in everyday life. They
argued that people are more often asked to hurt others in more subtle ways. For
example, your employer might ask you to do something that makes another
employee look bad.
Would you
obey?
Meeus and
Raaijmakers (1986) studied a different form of obedience: administrative
obedience. Dutch participants were told that the psychology department of a
university was commissioned to screen applicants for various state and civic
positions and that the department was using this opportunity to test the
effects of stress on test achievement. According to instructions, participants
made a series of disparaging statements about a person taking a test for a
state job. Fifteen statements, each more disruptive than the previous, were
used. The mildest statement was, “Your answer to question 9 was wrong”; a
moderate statement was, “If you continue like this, you will fail the test”;
and the strongest statement was, “According to the test, it would be better for
you to apply for lower functions” (p. 323). Understandably, job applicants
became increasingly upset with each comment.
Most of
the Dutch participants obeyed; 90% read all 15 statements. This resembles the
Milgram experiment in which participants had to increase shock in 15 stages as
the victim became more upset. In Milgram’s terms,
they gave the full 450 volts. When questioned about it, they attributed
responsibility for the harassment to the experimenter.
In another
variation on Milgram’s experiment, Australian participants
assumed the role of
either transmitter of the experimenter’s instructions or executor (Kilham & Mann, 1974). In the transmitter
condition, participants relayed orders to continue shocking a learner to a
confederate of the experimenter who delivered the shocks. In the executor
condition, participants received orders indirectly from the experimenter
through a confederate of the experimenter. The hypothesis was that there would
be greater obedience when the participant was the transmitter rather than the
executor of orders, presumably because the participant is not directly
responsible for inflicting harm on the victim. Results supported this
hypothesis. Participants in the transmitter role showed higher levels of
obedience than those in the executor role.
Milgram’s
obedience effect has been supported by
other cross-cultural research. For example, obedience among Jordanian adults was
found to be 62.5%—comparable to the 65% rate found by Milgram among
Americans—and among Jordanian children, 73% (Shanab & Yahya, 1977). The
highest rates of obedience were reported among participants in Germany. In a
replication of Milgram’s original baseline experiment, 85% of German men obeyed the
experimenter (Mantell, 1971). Overall, it appears that obedience is an integral
part of human social behavior.
Finally,
Milgram’s findings have withstood the test of time. Blass (2000)
evaluated replications
of Milgram’s experiments conducted over a 22-year period (1963 to 1985) and found that obedience rates
varied from a low of 28% to a high of 91%. However, there was no systematic
relationship between the time that a study was conducted and the rate of obedience.
According to Blass, it does not appear that an enlightenment effect has
occurred. An enlightenment effect occurs when results of research are
disseminated and behavior is altered. If this happened there should have been
reliably less obedience in later studies of obedience than in earlier studies
(Blass, 2000).
Reevaluating
Milgram’s Findings
Milgram
sought to describe the dynamics of obedience by comparing obedience rates
across different experimental conditions. A wholly different picture of Milgram’s
findings emerges when a careful analysis of the audiotapes made by Milgram of
almost all sessions of
his experiment was done (Rochat, Maggioni, & Modigliana, 2000). Such an
analysis by Rochat et al. showed that obedience within an experimental session
tended to develop slowly and incrementally through a series of steps. Rochat
and colleagues classified participants’behavior as either acquiescence (going along with the experimenter’s
demands without comment), checks (the participant seeks clarification of a
restricted part of the procedure), notifies (the participant provides
information to the experimenter that could lead to breaking off of the experiment), questions (the
participant overtly expresses doubt or requests additional information about
the experimenter’s demands), objects (the participant
overtly disagrees with the experimenter and brings up some personal reason why
he/she should not continue),
or refuses (the participant overtly declines to continue the experiment,
effectively disobeying the experimenter).
Rochat and
colleagues found that the participants’acquiescence to the experimenter was
relatively brief. At the 75-volt level (when
the learner first indicates he is in pain), 10% of participants exhibited a
low-level defiant response (minimum checking). As the experiment progressed,
opposition in the form of checking increased. By 150 volts, 49.7% of
participants were checking, and by 270 volts all participants checked.
Additionally, 30% of participants either questioned, objected to, or refused
the experimenter’s orders at or before 150 volts, with an
additional 35% reaching this high level of
opposition between 150 and 330 volts (Rochat et al., 2000). Interestingly, 57%
of the participants who eventually refused to continue began to protest before
150 volts, whereas none of the fully obedient participants did so.
Regardless
of the path chosen by a participant, he or she experienced a great deal of
conflict as the experiment progressed. Participants dealt with the conflict
aroused by the demands of the experimenter and the learner by becoming confused
and uncertain, and by showing high levels of distress (Rochat et al., 2000).
Some participants dealt with the stress of the situation by rationalizing away
the suffering of the learner, whereas others rushed through the remaining shock
levels. According to Rochat and colleagues, participants resolved their
conflict in one of two ways. Some participants completed the task to the
450-volt level in a “resigned or mechanical fashion” (p. 170). Others resolved
the conflict by becoming oppositional toward the experimenter by first
questioning and/or objecting to the experimenter and then later refusing,
despite the pressure put on the participant by the experimenter to continue
(Rochat et al., 2000).
Critiques
of Milgram’s Research
There were
aspects of Milgram’s experiments and others like them that were never precisely
defined but probably influenced levels of obedience. Consider, for example, the
gradual, stepwise demands made on the participant. Each 15-volt increment may
have “hooked” the participants a little more. This is in keeping with the
foot-in-the-door technique. Obeying a small, harmless order (deliver 15 volts)
made it likely that they would more easily obey the next small step, and the next,
and so on (Gilbert, 1981). Each step made the next step seem not so bad.
Imagine if the participant were asked to give 450 volts at the very start. It
is likely that many more people would have defied the experimenter.
What about
the protests made by many participants? Very few participants went from
beginning to end without asking if they should continue or voicing some concern
for the victim. But they were always told, “You must continue; you have no
choice.” Perhaps, as some observers suggest, the experiments are as much a
study of ineffectual and indecisive disobedience as of destructive obedience
(Ross & Nisbett, 1991). When participants saw others disobey, they suddenly
knew how to disobey too, and many of them did so.
There is
another, even more subtle factor involved here. The experiments have a kind of
unreal, “Alice-in-Wonderland” quality (Ross & Nisbett, 1991). Events do not
add up. The participant’s job is to give increasing levels of
electric shock to a learner in order
to study the effects of punishment on learning. The shocks increase as the
learner makes errors. Then (in some variations), the learner stops answering.
He can’t be earning anything now. Why continue to give shocks?
Furthermore, the experimenter clearly does
not care that the victim is no longer learning.
Some
observers suggest that because the situation does not really make sense from
the participant’s perspective, the participant becomes
confused (Ross & Nisbett, 1991).
The participant acts indecisively, unwilling or unable to challenge authority.
Not knowing what to do, the participant continues, with great anxiety, to act
out the role that the experimenter has prescribed.
This
analysis suggests that Milgram’s experiments were not so much about
slavish obedience to
authority as they were about the capacity of situational forces to overwhelm
people’s more positive tendencies. This may, however, be a futile
distinction. Either
way, the victim would have been hurt if the shock had been real.
Finally,
Milgram’s research came under fire for violating ethical research practices. Milgram
explored the dimensions of obedience in 21 experiments over a 12-year period,
and more than a thousand participants participated in these experimental
variations. Because Milgram’s participants were engaging in behavior
that went against
accepted moral standards, they were put through an “emotional wringer.” Some
participants had very unpleasant experiences. They would “sweat, tremble,
stutter, bite their lips, groan, dig their fingernails into their flesh”
(Milgram, 1963, p. 375). A few had “full-blown uncontrollable seizures” (p.
375). No one enjoyed it.
Milgram’s
research and its effects on the persons who participated raise an interesting
question about the ethics of research. Should we put people through such experiences in the name of science? Was
the participants’anguish worth it? Several observers, including Baumrind (1964),
criticized Milgram for continuing the research when he saw its effect on his
participants. After all, the critics argued, the participants agreed to take
part only in an experiment on memory and learning, not on destructive obedience
and the limits of people’s willingness to hurt others.
But
Milgram never doubted the value of his work. He believed it was important to
find the conditions that foster destructive obedience. He further believed that
his participants learned a great deal from their participation; he knew this
because they told him so. Milgram went to great lengths to make sure the
teachers knew that Wallace was not harmed and that he held no hard feelings. He
also had a psychiatrist interview the participants a year or so after the
experiment; the psychiatrist reported that no long-term harm had been done
(Aron & Aron, 1989).
The
current rules for using participants in psychological experiments would make it
exceedingly difficult for anyone in the United States to carry out an
experiment like Milgram’s. All universities require that research
proposals be evaluated by institutional review
boards (IRBs), which decide if participants might be harmed by the research. A researcher
must show the IRB that benefits of research to science or humankind outweigh
any adverse effects on the participants. If a researcher were allowed to do an
experiment like Milgram’s, he or she would be required to ensure
that the welfare of the participants was
protected. In all likelihood, however, we will not see such research again.
Disobedience
Although
history shows us that obedience can and has become an important norm guiding
human behavior, there are also times when disobedience occurs. In 1955, for
example, a black seamstress named Rosa Parks refused to give up her seat on a
Montgomery, Alabama, bus to a white passenger. Her action was in violation of a
law that existed at the time. Parks was arrested, convicted, and fined $10 for
her refusal.
Parks’s
disobedience served as a catalyst for events that shaped the civil rights
movement. Within 2 days of her arrest, leaflets were distributed in the African
American community
calling for a 1-day strike against the bus line. Martin Luther King, Jr. and
other African American leaders took up her cause. The bus strike that was
supposed to last only a day lasted for a year. Eventually, laws requiring
African Americans to sit at the back of a bus, or to surrender a seat to a
white passenger, were changed. From Rosa Parks’s initial act of disobedience flowed a
social movement, along with major social change.
Breaking
with Authority
Milgram
(1974) suggested that one factor contributing to the maintenance of obedience
was that the individual in the obedience situation entered into an agentic
state, which involves a person’s giving up his or her normal moral and
ethical standards in favor of those
of the authority figure. In short, the individual becomes an agent or
instrument of the authority figure. Milgram suggested further that in this
agentic state, a person could experience role strain(apprehension about the
obedience behavior) that could weaken the agentic state. In an obedience
situation, the limits of the role we play are defined for us by the authority
source. As long as we are comfortable with, or at least can tolerate, that
role, obedience continues. However, if we begin to seriously question the
legitimacy of that role, we begin to experience what Milgram called role
strain.
In this
situation, the individual in the agentic state begins to feel tension, anxiety,
and discomfort over his or her role in the obedience situation. In Milgram’s
(1974) experiment,
participants showed considerable signs of role strain in response to the
authority figure’s behavior. As shown in Figure 7.9, very
few participants were “not at all tense and
nervous.” Most showed moderate or extreme levels of tension and nervousness.
Milgram
suggested that this tension arose from several sources:
• The cries of pain from the victim, which can
lead the agent to question his or her
behavior
• The inflicting of harm on another person,
which involves violating established
moral and
social values
• Potential retaliation from the victim
• Confusion that arises when the learner
screams for the teacher to stop while the authority demands that he or she
continue
• Harmful behavior, when this behavior
contradicts one’s self-image
Figure
7.9Role strain in Milgram’s obedience experiment. Most participants experienced
moderate to extreme stress, regardless of the fact that they knew they were not
ultimately responsible for any harm to the learner.
How can
the tension be reduced? Participants tried to deny the consequences of their actions
by not paying attention to the victim’s screams, by dealing only with the task
of flipping switches.
As mentioned earlier, Milgram (1974) called this method of coping cognitive
narrowing. Teachers also tried to cheat by subtly helping the learner—that is,
by reading the correct answer in a louder voice. These techniques allowed
teachers to tolerate doing harm that they wished they did not have to do. Other
participants resolved the role strain by breaking the role, by disobeying. This
choice was difficult; people felt
they had
ruined the experiment, which they considered legitimate.
Role
strain can, of course, eventually lead to disobedience. However, real-world
obedience situations, such as those that occur within military organizations,
often involve significant pressures to continue obedience. Nazi soldiers who
made up the squads that carried out mass murders (Einsatzgruppen) were
socialized into obedience and closely allied themselves with their authority
sources. When role strain is felt by people in this type of situation,
disobedience is difficult, perhaps impossible.
However,
this does not necessarily mean that the role strain is ignored. Creative
psychological mechanisms may develop to cope with it. A fair number of members
of the Einsatzgruppen experienced role strain. In his study of Nazi doctors,
Robert Lifton (1986) found that many soldiers who murdered Jews firsthand
experienced immediate psychological reactions, such as physical symptoms and
anxiety. For example, General Erich von dem Bach-Zelewski (one of the Nazis’premier
Einsatzgruppen generals) was
hospitalized for severe stomach problems, physical exhaustion, and
hallucinations tied to the shooting of Jews (Lifton, 1986). The conflict
soldiers felt was severe: They couldn’t disobey, and they couldn’t continue. As
a result, they removed themselves from the
obedience situation by developing psychological problems.
Reassessing
the Legitimacy of the Authority
In their
book Crimes of Obedience, Kelman and Hamilton (1989) pointed out that authority
is more often challenged when the individual considers the authority source
illegitimate. Recall that when Milgram conducted his experiment in downtown
Bridgeport instead of at Yale University, he found a decrease in obedience.
When an authority source loses credibility, disobedience becomes possible.
Kelman and
Hamilton suggested that two kinds of psychological factors precede
disobedience. The first comprise cognitive factors—the way we think about
obedience. In order to disobey, the individual involved in an obedience
situation must be aware of alternatives to obedience. For example, Lt. Calley’s
men in Vietnam were not aware that
a soldier may disobey what he has good reason to believe is an illegal order,
one that violates the rules of war.
Disobedience
is also preceded by motivational factors. An individual in the obedience
situation must be willing to buck the existing social order (whether in the
real world or in the laboratory) and accept the consequences. Milgram’s
finding supports the importance
of this motivation to disobey. Participants who saw another person disobey and
suffer no consequences frequently disobeyed.
These same
factors could explain the behavior of Lithuanians during the early part of
1990. The Lithuanians declared independence from the Soviet Union, disrupting
the long-standing social order. They were willing to accept the consequences:
sanctions imposed by the Soviets. Lithuanian disobedience came on the heels of
the domino-like toppling of Communist governments in Eastern Europe. Having
seen that those people suffered no negative consequences, Lithuanians realized
that there was an alternative to being submissive to the Soviets. In this
respect, the Lithuanians behaved similarly to Milgram’s
participants who saw the confederates disobey the experimenter.
According
to Kelman and Hamilton (1989), these two psychological factors interact with
material resources to produce disobedience. In response, the authority source
undoubtedly will apply pressure to restore obedience. Those who have the funds
or other material resources will be able to withstand that pressure best. Thus,
successful disobedience requires a certain level of resources. As long as
individuals perceive that the authority figure has the greater resources (monetary
and military), disobedience is unlikely to occur.
Consider
the events in Tiananmen Square in China during June 1989. Students occupied the
square for several days, demanding more freedom. At first, it appeared that the
students had gained the upper hand and had spurred an irreversible trend toward
democracy! The government seemed unable to stem the tide of freedom. However,
the government’s inability to deal with the students was
an illusion. Once the Chinese government decided to act, it used its vastly superior
resources to quickly and efficiently end
the democracy movement. Within hours, Tiananmen Square was cleared. At the cost
of hundreds of lives, “social order” was restored.
Strength
in Numbers
In Milgram’s
original experiment, the obedience situation consisted of a one-on-one relationship between the authority
figure and the participant. What would happen if that single authority source
tried to influence several participants?
In a study
of this question, Gamson and his colleagues recruited participants and paid
them $10 to take part in a group exercise supposedly sponsored by the
Manufacturers’Human Resources
Consultants (MHRC) (Gamson, Fireman, & Rytina, 1982). Participants arrived
at a hotel and were ushered into a room with a U-shaped table that seated nine
persons. In the room were microphones and television cameras. After some
introductory remarks, the session coordinator (the experimenter) explained that
MHRC was collecting information for use in settling lawsuits. The nine
participants were told that the current group would be discussing a case
involving the manager of a gas station (Mr. C). Mr. C had been fired by the
parent company because he was alleged to be involved in an illicit sexual
relationship. The experimenter explained that the courts needed information
concerning “community standards” on such an issue to help reach a rational
settlement in the case. Participants then signed a “participation agreement,”
which informed them that their discussions would be videotaped.
Next, they
were given the particulars of the case and then were asked to consider the
first question: “Would you be concerned if you learned that the manager of your
local gas station had a lifestyle like Mr. C’s?” (Gamson et
al., 1982, p. 46). Before leaving
the room, the experimenter conspicuously turned on a videotape recorder to
record the group’s discussions.
A few minutes later, the experimenter came back into the room, turned off the
video recorder, and gave the group a second question to consider: “Would you be
reluctant to do business with a person like Mr. C because of his lifestyle?”
(p. 46). Simultaneously, the experimenter designated certain members of the
group to adopt a position against Mr. C, because people were only taking the
side of the gas station manager.
He then
turned the video recorder back on and left the room. This process was repeated
for a third question. Finally, the experimenter came back into the room and
asked each person to sign an affidavit stating that the tapes made could be
used as evidence in court. The experimenter again left the room, apparently to
get his notary public stamp so that the affidavits could be notarized. The
measure of obedience was each person’s willingness to sign the affidavit.
Let’s
consider what happened in this study up to this point. Imagine that you are a participant in this study. You are
seen on videotape arguing a given position (against Mr. C) that you were told
to take. However, because the experimenter turned off the video recorder each
time he came into the room, his instructions to adopt your position are not
shown. A naive observer—for example, a judge or a juror in a court in which
these tapes would be used— would assume that what you say on the tape reflects
your actual views. The question for you to evaluate is whether you would sign
the affidavit.
Surprisingly,
in 16 of the 33 nine-person groups all participants refused to sign. These
groups staged what might be considered outright rebellion against the
experimenter. Some members even schemed to smuggle the affidavit out of the
room so that they would have evidence for future legal action against Mr. C.
Disobedience was not a spur-of-the-moment decision, though. Some groups showed
signs of reluctance even before the final request was made, such as during
break periods between tapings. When the video recorder was off, members of
these groups expressed concern about the behavior of the experimenter.
Furthermore,
there were nine groups that the researchers termed factional successes. In
these groups, most participants refused to sign, although some agreed to sign.
Four other groups, called fizzlers, included a majority of members who showed
signs of rebellion during the early stages of the experiment. However, when it
came time to sign the affidavits, these majority members signed them anyway.
Finally, four groups, called tractables, never showed signs of having a
majority of rebellious members. Therefore, in all but four groups, there was a
tendency to disobey the experimenter.
What
differences are there between the Gamson and Milgram studies? The most
important difference is that Gamson’s participants were groups and Milgram’s
were individuals. The
groups could talk, compare interpretations, and agree that this authority was
illegitimate. Milgram’s participants may have thought the same, but they had no way of
confirming their opinions. One important lesson may be that rebellion is a
group phenomenon. According to Gamson, people need to work together for
disobedience to be effective.
The
development of an organized front against authority may occur slowly. A core of
committed individuals may mount the resistance, with others falling in later in
a bandwagon effect. The Chinese student uprising in 1989 is an example. The
protest began with a relatively small number of individuals. As events
unfolded, more people joined in, until there were hundreds of thousands of
protesters.
A second
factor is the social climate. Disobedience—often in the form of social
movements—occurs within social climates that allow such challenges to
authority. Milgram’s studies, for example, were conducted
mainly between 1963 and 1968. By the
time Gamson and his colleagues did theirs, in 1982, the social climate had
changed dramatically. Trust in government had fallen sharply after Watergate
and the Vietnam War. Furthermore, Gamson’s situation involved a large oil company.
By 1982, people’s trust
in the honesty of oil companies had reached a very low level.
Many
nonlaboratory examples illustrate the role of social climate in rebellion.
Communist governments in Eastern Europe, for example, were overthrown only
after major changes in the political system of the Soviet Union that had
controlled Eastern Europe since 1945, the end of World War II. Eventually, that
climate caught up to the Soviet Union, which disintegrated completely in 1991.
Rebellion
against authority may also occur within social climates that do not fully
support such rebellion. The resistance movements in France during World War II,
for example, helped undermine the German occupation forces, despite the fact
that most of France was ruled with an iron fist by the Germans. Within Germany
itself, there was some resistance to the Nazi regime (Peukert, 1987). Even the
ill-fated student uprising in Tiananmen Square took place within a climate of
liberalization that had evolved over several years before the uprising.
Unfortunately, the climate reversed rapidly.
Not all
acts of disobedience are rebellious in nature. In some instances a group of
citizens may advocate and engage in the breaking of laws they see as unjust.
This is commonly known as civil disobedience. Civil disobedience can take a
number of forms, including protests, work stoppages, boycotts, disobeying laws,
and violent acts inflicting physical, economic, or property damage. Civil
disobedience may be used in response to restrictions of one’s
basic civil rights or may be ideologically driven when a law is perceived to be unacceptable to one’s
best interests (Rattner, Yagil, &
Pedahzur, 2001). Finally, the most widely known form of civil disobedience
occurs when one person (e.g., Rosa Parks) or a large group of individuals
(e.g., protests) engage in direct acts of disobedience. However, a newer
channel of civil disobedience is known as electronic civil disobedience(Wray,
1999). According to Wray, such acts might include clogging communications
channels, physically damaging communication cables, and massive e-mail
campaigns designed to shut down government offices and/or services.
Civil
disobedience seems to work best when two conditions are met (Dillard, 2002).
First, civil disobedience is most effective when it is carried out in a
nonviolent and nonthreatening way. So, individuals who engage in peaceful forms
of civil disobedience will have the most persuasive power over others. Second,
the participants in civil disobedience must be willing to accept the
consequences of their disobedience and communicate their suffering to others.
Note that Rosa Parks’s act of civil disobedience where she refused to give
her seat on a bus up for a white passenger met both of these conditions.
The Jury
Room Revisited
Poor Karl!
He never really had a chance, did he? He was caught on the horns of a dilemma.
On the one horn was the judge, a powerful authority figure, telling him that he
must obey the law if the prosecutor proved his case. This was reinforced by the
prosecutor in his closing statement when he reminded the jury members of their
duty to apply the law as provided by the judge. Certainly, in Karl’s
mind the prosecutor had met the burden
of proof outlined by the judge. In comes the second horn that gored Karl when
the deliberations began. He began to face normative and informational social
influence from his fellow jurors. On the initial vote only two jurors sided
with Karl. At this point he had his true partners and he might have been able
to hold out and at least hang the jury if those true partners hadn’t
abandoned him. Eventually, Karl was
left alone facing a majority who tried their best to get Karl to change his
mind. They did this by directly applying pressure via persuasive arguments
(informational social influence) and the more subtle channel of normative
pressure.
As we
know, Karl ultimately decided to disobey the judge’s
authority. He changed his
vote to not guilty. However, consistent with what we now know about social
influence, he was not convinced. His behavior change was brought about
primarily through normative social influence. This is reflected in the
sentiment he expressed just before he changed his vote: He changed his vote so
as not to hold up the jury but he would “never feel right about it.”
Chapter Review
1. What is
conformity?
Conformity
is one type of social influence. It occurs when we modify our behavior in
response to real or imagined pressure from others. Karl, the man cast into the
role of juror in a criminal trial, entered the jury deliberations convinced
that the defendant was guilty. Throughout the deliberations, Karl maintained
his view based on the information he had heard during the trial. However, in
the end, Karl changed his verdict. He did this because of the perceived
pressure from the other 11 jurors, not because he was convinced by the evidence
that the defendant was innocent. Karl’s dilemma, pitting his own internal beliefs against the
beliefs of others, is a common occurrence in our lives. We often find ourselves
in situations where we must modify our behavior based on what others do or say.
2. What is
the source of the pressures that lead to conformity?
The
pressure can arise from two sources. We may modify our behavior because we are
convinced by information provided by others, which is informational social
influence. Or we may modify our behavior because we perceive that a norm, an
unwritten social rule, must be followed. This is normative social influence. In
the latter case, information provided by others defines the norm we then
follow. Norms play a central role in our social lives. The classic research by
Sherif making use of the autokinetic effect showed how a norm forms.
3. What
research evidence is there for conformity?
Solomon
Asch conducted a series of now-classic experiments that showed conformity
effects with a relatively clear and simple perceptual line-judgment task. He
found that participants conformed to an incorrect majority on 33% of the
critical trials where a majority (composed of confederates) made obviously
incorrect judgments. In postexperimental interviews, Asch found that there were
a variety of reasons why a person would conform (yield) or not conform (remain
independent).
4. What
factors influence conformity?
Research
by Asch and others found several factors that influence conformity. Conformity
is more likely to occur when the task is ambiguous than if the task is
clear-cut. Additionally, conformity increases as the size of the majority
increases up to a majority size of three. After a majority size of three,
conformity does not increase significantly with the addition of more majority
members. Finally, Asch found that conformity levels go down if you have another
person who stands with you against the majority. This is the true partner
effect.
5. Do
women conform more than men?
Although
early research suggested that women conformed more than men, later research
revealed no such simple relationship. Research indicates that the nature of the
task was not important in producing the observed sex differences. However,
women are more likely to conform if the experimenter is a man. No gender
differences are found when a woman runs the experiment. Also, women are more
likely to conform than men under conditions of normative social influence than
under informational social influence conditions. Two explanations have been
offered for gender differences in conformity. First, gender may serve as a
status variable in newly formed groups, with men cast in the higher-status
roles and women in the lower-status roles. Second, women tend to be more
sensitive than men to conformity pressures when they have to state their
opinions publicly.
6. Can the
minority ever influence the majority?
Generally,
American social psychologists have focused their attention on the influence of
a majority on the minority. However, in Europe, social psychologists have
focused on how minorities can influence majorities. A firm, consistent minority
has been found capable of causing change in majority opinion. Generally, a
minority that is consistent but flexible and adheres to opinions that fit with
the current spirit of the times has a good chance of changing majority opinion.
A minority will also be more effective when the majority knows that people have
switched to the minority viewpoint; although this effect levels off after three
defections. Additionally, a minority has more power in a face-to-face influence
situation and, in an ironic twist is more likely to be taken seriously when the
minority is small.
7. How
does minority influence work?
Some
theorists contend that majority and minority influence represent two distinct
processes, with majority influence being primarily normative and minority
influence being primarily informational. However, other theorists argue that a
single process can account for both majority and minority influence situations.
According to LatanĂ©’s social impact theory, social influence is related to the
interaction between the strength of the influence source, the immediacy of the
influence source, and the number of influence sources. To date, neither the
two- nor the single-process approach can explain all aspects of minority, or
majority, influence, but more evidence supports the single-process model.
8. Why do
we sometimes end up doing things we would rather not do?
Sometimes
we modify our behavior in response to a direct request from someone else. This
is known as compliance. Social psychologists have uncovered four main
techniques that can induce compliance.
9. What
are compliance techniques, and why do they work?
In the foot-in-the-door technique (FITD), a
small request is followed by a larger one. Agreeing to the second, larger
request is more likely after agreeing to the first, smaller request. This
technique appears to work for three reasons. First, according to the
self-perception hypothesis, agreeing to the first request may result in shifts
in one’s self-perception. After agreeing to the smaller request, you come to see
yourself as the type of person who helps. Second, the perceptual contrast
hypothesis suggests that the second, larger request seems less involved
following the smaller, first request. Third, our thought processes may undergo
a change after agreeing to the first request. The likelihood of agreeing to the
second request depends on the thoughts we developed based on information about
the first request.
The
door-in-the-face technique (DITF) reverses the foot-in-the-door strategy: A
large (seemingly unreasonable) request is followed by a smaller one. Agreement
to the second, smaller request is more likely if it follows the larger request
than if it is presented alone. The door-in-the-face technique works because the
norm of reciprocity is energized when the person making the request makes a
“concession.” The door-in-the-face technique may also work because we do not
want to seem cheap through perceptual contrast or to be perceived as someone
who refuses a worthy cause. This latter explanation is the worthy person
hypothesis. A final explanation for the DITF technique is self-presentation.
According to this explanation, refusing the first request in the DITF procedure
may cause the person making the request to perceive the target as an unhelpful
person. The target agrees to the second request to avoid this perception.
10. What
do social psychologists mean by the term “obedience”?
Obedience
is the social influence process by which a person changes his or her behavior
in response to a direct order from someone in authority. The authority figure
has the power, which can stem from several sources, to enforce the orders.
Generally, obedience is not always bad. Obedience to laws and rules is
necessary for the smooth functioning of society. This is called constructive
obedience. However, sometimes obedience is taken to an extreme and causes harm
to others. This is called destructive obedience.
11. How do
social psychologists define evil, and are evil deeds done by evil
persons?
From a social psychological perspective, evil
has been defined as “intentionally behaving, or causing others to act, in ways
that demean, dehumanize, harm, destroy or kill innocent people” (Zimbardo,
2004, p. 22). Under this broad definition, a wide range of deeds could be
considered evil. Social psychologists have also analyzed the roots of evil.
Baumeister and Vohs (2004) identified four preconditions for evil:
instrumentality (using violence to achieve a goal), threatened egotism
(perceived challenges to honor), idealism (using violence as a means to a
higher goal), and sadism (enjoying harming others). These set the stage for
evil to occur, but it is a loss of self-control that directly relates to evil.
Staub (1989) also suggests that difficult life conditions, cultural and
personal factors, and social-political factors (authoritarian rule) also
contribute to evil.
There is a
tendency to attribute acts of destructive obedience to abnormal internal
characteristics of the perpetrator. In other words, we tend to believe that
evil people carry out such acts. Social psychologists have recently attempted
to define evil from a social psychological perspective. One such definition
says that evil is defined as “intentionally behaving, or causing others to act,
in ways that demean, dehumanize, harm, destroy or kill innocent people.”
Although
it might be comforting to think that those who carry out orders to harm others
are inhuman monsters, Arendt’s analysis of Adolph Eichmann, a Nazi responsible for deporting
millions of Jews to death camps, suggests that evil is often very commonplace.
Those who carry out acts of destructive obedience are often very ordinary
people. The false idea that evil deeds can be done only by evil people is
referred to as Eichmann’s fallacy. Not everyone agrees with this analysis. Calder
(2003) suggests that evil carried out by moral idiots (those doing evil at the
behest of others) may be more banal than evil carried out by moral monsters
(those who conceive and direct evil acts).
12. What
research has been done to study obedience?
Recurring
questions about destructive obedience led Stanley Milgram to conduct a series
of ingenious laboratory experiments on obedience. Participants believed that
they were taking part in a learning experiment. They were to deliver
increasingly strong electric shocks to a “learner” each time he made an error.
When the participant protested that the shocks were getting too strong, the
experimenter ordered the participant to continue the experiment. In the
original experiment where there was no feedback from the learner to the
participant, 65% of the participants obeyed the experimenter, going all the way
to 450 volts.
13. What factors influence
obedience?
In variations on his original experiment,
Milgram uncovered several factors that influenced the level of obedience to the
experimenter, such as moving the learner closer to the teacher. Explanations
for the proximity effect include increasing empathic cues from the learner to
the teacher and cognitive narrowing, which is focusing attention on the
obedience task at hand, not on the suffering of the victim. Moving the
experiment from prestigious Yale University to a downtown storefront resulted
in a modest (but not statistically significant) decrease in obedience as well.
Research after Milgram’s suggests that the perceived legitimacy of authority is
influential. We are more likely to respond to an order from someone in uniform
than from someone who is not. Additionally, if the authority figure is
physically removed from the laboratory and gives orders by phone, obedience drops.
Conflicting
sources of authority also can disrupt obedience. Given the choice between
obeying an authority figure who says to continue harming the learner and
obeying one who says to stop, participants are more likely to side with the one
who says to stop. Seeing a peer disobey the experimenter is highly effective in
reducing obedience. Two explanations have been offered for this effect. The
first explanation is diffusion of responsibility: When others are involved in
the obedience situation, the participant may spread around the responsibility
for doing harm to the learner. The second explanation centers on the
development of a new antiobedience norm when one’s peers refuse to go along with the experimenter. If
an antiobedience norm develops among disobedient confederates, individuals are
likely to disobey the authority figure.
14. Are
there gender differences in obedience?
Although
Milgram’s original research suggested that there is no difference in levels of obedience between male and
female participants, two later studies suggest that males obey more than
females and that among younger individuals there is more obedience to male than
female sources of authority.
15. Do
Milgram’s results apply to other cultures?
Milgram’s
basic findings hold up quite well across cultures and situations. Cross-cultural research done in
Australia, Jordan, Holland, and Germany has shown reduced obedience levels that
support Milgram’s findings, even when the obedience tasks diverge from
Milgram’s original paradigm.
16. What criticisms
of Milgram’s experiments have been offered?
Milgram’s
research paradigm has come under close scrutiny. Some observers question the ethics of his
situation. After all, participants were placed in a highly stressful situation
and were deceived about the true nature of the research. However, Milgram was
sensitive to these concerns and took steps to head off any ill effects of
participating in his experiment. Other critiques of Milgram’s
research suggested that using the graded shock intensities made it easier for participants to obey.
The foot-in-the-door effect may have been operating.
Another
criticism of Milgram’s research was that the whole situation
had an unreal quality
to it. That is, the situation confuses the participant, causing him to act indecisively.
Thus, Milgram’s experiments may be more about how a situation can overwhelm the normal
positive aspects of behavior rather than about slavish obedience to authority.
Finally,
Milgram’s experiments have been criticized for violating ethical standards of research.
Participants were placed in a highly stressful situation, one they reacted
negatively to. However, Milgram was concerned about the welfare of his
participants and took steps to protect them during and after the experiment.
17. How
does disobedience occur?
Historically, acts of disobedience have had
profound consequences for the direction society takes. When Rosa Parks refused
to give up her bus seat, she set a social movement on course. Disobedience has
played an important role in the development of social movements and social
change. Civil disobedience, or the conscious disobedience of the law, is most
effective when it is nonviolent and the individual using it is willing to
suffer the consequences.
Disobedience
may occur when role strain builds to a point where a person will break the
agentic state. If a person in an obedience situation begins to question his or
her obedience, role strain (tension and anxiety about the obedience situation)
may arise. If this is not dealt with by the individual, he or she may break the
agentic state. One way people handle role strain is through cognitive
narrowing. Disobedience is likely to occur if an individual is strong enough to
break with authority, has the resources to do so, and is willing to accept the
consequences. Finally, research on disobedience suggests that there is strength
in numbers. When several people challenge authority, disobedience becomes
likely.
*********************************************
Social Psychology
Third Edition
Kenneth S. Bordens Indiana University—Purdue University Fort Wayne
Irwin A. Horowitz - Oregon State University
Social Psychology, 3rd Edition
Copyright ©2008 by Freeload Press
Illustration used on cover © 2008 JupiterImages Corporation
ISBN 1-930789-04-1
No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or
by any means, electronic, mechanical, recording, photocopying, or otherwise, without the prior written
permission of the publisher.
Printed in the United States of America by Freeload Press.
10 9 8 7 6 5 4 3 2 1
Komentar
Posting Komentar