The Psychology of Compliance

How The Complicit Center Enables Atrocity

A group of people representing societal segments

Most people, most of the time, act decently towards each other. Psychologists call this "prosociality." Those who do not behave this way are broadly classified as "antisocial." And yet, human history is frequently punctuated by genocides and other atrocities, generally perpetrated by large groups of regular people against smaller groups of other regular people. The Holy Inquisition, the Witch Trials, the Holocaust, the Chinese Cultural Revolution, the Cambodian Genocide, the Balkan Wars, and countless other horrifying events are chracterized by the mass acceptance of antisocial behavior - destruction, torture, and murder - as normal, and even desirable.

Several research experiments were conducted around the middle of the 20th century, in an attempt to shed light on how everyday folks become complicit in atrocity. The common thread in all of them has been the corrosive influence of authority, both on those who wield it, and on those who obey it.

The Milgram obedience experiment
The Milgram Obedience Experiment

After WWII, an American researcher named Stanley Milgram set out to determine whether red-blooded Americans were capable of doing what the Germans had done: commit murder with the excuse that they "were just following orders." His theory - and that of his colleagues - was that the vast majority of Americans would not do what Germans had done.

To test this hypothesis, Milgram set up an elegant experiment, in which a paid volunteer was told that he would be participating in a study on learning and memory. The volunteer's role was that of "teacher," and another volunteer (actually an actor) would be serving as the "learner." The learner was strapped into a chair equipped with electrodes, and was supposed to memorize lists of words. If he made a mistake, the teacher was instructed by the scientist running the experiment to administer increasingly severe electric shocks. As the scenario progressed, the learner complained more strenously about the shocks, then begged to be released from the experiment, and ultimately (after a purported 300 volt shock) fell completely silent. Shockingly (pun intended), roughly 65% of the volunteers continued to administer shocks even after the learner had fallen silent, and was seemingly either unconscious or dead. As with the German soldiers, the rationale given by the volunteer teachers was that the scientist conducting the experiment had told them to continue.

Milgram's experiement has been repeated with various populations and in different countries, and the results have remained fairly consistent. About two-thirds of the subjects will obey blatantly evil instructions from an authority figure.

The Asch Paradigm

While Milgram's experiments investigated the degree to which average people would follow orders that violated moral norms, Solomon Asch researched a more subtle form of influence. In his experiment, a volunteer was told that he would be part of a group participating in a study of visual perception. Seated together in a room, the group was shown a pair of cards: one card had a reference line, and the other had three lines of varying length. The question was, which of the three lines was the same as the reference line? Unbeknownst to the volunteer, all the other people in the room were confederates of the researcher. After giving the correct answer a few times, these confederates would begin unanimously selecting the wrong line. Asch ran several variations of the experiment, and found that approximately one third of the subjects would conform to the group - that is, they would pretend to agree with the incorrect answer in order to fit in.

The Stanford Prison Experiment

Philip Zimbardo, a professor at Stanford University, decided to study the question of authority from a different direction. He recruited a group of college students, randomly assigned them to be either prisoners or guards, and then set up a "prison" on the Stanford campus. In short order, both prisoners and guards began to display profound psychological changes, with guards displaying sadistic behavior even though they knew the whole scenario wasn't real. The whole situation became so traumatic that Zimbardo had to terminate the experiment early. Lord Acton, the British historian, famously commented "Power corrupts. Absolute power corrupts absolutely." Philip Zimbardo proved that this was true.

Conformity vs. Compliance

In discussing obedience to authority, it's important to distinguish between conformity and compliance. As defined by Nicholas Pollis and Herbert Kelman, conformity is true agreement with something. This is "I support the current thing" because either you believe in it ("internalization") or because you identify with the people telling you to support it ("identification"). Compliance on the other hand is going along with the current thing, because you either want to gain a reward (such as social status) or avoid a punishment. Then, of course, there is non-compliance - what Milgram called "defiance," which is actively opposing the current thing.

What the Asch and Milgram experiments taught us is that about 1/3 of the people will go along with the current thing, no matter how wrong it is, 1/3 will oppose it, and the remaining 1/3 will probably comply if they're told to by an authority figure.

This means that it is on the ambivalent 1/3 of the population that history rests. The zealots at either end of the spectrum are highly unlikely to change their minds, so the goal of policy-makers is always to persuade or coerce the ambivalent center to join the fold.

It's always easier to convince someone when yours is the only voice they hear, which is why censorship and propaganda have been used in every regime from ancient to modern. In Asch's experiments, the presence or absence of even one dissenting voice profoundly destabilized the social influence. This concept may be familiar to anyone who is familiar with the children's story, "The Emperor's New Clothes." As long as everyone agrees that a lie is true, it works. But if one person points out that it is false, the spell is broken. This is why, in today's world, the full force of AI and global technology is focused on silencing anyone naive or recalcitrant enough to point out that, far from wearing a magnificent suit of fine clothes, the emperor is butt-ass naked.

If, through censorship, propaganda, influence, and brute force, the authorities can bring the ambivalent center into compliance, it is then easy to destroy the non-compliant 1/3 of the population. As Hermann Goering said, "The people can always be brought to the bidding of the leaders. That is easy. All you have to do is tell them they are being attacked and denounce the pacifists for lack of patriotism and exposing the country to danger. It works the same way in any country." Indeed, it always works like a charm. Mel Brooks joked that "Nobody expects the Spanish Inquisition," but this is wrong. So long as power is concentrated in the hands of any group, they will find a way to turn that power to their own benefit, and to the destruction of others.

The emperor - every emperor - has no clothes, and that is why we must always expect the Spanish Inquisition.