Back in the 1960s in a Yale University basement, some interesting and controversial experiments were carried out by psychologist Stanley Milgram. His findings showed people would inflict pain on another person purely because someone in a position of authority told them to.
So how easy is it to convince good people to do bad things?
According to a new study, it depends on how much control that person feels over his or her own choices, and it is that “sense of agency” that affects the way the brain processes the outcome of those actions.
Researchers have taken Stanley Milgram’s classic experiments one step further, and can now provide new evidence that might help to explain why people are so easily coerced.
According to the study by researchers from the University College London and Université Libre de Bruxelles in Belgium, when someone gives us an order, we actually feel less responsible for our actions and the painful consequences.
Patrick Haggard from the University College London said in a statement:
“Maybe some basic feeling of responsibility really is reduced when we are coerced into doing something.
“People often claim reduced responsibility because they were ‘only obeying orders.’ But are they just saying that to avoid punishment, or do orders really change the basic experience of responsibility?”
Haggard and his colleagues searched for the answer to this question by measuring the phenomenon called “sense of agency.” This is the feeling that a person’s actions have caused by some external events. For instance, Haggard explains if you flip a light switch and a light comes on, you often experience those events as being nearly simultaneous, even if there’s a delay.
Milgram’s experiments involved student participants asking a series of questions, and then delivering increasingly painful electric shocks for wrong answers to an unseen individual.
Regardless of the controversy over the ethics of Milgram’s methods, many students showed signs of emotional distress even as they followed orders. Milgram’s experiment showed that a majority of the participants were willing to follow a command even when it went against their own judgment.
Haggard and his colleagues have determined that when coerced into taking an action that adversely affects another person, individuals experience reduced agency, altering their perceptions of cause and effect.
In the new study, which was published in the Cell Press journal Current Biology, the researchers conducted a series of experiments. Firstly, an “agent” would deliver mild physical pain or financial harm to a “victim,” a decision that was either pressured or made freely.
The participants then trade places, so they would know exactly what kind of harm they were inflicting. In the second part of the study, researchers analyzed what the effects of the “coercion” and “free-choice” conditions on brain activity.
According to EurekAlert!:
“The researchers report that coercion led to a small but significant increase in the perceived time interval between action and outcome in comparison to situations in which participants freely chose to inflict the same harms.
“Interestingly, coercion also reduced the neural processing of the outcomes of one’s own action. The researchers concluded that claims of reduced responsibility under coercion could indeed correspond to a change in basic feelings of responsibility — not just attempts to avoid social punishment.”
Haggard says it would be interesting to find out whether some people more readily experience a reduced sense of agency under coercion than others.
“When you feel a sense of agency — you feel responsible for an outcome — you get changes in experience of time where what you do and the out-come you produce seem closer together.
“Fortunately for society, there have always been some people who stand up to coercion.”
Being told to do something morally questionable is no defense for indefensible behavior. But this study at least helps to explain why people are so willing to cause harm, simply because an authority figure told them to.
Learn more about the Milgram experiment by BigHistoryNL :