Moral Choices Emerge In The Moment

Imagine being one of the Capitol protesters on January 6, 2021. Imagine you’ve been convinced by right-wing pundits that the election was indeed stolen from Donald Trump (despite dozens of Trump-appointed federal judges reviewing the claims and concluding there’s no evidence). Now imagine that the loud but peaceful protest you’ve been involved in grows violent. 

The barricades nearby are being pushed aside and Capitol police officers are being forced to retreat. You are faced with a split-second decision.

Do you move against the flow of your compatriots and back away from the Capitol and the burgeoning riot? Or do you go along with your compatriots and join the riot?

If you are a good imaginer, then you can see the difficulty of the dilemma. If you are a bad imaginer, you probably blithely conclude “Oh, I would never join a riot.” But the Milgram experiments, and the Stanford Prison experiment, have shown us that even people who might not think they are violent will nonetheless often go along with others who are encouraging violence.

Many of those pro-Trump political activists advertise themselves as “good Christians” even though they clearly violated the Bible’s teachings that day. They weren’t thinking about “Do unto others as you would have them do unto you,” when they were spraying bear spray into the faces of Capitol police officers or striking them in the head with flagpoles.    

A recent psychology experiment by Ben Falandays and colleagues, published last month, delves into the millisecond timing of difficult moral dilemmas like this. Inspired by a 2015 eye-tracking experiment by Philip Pärnamets and colleagues, Falandays had participants respond to moral quandaries like “Is murder ever justifiable?” by mouse-clicking one of two response options on the computer screen, such as “Sometimes justifiable” and “Never justifiable.” 

However, what the participants didn’t know was that the computer mouse was hacked to move slightly faster in one direction than the other. 

On each trial of the experiment, the computer software randomly selected one of the response options to be its own “preferred target.”  Then the software biased the computer mouse to move slightly faster in the direction of that response option. 

If your computer mouse was biased, could it alter your decisions?

Thus, when the moral quandary was a difficult dilemma for the person, they tended to move their mouse slowly toward the midpoint between these two equally-attractive response options as they mulled over this arduous decision. Any brief accidental leaning toward one direction or the other was magnified if it was in the direction of the computer’s “preferred target,” and not magnified if it was in the direction of the other response option.

As the participant sees their own mouse cursor adventitiously drifting ever so slightly in the direction of one of the response options, they may feel as though their arduous decision is finally coming to fruition. However, in a handful of instances, this was clearly due to the subtle bias in the computer-mouse, not due to the participant’s internal decision-making processes. 

Instead of participants choosing the computer’s “preferred target” 50% of the time, as random chance would predict, they chose it 52% of the time. The effect is small but statistically reliable – with similar effects being reported in related eye-tracking experiments (Falandays & Spivey, 2020; Ghaffari & Fiedler, 2018; Pärnamets et al., 2015).

Thus, perhaps you shouldn’t think of a choice you make – even an immoral one that might involve violence – as involving a decision process that takes place entirely inside your brain. You don’t really have a “moral compass” brain region that performs those decision-making processes all by itself. Rather, perhaps you should think of a choice you make as something that emerges in the moment, resulting from brain processes, body processes, and environmental processes. The biases (or specific timing) of events in your environment, relative to mental events in your brain, can occasionally sway your moral choices to go one way or the other. 

Now, if we return to that imagination of being one of the Capitol rioters, perhaps we can see how a person who thinks they have good intentions can find themselves doing obviously bad things. I am most certainly not trying to say that people who get swept up in the fray of group think are not responsible for their actions. They most certainly should be held accountable for their actions. But at least now we can have a little more insight into how those actions emerge in the moment. And this helps open up the possibility of tracing some of the environmental causes that also bear some portion of the responsibility.

Moral of the story: When you can, choose carefully what environments you place yourself in. For those who cannot, civil society needs to better safeguard its environments for promoting good moral decisions.

Michael J. Spivey, Ph.D.