Why is it so difficult to change people's minds about important ideas? Why do people believe things in spite of evidence? How can you get people to see what is right before their eyes, but that they fail to see? There are several neural and psychological findings we could draw upon to answer this incredibly frustrating phenomenon, but to put it simply - it is both too risky and too much effort to change one's mind about things the mind considers foundational and important, especially when important in a social context. Why is it so difficult to change people's minds about important ideas?
Why do people believe things in spite of evidence? How can you get people to see what is right before their eyes, but that they fail to see? There are several neural and psychological findings we could draw upon to answer this incredibly frustrating phenomenon, but to put it simply - it is both too risky and too much effort to change one's mind about things the mind considers foundational and important, especially when important in a social context. Knowledge that is merely interesting is relatively easy to change. For example: Q: "What year was the song Khe San released?", "1980", "Nope it was 1978", "Oh, I thought it was a little later than that, oh well, '78 it is". You might wander away thinking, "I was sure it was...", but as it doesn't matter and there isn't much more than a point in a trivia contest at stake, we happily re-calibrate our knowledge. Knowledge that is foundational to my sense of self or how the world is supposed to work is much more difficult. Recent research (pub, Dec 2016), showing the neural correlates of maintaining one’s political beliefs in the face of counterevidence, demonstrates that topics such as abortion rights, same sex marriage, and how wealth ought to be distributed in a society, for example, are really hard to shift. [Kaplan, Gimbel, Harris 2016] Why? Well, you've probably heard about concepts like 'confirmation bias’ (where we see what we expect to see and prefer that which is expected or familiar). Your Facebook feed is a shining, and scary, example of confirmation bias. You get more of what you pay attention to and are interested in. But, that still doesn't explain why people stubbornly refuse to change their minds in the face of extensive evidence. Acceptance of evolution through natural selection is the obvious example - there is more scientific evidence for Darwinian evolution than there is for gravity(!) and yet there are people who simply cannot accept it as fact. Or, for even greater absurdity – the Flat Earthers. That's right, people who genuinely and staunchly believe the earth is flat. It has, as it always has with evolution of species, a lot to do with energy and survival. As we grow, our brains interact with and develop models of the world; learning how it works and how we ought to navigate it. In the formation of these models, we form a ‘sense of self’, as a reflection of or an interlocutor with these models. So, for example, self-described "Liberal/progressives" are typically in favour of same sex marriage, and self-described "Right wing/Conservatives" are typically against same sex marriage. The social/political context in which the question lies makes a difference to the treatment of contradictory evidence. People are significantly more resistant to contradictory evidence and to changing, or even softening, their position when it relates to social or political constructs. The amount of neural real estate, and the more emotionally relevant that real estate, the more resistant our brains are to change (which makes obvious sense when you consider the brain to be a physical network of connections - the connections need to change, the more connections, the more effort required to change). It seems that, because so much of the ‘self’ is interconnected with core social and political ideas, a challenge to an idea requires a challenge to much of the world model the brain relies on to navigate the world. The brain loves consistency - it relies on it. When you put your foot down, your body expects to find the ground. When it doesn't, there is a problem. When your brain interprets an idea; for example, the statement, "Same Sex Marriage is a moral improvement." This stimulus is compared to your brain's internal model of the world. This model has grown with you – or, more accurately, the growth of this model IS you. Some people's brains hear this stimulus and there is congruence, so their brains’ response will be: "D'uh, what's the problem?" Other peoples’ brains experience significant incongruence with established and important models, so their brains’ response will be: "No! This does not match. This requires a significant re-think and re-wire of the neural networks. That is going to cost energy, time, change in my social context." It is easier, therefore, to dismiss, undermine and otherwise ignore evidence or argument. The same would be true, in reverse, if the statement was, "Same Sex Marriage is not a good idea for society." I am no evolutionary biologist; however, we could speculate on how this problem developed. Firstly, we need to remember that contemporary humans have got it very easy; we've got so much consumable energy that one third of western populations are overweight. It is easy to forget that things were very scarce and scrounging for enough to eat was our full-time job. Brains that habitually re-think well established and usable knowledge are wasting energy. More directly though, as evolved social apes, being 'in the group' was critical to survival. Your ‘sense of self’, is tied to your sense of the tribe you are part of, complete with normative views and behaviours. Your social beliefs, how the tribe ought to be run and your place in it, matter. You don't want to risk ostracism, you rely on the group, therefore, you rely on thinking 'like the group you're in.' This means that challenges to your social/political beliefs are potentially life threatening. (This paragraph is my speculative assessment of where this phenomenon emerged from. I'd welcome comments from any who know better, or are prepared to guess better. Now, back to the scientifically-validated material.) Recent research (cited above), using fMRI, has demonstrated increased activity in the amygdala when evidence contradicting established firm social/political views are presented. This does not occur when 'non-political' ideas are contradicted. The amygdala is, among other things, the trigger for your 'fight/flight' response. So, challenges to our social/political views are "threatening". Further, in earlier research, ideas that we strongly disagree with reveal activity in a region of the brain that is associated with generating the emotion of disgust (i.e., the same area that fires when you smell bad food). So, challenges to our well established views are "disgusting", hence the scrunched-up nose response when someone says something we consider preposterous. In other words, your brain considers threats to your important social/political (and religious) beliefs as potentially life threatening and quite literally 'disgusting' [Harris, Sheth, Cohen 2008 “Functional Neuroimaging of Belief, Disbelief and Uncertainty”]. Is it any wonder that political panel shows resemble a blood sport at times? Is it any wonder that the audience get frustrated and ask, "Why can't they collaborate, or maturely state their arguments, without descending into ad homonym?" Importantly, for workplaces, this is true of other foundational beliefs. For example, in leadership, "having a good work ethic." How many bosses do you know that measure productivity by the HOURS you're at work, rather than the OUTPUT from your work? Beliefs about 'work ethic' and 'showing-up' confuse the objective of ‘getting stuff done’. So, how CAN you change peoples’ minds? Michael Shermer, in a recent article in "Scientific American" (a concept which I hope does not become an oxymoron under a Trump Presidency) offers these steps:
The neuro-psychotheraputic approach, which is what I practice, would concur with the above steps. Most important is the sense of safety. The brain of the person you’re talking with needs to feel safe to be open enough to allow in the inconsistent information to be incorporated into their worldview. Even so, this does not always work, but then not much does, and it is far more effective than applying the traditional tools of blame, guilt, fear and judgment. Let's hope our political and workplace leaders are up to it. Changing minds is as simple and as difficult as building neural networks from the bottom up and the inside out. It requires safety – long-held beliefs need to be replaced with logically-consistent ideas that build within the mind. If you want to change someone’s mind – attacking, judging and blaming them for “thinking wrong” is dumb. Argue your case, defer to evidence and logical consistency, and be open to adjusting your own position based on new evidence. Comments are closed.
|
|