How To Change A Mind

Peter Holmes
15 min readApr 27, 2020

--

1. Beliefs

Talking with people about important subjects can be frustrating. People seem to believe what they believe, and it’s hard to tell them anything different. And yet across time and across cultures, beliefs tend to change dramatically. In the late 1600’s various American communities came to believe in the existence of witches, and used it as a reason to burn many (innocent) people alive. Until 1870, people didn’t believe in germs, much to the chagrin of Ignaz Semmelweis, who discovered germs decades earlier and was mocked for advising doctors to wash their hands.

In 2020 America we no longer believe in witches, but we do believe in germs, and freedom, and eating animals, and capital punishment. Obviously not everybody believes in those things, but enough people do, to where our laws allow them.

Granted, it wasn’t until the Enlightenment that people started to believe that the government and its laws should represent what people mostly believe (Democracy). But even an authoritarian system reflects a form of consensus belief, because dictators do not govern in a social vacuum. For example the Holocaust did not happen because of one man alone- Germans across all levels of society bought into the belief that Jewish people were inferior, supported Hitler’s rise to power, and fought a world war.

All of which is to say that our beliefs, individually and collectively, are critically important. Because they become our reality.

2. What Is A Belief?

Beliefs are a construct of human cognition. More specifically, they are patterns or abstractions that our brains use as a tool to help us understand and navigate the world. For example, I know (strong belief) that if I get stuck under water too long I will die. So whenever I’m swimming, my mind leverages that knowledge to form decisions about my actions, making sure I come up for air, and keep some land within reasonable distance.

The concept of drowning is a very precise example though, because beliefs can be formed about anything, and at different levels of abstraction. Racism is the belief that certain types of people are inferior. Religion is the belief that life was endowed by a specific creator and for a specific reason. Both of these beliefs operate at a much higher level of abstraction, and are thus more controversial and varied across cultures (the necessity for breathing is universally agreed upon).

Higher level beliefs also have nested beliefs within them. For example, I believe that food, shelter and healthcare should be guaranteed to all people. But the belief that we should provide for everyone contains the nested belief that we are able to provide for everyone. If a global famine struck, and we no longer had enough food, I would be forced to form a new belief about how our food should be distributed.

3. How Are Beliefs Formed?

Beliefs are formed from information consumed and processed by our brains. For example if my neighbor informs me it will rain tomorrow, I’ll form an assumption (weak belief) that it will rain tomorrow. But if I then check my weather app and it’s predicting beautiful, sunny days all week, I now have conflicting information. To resolve this, my brain will apply some processing (aka. intelligence), weighing the credibility of my neighbor versus my app, and then form a revised belief about tomorrow’s weather: it’s likely to be sunny.

The notion of credibility is thus quite important in the process of forming beliefs. If someone tells you something and it’s wrong, their credibility is damaged. If they continually prove correct, their authority expands. As an example of exceptionally strong authority and credibility, young children rely on their parents almost exclusively for information about the world, not to mention their survival. So if your Dad tells you at age 7 that the world was created by a white man named God, you are likely to incorporate that into your core beliefs about how the world works (aka. your worldview).

4. The Credibility of Consensus

Credibility increases dramatically when someone is in a position of social authority, due to the implied social contract. If many others view them as credible, what are the chances everyone is wrong? (Note the nested belief: if they were wrong, someone would have said something). The Weather app on my phone is used by millions of people, so placed up against my neighbor, the app easily won the credibility battle because of its inherent social authority, and the assumed professional rigor that goes with it, meaning a team of trained meteorologists deploying advanced scientific techniques and technology. The same applies to anyone in a position of authority, such as major media figures (eg. Anderson Cooper), major government officials (eg. Colin Powell), renowned academics (eg. Noam Chomsky), decorated economists (eg. Jerome Powell), etc. And also locally with Preachers and Pastors, Mayors, Teachers, Coaches, etc.

That said, it’s also commonly believed that power is prone to corruption (fake news!). So I’m not saying is that any person promoted into prominence is trusted by others. Rather, each individual evaluates credibility from within their own belief system, and the belief system of an individual is a complex tapestry of information, sources, lived experiences, and internal calculation.

But as social animals, it is in our nature to give credibility to authority figures because of the implied social consensus. And that credibility is amplified to its strongest power when multiple authoritative voices confirm beliefs in parallel, feeding into the powerful influence of “consensus reality”, otherwise known as common sense. Common sense is a powerful bit of psychology because unlike more controversial beliefs, something deemed common sense isn’t even worth hearing evidence against- anyone or anything that disagrees can just be written off. When Hitler made his argument for the Jewish people as inferior, it wasn’t just him saying it, it was his officials, and his news outlets, and his scholars, and his scientists. The power of consensus belief became a strong force of psychological manipulation for the German people to overcome, and most did not, which is why they fought.

They were believers.

5. The Emotionality of Truth

Human biology and psychology has evolved over millennia to meet one fundamental goal: survival. And beliefs were an important part of that evolution. Beliefs are mental tools that help us survive a challenging and unpredictable world by synthesizing complex sets of information into actionable truths and heuristics. If your tribe was hit by a drought, your belief about where to find water meant the difference between life and death. Not surprisingly then, many beliefs are generated by fear as part of our primitive survival instinct. Much of the xenophobic and racist belief systems can be traced to people’s fears that those who are different pose a threat. And much of the religious belief systems can be traced to a fear of the unknown, where it is far preferable to have faith (belief) in a preset collection of rules and explanations, than to face a world so full of uncertainty, with nobody watching over you.

Except this creates a very uncomfortable situation for us humans. Because we know our beliefs are supposed to be truth based, and yet they are heavily based on emotion and fear. So we end up forming beliefs that might not be true but are useful emotionally. And that’s all fine and dandy, up until life confronts us with facts and evidence that directly contradict our belief. This is known as “cognitive dissonance”, or the uncomfortable feeling we get when provided information that is difficult to reconcile with our beliefs. You love your father and yet he is abusing you. You trust your leader and yet he is lying to you. Cognitive Dissonance happens to all of us, because our beliefs were never more than helpful generalizations and patterns based on emotion.

So our brains have adapted a handy solution: just throw it away. Like a piece of trash you don’t need, just disregard the evidence, or discount it. The following passage from Hitler’s infamous Mein Kampf describes this remarkable bit of psychology, and also speaks to the force of credibility and consensus:

All this was inspired by the principle — which is quite true within itself — that in the big lie there is always a certain force of credibility… It would never come into their heads to fabricate colossal untruths, and they would not believe that others could have the impudence to distort the truth so infamously. Even though the facts which prove this to be so may be brought clearly to their minds, they will still doubt and waver and will continue to think that there may be some other explanation. For the grossly impudent lie always leaves traces behind it, even after it has been nailed down, a fact which is known to all expert liars in this world and to all who conspire together in the art of lying.

It’s an amazing phenomena- when presented with clear evidence contradicting a core belief, people will find some way to simply disregard the new information. Cognitive Dissonance is not a comfortable process though (basically mental vomiting), so as humans we try to avoid it and surround ourselves with confirmations rather than contradictions. In the age of information we now live in, this commonly takes the form of watching only particular sets of news/information sources, only following certain people on Twitter and Facebook, and generally leveraging the vastness and diversity of the internet into a giant tool for confirmation bias. In the words of Jeff Bezos:

“I think social media is unfortunately increasing identity politics, and tribalism. I think the internet in its current incarnation is a confirmation bias machine. If you have a going in point of view, and you go do some searches, you find confirmation of your point of view.”

Taken together we now see the full challenge of trying to change a mind. We aren’t merely up against a debt of information. As just one person talking to another, one must also negotiate the rather weak credibility of an individual voice, a lifetime of confirmation bias, and the emotionality of truth, meaning an instinctual fear of change.

6. You Probably Can’t

The unfortunate conclusion here is that you probably will not be able to change someone’s mind, no matter how strong/credible your evidence. Especially if it’s a subject that relates to a core belief system. Any facts you bring to bear will be discounted, any clarity of logical argumentation will be convoluted, and the most likely outcome is that the person will get angry and leave. Subjects like religion and politics are simply too emotional for most people to discuss rationally.

Some people are much more open minded than others though, which I suspect is because they have incorporated rationality directly into their belief system. And that provides their views much greater flexibility, because the nested belief is that they don’t really know anything unless it’s proven logically. A good example is the famed Galileo, who was willing to contradict an entire society of people insisting that the sun revolves around the earth, because the science driven conclusion of Copernicus was quite clear to him. For people like Galileo, changing their mind is simple: provide credible evidence.

Everyone likes to think that they’re like Galileo, but the reality is that most people are like the folks that put him in jail.

7. If You Decide To Try

If you are still determined to give it a shot, which I commend you for, then I offer the following advice:

• Learn to Identify Good Faith & Bad Faith Argumentation

In the world of Law, Good faith refers to an earnest attempt at understanding and resolution, whereas bad faith refers to those only seeking only to promote or defend their own viewpoint. In my experience most people will actually weave in and out of good and bad faith positions throughout the course of a discussion, but if you learn to identify when your subject shifts into bad faith arguments, you can try to reset things by taking a step back and redefining the goals of the discussion (settle them down).

• Identify the Goal Belief

Conversations almost always jump around between subjects, and at various levels of abstraction, which creates a confusing mess. But the goal is to change their mind on a particular topic, not the entirety of who they are as a person, so it’s important to single out the belief/concept you want to address and focus on it. For example state it clearly, multiple times, and indicate you want to explore the basis for it in more detail.

  • Externalize the Belief

It also helps to put as much distance between them and the belief as possible, so rather than stating, “You believe X”, use non-possessive language: “It’s an interesting question of whether X is true. So let’s examine it in more detail.” You could even try writing it down, which provides both focus and externality.

• Provide New Information, Delicately

Changing minds is all about providing new information that allows for rewiring individual beliefs, and potentially entire belief systems. So although it will likely not suffice in itself, be sure to at least put forward the new information you possess which contradicts their existing belief.

• Seek Credibility and Corroboration on Their Terms

The more credible your new information is, the better chance it will be accepted, so try and source your information to people with lots of credibility. However establishing credibility can be difficult when so many different authorities can be leveraged on both sides of any argument (ie, the internet as a confirmation bias machine), so bashing someone with arguments from respected people who they don’t personally like will only aggravate them. Rather, ask them what types of sources they consider credible, and if possible, seek to find corroboration of your goal belief from someone within their own sphere of credibility.

• Empathize & Reassure

As much as possible try and let your subject know that the goal belief has nothing to do with their worth as a person, and changing it poses no threat to them personally. For example build up your subject: “I know what a critical thinker you are.” Or tone down the implied challenge: “I can see how people might take both perspectives on this.” Or reassure them: “Either way it’s ok for friends not to agree on everything.”

• Accept Failure and Stay Patient

It’s frustrating to give up an argument where you have the facts on your side. It feels like folding a winning hand. But it’s important to remember the emotional nature of truth, and to accept that when emotions start to ride high, the rules of the game shift, to where you can actually lose with a winning hand- lose friends. So sometimes the best thing you can do is to hit the pause button, let emotions subside, and then approach the subject later.

8. The Ministry of Truth

As a single individual you have influence on those around you, but it’s quite limited. You just don’t have the power to compete. However that equation would change dramatically if, say hypothetically I gave you control of a large national TV property, such as Fox News. Now you have a 24hr channel to support your position like a drumbeat, by bringing on experts, and cherry-picking stories, over and over, day after day, week after week. That would carry significant influence and change a lot of minds. With control over the entire channel you could create your own little consensus reality, and convince a lot of people into it. And all the more so if you stoked people’s emotions and fears, incorporating themes like nationalism, national security, and xenophobia.

But, it wouldn’t convince everyone. Lots of people would dismiss your network as biased and click over to CNN, or the turn the TV off entirely and gulp down the New York Times. Serious journalism. So now let’s extend our imaginary exercise further- let’s imagine I was good friends with Arthur Sulzberger, owner of the New York Times. And also John Stankey, head of Timer Warner (which owns CNN), and Mark Zuckerbeg, CEO of Facebook. And also the folks running Comcast, Disney, Viacom and the small number of extremely large media companies that control every local news channel and almost every national media outlet in the United States.

If you had all those folks around a table, willing to do what you wanted, what do you think you could make people believe? What would be the extent of your ability to manipulate the masses? The answer is quite a lot. You could probably make everyone believe just about anything you wanted. And just as importantly, you could do whatever you wanted, and nobody would know about it.

Not to mention, the more outlandish your lies the better, because “in the big lie there is always a certain force of credibility.”

Can you imagine?

Ok, end of imaginary exercise, time to end your reign as the Ministry of Truth. Although to be fair, media in the United States has been consolidated into the hands of just a few companies in the modern era. That part is true. And we have been lied to by the entire national media before- for example about Weapons of Mass Destruction in Iraq. But there’s no way the entire media is operating in a coordinated manner to influence our beliefs, right? That’s simply not possible, right?

9. The Reality Based Community

In 2004, a prize-winning journalist named Ron Suskind reported a fascinating quote from a senior official in the George W. Bush administration, who is rumored to have been Karl Rove, but preferred to remain anonymous. In Ron’s words:

The aide said that guys like me were ‘in what we call the reality-based community,’ which he defined as people who ‘believe that solutions emerge from your judicious study of discernible reality…That’s not the way the world really works anymore,’ he continued. ‘We’re an empire now, and when we act, we create our own reality. And while you’re studying that reality — judiciously, as you will — we’ll act again, creating other new realities, which you can study too, and that’s how things will sort out. We’re history’s actors…and you, all of you, will be left to just study what we do’.

They create their own reality? I don’t know about you, but I find that passage deeply troubling, because it suggests that our American media and social systems have been entirely compromised by established power. Although, assuming that was true, would we have any way of knowing it or proving it? And if we were somehow able to prove it, what use would that be if you couldn’t get anyone else to believe it?

How could one person on the internet citing various blogs and news articles possibly overrule Anderson Cooper and the millions of other “highly credible” people who implicitly contradict that notion by going on TV every night and not talking about it? (Or occasionally weighing in only to laugh at the suggestion… recall the power of consensus reality- it requires no defense because it is already true.)

The answer is that if that happened, you really couldn’t do anything about it. But frankly I’m not sure it even matters whether our media has been compromised or not, because regardless of the answer, the one thing that we know is that the structure of our information system is already broken. Specifically:

• Our media has been almost entirely consolidated into a handful of companies control

• Within those organizations, editorial choices are made by specific individuals with no transparency into the process

• Information from sources outside the major media operators is highly limited in scope of influence

Given this structure, regardless of your opinion on whether “freedom of the press” has been compromised, logically speaking the current system makes it almost inevitable that full corruption either has happened or will inevitably happen at some point in the future. So the one thing we should all agree on is that our systems of information need to be reformed.

But how?

10. Change the Information, Change the Minds

The solution is just as obvious as the problem itself: decentralize and democratize our information systems.

  • The problem is that our information has become centralized, meaning a small number of people control what we see
  • The solution is decentralization, meaning allowing everyone to collectively decide what everyone else should see

Just as our forms of government have slowly evolved into the concept of representing what everyone in a society believes, our systems of information must also follow that track, by democratizing the flow of information. Thankfully we live in an era of innovation and new capabilities driven by the digital revolution, where interactive media and decentralized concepts like Bitcoin and the internet itself are proving decentralization more feasible than ever before.

As consumers we must seize on this technological opportunity and strive to end passive news and information consumption. We must demand that our information platforms provide the tools for interactivity and democratization of content:

  • Crowd Curation of Content (Voting content up/down)

Don’t use apps with individuals choosing what you see!

  • Crowd-Sourced & Curated Feedback: (A Comment Section with votes)

Don’t read articles without comment sections! (It’s just one viewpoint)

  • Support Algorithms that Combat Confirmation-Bias

Don’t use apps that create tribalism- demand tools for diversity

Transparency (Access to all hidden or deleted content and Community driven rules on content management)

Don’t be fooled by platform manipulation- demand transparency!

  • Accountability (Those running an information system must be accountable to those using it)

These are our systems!

Truth is, we are the ones who hold the power.

We just don’t know it yet.

--

--

Peter Holmes
Peter Holmes

No responses yet