I love ideas. I always have. But I often wonder, is it worth spending a significant amount of time arguing for ideas that I consider to be true and important? After all, what is the point? Can ideas really change the world? Most people seem to not be interested in ideas. Even fewer seem to be open to change their mind.

I have spend a lot of time in my life trying to engage with other people’s ideas. With decades of experience, I now have to conclude that the return on this investment is very poor. It is very difficult to change the minds of people as long as they like an idea and have an investment in it.

As a consequence, my enthusiasm for arguing with people has greatly diminished. There are only two reason why I continue engaging in ideas at all. While I have not changed the minds of many people, engaging in ideas has at least improved one mind, which is my own. I like to believe that the constant exploring of theories has given me a better understanding of the world, which ultimately improves my life. The other reason I continue to argue is that I simply enjoy testing ideas. It is a geeky passion of mine, even though it is one that not many people seem to share.

However, it can be frustrating to discover over and over again that most people have no interest in correcting false believes they hold. This only happens under rare circumstances.

The good news is that in principle, the human mind is very capable of being rational. When I say rational, I mean that we are not really capable of holding two contradictory viewpoints at the same time. Contradictions always have to be resolved one way or another. Humans are therefore not free to easily believe whatever they want.

That means that if I can get someone to listen to the argument, and he understands the argument, then I have a chance that he will eventually change his mind. It is not guaranteed that he will, but there is a chance.

The reason why it is not guaranteed is, because humans are also capable to lie to themselves. We can deceive ourselves that what we want to believe really is the truth, and that the critics are wrong. It is true, being rational is a fundamental feature of our mind. Accepting a lie, therefore, does not come easy. It takes a lot of effort to make it work.

That is true for every lie, not just the ones we tell ourselves. For example, if I wanted to come up with a believable lie for being late, not every deception would equally work. If I said “sorry I am late, but shortly before I wanted to leave, I got a phone call that I had won an Oscar for a cat video I posted on facebook, and I had to quickly pick it up before I came”, no one would believe this. It simply makes no sense.

A more believable lie would be, “sorry I am late, but just before I wanted to leave I got a phone call from my mother that my dad had to go to hospital, and I had to discuss the situation with her”. This lie works, because it seems entirely possible and it would be mean to lie about this.

Both excuses can be equally false, they can both be lies. However, the second one is an effective deception while the first one would never work. A lie needs to be a believable story. And it is only believable if it has some anchor in the real world.

Because it needs to be anchored in the real world, one could be tempted to conclude that humans are entirely rational. They are constantly testing their ideas if those conform with reality.

But that would be a false conclusion. As already discussed, we are certainly capable of being rational, no doubt. That, however, does not mean that whenever we come across something that contradicts a theory that we consider to be true, we will automatically change our minds. The human mind has developed a number of tools that allows it to work around such contradictions. These deceptions are the reason why it is so difficult to change people’s minds.

Let us go through some of these tricks that allow us to believe what we want to believe and not change our minds if we don’t want to.

One very simple tick is to create a contra fact to a fact that is inconvenient to our believes. An easy way to do that is to have a group of people disagreeing with the unwanted data point. Humans are not just rational, but we are also herd animals. We have an inherent assumption that our herd cannot be wrong. This is not in itself irrational, as the herd probably is more often right than wrong. However, the herd can be wrong.

Let’s say we have two things, a theory and an observable fact in the real world that contradicts that theory. In addition, we have another fact that the group we trust continues to believe that the theory is correct, despite it being in contradiction with the observable fact. Now we have two facts that we can test our theory with. One is an observable fact that contradicts it and the other is the fact that the group thinks the theory is correct.

Suddenly, we have a reasonable choice. We can now decide which one of these facts is more important. We have created the possibility to neutralise the inconvenient fact. It might seems strange to some iconoclastic people, but the vast majority tends to find the herd more believable than even undisputable simple facts. This has been shown in experiments.

In the 1950th Solomon Asch performed an experiment in which a person had been sad in a room with a group of other people. All the others were part of the experiment, a fact that the only participant was not aware of. The group was shown two pictures. One had a line of a certain length drawn on it. The other had three lines A, B and C drawn on it that all had different length. One of the lines, line C, was clearly visibly of the same length as the line in the first picture.

All members of the group where then ask to decide which line in the second picture had the same length as the one in the first picture. While the answer to this question was clearly line C, all the members of the group that were part of the experiment pretended that it was B. The goal was to test whether the real participant of the experiment would go with the group answer or with the judgement of his own eyes. The result was that the vast majority of the participants ended up agreeing with the group instead of their own eyes.

This, remarkably, shows that most people even go with the herd when it is very obvious that the herd is wrong. In addition, this is not even an attractive theory. It is a very dull theory that could be easily rejected. And yet, most people still prefer the herd to their eyes.

Some people might object to this, saying that the participant probably did not really believe that B was the answer. They just went along to not be punished by the group. This, however, is missing the point. Of course, their eyes told them that C was the correct answer. But at the same time they also clearly believed that all the people in the room could not have made a collective mistake like this. There are, therefore, two contradictory facts here that can be easily understood. Consequently, there is no way to get out of this situation without some confusion. But we can choose which confusion we find more convincing. Confronted with such a choice, most people assume the herd must be correct.

Again, it is not necessarily irrational to prefer the herd over an observable fact. After all, it is entirely possible that we are missing something. Following the herd seems to have been a good survival strategy, otherwise we would not have this inherent bias. However, it shows that simply being rational does not necessarily lead us to the truth. The fact that people are rational is not enough to convince them of new, better ideas. To the contrary, rationality can be an effective tool to lead us astray.

If an idea is attractive, it is not difficult to create a group of people that will support each other in the believe that the idea is correct. The fact that this group approves of the theory then becomes an effective counter argument to neglect other facts that seem to contradict the theory. While in the experiment participants ended up in a group that had an opinion that seemed strange to them, in the real world, we can actively decide to join groups that hold opinions that we like. By making the choice of joining such groups we can effectively have some control over what we believe. We have used our capacity for rational thinking to deceive ourselves.

Be honest to yourself. Just like me, you probably also have often tried to seek “like minded” people. It feels good to find a herd that supports one’s favourite believes. Why do we have this instinct bias? Wouldn’t it be a better strategy to actively try to surround ourselves with people that we disagree with, just to test if our believes are correct? Yes it probably would, and some of us might even have done this. But we are not doing that instinctively. It is a lot more fun to find like minded people.

In other words, most people, if not all, chose to create echo chambers. Shaping our environment so that we do not come in contact with inconvenient facts is a powerful strategy to deceive ourselves.

However, this is not the only trick we have to deceive ourselves. Obviously, having to ignore contradictory facts can still leave a foul taste with many people. Yes, the group gives us a warm comfort that our beloved theory is correct, but it would be even nicer if those contradictory facts were not there at all. There are a number of other ways to attack the disturbing facts.

The most simple one is to not listen to them. Most people don’t seek to listen to others who they disagree with. Instead, they tend to filter their information sources according to the ideas they like. That is why we have so many different media outlets. People prefer the ones that promote their world view. Anyone who is trying to run a business selling information knows that the best, and really only, way to run a business like that is to tell people what they want to hear. People don’t pay for critical information, they pay to be confirmed in their believes.

If, for whatever reason, we do come across someone making an argument we don’t like, we have other tools available to save our preferred deceptions. One effective one is to devalue the argument by devaluing the person making the argument. This person can be portrait as either stupid, disingenuous or even evil. As a consequence, even though we cannot immediately see the flaw in the argument, we can justify to ourselves that the argument has to be false as it was made by a bad person.

If that is not enough, the devaluation of the person can lead to a more affective avoidance of the argument in the future. The person who is making the argument can be hindered in various ways to not make the argument again. Censorship can come in various forms. A critic can either not be given a platform, can be punished to deter him and others from making the argument again, or in the most extreme case, can be killed to be certain that he is silenced.

Most people don’t seem to see anything wrong with silencing critics. Their rational thinking does not seem to go far enough to conclude that maybe instead of silencing someone, they should ask themselves whether they can learn something from the critic. After all, if they cannot, then why are they scared of a nutter?

Silencing critics is a constant phenomenon throughout human history in all known societies. It is the absolute exception that there are some societies which have made it a principle to leave critics alone. Those have been more successful societies if I may add. However, even the most liberal societies have routinely prevented critics from finding a platform.

It has shocked me to see that the liberal democracies, who all have freedom of speech as a legal principle, and who I thought had moved to a cultural consensus of allowing everyone to speak, have in a very short time moved towards embracing censorship again. This just goes to show how much most people prefer an attractive lie to an unattractive truth, and are happy to accept a deception. Our tools are very good to keep the attractive lie alive. Some deceptions, like a lot of religious teachings, have survived throughout millennia that way.

Does this mean that ideas cannot change the world? No not at all. It is a historical fact that the dominating ideas of a time have changed constantly. This change clearly came through some people arguing for new ideas. Although, one could also argue that the change we have seen is just a change of which deception we use to justify very similar false ideas. It is the whole history rhymes phenomenon. For example, no matter how often socialism fails, we don’t seem to be able to kill this totally false idea. However, the form of how socialism has been justified has change many times. The most current form is climate change, and it won’t be the last.

Still, since I am attracted to ideas, I like to think that some progress in ideas has been made. Of course, it could very well be that I like the idea so much that I have deceived myself into believing a falsehood. I tend to seek out people who are interested in ideas.

To be fair, I have seen ordinary people changing their minds over time responding to my arguments. It is rare, and it usually takes many years, but it has happened. In my experience, however, people need to be motivated to change their minds. The people that changed their minds where either part of my herd in some form or were actively interested in ideas. There is no point in going into a church (a church just meaning any group of believers in a theory) and confront them with uncomfortable facts. You won’t change anyone’s mind and, for the reasons stated above, you might even get yourself into trouble.

Surprisingly, the most dogmatic church of all these days is probably academia. This group of people is often smart and has therefore created very sophisticated deceptions that they wholeheartedly believe in. They have created a system in which every critic is removed and in which their whole livelihood depends on them being right about their theories. But they have done that in a way that they can still pretend to be at the forefront of truth seeking. In addition, they can easily devalue any critic that is not part of their profession as not worth listening to. Having mastered all the tools of deception, it is in academia where we can find people truly believing in theories that are so detached from the real world, it boggles the mind. Don’t waste your time with these people.

Instead, let people come to you. It often has consequences to ignore reality. A lot of deceptions eventually fail and lead to problems that are too large to ignore. When that happens people become open to arguments and seek to listen to critics. Then, and only then, one has a chance of convincing them. Unfortunately, even then a lot of people just seek to find a different deception for the ideas they like. That is why socialism is still popular no matter how often it fails and what form it takes.

There is also a small group of people who are genuinely interested in truth and ideas. These people are willing to change their minds. Over long periods of time, this will change ideas and can change them for the better. However, since most people believe what is attractive and less so what is true, changing the world for the better with ideas is a very slow process with very little return on investment. One has to really love doing it to keep on doing it.


Follow Liberating Thoughts on substack to not miss out on updates: https://liberating.substack.com/