An article in the New Yorker[1] explains why we tend to not change our convictions, not even when faced with contrary evidence. 

On the one hand, the stakes chosen by writer of the article, Elizabeth Kolbert, 2015 Pulitzer Prize winner, are natural. Faced with such an assertion, one of the first questions is ”Why?” This is all the more the case when we are used to looking for causes before determining the solutions. On the other hand, however, in the spirit of a healthy, rational approach, which is the subject of this article, I must say that the New Yorker article contains a bias that will ultimately sterilize its ability to provide a solution to the presented problem. The bias is summarized in a key phrase that I will return to later.

At the beginning of the article, Kolbert presents two Stanford studies, the first of which was an ample series of studies that proved that people keep the impressions they have already developed despite exposure to evidence showing these impressions are not logical. In the first study, subjects were asked to identify the genuine death notes of suicidal people out of a mixture of several false and authentic notes.

During the first phase, the subjects were lied to about the results. One group was told that they had performed almost perfectly, the other group that they failed lamentably. In the second phase, the subjects found out that, in reality, everyone’s results were similar and they were asked to evaluate their ability to discern between the authentic and the false ones. It was concluded that the subjects in the group who had been previously praised hung on to the feeling that they were superior. In the meantime, the contrary evidence was presented to them. 

In a second study, the subjects received two biographies. One was of an excellent firefighter that took no risks, and that of lousy one who also took none. They were then told that the biographies were false and were asked to write down how an exemplary firefighter should react to a risky situation. Subjects that were given the first biography expressed their belief that risk should be avoided. The ones that were presented with the second biography believed that one should take risks. In other words, the presented biography birthed a generalization that was insufficiently backed up by rational arguments regarding a firefighter’s desirable approach to risky situations. 

A third study indicated that the objective data of certain researchers presented to subjects divided in two groupsone in favour and one against the death penaltywere preferentially received and evaluated by the subjects. Depending on the already-embraced conviction, the subjects appreciated the study that confirmed their belief and rejected the unfavourable study. After being told that the studies were fictitious, researchers noticed that subjects consolidated the convictions they had when they entered the study. 

After this third study, the author formulates the key affirmation that I mentioned in the first paragraph: “If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias.”

The author continues to say that although, evolution-wise, the confirmation bias should have disappeared with time, its presence today shows that reason evolved in order to help us defend ourselves from other people, not from ourselves. This is why we are very good at discovering the weaknesses in other people’s arguments and we are blind to our own errors in judgement. 

In the last part of the article, another study is presented that shows that we rely a lot on expert opinions. Subjects were asked to self-evaluate their level of understanding of some common mechanisms (toilets, zippers, etc.) and then to explain how a toilet works and, in the end, to repeat the self-evaluation regarding their level of understanding of the mechanism. The study not only proved that people are wrong on how much they understand, but has also given us hope, when the subjects lowered their self-evaluated level of understanding of the mechanism at the end of the study.

The author shares the conclusion of the researchers that people have an illusory perspective on knowledge due to their mutual collaboration. If an expert can make something we all use, we somehow get the feeling that we all know how that particular thing works. Then she suggests that this explains why so many people agree with governmental policies. The less they know, the more they tend to agree. The more people support an opinion, the stronger the opinion of one person in the group becomes. 

The article ends on a discouraging note. Although it advocates for the scientific method as the only one able to shield us from confirmation bias, the author reaffirms the difficulty of getting people to change their convictions with scientific arguments. Since the appeal to the individual’s emotions to change them would not be a method worthy of a healthy scientific methodology, the conclusion is that the “challenge” of the presented problem remains. In other words, although now we know why we cannot rationally evaluate our own convictions—namely, because the reason that has developed over the course of human evolution has fulfilled other roles and not this one—we cannot find a way to solve this problem. This conclusion is natural for the article’s argumentation. 

There is, however, a turning point in the search for a solution to the presented problem that would be worth analysing. Kolbert’s position is that if reason has been created, instead of evolving, then we should be able to discern the weaknesses of our own arguments as we do with the arguments of others. The logic of the statement is only assumed, and not obvious.

The way we rationally function today does not automatically say something about the ordinary state of things. What contributed to this conclusion was the author’s evolutionary convictions, based on the contemporary scientific discourse that predicts the conclusion at the end of the article. If we got here by evolutionary means, there’s not much to do. It must be the same evolution, with its formidable slowness, that should solve the case—if it manages to keep up with the speed with which the world we live in is changing.

If not, then, probably, all these traits that are incompatible with survival, that will keep piling up as society changes faster than the human capacity to evolve, will lead us, in the end, to self-destruction. When the characteristics that are not favourable to evolution will be more numerous than those that favour it, the created unbalance will lead to the self-destruction of the human species. We must, however, not worry. Such a process might last millions of years, and in the meantime, our planets may no longer sustain life (before we become capable of colonizing other planets) or we might all become victims of a total nuclear war. The scenarios are so many and so varied that weighing them at this point is not worth the effort. 

A creationistic, Christian perspective is, conversely, not at all contrary to the conclusions of the studies presented in the article. On the contrary, the Bible speaks, from its very beginning, of a tendency to justify oneselfeven from Adam, Eve, Cain and other characters of Bible history onwards. Even Adam’s decision to eat of the forbidden fruit shows that, although he clearly saw his wife’s mistake, he was not able to prevent his own. That is, he could not logically decide what was best to do. Not even Eve could defend herself logically for the decision to listen to the snake. Not even Lucifer, the leader of the rebels, who chose to rebel, has rationally convincing arguments for his decision to abandon perfection for rebellion. All these initial decisions from history of the emergence of what the Bible calls “sin” are irrational and inexplicable. In other words, the Bible recognizes the mystery of the imperfection of our reason, with which God’s creatures were endowed, to always judge soundly.

The Bible calls this “the mystery of sin”the inexplicable way in which someone capable of perfectly healthy judgement can reach an irrational conclusion, no matter how well argued it might be, which they then fully embrace. 

Moreover, the Bible talks about the human being’s continuous degradation after the original sin, and agrees with the article’s idea that humans have become more interested in defending themselves from others than from themselves. Egocentric humans see no reason to fight within themselves. The essence of the biblical message is that humans have even refused to be influenced by God to give up themselves and their harmful beliefs. After millennia of practicing this thought pattern, it is only natural that we are reaping its fruits today. To this we can add a lack of education, in general, and a lack of specific training when it comes to critical thinking.

We now know that critical thinking can be trained and that academic training plays an important role in this. Moreover, one can notice in the lives of many believersfor whom spirituality is not a superstition, but an intelligent and intentional questthat an authentic spiritual experience awakens the individual’s capacity to critically self-evaluate, and to own up to their own mistakes in relation to the divine absolute. This is an essential step for their transformation into authentic Christians. 

What is certain is that, regardless of what someone believes about the origin of the human tendency to overlook their own arguments’ weaknesses, we are left with the state we find ourselves in because of decisions or choices, with a personal or public impact, that are governed by irrational arguments. We see this in the way in which votes are distributed, or the support for certain measures or public policies, the position towards falsely created problems (like the assumed relation between vaccination and autism) or the appeal to different conspiracies that are permanently at hand. This also includes the way in which interpersonal conflicts are not resolved due to a lack of clarity and the acknowledgement of one’s own logical fallacies.

All these are widespread realities of our time. The reversal of these tendencies depends on the strength of an argument to effect a change in the way individuals relate to themselves. We can become aware of the prejudices that impact us. However, practicing such an attitude, that apparently produces more disadvantages than advantages nowadays, can emerge against the background of confidence in the benefits and moral superiority of change. These can become more obvious by means of training in ourselves a spirit of critical thinking, but also through an intelligent religion that provides the human with a different motivation than the evolutionary one. It is preferable to incline towards the ideal for which one was created than to believe, against all odds, that we will soon be able to overcome our deterministic prejudices regarding our own way of thinking. 

Norel Iacob is the editor-in-chief of Signs of the Times and ST Network.

[1]„Elizabeth Kolbert, «Why facts don’t change our minds», The New Yorker, 27 February 2017,”

„Elizabeth Kolbert, «Why facts don’t change our minds», The New Yorker, 27 February 2017,”