Plato may have been one of the first to think this way, but in modern sociology it was Walter Lippmann who made history with the idea that people do not have access to reality in all its complexity, but operate on images of that reality that they construct for themselves.

In all of human history, it is hard to find a time when this truth seems more evident than in the digital age of the internet and social media. There is a lot of talk these days about the ‘bubbles’ created on Facebook, but in reality the technology has only exacerbated the ‘bubbles’ we were already living with in our minds without realising it.

Around 50 million people worldwide suffer from epilepsy, one of the most common neurological disorders today. Before an epileptic seizure, which is caused by uncontrolled electrical activity of nerve cells in a particular part of the brain, people with epilepsy can have strange experiences. Some may hear music or voices, others may smell burning rubber, and others may see something called an aura. The neurosurgeon Wilder Penfield, who pioneered the technique of electrically stimulating the brain during surgery on awake patients to get a feel for the functioning of certain areas of the brain, has observed that when the brain is electrically stimulated, patients experience the same phenomena associated with epileptic seizures. His observations, and those of many others, show that neural activity can create false experiences that we are convinced are part of reality.[1]

But even if all our senses are intact and our brains are functioning within normal parameters, we still don’t have direct access to reality in all its complexity, even though almost every day we might put our faith in our superior ability to analyse and memorise reality exactly as it is.

The illusion of seeing everything

One of the illusions we live with is that we think we perceive the world around us instantly and effortlessly. In fact, by measuring the perception time—that is, the time it takes from touching an object to realising that we have touched it—researchers have found that it is longer than expected. The processes that take place in the brain based on information from the senses take about 100 milliseconds before the visual representation of the object appears in our mind.[2] We also believe that when we look at something, we perceive 100% of what is in our field of vision in detail (unless we are visually impaired). In reality, we only see in detail and colour what hits the centre of the retina, the fovea centralis, where the clearest visual image is formed. Beyond ten degrees from the centre, we see only light and shadow, but because our eyes are constantly moving, we do not perceive any spots without colour or detail.

Apart from these biological specifics, there are other reasons with much more serious implications behind the idea that we don’t perceive everything we see, even though we are sure that we do. Christopher Chabris and Daniel Simons conducted an experiment that has gone viral. The researchers made a short video of two teams of students, one dressed in white and the other in black, passing a basketball to each other. They then asked volunteers to count the passes made by those in white. The final question, “Did you notice the gorilla?” surprised many. Busy counting the passes, they didn’t notice that at one point someone in a gorilla suit enters the scene, stops in the middle, looks at the camera, thumps his chest and leaves the scene through the other side. In total, he spends about ten seconds in the video.

The experiment has been repeated under different conditions and with different audiences, from different countries, social classes and IQ levels, but the results are always the same: almost half of the people don’t see the gorilla, even though researcher Daniel Memmert from the University of Heidelberg used software that tracks eye movements on the screen to show that those who said they didn’t see the gorilla looked at it for a full second, exactly as long as those who said they did see it.[3]

Those who pass the test tend to see this as confirmation of superior ability or intelligence, but the conclusion does not hold. There are other explanations for the phenomenon. The brain works with two processing systems that are constantly operating together. Behavioural psychologists give them different names, but basically they are automatic processes and controlled processes. Most of what we do is automatic, and most of the time we are not aware of it. Controlled actions are those that require thought, take place in stages and are at the centre of awareness. Planning to fly to a holiday destination involves conscious thought, but almost everything we do along the way is automatic.

Social psychologist Jonathan Haidt explains in The Happiness Hypothesis that while automatic processes run in parallel and are very good at handling multiple tasks at once, controlled processing is very limited—we can only consciously think about one thing at a time,[4] such as counting passes between basketball players. This is not to say that automated processes don’t allow people to see the gorilla; it just means that they haven’t brought it into consciousness, except in some people. 

The gorilla study dramatically illustrates the powerful influence of the illusion of attention, because nobody is prepared to believe that they would miss something important even if they were paying attention to something else.

 

In 1997, Kenny Conley was sentenced to three years in prison for failing to stop and help a police colleague who was being beaten. Conley was chasing a suspect when he drove right past his colleague. Conley claimed to the end that although he passed his colleague, he simply didn’t see him, but as expected, no one believed him. After the trial, one of the jurors admitted: “It was hard for me to believe that with all the chaos, he didn’t see something.”[5] Conley himself had told the jury that he should have seen his colleague and could not explain the situation.

Later, a researcher interested in the case asked Conley to do the gorilla test. Conley did a good job of counting the passes and also saw the gorilla, which in no way helped his case. The researcher was surprised by Conley’s reaction: even though he had been condemned because others could not understand how he could pass a colleague without noticing him, Conley himself could not understand how the other test participants could not see what was so obvious to him. This showed the researchers just how visceral and dangerous delusions about one’s own ability to perceive and understand can be.

Chabris and Simons point out that this kind of inattentional blindness has more potential negative consequences today, not because we have more limitations than our ancestors, but because we are exposed to a much more complex world in which many more things demand our immediate attention. This makes us far more prone to making mistakes. Yet, paradoxically, because we are exposed to so much more information, we are more confident that we are judging more correctly than ever before.

The illusion of correct memory

If the illusion described above leads us to believe that we have accurately perceived reality even when we haven’t, the illusion that we are accurately remembering what we have seen only exacerbates the problem. In another study by Chabris and Simons, 47% of respondents were confident that once they had witnessed an event and formed a memory of it, it would not change. An even higher percentage, 63%, believed that “human memory works like a video camera, accurately recording what we see and hear so that we can go back and inspect them later.”[6]

The reality is different. When we perceive something, our brain does not record a faithful and perfect copy in detail, like a photograph, but extracts meanings from what we see, smell or hear and associates them with things we already know. These associations, made by the brain’s automatic processes, help us to identify what is most important to remember and also act as signals that make it easier to access memories when we need to recall something. But the same associations can lead us to the wrong conclusions, especially if we have an inflated view of our own memory’s accuracy, the researchers warn.

What do we select from the world we come into contact with, and what do we remember? “Our canons determine greatly what we shall perceive and how,”[7] said the American sociologist Walter Lippmann, and most psychologists today tend to agree. The process of memory formation is one in which we make sense of what we see and experience, and this interpretation determines both what we remember and the emotional charge attached to those memories, which can change over time as we develop personally.

In a dramatic demonstration of this principle, psychologists William Brewer and James Treyens conducted a simple experiment. Subjects were led into a waiting room where they were asked to wait for a short time for the test room to clear. Once in the test room, they were asked to write down a list of things they noticed in the waiting room, which appeared to be a normal waiting room. All the subjects wrote down ordinary things, such as a desk, chairs, and shelves. But 40% of them said they also saw books or filing cabinets. The problem is that this room was unusual precisely because it lacked books and filing shelves. This shows that in the process of memory formation, an image was built based on both reality and expectation—on what was expected to be found in the room.[8] If we also think about personal experience, we can certainly conclude that memories change with us over time. Each time we revisit something, the memory is reassembled like a jigsaw puzzle, which may integrate new pieces or remove old ones, according to our expectations of what the memory should look like, expectations that may or may not change over time, but of which we are often unaware.

“Our canons determine greatly what we shall perceive and how” (Walter Lippmann).

According to psychologist and behavioural economist Daniel Kahneman[9], the mind, which constructs a narrative about the past, tries to make sense of events, and when something unexpected happens, there is an immediate adjustment of the worldview to accommodate the surprise. Learning from surprises is normal and desirable, but it can have dangerous consequences if we fail to recognise the limitations of the human mind in reconstructing a previous state of knowledge, opinions and beliefs that have changed. Once we adopt a new worldview, we immediately lose much of our ability to remember what we previously believed and the reasoning behind it, but we are not aware of this loss, Kahneman explains.

As a result, we tend to underestimate how surprised we were by past events. We may even try to pretend that we knew about them beforehand. Baruch Fischhoff was one of the first psychologists to demonstrate hindsight bias in the 1970s. Prior to President Richard Nixon’s visits to China and Russia in 1972, Fischhof conducted a survey in which respondents had to guess the probability of 15 diplomatic outcomes of these state visits. After Nixon’s return, Fischhof asked respondents to recall the probabilities they had assigned to each event. The results were revealing. For the events that had actually happened, people exaggerated the initial probabilities, while lowering the probabilities for events that had not occurred.[10]

Another direct consequence of hindsight bias is that actions that seemed prudent before a disaster can appear irresponsibly negligent in hindsight. The worse the aftermath, the more biased we are in hindsight, warns Kahneman[11] and while these tendencies have the positive effect of encouraging risk aversion, they also provide undeserved rewards to those who took risks and came out scot-free. Leaders who are lucky are never punished for taking a big risk, but are assumed to have had flair and the ability to anticipate success, and people who doubted them seem, in retrospect, weak and unworthy of their success. But the even more insidious effect of hindsight bias is not that some people get credit for foresight they didn’t have, Kahneman says, but that it imposes the idea that the world is more discernible than it is. “The core of the illusion is that we believe we understand the past, which implies that the future also should be knowable, but in fact we understand the past less than we believe we do,”[12] the psychologist concludes.

The illusion of correct interpretation

Sociologist Walter Lippmann formulated his theory on the forming of public opinion and the difference between the real world and the images in our heads long before all these discoveries in psychology and behavioural economics. As early as the 1920s, he postulated that people do not take as facts what they are in reality, but what they perceive them to be, resulting in a mismatch between actual reality and the pseudo-environment—that is, the reflection of that reality, a true interface between a person and their physical environment of existence, of which they are not aware.[13]

First of all, we are not equipped to understand the whole of our environment in a unified vision and in the nuances that an accurate representation requires. In short, we are limited, and there is nothing demeaning in admitting this. Secondly, people rarely have access to unmediated reality, and more often than not something has reached them through at least one filter of interpretation, so that they can apply other filters to it before passing it on. And the tendencies towards misinterpretation that have been identified so far are overwhelming.

Most significantly, each of us relates to a situation in the spirit of the culture and time to which we belong, with the mental structures that one or another cultural space forms in us. Stereotypes prefigure our response, as a filter that makes a selection before the information comes to be analysed.[14] Over time, reliance on stereotypes as absolute truths has shown its limitations and negative consequences, and in our globalised culture, stereotype is a “bad” word. What we need to understand is that stereotypes cannot disappear completely. Automatic processes in the brain work by grouping things into categories. That’s how we think about horses, dogs, refrigerators, and vacuum cleaners, by storing in memory one or more representations of “normal” members of each category, explains Kahneman.[15] It’s only when these categories are formed in the social domain that they are called stereotypes. And while some are wrong, hostile, and need to be corrected, we must understand that we cannot avoid stereotypical thinking altogether. However, being aware of it can help us avoid its pitfalls.

Another age-old tendency is to exaggerate. A positive story is glorified, a negative story takes on elements of horror, and so on. In the online world, or rather in the “attention economy” where everyone is fighting to get the attention of as many people as possible for as long as possible (to go viral), this tendency itself is taken to extremes. It goes hand in hand with the tendency to generalise, to believe that the latest information we are exposed to describes phenomena that are more common than they really are. At the same time, we tend to believe information that sounds familiar at the expense of new information, even if the familiar information is not true. The fact that we recognise it in a small corner of our minds leads us to remove certain filters through which we would pass new information (as when customs officers don’t check people they know as carefully as they check people they don’t know). That is why one of the most effective and popular strategies used in marketing and politics to persuade people of something is the frequent repetition of simplistic messages.

Speaking of politics, you may have noticed that if you like the politics of a particular leader, you tend to like the “whole package” of that leader, including the voice and the physical presence. This is the halo effect, and it applies to people as well as situations. It’s a kind of emotional generalisation. If we go to a party and feel uncomfortable, we’ll say the whole thing was a failure. If we dislike a person for no particular reason, we’ll say they’re obnoxious. Contradictions disrupt the clarity of thought and feeling, and automatic processes try to keep thinking simple and coherent by exaggerating the consistency of evaluations, in the following pattern: good people only do good things, and bad people only do bad things. We therefore tend to avoid contradictions. The statement “Hitler loved dogs and little children” is shocking no matter how many times we say it, because it violates our expectations, imposed by the halo effect, that there can be no evidence of goodness in an evil person, Kahneman explains.[16] Therefore, we tend to believe it is a lie, even when we see evidence to the contrary.

Last but not least, we are guilty of the tendency to always want to be right. Contrary to the rule of testing hypotheses by proving their absurdity, people naturally tend to look for data that confirms what they already know. It’s part of the strategy of thinking fast and keeping a clear head. Social psychologist Jonathan Haidt explains in The Righteous Mind that even our moral judgements are often the result of automatic processes of which we are unaware, that we tend to give the answer first and then look for a rational explanation to fit it in.[17]

Theoretically, one might think that in this day and age, with so much access to information of all kinds, we would have escaped this habit to some extent, but this is far from the case. Book sales on amazon.com show that right-wing people only buy right-wing books, and left-wing people only buy left-wing books. This means that they only get information from one side and remain divided into different camps. Social media algorithms, whether deliberately or not, reinforce this tendency, grouping us into information “bubbles” where we are exposed to information that matches our interests. Applying this to over two billion people has led to an unprecedented polarisation of public opinion around the world. As Walter Lippmann said almost 100 years ago, we live in the same world but feel like we’re in different worlds.[18]

Jumping to conclusions

We all believe that we are capable of seeing correctly what is before our eyes, of remembering correctly past events, of understanding the limits of our knowledge, of determining correctly cause and effect. And, by and large, we are. The human body has miraculous internal mechanics that keep us alive, that help us build real and virtual communities, that help us improve our lives and advance technologically. But it doesn’t do this because of all the logical and biological fallacies I mentioned above, but in spite of them. And it doesn’t do it in a linear and steady way, but with a cyclicity of human catastrophes that we can’t seem to escape.

In theory, even conventional economics is based on the assumption that we are rational beings who, in everyday life, are able to correctly evaluate all options and make the best decision. In reality, in everyday life, we need to be constantly reminded not to judge people by their appearances, to put money aside, to eat properly, and other things that prevent us from making mistakes on autopilot. Even when we realise that something is wrong, we are extremely resistant to change. The illusions we talked about above affect us every time we talk on the phone while driving, convinced that we are still paying attention to the road; every time we suspect someone of lying when they can’t remember something; every time we choose the most confident and charismatic person as our leader. These illusions affect our daily lives so much that our patterns of irrational behaviour no longer seem accidental, but downright systematic as we repeat them over and over again. We are irrational in a very predictable way, concludes psychologist and behavioural economist Dan Ariely.[19] Indeed, the belief that we are irrational rather than rational was at the root of the 2008 financial crash—Wall Street banks got rich on the backs of people who couldn’t resist taking out bad loans.

Life is increasingly complex, with more and more forces acting on us simultaneously, and we are more and more caught up in our own narratives and less and less able to understand each other and come to a shared understanding of the reality in which we live. It is hard to escape the gut feeling that the information available to us, which confirms what we already believe, is all there is to know. With this information we build our best worldview, and if it’s a good enough story, we’ll believe it.

It is difficult, but not impossible, to question the intuitive belief that we know ourselves very well, that we know how our minds work, that we know why we do what we do, that we know how those around us think and work, and that we have learned from the past and are ready for the future. If history proves anything, it proves the opposite. Whether we look at it as consumers, business people, political leaders, or as parents, siblings, spouses, and friends, understanding how predictably irrational we are gives us a starting point for improving our judgement and decision-making and changing our lives for the better. For all of us.

Eliza Vladescu is a communications specialist and was previously part of the ST Network permanent team. She currently works as an online communications consultant.

Footnotes
[1]“Chris Frith, ‘Making up the Mind. How the Brain Creates our Mental World’, Blackwell Publishing, 2007, online edition.”
[2]“Ibid.”
[3]“Christopher Chabris and Daniel Simons, ‘The Invisible Gorilla and Other Ways our Intuitions Deceive Us’, Crown, New York, 2010, online edition.”
[4]“Jonathan Haidt, The Happiness Hypothesis, Basic Books, 1 December 2006.”
[5]“Christopher Chabris and Daniel Simons, op. cit.”
[6]“Idem.”
[7]“Paul Dobrescu, Alina Bârgăoanu, ‘Mass media și societatea’ (Mass media and society), comunicare.ro, Bucharest, 2003, p. 43.”
[8], “Christopher Chabris and Daniel Simons, op. cit.”
[9]“Daniel Kahneman, ‘Thinking, Fast and Slow’, Farrar, Straus and Giroux, New York, 2011, online edition.”
[10], “Ibid.”
[11]Ibid.
[12]Ibid.
[13]“Paul Dobrescu, Alina Bârgăoanu, op. cit. p. 41.”
[14]“Ibid, p. 43.”
[15]“Daniel Kahneman, op. cit.”
[16], “Ibid.”
[17]“Jonathan Haidt, ‘The Righteous Mind: Why Good People Are Divided by Politics and Religion’, Vintage, 12 February 2013.”
[18]“Paul Dobrescu, Alina Bârgăoanu, op. cit., p. 44.”
[19]“Dan Ariely, ‘Predictably Irrational’, Harper Collins, 2009, online edition.”

“Chris Frith, ‘Making up the Mind. How the Brain Creates our Mental World’, Blackwell Publishing, 2007, online edition.”
“Ibid.”
“Christopher Chabris and Daniel Simons, ‘The Invisible Gorilla and Other Ways our Intuitions Deceive Us’, Crown, New York, 2010, online edition.”
“Jonathan Haidt, The Happiness Hypothesis, Basic Books, 1 December 2006.”
“Christopher Chabris and Daniel Simons, op. cit.”
“Idem.”
“Paul Dobrescu, Alina Bârgăoanu, ‘Mass media și societatea’ (Mass media and society), comunicare.ro, Bucharest, 2003, p. 43.”
“Daniel Kahneman, ‘Thinking, Fast and Slow’, Farrar, Straus and Giroux, New York, 2011, online edition.”
“Paul Dobrescu, Alina Bârgăoanu, op. cit. p. 41.”
“Ibid, p. 43.”
“Daniel Kahneman, op. cit.”
“Jonathan Haidt, ‘The Righteous Mind: Why Good People Are Divided by Politics and Religion’, Vintage, 12 February 2013.”
“Paul Dobrescu, Alina Bârgăoanu, op. cit., p. 44.”
“Dan Ariely, ‘Predictably Irrational’, Harper Collins, 2009, online edition.”