The misinformation crisis affecting older adults is real, well-documented and growing. Here’s what the studies say about how to tackle it.

At every family gathering, there comes a point when one of our parents tells an unbelievable story about a politician or a natural remedy that doctors are supposedly keeping secret. We quickly assess the situation: last time we said something, the discussion dragged on until dessert and nobody enjoyed themselves. Should we say anything under these circumstances?

Millennials have the greatest power to change how their parents relate to misinformation because their parents trust them the most. However, many millennials make incorrect assumptions about their parents’ reasons for believing fake news, which is why their efforts to help are not very successful.

Are parents of millennials less intelligent because of their age? Are they less able to recognise a lie? The answer is more uncomfortable than we might expect.

What are we up against?

One of the most important studies on the impact of online networks on the spread of misinformation was conducted in the US following the 2016 presidential election. At the time, researchers found that, around the election period, Facebook users aged over 65 shared almost seven times more misinformation than those aged under 30. On Twitter, the problem appeared to be concentrated among a small group of “super-sharers,” who were responsible for around 80% of shares, and users aged over 50 were overrepresented in that group. Nearly a decade later, a RAND survey of almost 1,000 Americans aged 55 and over found that one in three checked social media daily for political news—a constant diet of algorithmic content where engagement, not accuracy, is the ultimate value.

Medical misinformation poses a particular danger. A study published in Digital Health showed that older adults with chronic conditions are extremely active consumers of medical content on Facebook. Unfortunately, the same research shows that much of what older adults find on Facebook is simply false. Separately, an analysis of 200 articles about cancer shared on social media found that one-third contained misinformation, with false articles consistently generating more engagement than accurate ones. The algorithm rewards content that spreads easily, regardless of whether it is true.

The paradox that makes the issue truly interesting is that the inability to recognise lies is not the key to understanding this phenomenon. A meta-analysis of 31 studies, published in PNAS at the end of 2024, found that older adults demonstrate a greater ability to distinguish true news from false news than younger cohorts do under neutral evaluation conditions. They know, in theory, what a lie looks like. However, this theoretical knowledge is imperceptibly influenced by elements that have only a tangential connection to reason.

We have difficulty following what we know to be good or true when what we know to be good or true conflicts with what we want to be good or true. Contemporary research into our preference for things we like rediscovers, in empirical terms, a concept that Christian anthropology identified long ago: knowledge is not neutral. What we know is always filtered through what we love and fear losing, as well as the culture to which we adhere. This describes the human condition in general, not just the older generation. This is precisely why any solution that ignores the moral and relational dimensions of the problem will be incomplete.

Carol’s journey

In 2019, researchers at Meta (the company that owns Facebook, Instagram, and WhatsApp) created a test account. They named it “Carol Smith”. Carol was described as a conservative mother from Wilmington, North Carolina, who was interested in politics, parenting, and Christianity. She had never expressed an interest in conspiracy theories. Within two days of creating the account, Facebook’s algorithm was recommending QAnon-dedicated groups.

The researchers titled the resulting document “Carol’s Journey to QAnon”. This document was among the thousands of pages of Facebook’s internal documents shared with Congress by whistleblower Frances Haugen. The conclusion was unequivocal: the platform’s recommendation engine was systematically radicalising some of its users, not because they were seeking out radical content, but because the algorithm had learned that outrage, fear, and conspiracy narratives generated more engagement than ordinary posts. In other words, the platform’s incentive structure and the spread of misinformation were not separate phenomena. They were one and the same phenomenon.

Carol Smith isn’t a real person. However, millions of real people—many of whom are parents and grandparents—follow the same path as Carol every day on platforms that have been aware of this problem for years yet have largely refused to address it. This is the environment in which our parents live. Judging them for the content they encounter online is akin to blaming someone for drowning after being pulled under by the currents.

Of course, the algorithm did not invent the needs it exploits. The need for community, the need for meaning, and the need to know the truth are profoundly human needs which are recognised as legitimate by Christian tradition and even as traces of the image of God in humanity. Yet the platform captures these real and legitimate needs and redirects them towards profitable content. From an ethical perspective, this is an act of instrumentalising the individual, a matter that Christianity treats with the utmost seriousness, regardless of the terminology employed.

The real culprit is not cognitive decline

For years, the popular explanation for older adults’ susceptibility to misinformation was cognitive decline. A decline in analytical ability caused by ageing was said to responsible for their greater susceptibility to emotional manipulation. This perspective was condescending and, as it turned out, false.

A more convincing explanation emerged from studies such as that by Ben Lyons, a communication professor at the University of Utah, whose research was presented at Harvard University’s Shorenstein Center in January 2026. Lyons identified congeniality bias as central: the tendency to accept information that confirms our existing beliefs and reject information that challenges them.

Older adults are no less analytical. They are simply more partisan. Their partisanship has had several decades to solidify into a mental conviction that, from the inside, is indistinguishable from knowledge.

“Older adults tend to rely more on prior knowledge, as a rule, as a general finding, to reduce cognitive load. But their prior knowledge, based on this consistently stronger partisanship, at least in the political domain, is more likely to be politically biased.”—Ben Lyons, University of Utah, Harvard Shorenstein Center, January 2026

A study of almost 2,500 adults in the United States and Brazil, published in the Journal of Experimental Psychology in 2025, found the same thing. The older the respondents were, the more partisan they became, and this partisanship distorted their assessment of evidence. This effect was observed in both countries, regardless of the respondents’ political affiliation. The research also found no evidence that older adults thought less analytically; they simply applied their analytical skills to confirm their existing beliefs.

However, partisanship isn’t the only force that distorts judgement. Another study shows that, when repeated enough, even false information begins to appear true. Older adults whose news feeds have delivered the same misinformation repeatedly for years have become saturated with false beliefs. Researchers at the Harvard Kennedy School have documented that this prior-exposure effect is particularly pronounced among older news consumers. Not only were they misled the first time, they also became resistant to corrections because they have integrated the false information as memories.

The obvious solution doesn’t work

Faced with an unending stream of false information aimed at their parents, millennials typically respond by providing targeted corrections backed by fact-checking and insistent reaffirmation of the truth. However, this strategy often fails, since correction is essentially perceived as a threat to identity. When misinformation shared by a person confirms the beliefs of their political group, informing them that it is false is perceived as a social threat at a neurological level.

Recent research has consistently found that corrections can paradoxically reinforce false beliefs, especially when the belief is partisan. Even when a correction succeeds in changing a person’s stated belief, they continue to think using the data from the initial misinformation. The brain does not fully update itself; it does not close the old file.

Christian tradition has recognised this irony for much longer than cognitive psychology has. However justified, confrontation rarely brings about change; instead, it tends to put the other person on the defensive. Thinking is not transformed by winning an argument or prevailing in a debate, but by cultivating relationships and trust. Galatians 6:1 articulates a principle that is as relevant today as it was in the first century when the letter was written: Whoever restores a fellow believer must do so with a gentle spirit, because the way you speak the truth is an integral part of the truth.

The current scale of misinformation demonstrates why this call for gentleness is a moral imperative with implications in areas that are completely unexpected, such as international security. In 2024, the World Economic Forum ranked misinformation as one of the most serious global risks. There are too few fact-checking organisations, and an MIT study showed that false information spreads six times faster than true information on social media. Such an imbalance cannot be offset by isolated corrections.

A 2024 RAND survey explains why. Older adults distrust the press, health authorities, and academia—precisely the institutions most likely to provide an accurate assessment of reality. This suspicion has been systematically cultivated by political actors and media ecosystems with a financial stake in perpetuating paranoia. Under these circumstances, institutional corrections are often not perceived as clarifications, but as further evidence of a conspiracy.

This is why the only people who can break through this wall of distrust are often those within personal relationships: children and grandchildren who have earned credibility through closeness and love that no fact-checking platform can match.

What millennials can really do

Interventions backed by real empirical evidence have one thing in common: they all focus on building a relationship rather than winning an ideological battle.

Start with curiosity, never with corrections!

The most consistent lesson emerging from research on digital literacy across generations is that questioning and challenging are more effective than confrontation. Rather than telling your parents they’re wrong, help them form a new habit of asking a question before hitting the share button. This may seem a small difference, but this habit will build a process whose effects far exceed the impact of a passing conversation.

When your father reads a suspicious headline aloud, try not to say, “That’s not true.” Instead, ask, “Where do you think they got that number from?” or “I wonder who runs that website.” These types of questions, which do not attack the statement, are most helpful in shaping a thinking habit that, repeated often enough, will become second nature.

Focus on the principle!

One of the most significant findings in the field of misinformation research is that psychological inoculation (building immunity in advance), is far more effective than correcting the situation afterwards. Researchers at Cambridge have developed a concept called “prebunking”: exposing people to manipulation mechanisms before they encounter the actual misinformation. This approach works because, although misinformation can cover countless topics, it almost always uses the same few tools: creating a sense of urgency, forcing false choices, stirring up emotions, mimicking consensus, and pointing to a convenient scapegoat. When you learn to recognise these patterns, you become harder to manipulate, regardless of the subject.

The model has already been tested on a large scale. Jigsaw, the Google unit researching ways to combat misinformation, distributed prebunking materials to hundreds of millions of users via YouTube ads. A randomised field experiment published in 2022 showed that a single exposure increased participants’ ability to identify manipulative content. The effect was noticeable across different languages, cultures, and political contexts.

In conversations with parents, for example, it is more useful to talk about the mechanism than to get bogged down in an argument about vaccines, elections, or whatever conspiracy theory is doing the rounds. It’s better to show how the text works than to just explain why it’s wrong. For example: “Have you noticed that articles like this always create the feeling that imminent danger is approaching and that you have to react immediately? This is a manipulation technique. The sense of urgency is intentionally created to cut off the reflex to stop and check.”

This is, in fact, the point. You’re not asking your parents to change their political beliefs on the spot. You’re giving them a tool for guidance. It’s a mental filter they can use next time, when you’re no longer there.

Become the most trusted source!

A British study by Ofcom documented something that researchers in the field had long suspected: that older adults considered information about COVID-19 to be significantly more credible when received via a family WhatsApp group than when received from public health authorities. For most people, family is the most trusted source of information. This is, of course, why misinformation spreads so easily through these channels—a fake news story sent by a family member carries the full weight of your relationship with them.

Millennials therefore have an advantage that no fact-checking organisation can replicate. Research on intergenerational digital literacy programmes has shown that over 80% of older adults who received digital guidance from a family member felt more comfortable using technology. Additionally, around 70% also felt that discussing the media brought them closer together.

Millennials who succeed best at this have built a track record over time of sharing interesting and accurate information, so that when they gently challenge something, their parents already trust them. Credibility must come before correction.

Use the power of pausing to think!

One of the most counterintuitive findings in misinformation research is that people don’t necessarily share fake news because they believe it; rather, they share it without even thinking about whether it’s true.

Social media platforms are designed for speed and emotional response, and the deliberate evaluation of accuracy is a costly mental process that platforms actively discourage. A study by Gordon Pennycook and David Rand at MIT showed that simply asking people to consider accuracy before sharing, rather than after, reduced the spread of misinformation by up to 51%. Without any confrontation, all it took was a pause.

Millennials can provide that pause for their parents. In the family chat, we fact-check before forwarding messages. A message saying “hold on, let me check” is an invitation to adopt the habit together and won’t be perceived as a correction. Over time, they’ll adopt the habit.

Address loneliness, not just the algorithm!

Research consistently shows that social isolation is one of the strongest amplifiers of vulnerability to misinformation. Older adults who live alone or are estranged from their families are much more likely to engage with and share emotional content, not because they are convinced by it, but because sharing is a form of social participation. For them, the Facebook feed is not merely an informational medium; in many cases, it is their primary community.

The Christian perspective adds a dimension that sociology alone cannot capture: loneliness is also a spiritual problem. It is a sign that the natural structures of community, such as the extended family, neighbourhood and faith community, have eroded. This transforms legitimate needs for fellowship into a marketplace of attention that is easily exploited by algorithms.

A study at the intersection of social media and chronic illness found that older adults with chronic health conditions were the heaviest consumers of medical content on Facebook, and were the most likely to share misinformation related to their illness. This behaviour stems less from a thirst for information and more from anxiety and the need for connection. A phone call, a visit, or a shared activity would be more effective interventions than trying to correct every incorrect medical claim. Fighting misinformation is a never-ending battle. Only when we reduce the conditions that make misinformation seem like a necessity will we stand a chance in this fight.

The trap of condescension

Condescension is the intruder in any intergenerational media literacy conversation. Millennials who try to teach their parents media literacy based on the belief that they are rational adults enlightening their gullible parents will fail because they have misunderstood the relationship. This is perhaps the most important takeaway from the entire article.

Older adults are not digitally illiterate. In fact, the PNAS meta-analysis shows that they are often better at identifying misinformation than younger adults under neutral conditions. They are vulnerable in specific ways: they are more partisan, have a stronger emotional investment in their political identity, and have been exposed to false beliefs for longer, which have now cemented themselves as memories. None of this is a problem of intelligence. Rather, they are all consequences of a life lived in an informational environment specifically designed to exploit the human need for community and meaning.

Van Boven offers one final recommendation that’s easy to overlook: Stop unfriending people you disagree with politically! Having diverse social networks is a structural defence against information bubbles. Research consistently shows that exposure to different perspectives fosters moderation. This isn’t just a lesson for older adults, though. Millennials also create and maintain their own bubbles, which is why they should approach the subject with a bit more intellectual humility.

It’s not your mum’s fault

Taking a step back, the problem of misinformation is far too vast and deeply rooted to be solved on a family-by-family basis. Platform accountability, algorithmic transparency, regulation, and sustained investment in media literacy all play essential roles. However, none of these are happening quickly enough.

Institutional responses struggle to reach people. Yet, the most direct path to an older adult’s information habits runs through the people they love. An adult child who models healthy information habits for their parents will have a greater impact than any algorithmic intervention Facebook has ever built because the relationship isn’t peripheral to the solution—the relationship is the solution.

And Mum? She isn’t the problem. She’s just trying to navigate a system designed by some of the most talented engineers in history to be navigated incorrectly. She needs a guide. She’s known you your whole life. Start with patience! Ask more than you tell! Then come back next week as well!