“Do not fear death so much, but rather the inadequate life.”— Bertolt Brecht
“The following day no one died.” With this sentence, Portuguese writer José Saramago opens one of his most widely read novels. In an unnamed country, death abruptly suspends its work at the start of a new year, granting the population a fragile kind of immortality. Yet while no one dies, life continues to flow along its ancient channel of suffering.
At first, people rejoice with unmeasured delight. Never before had they received such a wondrous gift—the chance to enjoy life without end. Many even become convinced that sheer willpower can overcome death and that “the undeserved disappearance of so many people in the past could be put down solely to a deplorable weakness of will on the part of previous generations.”[1]
But in the weeks that follow, their joy thins into dread as they realise they cannot manage a situation with no way out. The number of those trapped at the border between life and death—frail elders barely clinging to existence, young people ravaged by incurable disease, or victims of terrible accidents—grows steadily, pushing society into crisis.
Soon, life without death loses both its taste and its meaning. People begin smuggling their loved ones across the border—sometimes on their own, sometimes with the help of the mafia—into lands where death still operates. Months later, when death resumes its work, it does so with a new method: sending violet envelopes to its next victims, each one a week in advance.
Saramago’s novel, written in his eighties, can be read through many lenses. But above all, it provokes reflection on how we view death—and life—in a world scarred by imperfection. The human longing to find meaning in a life that feels unworthy of being lived, or in a death that comes too soon to make sense, has accompanied us since the dawn of our existence.
This is one of the reasons why death has been explored not only in films and literature, but also in scientific research, which has examined its many facets—from the physiological processes that unfold in life’s final moments to the ways people think and feel when faced with death.
Over the past three to four decades, a wide range of studies have focused on unraveling a central dilemma: can the mind transcend the physical limits of the body, and does the power of will have the capacity to delay—or hasten—the moment of death?
When the will (does not) delay the inevitable
Those who have lost a loved one sometimes speak of the dying person’s will to live, which, in their view, may have played a role in keeping them alive until a meaningful event—such as a religious holiday or the arrival of a close relative whose visit had been delayed.
Although it may sound like folklore, the phenomenon has been the subject of several studies aimed at discovering whether individuals can, even briefly, exert some control over a process still shrouded in mystery.
Sociologist David Phillips has examined the relationship between the timing of death and significant life events in a series of studies. In research published in the Journal of the American Medical Association in 199), Phillips reported that deaths among Chinese individuals dropped by 35.1% in the week leading up to the Harvest Moon Festival—one of the most important traditional Chinese holidays—only to spike to a similar level the following week. Cerebrovascular diseases accounted for the highest percentage of these deaths, followed by heart disease and malignant cancers. The study concluded that such patterns suggest death can, at least for a short time, be delayed until a personally significant milestone is reached.
In another study, also led by Phillips, data from more than 2.7 million deaths revealed that women were less likely to die in the days leading up to their birthday. However, the likelihood of death rose significantly in the week following their birthday, making it the most dangerous time of year for them.
Analyzing data from more than 30 million deaths in the United States, researchers Mitsuru Shimizu and Brett Pelham concluded that people are more likely to die after major events—such as Thanksgiving, Christmas, New Year’s, and birthdays—than before them.
However, more recent studies have only modestly replicated these findings. Researchers have pointed out a number of limitations in earlier work and have called for a reassessment of socio-demographic factors—such as marital status, gender, age, and religious affiliation—that may influence the process.
In 2004, researchers Donn Young and Erinn Hade examined more than 300,000 cancer-related deaths in Ohio between 1989 and 2000. Their study found no correlation between the number of deaths and events considered significant, such as Christmas, Thanksgiving, or birthdays. Some specialists who work with patients have pushed back, stressing that statistics say little about the human side of the phenomenon. Daniel Loiterstein, a geriatric psychiatry expert at Rush University Medical Center, noted that the study did not take into account individual circumstances or the fact that events other than the holidays examined might hold greater personal significance for patients.[2]
Other research has shown that Christmas and New Year’s can actually be risk factors for both cardiac and non-cardiac mortality[3]. Meanwhile, a study of Catholic priests found no variation in mortality around Christmas, Easter, or the anniversary of their ordination. And the list of studies suggesting there is no robust evidence that people can delay their own death is much longer.
Although the belief in such an ability has been widespread—even among medical professionals—the reality, says Judith Skala, a psychiatry professor at Washington University School of Medicine in St. Louis, is that three decades of research have failed to provide convincing evidence: “But none of the studies have convincingly established that the time of death can be postponed through force of will or hastened by loss of the desire to live” Skala concludes, though she does not rule out the possibility that future studies may shed new light on the connection.
While specialists remain largely skeptical of the idea that the power of the mind can fight off imminent death by postponing it, a recent study[4] has shown that the reverse of this phenomenon—observed in many people who have endured traumatic events—may be possible: individuals who lose the will to live can die even when their physical condition alone would not justify death.
Letting go of life: a common phenomenon in prison camps
The term “give-up-itis” was first applied to prisoners of the Korean War (1950–1953), who, under the terror and squalor of North Korean camps, abandoned hope and even the will to live, ultimately dying without any obvious organic cause.
One medical officer observed in his fellow captives a strange state of apathy—a kind of turning away from life—and learned to estimate with surprising accuracy how much longer a prisoner had left to live by studying the severity and combination of symptoms.[5]
Another noted that there was a widespread belief, shared even by some doctors, that these deaths were not the result of physical decline; rather, they were caused by extreme isolation in which the individual seemed “willing to accept the prospect of death rather than to continue fighting a severely frustrating and depriving environment.”[6]
The same phenomenon was later observed among American prisoners in Vietnam, and the term “give-up-itis” was retrospectively applied to inmates of Nazi concentration camps during World War II, where detainees gradually detached themselves from life.
Among the accounts of Vietnam prisoners succumbing to trauma is that of a highly skilled Marine who grew increasingly detached from everything around him. In the end, he curled up in his bed and remained in that position until he died. His final words were: “Wake me up when it’s over.”[7]
In the Soviet Gulag, inmates devised an extensive vocabulary for those who were “walking corpses,” the most common term being dokhodiaghi (“the finished ones”). “They […] did not wash—even when they had the opportunity… Nor did the wicks bother to search for and kill the lice that sucked their blood,”[8] wrote American artist Thomas Sgovio, who recounted the shock of encountering a friend in the camp who failed to return his greeting, and did not even seem to recognise him, his face marked by a vacant expression.
Although prison camps became the breeding ground for such manifestations, similar cases of “giving up” have been recorded in other traumatic circumstances—for example, among the settlers of Jamestown, the first English colony in America; among enslaved Africans in the British colonies; or among survivors of tragedies.
Though unusual, the pattern of death brought on by the loss of the will to live has recurred many times under harsh conditions. Only recently, however, has it become the subject of scientific study. John Leach, a researcher at the University of Portsmouth, has investigated the phenomenon known in academic literature as psychogenic death—a death that occurs in the absence of psychosis, depression, or severe physical illness to explain it.
The steps of a sprint toward death
People can die when they give up the will to live, feeling trapped in a maze with no way out. The slide from life to death often unfolds in five stages, during most of which recovery is still possible. This is the conclusion of a study[9] by John Leach, the first researcher to identify the clinical markers of psychogenic death.
This withdrawal from life typically occurs in the context of trauma the individual believes they can no longer endure. According to the study, severe trauma may disrupt the anterior cingulate cortex—an area linked to motivation and goal-directed behaviour—while also reducing dopamine levels, leading to apathy and loss of drive.
“Psychogenic death is real. It isn’t suicide,”[10] Leach says, noting that death usually occurs about three weeks after the condition begins.
The first stage is marked by withdrawal from normal social interactions, a shift toward emotional numbness and indifference—patterns observed especially among prisoners of war. Motivation and initiative diminish, though cognitive function remains intact. Leach believes that withdrawal may serve as a coping strategy, allowing the individual to emotionally disengage from their environment and recalibrate their inner stability.
If this detachment from the outside world continues, the person may slide into the second stage: apathy, defined as “a quantitative reduction of voluntary, goal-directed behaviors.”[11]
A former prisoner of war described the “colossal inertia” that gripped him on arrival in the camp—even though capture was a real possibility, neither he nor his comrades had allowed themselves to imagine it; captivity was often a shock many soldiers tried to cope with alone. He remembers that even getting out of bed consumed more energy than he felt capable of managing: “I was not tired—I was just apathetic [and] every act, every decision, required an effort out of all proportion to the circumstances.”
Although apathy can serve as a defensive mechanism in traumatic situations, without external intervention a person can slide into the third stage of the syndrome, marked by abulia—a severe loss of motivation and initiative, an inability to decide, and even a loss of speech. Any interest in personal hygiene or appearance evaporates. “I began to look around me and saw the beginning of the end for any woman who might have had the opportunity to wash and had not done so, or any woman who felt that the tying of a shoe-lace was wasted energy,”[12] recalls one survivor of the Nazi camps.
In the fourth stage, intrinsic motivation (still present in the earlier phases) is gone and attempts at outside assistance typically produce no response. Akinesia[13] appears, along with a disappearance of sensations such as hunger, thirst or pain.
The final stage is death, usually occurring about three to four days after the preceding phase. Occasionally a false resurgence is observed—prisoners sometimes smoked one last cigarette, carefully saved as a kind of token—but while the behaviour may appear goal-directed, “the goal itself appears to have become the relinquishing of life.”[14] As psychiatrist Viktor Frankl noted, when an inmate in that condition lit a cigarette, fellow prisoners expected him to die within 48 hours.
The road to recovery lies in rediscovering a reason to live
Death is not inevitable for those who surrender to life’s sudden, overwhelming blows. The fading of the will to live can, in fact, be reversed, according to the study, which cites cases where victims were jolted out of their lethal apathy through methods ranging from gentle to harsh. In one instance, the “therapy” applied to an American prisoner by his fellow inmates consisted of beatings—deliberately provoking him into anger until he rose to fight back.
Because research has shown that physical activity boosts dopamine production, Leach suggests that this increase could prove beneficial in cases of surrender syndrome. The release of dopamine in the brain’s nucleus accumbens enhances motivation and goal-directed behaviour. At the same time, dopamine stimulates the prefrontal cortex, producing positive effects on cognitive function. This, in turn, allows the individual to reassess their circumstances, recognising elements of the situation that remain at least partly within their control.
The signal that a victim had broken free from the grip of apathy was the resumption of simple routines, such as shaving or undressing before bed.[15] A person who has lost the will to live is only truly out of danger once they regain a sense of control over their life and recover intrinsic motivation, because, as Leach says, “motivation is essential for coping.”[16]
Psychiatrist Viktor Frankl identified the search for meaning as central to survival under terrifying conditions. Life in the camps, he recalled, was so grotesque that prisoners often refused to wake someone from a nightmare, convinced that reality was far worse than any dream. One inmate confided to Frankl that the march from the train station to the camp felt like attending his own funeral.[17] What deepened the despair was the uncertainty about how long imprisonment would last. Normal life, glimpsed beyond the barbed-wire fence, appeared like an untouchable phantom reality.
Having endured the concentration camps of Auschwitz and Dachau, stripped of everything and reduced to a number—prisoner 119104—Frankl discovered a strategy for survival: constructing meaning amid the wreckage of broken dreams. Citing the struggles of the unemployed or those battling relentless illness, he argued that not only in the camps but in everyday life, people face suffering, injustice, and absurdity—and need to find meaning if they are to endure without collapsing into despair.
His camp experience taught him that “any attempt to restore a man’s inner strength in the camp had first to succeed in showing him some future goal.”[18] For this reason, Frankl believed Nietzsche’s dictum (“Those who have a ‘why’ to live, can bear with almost any ‘how’”) should guide all therapeutic efforts to help trauma survivors. Frankl found his own “why” in the hope of seeing his loved ones again and in rewriting the manuscript that the Nazis had confiscated.
It should be noted that finding a meaning and a hope that transcended the circumstances helped some camp inmates avoid clinging to life “at any cost,”[19] as Solzhenitsyn said, noting that, all too often, preserving life meant brutalizing the one who possessed it.
Describing the terrible plight of women in the Soviet camps—where squalor and the harshness of daily existence pushed them into submitting to the sexual harassment of guards and supervisors—Solzhenitsyn bitterly records that even the most resolute were often forced to give in. Moral degradation was furthered “by the fact that there was no meaning, no purpose, left in life”[20] (aside from the aim of wringing some small relief from an environment that was, in no metaphorical sense, hell itself). And once life lost its purpose—shattered into petty objectives, often illegitimate or immoral—the choice at the crossroads of camp existence became tragically simple: “if you go to the right, you lose your life, and if you go to the left, you lose your conscience.”[21]
The camps did not succeed in corrupting everyone, the Russian writer notes, recalling the integrity of the nuns and of all the “genuine religious believers” he encountered in detention—figures who, in his words, advanced in “their self-confident procession through the Archipelago—a sort of silent religious procession with invisible candles.”[22] They were the ones who could have confirmed, without words, the haunting title produced by a survivor of the Nazi camps: If This Is a Man.[23]
Stripped of everything they loved, those who preserved their humanity—in life and even in death—left behind a message as unyielding as the backbone of their stance: there is hope, there is dignity, and there is Refuge, even when the windows to the world and to one’s fellow human beings close, one after another.
















