Michael Paterniti is the man who crossed America in 1997, carrying a jar containing Albert Einstein’s brain in the trunk of a rented Buick. This journalist is not the only man who can brag about this memory, because riding shotgun was Thomas Harvey, the pathologist who had stolen the brain of the great physicist, in the hope that he would be able to understand his mind.

When I first read about the journalist’s adventurein the book Life Without Limits, written by Clifford GoldsteinI took it as an inspired illustration of the idea the author wanted to convey. It never crossed my mind that I could look for the article Goldstein was referring to. And I would have missed out, had I not done it recently. Paterniti’s article is still where it first appeared, in Harper’s Magazine, which secured it an obscure place in their archive. Probably wanting to avoid oblivion, Paterniti also wrote a book about his research experience, but I didn’t get to read that, although it is available on Amazon. The Harper’s Magazine article though, is a true legend—one so tasty and empathetically written that I’m surprised it wasn’t turned into a movie. It probably will be someday.

In 1955, Thomas Stoltz Harvey was 42 years old and was the chief pathologist at the prestigious Princeton Hospital where Albert Einstein had been hospitalized due to an aneurysm in the abdominal aorta, which he had refused to get operated on and which caused his untimely death. Einstein’s body had to go through the autopsy protocol, but the forensic scientist appointed to it, a former professor of Harvey and Einstein’s acquaintance, Dr Harry Zimmerman, was unable to attend. He let Harvey handle it. After determining the cause of death, Harvey took the liberty of harvesting the brain of the genius physicist for study.

Without asking anyone’s permission, Harvey weighed the brain and found that its weight was within normal parameters (1,230 grams) so he proceeded to cut it into 170 pieces. The process took three full months because he then wanted to cut the 170 parts into slices that could be analysed under a microscope. He created 12 sets, each containing hundreds of brain sections, and distributed them to some of the most famous pathologists in the world at the time, keeping two sets for himself.

Albert Einstein—described by Paterniti as the “first trans-global super celebrity”, who became world-famous just before images of Michael Jordan or Marilyn Monroe reached many parts of the world—had expressed his desire to be cremated after death, to avoid his remains being idolized by fans. Neither Einstein nor his family gave their consent for his brain to be preserved. However, when they learned that this had happened and that the doctors intended to study the brain, Einstein’s descendants decided to give doctors permission to store the brain, provided the results were published strictly in academic journals, not in tabloids.

What happened to all those slices of Einstein’s brain? Not much, Paterniti concluded, after confronting Harvey’s stories on the selection of the most famous pathologists (by reading their papers) with the accounts of those pathologists (many contacting Harvey themselves). One of the studies that led Harvey to say that Einstein’s brain is more unusual than people thought, is a paper that claimed that Einstein’s brain had more glial cells. However, Larry Kruger, professor of neurobiology at UCLA, says that the “meagre findings” are “laughable” because the paper “means absolutely nothing.”

brain

So this is what all the fuss is about?

For decades, illegally and undisturbed, a piece of the father of relativity’s brain sat in Harvey’s basement. Harvey got into Paterniti’s car because the journalist, who had been following him for a long time, offered to drive him to one of Albert Einstein’s granddaughters, to whom he wanted to take the last pieces of the physicist’s brain.

Einstein’s granddaughter’s reaction is what makes Paterniti’s story so delightful: “So this is what all the fuss is about?” she allegedly said, clutching the container that held a few pieces of her grandfather’s brain. The journalist tries to build an absurd picture, and it works, because it has all the ingredients: he crossed a continent, from one side to the other, together with a fugitive from justice, taking a totem relic of a genius, to his unexpectedly atypical successor.

Evelyn, Einstein’s granddaughter, was almost everything you might expect her to be: raised in a kind of exile in Switzerland, then educated at the University of California, Berkeley, she went through a failed marriage, lived on the streets for a whole year, then with members of a religious cult, then got a job as a policewoman. According to Paterniti’s account, she forces herself to reverently refer to her grandfather’s remains, but the whole text shows the strange character of the “reunion” of the two Einsteins: a genius and an anonymous person. And what makes the article brilliant is that it insinuates a question that the reader can no longer escape from, although the journalist does not directly ask it: What made the difference between the brilliance of the physicist and the apparent bluntness of his descendant? This is just another way of asking the same question to which Thomas Harvey hoped to get an answer: What about Einstein’s brain made him brilliant?

Goldstein, in turn, wonders: “How did cells and chemicals, entombed in darkness by tissue and skull—and thus never coming into contact with the external world except through the subjective filters of Einstein’s eyes, nose, mouth, ears, and skin—how did those cells and chemicals, nevertheless, produce such profound insights into the external world? Tissue never exposed directly to a single ray of light could still theorize fascinating things about that light.”

The whole story is illustrative of the idea that Goldstein wants to convey: “I don’t know,” he answered, “and decades later, no one does.” No one knows how quanta of chemicals, going from one neuron to another, translate into thought and sensation. “They have no clue, actually. Well, they have clues, lots, but the clues shrivel up and die at the mystery of how biological and chemical tissue—having more in common with Chinese take-out than a hard drive—can create in us not just qualia (sensation), but transcendental logical thought that can ask questions about the very object, the brain, that poses those questions to begin with.”

The mistakes of those who fix your mind

A pragmatist might, of course, accuse Goldstein of philosophizing on topics that could be categorized as having an unfortunate reputation. But the author’s obvious predilection for philosophical thinking has a surprisingly current practical applicability, and is more relatable than we might imagine.

In “Psychiatry’s Incurable Hubris”, a troubling essay he wrote for the April issue of The Atlantic, Garry Greenberg reviews the many failures that psychiatry, as a medical specialty, went through before reaching the point where it really got started. Greenberg reviews Anne Harrington’s book, Mind Fixers: Psychiatry’s Troubled Search for the Biology of Mental Illness, and approves her findings. After decades of study and experiments that today would be considered, without reservation, barbaric, psychiatry is in a prolonged stalemate regarding its very foundations.

“Exactly what mental illnesses can be said to exist?” Pliny Earle, a renowned 19th-century psychiatrist, was asked after 50 years of experience in the field. The one who asked the question was Clark Bell, the editor of the New York Medical Society’s journal. And despite Earle’s reputation as the one who designed the American University of Psychiatric Disorders, co-founded the first association of psychiatrists, and opened the country’s first private psychiatric practice, the prestigious physician responded in a way that could not meet the expectations of the editor: “In the present state of our knowledge, no classification can be erected upon a pathological basis, for the simple reason that, with slight exceptions, the pathology of the disease is unknown.” Anne Harrington’s thesis, which was published in her volume on April 16, is that, unfortunately, the history of psychiatry is not one of progress, but rather of promising paths that have proved to be dead ends.

From the Nobel Prize awarded to Egaz Moniz in 1949, the inventor of lobotomy, to the infamous work of psychiatrist Walter Freeman, who travelled across the United States to perform, at the cost of the lives of patients in psychiatric asylums, experimental operations on brains affected by disease, then to the induction of diabetic coma to treat psychosis and the deinstitutionalization of mentally ill patients in America in the 1960s and 1970s, many of the milestones of international psychiatry are actually memorials of pain.

Harrington continues the list of fiascos, recalling the 1973 vote by which the American Psychiatric Association declared that homosexuality was no longer a mental illness. Of course, such an assertion invariably provokes political and social reactions. However, the author does not make a moral statement about homosexuality, but a scientific one. What is scientific about a discipline that establishes, by vote, an issue with such far-reaching consequences? The Association’s response, decrypted from the Diagnostic and Statistical Manual of Psychiatric Disorders (DSM) of 1980, was that the list of diseases had been revised, and only those diseases that had atheoretical symptomatology were preserved, removing taxonomies based on preconceptions (such as Freudian theories).

However, Harrington disputes the honesty of this measure, claiming that this too is a theory: more precisely a theory that claims that mental illness is nothing but a pathology of the brain. “They believed that biological…markers and causes would eventually be discovered for all the true mental disorders. They intended the new descriptive categories to be a prelude to the research that would discover them,” writes Harrington.

“The DSM-3’s gesture at science proved sufficient to restore the reputation of the profession, but those discoveries never followed,” Greenberg said bitterly. “The pathological basis of almost all mental disorders remains as unknown today as it was in 1886,” Clarke wrote to Earle. Consequently, the National Institute for Mental Health in the United States announced that it would be “re-orienting its research away from DSM categories.”

Harrington gets even closer to a discussion relevant to the average person when she attacks the issue of “chemical imbalance,” which is thought to be the main cause of some highly prevalent mental disorders, such as depression. The author argues that the idea was introduced in the early 1950s, after scientists were able to demonstrate the principles of chemical neurotransmission, and it was shown that drugs such as LSD alter the reception of serotonin and other neurotransmitters. It took four decades for the idea that mental illness is caused by chemical imbalances to reach the general public, with advertising campaigns targeting the end consumer in an attempt to sell prescription drugs, antidepressants in particular. These campaigns, Harrington says, were meant to assure potential customers that the effect of medication is not similar to recreational drugs, but that they were repairing an underlying biological problem.

brain

This thinking is so last-century

In the 1990s, while the public was convinced that depression was caused by a chemical imbalance, scientists reached a consensus that “this theory is deeply flawed and probably outright wrong.” But its echo still resonates today in the attitudes of ordinary people towards mental illness. In 2007, a survey of a cohort of students at Cleveland State University found that 84.7% of students were convinced that depression was very likely caused by a chemical imbalance. “Chemical imbalance is sort of last-century thinking,” said neurologist Joseph Coyle of Harvard Medical School. “It’s much more complicated than that.”

“Indeed, it is very likely that depression stems from influences other than neurotransmitter abnormalities. Among the problems correlated with the disease are irregularities in brain structure and function, disturbances in neural circuitry, and various psychological contributions, such as life stressors,” noted Hal Arkowitz and Scott Lilienfeld for Scientific American. “Of course, all these influences ultimately operate at the level of physiology, but understanding them requires explanations from other vantage points,” the authors said.

Other psychiatrists, such as Richard A. Friedman, also lament the lack of empirical knowledge about the physiology of mental illness. Friedman wrote in The New York Times that, “despite a vast investment in basic neuroscience research and its rich intellectual promise, we have little to show for it on the treatment front. With few exceptions, every major class of current psychotropic drugs—antidepressants, antipsychotics, anti-anxiety medications—basically targets the same receptors and neurotransmitters in the brain as did their precursors, which were developed in the 1950s and 1960s.”

Now is probably the best time to mention that none of these descriptions of the crisis that psychiatry is going through today should be a reason to abandon the drug treatment of mental illness. However, they are warning signs about the limits of a medical branch that enjoys prestige and significant financial investment, which, as Friedman said, draws resources from limited budgets, which could otherwise be saved.

Friedman, a professor of clinical psychiatry at Weill Cornell Medical College, cited a study by Dr John C. Markowitz that demonstrated the unexpectedly high effectiveness of interpersonal psychotherapy compared to the more popular method of exposure in patients with post-traumatic stress. Psychotherapy research, Friedman said, “deserves a much larger share of research dollars than it currently receives,” especially given that, among psychiatrists, “no one thinks breakthrough biological treatments are just around the corner.”

A complete understanding of neurobiology would be unlikely to elucidate the complex interactions between genes and the environment that lie at the heart of many mental disorders, the psychiatrist said. “Anyone who thinks otherwise should remember the Decade of the Brain, which ended 15 years ago without yielding a significant clue about the underlying causes of psychiatric illnesses.”

The return to morality

“Don’t get me wrong,” Friedman warns his readers, “I am all for cutting-edge neuroscience research. But we are more than a brain in a jar.”

We are more than a brain in a jar. No matter how much we run away from the philosophical or even the moral side of the discussion, we’ll still end up here. I could not convey the idea any better than Greenberg did in his review for The Atlantic: “[Psychiatry], far more directly than other medical specialties, implicates our conception of who we are and how our lives should be lived. It raises, in short, moral questions. If you convince people that their moods are merely electrochemical noise, you are also telling them what it means to be human, even if you only intend to ease their pain. In this sense, the attempt to work out the biology of mental illness is different from the attempt to work out the biology of cancer or cardiovascular disease.”

We best understand how important our perspective is on what it means to be human or what it means to live this life when we find ourselves in a desperate situation. Such was the tragic case of young Emily, who in her twenties, asked to be euthanized for “terminal depression,” and a commission of psychiatrists gave her what she asked for. However, if we truly love those around us, and feel responsible for our own lives and the lives of at least a few others, we will also take responsibility for what we tell others about the value of life, despite not having the right units of measurement to calculate it. For our wellbeing and the wellbeing of those around us, however, declaring it incalculable is enough. We should admitmodestly, not in conceitthat we are more than the sum of the electrical impulses in us, no matter what map they follow.