If science were a religion, how violent would it be compared with Christianity?
If science were a religion, how violent would it be compared to Christianity?
The Crusades, the Inquisition, and so-called holy wars are all things anyone who wants to accuse Christianity of violence can point to. Some have gone as far as to say that, in fact, violence is intrinsic to any religion, because like politics, it is only one side of a coin called “power”1. They argue that the Enlightenment freed humanity from the grip of religious dogmas, and enlightened the masses with a natural, not supernatural, power: reason.
The Age of Enlightenment has had such a global impact that its good reputation continues to convert many to scientific positivism—the view that the only way to authentic knowledge is through science. With the reverence of a practiced believer, many today hope for an inimitable, unidirectional evolution of science, convinced that the truth can and will be known in all its fullness, sometime in the future.
With the same force with which they reject religion, some followers of scientific positivism embrace the belief that one day the mystery of existence will be fully unravelled, and that the universe will become accessible enough for us to understand its secrets.
Often, those who disparage religion display an equally strong degree of optimism about the ability of science to produce a better world. But just how reasonable is this optimism? To take such a question seriously, we must first rule out scientific denialism, not because scientific denialism does not have high-ranking exponents (see South African President Thabo Mbeki, who throughout his eight-year term denied that HIV would cause AIDS2) but because a balanced discussion must recognise the real problem, and expose extremist rhetoric.
Although it cannot be calculated, the impact science has had on the quality of life (at least in the developed world) cannot be denied. Although the ranks of those willing to throw out decades of scientific research, especially in the medical field, to embrace dubious and unverifiable visions seem to be growing, the ranks of those who show blind confidence in the progress of the uninterrupted nature of science are also increasing. And the lack of knowledge of the violent failures of science contributes significantly to this misplaced trust.
The history of science is littered with tragic episodes that are rarely talked about. But a brief foray into the recent yet dark past of science will reveal truths not only about science, but also about people.
At the beginning of the twentieth century, hospices that housed patients with various mental disorders could no longer cope with the growing number of patients. In prosperous Europe, therapeutic nihilism had infected medical specialists, a most appalling consensus that doctors were essentially unable to cure suffering, even through drug treatments that were considered the most advanced at the time. Therapeutic nihilism was so prevalent that in some countries it had become a political statement. The Croatian-Austrian philosopher Ivan Illich, one of the best-known supporters of therapeutic nihilism, argued that due to the fact that European societies were saturated with doctors, who performed too many surgeries and prescribed too many drugs, the incidence of malpractice had increased and so had diseases.3
But in the middle of the century, a burst of therapeutic innovation based on surgery caused a deadly wound to this nihilism. This is the context in which one of the most cruel moves of science took place: the emergence of the lobotomy. Conceptualised by the Portuguese neurologist Egas Moniz, lobotomy was introduced in the medical sciences in the category, also created by Moniz, of “psycho-operations”. Moniz was not the first to discuss the possibility that psychiatric patients’ problems could be solved by physical intervention on their brains. But he was the first to move from theory to practice and, inspired by the effects he considered remarkable of lobotomies on chimpanzees in 1935, he performed the first “leukotomy” (the old name for a lobotomy) on a human patient.
The principle that justified the operation was that mental illness could be alleviated, or even cured, by destroying the neural connections in the prefrontal cortex. This destruction, which took place through the power of the scalpel, reduced the “complexity of the patient’s psychic life”, which psychiatrists such as Maurice Partridge considered to be a symptom of mental disorder.4
After the operation, however, the patients’ mental presence, self-awareness, self-control and spontaneity were much reduced. Post-operatively, patients were inert, emotionally numb, and had a limited intellectual spectrum. But all of these were considered insignificant losses compared to the risks associated with living with mental illness. “I fully realize that this operation will have a minimal effect on her mental illness,” noted the legal guardian of patient ‘Helaine Strauss’ on the consent form. The note (which Jack Pressman discusses in the monumental volume on the history of medicine, Last Resort) continues: “But I am willing to consent, in the hope that she will reach a more comfortable position and it will be easier to take care of her.”5
But what many people received in return for this hope was completely different. The operated patients were, for the most part, unable to function independently. Many were socially uninhibited (no longer followed the rules of common sense), no longer had the ability to empathise (due to reduced cognitive ability), and were confused, and incontinent. Some gained an increased appetite and gained a lot of weight. Others were reduced to what American lobotomy neurologist Walter Freeman called “surgically induced childhood.” According to Freeman, the operation left patients with a “childish personality” that needed to be re-educated, an objective for which Freeman recommended the use of a system consisting of rewards (ice cream) and punishments (blows to the body).6
Freeman was an ardent admirer of Egas Moniz, but was only able to operate on a single patient, who died on the operating table. After the tragic event, his right to practice was revoked. His passion for the innovative field of lobotomy, which he considered extremely promising, pushed him to compromise with another neurologist named Watt, who would operate in his place, at his instruction.
Under this formula, Freeman and Watt operated on Rosemarie Kennedy, the older sister of former U.S. President John F. Kennedy. The operation, which took place in 1941, when Rosemarie was only 23 years old, ended very badly, and the whole story around her was covered up by the Kennedy family until 1987. According to biographer Ronald Kessler, Rosemarie suffered from a mental delay which her family identified early in her childhood. Her parents made sure that the girl went to special schools and was mentored by private teachers in boarding schools, which the Kennedy family generously supported financially. But in the last institution of special education in which she enrolled, Rosemarie could not adapt to the conditions, secretly running off at night, and having violent breakdowns in response to frustration.
Worried that rumours of his daughter’s promiscuity could jeopardise the Kennedy brothers’ political careers, Rosemarie’s father decided to subject his daughter to a lobotomy without even consulting his wife. Given the diagnosis of retardation, doctors would have only needed the father’s consent to proceed with the operation. The intervention, as described by the biographer, was horribly barbaric. Rosemarie was tied to the bed, given a mild tranquiliser, then Dr. Watt proceeded to pierce the skull with a drill. A two-centimetre incision allowed a medical rod to enter the brain while doctors were talking to Rosemarie. They asked her to say the Lord’s prayer, to sing the United States anthem, and to count down. “We were advancing depending on how well she responded and we stopped when she became incoherent.” This was the report made by Dr. Watt, in the only interview he would ever give.7
After the lobotomy, Rosemarie’s intelligence reached the level of a two-year-old child. She could no longer walk and became incontinent. Her father admitted her to a medical institution, and her mother did not visit her for two decades after the operation. It was not until her father’s death that the Kennedy family began visiting Rosemarie, who died in 2005 at the age of 86 in the same medical institution where she had spent her entire life. Rosemarie was said to have been the only member of the Kennedy family to die of natural causes.
What made it possible
A study of all patients who underwent a leukotomy in the UK between 1942 and 1954 looked at 10 365 patients who had had a single operation. Another 762 underwent two or more such operations. Of all these patients, 9 284 were examined post-operatively and it was found that 41% recovered considerably, 285 recovered to a minimal extent, 25% showed no improvement, 2% were in a worsened condition, and 4% had died.8
Many clinicians with impeccable reputations were convinced that lobotomy was a viable treatment. Studies conducted at that time managed to convince elite doctors in rich countries by reporting success stories that were merely incidental. Neurologists today, who look at the percentages provided at the time, are more than skeptical. One of Britain’s foremost neurosurgeons, Henry Marsh, told the BBC that lobotomy was simply bad science: “Wrong medicine, wrong science, because it was clear that patients undergoing this procedure would never recover properly. If you saw the patient after the operation, they looked good: they would walk and talk and say, ‘Thank you, Doctor.’ The fact that they were completely ruined as social human beings probably didn’t matter.”9
But given that the lobotomy came at a time when scientific elites no longer believed in the possibility of curing psychiatric patients, the medical community clung to lobotomies as an opportunity worth taking. Plus, thanks to it, patients could finally be discharged from crowded hospices.
The discovery of this new battlefield, or perhaps the call to arms against mental illness, brought Egas Moniz the Nobel Prize in Medicine in 1949. Moniz died in 1955, when the lobotomy fashion was already fading and new psychiatric therapies with neuroleptic drugs were taking its place. However, the page dedicated to Egas Moniz on the website of the Nobel Prize Foundation still displays an article signed by Bengt Jansson, Swedish professor of psychiatry and member of the Nobel Medical Assembly between 1976 and 1997, in which he gives a very favourable review of the legacy of the Portuguese neurologist: “I think there is no doubt that Moniz deserved the Nobel Prize.”10
To refuse a genius
The Nobel Committee was less favourable towards Dmitry Mendeleev, who, by formulating the periodic table of elements, provided the alphabet for every statement in chemistry. Although from 1869 (the year of the discovery) to 1906 (the year the Nobel Committee discussed Mendeleev’s award) his contribution had been ignored for four decades, despite its seismic impact. Not even at the eleventh hour did committee members vote to recognise what the whole world recognises as the foundations of chemistry. Due to animosities that two highly influential members of the committee harbored toward Mendeleev, he was “omitted” at the 1906 award, and in 1907, a strong flu would end the chemist’s life at 72 years of age.
The Guatemala experiment
If one could try to justify lobotomy fever by pointing to the good intentions of the doctors who practiced it, the same cannot be said of the doctors who took part in the Guatemala experiment, one of the most infamous scientific experiments that took place in a free country during peacetime.
Conducted over two years, between 1946 and 1948, in Guatemala, several experiments led by American doctor John Charles Cutler aimed at infecting several thousand Guatemalans from vulnerable social categories with syphilis and other sexually transmitted diseases. These people were soldiers, prostitutes, detainees, psychiatric patients, children from state schools, institutionalised orphans, and peasants. They did not give any consent to participate in such research. The experiment wanted to test the effectiveness of penicillin in treating diseases, and to that end, the experimenters left a group of subjects completely untreated and at risk of death, in order to analyse the evolution of the disease in the absence of any medication.
In 1946, just as Cutler was beginning this experiment, 23 Nazi doctors and officials were being brought to the bar in Nuremberg to account for the inhuman treatment of concentration camp detainees during World War II. As a result of this process, the Nuremberg Code was established, an international convention mandating the voluntary consent of participants in scientific experiments.
It was not until October 2010 that the U.S. president officially apologised to Guatemala for what had happened in those years. The gesture came immediately after a researcher from an American faculty discovered information about experiments in the archives of Dr. John Cutler (who died in 2003). The chairman of the Commission for the Study of Bioethics, Dr. Amy Gutmann, called the experiment “a shameful piece of medical history.” “Those involved in the study did not show even the minimum level of respect for human rights and morality”, Gutmann said.
But the lesson is hard to learn, and developing countries continue to attract medical researchers like a magnet. In 2010, as President Barack Obama was apologising for what happened in Guatemala, the U.S. Department of Health and Human Services found that, in 2008, nearly 80 percent of applications for new drugs had used experimental studies in other countries.
In 2015, nearly 800 people—many of them survivors and descendants of the victims—demanded material compensation for their loss and filed a complaint against Johns Hopkins University, accusing its researchers of providing the majority of commissions in charge of approving federal funding needed for the Guatemala experiment. The university’s representatives denied any guilt, and instead accused the victims of trying to profit financially from the scandal.
In all the cases detailed above, science was the immediate cause of the atrocities committed on innocent people. But we can easily mention cases in which science has been an indirect source of violence against people. For example, when it created scientific racism—the idea that there is empirical evidence of a racial hierarchy and, consequently, a justification of racial discrimination. Scientific racism received validation from the scientific community from the 1600s until the end of the First World War, when it was officially renounced by UNESCO.11 Therefore, its historical impact cannot be realistically assessed in terms of numbers of victims. But we can get an impression of the statistics by looking at the number of victims of historical slavery in the United States, or in the states that colonised underdeveloped countries.
We can also think about the hesitation of the scientific community in denouncing tobacco consumption as extremely harmful, since the first rumours circulated in the medical world that inspired tobacco smoke would cure various diseases.12 Worldwide, smoking kills 7 million people each year. And for every person who dies from smoking, another 30 live with a disease associated with tobacco use.13
A drop in the ocean
In addition to this brief calculation, the Crusades and the Inquisition, taken together, no longer feature so imposingly in the history of man’s cruelty towards man. We, as humankind, are cruel, sometimes without any explicit motivation, other times motivated by unscrupulously trying to fill the need for scientific knowledge. But even if we could table all the tragedies generated by false or distorted religion, and all the tragedies of false or distorted science, the comparison would not really solve the underlying problem. There is no possible way to rank moral corruption, because there are no degrees of comparison when it comes to moral corruption.
In reality, neither Christianity nor science are, in themselves, generators of violence. In both cases, it is human involvement that can lead to a destructive or tragic end. Perhaps a more honest position would be to admit that the greed for knowledge coexists with the greed of faith, and that both are parasitic diversions from the purpose for which both science and religion exist. And if science is dearer to us, we should, in its spirit, remark objectively that just as we should not get rid of science because of its dark episodes, we cannot renounce Christianity, despite its well-known historical sins. We need both. This is not unbridled greed, but simply the human condition struggling to find its rights.
Alina Kartman is a senior editor at ST Network and Semnele timpului.