In 1941, the future president John F. Kennedy’s younger sister, Rosemary, disappeared. With her brothers off at Naval Bases preparing for the upcoming war, the 22-year-old Rosemary attended a school in Washington DC, where she was known as a troublemaker. In fact, she had been the family black sheep for the last decade due to her erratic and extreme mood swings. Later in life, doctors would look back on Rosemary’s medical history and determine that she must have suffered from manic-depression, bipolar disorder, or a similar ailment. At the time, though, treatment for mental health was poor.
Joe Kennedy, Rosemary’s father, feared that his daughter’s behavior could throw a wrench in his plans for his sons’ political careers. So, he sought a permanent solution. Just years earlier, a Portuguese doctor had discovered a new way to treat mental health conditions— lobotomy. The procedure— which includes drilling holes in the patient’s head and severing connective brain tissue— left Rosemary in a toddler-like state. She forgot how to speak, became incontinent, and had to be trained to walk again. Rosemary lived the rest of her life hiding on the grounds of a school for young girls, under constant care from a group of nurses. She never saw most of her family ever again.
This may seem like something out of a horror story, but throughout fifty years in the middle of the 20th-century, lobotomies were seen as a justifiable treatment for mental health disorders. Doctors performed tens of thousands of them throughout the developed world. Most of the patients were involuntarily forced to undergo the operation. The best-case scenario typically left the patient in a permanent childlike state. Yet, this was considered a medical success. In retrospect, lobotomies look like one of the modern medical world’s worst errors, but some doctors say it was a sound method. A modern lens reveals the perfect storm of circumstances that led to the operation’s approval, popularity, and, eventually, demise.
Lobotomy is the most famous example of psychosurgery, an operation on the brain to treat mental illness. The first modern account of psychosurgery was in 1888 by the Swiss doctor Gottlieb Burckhardt. Burckhardt worked with six patients who struggled with various forms of mental illness. To treat these patients, Burckhardt opened up their skulls and removed portions of the cerebral cortex, the brain’s information processing powerhouse.
According to the surgeon’s own analysis, two patients were unchanged, two became quieter, one experienced convulsions and died a few days after the operation, and one improved. Burckhardt claimed a 50 percent success rate and presented his findings at a medical conference in Berlin, where he was met with intense opposition and criticism. Many of those present at the meeting spoke with disdain about the doctor’s apparent lack of morals and embrace of unsound theory. In his own defense, Burckhardt said, “Each physician has a different nature. One believes in the principle: do no harm. The other says: better a dangerous remedy than nothing. I lean toward the second category.”
Despite persistent pushback from physicians who prioritize “doing no harm,” operating on the human mind remained an attractive idea. Other forms of “heroic” physical remedies became commonplace during that period. These included malarial therapy for syphilis, insulin shock therapy, and electroconvulsive therapy, each one characterized by Burckhardt’s ethos— the treatments included a high risk of causing severe damage to the patient but were used when nothing else was available. Given the lack of medication for mental health issues, heroic therapy went viral among neurologists.
In 1935, a Portuguese neurologist named Egas Moniz attended a medical conference in London. At this conference, a Yale physician named John Fulton presented an experiment on the vital role of the brain’s frontal lobes using two chimpanzees. The chimps had long exhibited frustrating behavior, like throwing tantrums when they weren’t rewarded for completing tasks. Fulton removed both chimps’ frontal lobes, causing extreme behavioral changes. Both creatures appeared calm and happy, and the tantrums ceased entirely. Moniz, looking on in awe, asked Fulton if he felt the surgery could be done on humans suffering from mental illnesses. Fulton conceded that, in theory, it was possible but that the operation was too invasive and risky for human subjects.
Years later, Moniz claimed that he conceived of the lobotomy on his own before hearing Fulton’s presentation. Based on his journals, he believed that mental illness was caused by repeating circuitry in the brain, whereby negative signals were sent to and from the frontal cortex in an infinite loop. Regardless of when he got the idea, Moniz initiated the first lobotomies a few months after the London conference in November of 1935. Given his lack of operational training, Moniz enlisted the help of a surgeon named Pedro Lima.
The methods changed over the first dozen or so operations. At first, Lima injected alcohol into the patient’s frontal cortex until they became catatonic. Moniz declared the first operation a success, the patient cured of her depression, although she was confined in the hospital until her death. In future attempts, though, it sometimes took many injections before the patient responded. So, Moniz changed his tactics. They would drill a hole into the patient’s head and insert a device called a leucotome. With this tool, the surgeon would make six lesions, or cuts, that removed the fibers connecting the frontal cortex to the rest of the brain.
In the first four months of practice, Moniz oversaw 20 lobotomies. The long list of complications included increased temperature, vomiting, incontinence, apathy, lethargy, disorientation, kleptomania, extreme hunger, and akinesia, or loss of voluntary movement. Despite never observing patients more than a week after the operation, Moniz declared that these side-effects were temporary and that his operation was a success. Moniz recorded his results and sent assistants around Europe to present his findings, claiming a 70 percent success rate.
However, a colleague at Moniz’s hospital challenged that the patients actually showed dramatic deterioration of brain function and personality. Surgeons attacked the method for its lack of basis in clinical observation, instead relying on fringe theory and cerebral mythology. Despite this criticism, Moniz’s questionable data was shared throughout the western world and, within the next five years, it became a popular treatment in several western countries.
Lobotomies caught on the fastest in the United States. The first operation was done in September 1936 by James Watts and Walter Freeman, the men who would eventually perform on Rosemary Kennedy. Freeman met Moniz at the London conference in 1935 and was struck by the Portuguese physician’s genius. When Moniz published the results of his operations the following year, Freeman requested more details on the tools and procedure. Working together with Watts, the two men performed dozens of lobotomies throughout the US, but Freeman didn’t like the intense surgical skill needed for the operation. After all, surgeons, operating rooms, and anesthesia were all difficult to come by at the time. So, they worked to develop a new version of the procedure suited to untrained caregivers.
Inspired by the work of an Italian psychiatrist named Fiamberti, Freeman determined that the best method was to approach the brain through the eye-socket with a small, ice pick-like tool. In 1945, having already performed dozens of traditional lobotomies, he began testing his new method. The physician started by practicing on grapefruits and cadavers. Within a few short months, he claimed to have perfected his craft.
Freeman began by lifting up the eyelid and inserting a thin, sharp instrument called an orbitoclast. He then used a mallet to drive the tool deeper into the eye socket and through the thin layer of bone that protects the brain. Once 5 centimeters— or about 2 inches— into the brain, the instrument was rotated, cutting into brain matter. Then, after returning to the neutral position, it was pushed in another 2 centimeters and turned again. Each of these rotations cut the white fibrous matter that connects the thalamus to the frontal cortex. Following this process, the surgeon removed the icepick from the first eye and repeated it with the other eye. This new technique, known as a transorbital lobotomy, was first used in 1946.
Following Freeman’s innovation, his partner, Watts, quit in protest. He was disgusted by Freeman turning the surgical operation into an “office procedure.” In the years from 1940 to ’44, about 700 lobotomies were performed in the US. Due to Freeman’s promotion of his new technique, more than 5,000 were completed in 1949 alone. By 1951, almost 20,000 Americans were lobotomized. By then, it had spread throughout much of the western world. While lobotomies took much longer to catch on in Northern Europe, by the 1950s, Sweden, England, and Denmark completed more per capita than any other nation. Altogether, around 100,000 lobotomies were performed worldwide.
Perhaps the most shocking thing of all was the reported results from these operations. While some outside observers disparaged the method, many claimed that the results were overwhelmingly positive. Even the more critical reports referred to the outcomes as “mixed,” a seemingly ridiculous claim given the host of complications and side effects that followed. Countless patients were plagued by seizures for the remainder of their lives. Other forms of brain damage were commonplace. In the 1940s, 5 percent of all lobotomized patients died in the following days. The risks were incredibly high, yet so many caretakers and families of patients opted for this option. But why was that?
At the beginning of the 20th-century, the medical field became more aware of mental ailments. Without any way to manage mental health, most people suffering from these maladies were contained rather than treated. Asylums popped up throughout the western world. At one point in the interwar period, more than 55 percent of all American hospital beds were for psychiatric patients. Conditions in these hospitals and asylums were horrible and inhumane, and most doctors were aware of that, as were patients’ families. No one wanted to see more people institutionalized. However, many families couldn’t adequately care for their relatives given the lack of proper treatment for schizophrenia or severe manic depression. So, turning the patient into an immobile, quiet shell of their former selves was the goal of the procedure.
In the words of one doctor, who was describing an upcoming lobotomy, “I fully realize that this operation will have little effect on [the patient’s] mental condition but am willing to have it done in the hope that she will be more comfortable and easier to care for.” This is perhaps the most explicit reasoning behind lobotomy.
Unfortunately, many patients were still institutionalized after surgeries— most required weeks, months, or years of intensive therapy and training. Walter Freeman called the post-lobotomy period a “surgically induced childhood,” which called for parents to teach proper conduct by rewarding good behavior with ice cream and punishing bad behavior with physical harm. This system was meant to teach basic skills like walking, using a bathroom, and eating an appropriate amount of food.
Miraculously, a select few patients who received lobotomies managed to overcome the mental obstacles to lead something resembling an everyday life. The most famous example is Howard Dully. Dully was diagnosed with childhood schizophrenia at age four. In 1960, at just 12 years old, Dully became the youngest person ever to receive a lobotomy. The procedure left him in a stupor. He was institutionalized for the rest of his adolescence and fell into a pattern of anti-social and addictive behavior. Eventually, Dully got sober and worked diligently to put his life together. He earned a degree in information systems and held down a job as a driving instructor. In recent years, he published a book and gave several interviews about his life experience. Even today, Dully believes that his brain doesn’t work like most people’s. He is remarkably upbeat and optimistic, given his unfortunate situation. To some, though, he’s a symbol of just how terrible the lobotomy was when the only “survivors” suffer such immense consequences.
Thankfully, it didn’t take too long for some countries to respond to the horrors of lobotomy. In Germany, only a few of the operations were ever performed. The USSR banned them on moral grounds in 1950. Most countries finally forbade lobotomies in the 60s, 70s, and 80s. However, for many of these nations, giving up on the procedure had nothing to do with moral concerns. Instead, lobotomies fell out of style when another alternative arose. In 1955, a drug called chlorpromazine was approved for use in a handful of countries to treat patients with psychiatric disorders and with remarkable effect. Though still far from a perfect treatment, it was judged to be 70 percent effective, and not in the same way that Moniz considered lobotomies 70 percent effective. Chlorpromazine made patients more manageable without rendering them unconscious or unaware.
In 1967, with chlorpromazine on the rise, Freeman performed the last transorbital lobotomy in American history. His patient, Helen Mortensen, died of a brain hemorrhage shortly after the procedure.
With the end of an era came a stream of “I told you sos” from physicians. Many called for Moniz to be stripped of the Nobel Prize that he won in 1949 for inventing lobotomy. To this day, the Nobel organization has firmly stood by its decision, saying that Moniz deserved his award since there were no alternatives for treating schizophrenia and similar disorders. As absurd as this sounds, this opinion is shared by some.
Today, lobotomies are essentially unheard of. Advances in treatment for mental health conditions have negated the need for such drastic methods. Still, in retrospect, there is disagreement over whether the operation was ever justified. Proponents continue to point out that there were no other options available and that mental health facilities and professionals were seriously strained. Opponents argue that many people were lobotomized without permission, including those below the legal age of consent. In every country to perform lobotomies, they were more commonly done on women. Many cases resemble Rosemary Kennedy’s, where a father or husband chose to lobotomize his daughter or wife without approval. From a modern lens, it seems evident that the procedure is complete madness, but a small contingent of medical professionals continue to defend its former use.
What do you think? Were lobotomies justified given the lack of other treatments? Which ethic is more important: do no harm? Or better a dangerous remedy than nothing? Should risky procedures be made available with a patient’s consent? Or should they be outlawed to protect the people?