Why Do Men Have Nipples? Do They Serve Any Purpose?Why Do Men Have Nipples? Do They Serve Any…Source
Some people believe you can predict the future with cheese. Click to read the full fact.
The post WTF Fun Fact 12608 – Predicting the Future With Cheese appeared first on WTF Fun Facts.
Pretzels are the original snack of the Easter season. Click to read the full fact.
The post WTF Fun Fact 12607 – The Lenten Origins of Pretzels appeared first on WTF Fun Facts.
In November 1941, Rosemary Kennedy, the eldest sister of future American president John F. Kennedy, was admitted to the George Washington University School of Medicine to undergo a radical procedure. The then 23-year-old Rosemary had for many years exhibited erratic behaviour, mood swings, and mild learning difficulties that left her high profile parents exasperated and publicly embarrassed, until her father, Joseph P. Kennedy Sr, finally decided to do something about it. But the Rosemary that came out of the hospital was not the same Rosemary that went in. Her mental capacity had been reduced to that of a 2-year-old. Once vivacious and exceptionally charming, she could no longer speak coherent sentences and had become incontinent. Horrified, her parents locked her away in an institution and did not speak of her or her condition for nearly two decades. Rosemary Kennedy had been subjected to one of the psychiatry’s most infamous treatments: the lobotomy. For more than three decades, this crude form of psychosurgery was the go-to method for treating a wide variety of mental illnesses, from depression to schizophrenia, with nearly 40,000 being performed in the United States alone. But it was a horrifically imprecise and dangerous method, which far from curing patients occasionally left them totally incapacitated or even dead. But who invented the lobotomy, and how did such a life-shattering procedure come to be so widely accepted by the medical community and endure for as long as it did despite the occasionally horrific side effects?
In the early 20th Century the field of psychiatry was in crisis. While advances in neurology and the new practice of psychoanalysis had begun to shed some light on the origins of mental illness, the discipline was no closer to finding cures than in the days of bloodletting and ice baths. As their colleagues in other medical fields surged ahead with new discoveries like germ theory, surgical anaesthesia, and antibiotics, psychiatrists remained little more than caretakers, presiding over crowded asylums with few ways of actually treating their patients. Thus, any new breakthrough that promised to change this was met with great enthusiasm- too much in some cases. It is in this climate of desperation that procedures like insulin shock – and later electroshock – therapy emerged – and for more on this please check out our video “Who Invented Shock Therapy and How Does It Work?” But some saw such methods as too indirect, and wondered whether mental illness couldn’t be treated at its source, by cutting directly into the brain.
The first systematic study of psychosurgery was conducted by Swiss psychiatrist Gottlieb Buckhardt at the Préfargier Asylum in 1888. Like many psychiatrists at the time, Buckhardt believed that individual mental processes were localized in discrete areas of the brain, and that by selectively damaging or removing these areas a patient could be relieved of certain of their symptoms. His results were mixed, to say the least. Of six patients operated on, two remained unchanged, two became quieter, one experienced mild improvement, and one experienced violent epileptic seizures and died shortly after the operation. From there, many of the survivors experienced severe side effects including epilepsy and aphasia – the inability to understand or speak words. Other doctors were horrified by these experiments, and Buckhardt’s work was widely condemned. Thus while sporadic attempts at psychosurgery were made over the following decades by surgeons such as Ludvig Puusepp and Vladimir Bekhterev, it would not be until 1935 that the practice would become mainstream.
In that year, Portuguese neurologist António Egas Moniz developed the leucotomy, a procedure which involved drilling holes ineither side of the skull and making a series of incisions to sever the frontal lobes from the rest of the brain. The frontal lobes had long been suspected to be the seat of personality and behavioural regulation, a fact drastically illustrated by the infamous case of Phineas Gage. On September 13, 1848, Gage, an American railroad labourer, was packing gunpowder into a blasting hole near Cavendish, Vermont, using a metre-long tapered metal rod known as a tamping iron. The metal accidentally struck the rock, creating a spark that ignited the gunpowder and launched the tamping iron directly at Gage’s face. The iron entered Gage’s head just below his left cheek and exited through his left temple, landing some 25 metres away. Amazingly, Gage was not even knocked unconscious, and was able to speak coherently and sit upright as he was transported by oxcart to the local doctor. Over the next seven months Gage made a near-complete physical recovery and regained all his mental faculties, though he was no longer the man he once was. Previously described as a pleasant and genial man, following the accident Gage became, in the words of Harvard Professor H.J. Bigelow, “…gross, profane, coarse, and vulgar, to such a degree that his society was intolerable to decent people.” Indeed his closest friends declared “Gage… was no longer Gage.” While more recent studies have revealed that these personality changes were largely temporary – indeed, following the accident Gage had a long and successful career as a long-distance stagecoach driver in Chile – the case did much to convince psychiatrists of the central role of the frontal lobes in controlling personality and behaviour. This idea was further supported by cases of WWI soldiers who received wounds to the frontal lobes, and patients with large tumours in the same region. An even more dramatic illustration of the principle was an experiment conducted by Yale neuroscientist John Fulton. In the early 1930s Fulton performed frontal lobectomies on a pair of chimpanzees, Beck and Lucy, who exhibited aggressive behavioural tendencies. The results were astonishing, with both animals becoming remarkably placid and cooperative. When Fulton presented his results in 1935 at the Second International Congress of Neurology in London, Egas Moniz, who was in attendance, asked whether the procedure could be applied to humans. Stunned, Fulton responded that while it was theoretically possible, the procedure was “too formidable” an intervention to use on humans.
Moniz was nonetheless inspired, and in November of that year carried out the first leucotomies on psychiatric patients at the Hospital Santa Marta in Lisbon. At first Moniz used targeted injections of ethanol to kill nerve fibres and form a barrier of dead tissue behind the frontal lobes, but soon developed a wire-loop instrument called a “leucotome” which when rotated carved out a 1-centimetre spherical section of brain matter. These initial surgeries were performed on twenty patients suffering from a variety of disorders including depression, mania, schizophrenia, and catatonia, and according to his later report the results were promising, with 35% of patients experiencing significant improvement, 35% mild improvement, and 30% no change. However, “improvement” in this case appears to be a relative term, as the patients experienced a variety of severe side effects including extreme lethargy, confusion, and incontinence, and few were ever actually discharged from the hospital. But as none had died or suffered observable cognitive impairment, Moniz considered the experiment a success and published his findings in 1936. As with Buckhardt before him, Moniz’s work initially received a hostile reception from the medical community, with many decrying the procedure as unnecessarily reckless. But by this time the demand for a method – any method – to pacify patients and alleviate overcrowding in asylums had become so great that leucotomy was gradually adopted throughout the world. In 1949, Egas Moniz was awarded the Nobel Prize in Physiology and Medicine for his discovery.
But the heyday of the lobotomy would not come until the 1940s, thanks to one of the most controversial figures in American medicine: Walter Freeman. Born on November 14, 1895, Freeman met Egas Moniz at the same 1935 conference where John Fulton had presented his chimpanzee study, and was instantly taken in by the great neurologist’s genius. It was to prove a fateful encounter. Though he had long pondered the possibilities of psychosurgery, Freeman later claimed that had he not met Moniz in person he would likely never have pursued the subject, informing Moniz “…having your authority I expect to go ahead.” On his return to the United States, Freeman and his colleague neurosurgeon James W. Watts set out to perfect Moniz’s method, performing the first American leucotomy on September 14, 1936 at George Washington University. Their patient was a 63-year-old Mrs. Hammatt from Topeka, Kansas, who was brought in by her husband suffering from agitated depression and insomnia. According to Freeman’s account, the surgery was a great success:
“She survived five years, according to Mr. Hammatt the happiest years of her life,” Freeman noted in his autobiography. “As she expressed it, she could go to the theatre and really enjoy the play without thinking what her back hair looked like or whether her shoes pinched.”
Encouraged by these results, Freeman and Watts set up a private practice in D.C. and together performed nearly 1000 leucotomies between 1936 and 1945. But while the “Freeman-Watts” method, which involved drilling six small burr holes across the patient’s skull, was less invasive and more precise than Moniz’s original procedure, to Freeman it was still too elaborate. Freeman had worked at St. Elizabeth’s State Hospital in Washington D.C, and knew from first-hand experience that such institutions were often overcrowded and underfunded and rarely had operating rooms, anaesthesia, or surgeons available. Freeman thus set out to develop a procedure that could be quickly and easily performed on an outpatient basis by non-surgical staff.
Freeman found his answer in the obscure work of Italian psychiatrist Amarro Fiamberti, who in 1937 developed a procedure called the transorbital lobotomy in which the brain is accessed through the thin bones of the eye socket. Practicing on grapefruits and then cadavers, in 1945 Freeman adapted and refined Fiamberti’s method. Many of these early experiments were carried out using an ordinary ice pick from Freeman’s kitchen, hence the common name “ice-pick lobotomy” for the procedure.
In Freeman’s version of the transorbital lobotomy, the patient was rendered temporarily unconscious through the use of an electroconvulsive therapy or electroshock machine. Their eyelid was lifted and a long, thin instrument known as an orbitoclast placed between the top of the eyeball and the eye socket. A mallet was then be used to punch the orbitoclast through the thin orbital bone and into the brain. Once the instrument had been driven in around 5 centimetres, it was swept side-to-side, making a 40-degree incision and severing the frontal lobe. The procedure was then be repeated on the other eye socket. Beginning to end, the procedure took no more than 10 minutes to complete.
Freeman carried out his first transorbital lobotomy on January 17, 1946, and soon threw himself into evangelizing the procedure. Driving a van he dubbed “the lobotomobile,” he travelled thousands of miles to more than 55 hospitals across the country to demonstrate and promote his new procedure. A natural showman, Freeman would often show off by performing two lobotomies at once, or using his old kitchen ice pick instead of a proper orbitoclast. He had a reckless disregard for medical procedure, often chewing gum while operating and rarely wearing gloves or washing his hands. On one occasion he even invited the media to watch a procedure which ended in death when Freeman’s hand slipped and plunged the orbitoclast too deeply into the patient’s brain. This cavalier behaviour disgusted his colleague James Watts, causing the two to part ways in 1947. But the psychiatric community was impressed, and the use of lobotomies soared; between 1945 and 1949 the number of lobotomies performed annually in the United States grew from 150 to nearly 5000. The procedure was especially popular in the United Kingdom, where over 50,000 were performed. Lobotomies were prescribed for all manner of psychiatric conditions, however mild, and performed on patients of all ages. Freeman’s youngest-ever patient was 12-year-old Howard Dully of San Jose, California, who underwent the procedure on December 16, 1960. Freeman’s notes describe Dully’s behaviour as follows:
“He is clever at stealing, but always leaves something behind to show what he’s done. If it’s a banana, he throws the peel at the window; if it’s a candy bar, he leaves the wrapper around some place… he does a good deal of daydreaming and when asked about it he says, “I don’t know.” He is defiant at times – “You tell me to do this and I’ll do that.” He has a vicious expression on his face some of the time.”
Though diagnosed by Freeman as Schizophrenic, Dully’s behavioural problems were mild and typical of a boy his age, and today would likely be attributed to the sustained abuse he suffered at the hands of his stepmother. Incredibly, though he suffered from mental confusion and drug addiction, spent years in and out of institutions and halfway houses, Dully eventually made a near-complete recovery and suffers few lingering effects from the surgery.
Most others, however, were not so lucky. Though Freeman claimed a success rate of around 50%, many lobotomy patients were reduced to a vegetative or child-like state, unable to care for themselves and requiring life-long institutionalization. Those who retained their faculties often exhibited a “flattening” or “blunting” of affect, becoming passive and apathetic, while others suffered from seizures or other neurological side effects. A lucky few like Howard Dully experienced no change at all. The greatest benefit of the procedure was often not to the patient themselves but their families and hospital staff, as it made patients more docile and easier to handle. As Freeman enthusiastically reported:
“The noise level of the ward went down, ‘incidents’ were fewer, cooperation improved, and the ward could be brightened when curtains and flowerpots were no longer in danger of being used as weapons.”
Disturbingly, the use of lobotomy as a means of social control went even deeper. In all countries where it was adopted, the procedure was performed disproportionately on women, echoing the 19th Century practice of diagnosing difficult or disobedient women with the catch-all disorder of “hysteria.”
But the days of the lobotomy were numbered. By the mid-1950s the vast numbers of patients left permanently disabled by the procedure could no longer be ignored by the medical community, and the use of lobotomy began to taper off, with several countries such as Germany, Japan, and the Soviet Union banning it altogether. The introduction of antipsychotic drugs like Thorazine also made such radical surgical interventions increasingly unnecessary, and sparked the mass de-institutionalization of psychiatric patients that Freeman had long dreamed of. But Freeman remained stubbornly devoted to his procedure, and after moving to California in 1954 continued to perform lobotomies both in state hospitals and his own private practice. In 1967, Freeman operated on Helen Mortensen, who had been lobotomized twice before but whose symptoms continued to relapse. During the procedure Freeman accidentally severed a blood vessel in Mortensen’s brain, leading to her death three days later. Herrick Memorial Hospital, where the surgery was performed, revoked Freeman’s surgical privileges; he never performed another lobotomy. Walter Freeman died of cancer on May 31, 1972 at the age of 76, having personally lobotomized 3,439 patients.
The golden age of the lobotomy was brief but tragic. Though the use of the procedure had largely ended by the late 1960s, it continued to be performed in countries such as France well into the 1980s – but only as a last resort when all other interventions had failed. All told, more than 40,000 people were lobotomized in the United States alone, with 490 dying and countless others being permanently disabled. Though born of good intentions and the desire to free the mentally ill from overcrowded hospitals, the lobotomy became the embodiment of social control, institutional abuse, and medical hubris whose dark legacy still haunts the field of psychiatry to this day. Though many experimental psychosurgical techniques have been developed in recent years to treat disorders such as Parkinson’s disease, obsessive-compulsive disorder, and epilepsy, the looming shadow of Walter Freeman and his cavalier methods has discouraged many researchers from pursuing this line of research – and government institutions from funding it. As Bart Nuttin, a researcher at the Catholic University in Leuven, Belgium, explains:
“I definitely believe that there is a very important public stigma attached to surgical treatments for psychiatric disorders, and that this is for good reasons. I am convinced that in the past this kind of surgery has been abused. My greatest fear is that some surgeons would start using [new techniques] in a less controlled way than we have. There remains a need for strict official control of this kind of treatment.”
Similarly, according to University of Michigan neuro-psychologist Elliot S. Valenstein:
“I think they’re really concerned about the reaction to the [perceived] notion that the government is supporting brain operations and that there may be a resurgence of lobotomies in this country.”
As for Howard Dully, who suffered so much at the hands of Walter Freeman, his opinion of the controversial doctor is remarkably magnanimous:
“I don’t think Freeman was evil. I think he was misguided. He tried to do what he thought was right, then he just couldn’t give it up. That was the problem.”
If you like this article, you might also enjoy:
- Mind Control: From the Inside Out
- The Curious Case of Terminal Lucidity
- One of the Most Shocking CIA Programs of All Time: Project MKUltra
- Who Started the Lizard People Conspiracy Theory?
Expand for References
Gordon, Meryl, “‘Rosemary: the Hidden Kennedy Daughter’, by Kate Clifford Lawson”, The New York Times, October 6, 2015, https://www.nytimes.com/2015/10/11/books/review/rosemary-the-hidden-kennedy-daughter-by-kate-clifford-larson.html
Nicholas, Elizabeth, Rosemary Kennedy and the Legacy of Mental Illness, VICE, October 5, 2015, https://www.vice.com/en/article/mvb4pq/rosemary-kennedy-and-the-legacy-of-mental-illness-511
Walter Freeman: the Father of Lobotomy, Medical Bag, May 21, 2015, https://www.medicalbag.com/home/features/despicable-doctors/walter-freeman-the-father-of-the-lobotomy/
He Was Bad, so They Put an Ice Pick in His Brain, The Guardian, https://www.theguardian.com/science/2008/jan/13/neuroscience.medicalscience
El-Hai, Jack, The Lobotomist, The Washington Post, February 4, 2001, https://www.washingtonpost.com/archive/lifestyle/magazine/2001/02/04/the-lobotomist/630196c4-0f70-4427-832a-ce04959a6dc8/
Tan, Siang Yong & Yip, Angela, António Egas Moniz (1874–1955): Lobotomy pioneer and Nobel laureate, Singapore Medical Journal, April 2014, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4291941/
The post A Crisis of Minds- The Fascinating Tail of Fixing People By Destroying Their Brain appeared first on Today I Found Out.