The Zombie Theory: The Era of Medical Experimentalism (Part 2)

The Zombie Theory: The Era of Medical Experimentalism (Part 2)

The reproducibility of published experiments is the foundation of science. No reproducibility – no science." - Moshe Pritsker, Ph.D., CEO of JoVE1

By the turn of the 20th century medical science had fully embraced empiricism – the philosophy that knowledge is determined by rational experiments perceived by our senses. Proof rather than deduction or revelation was the new measuring stick. Experiments were designed, theories created, measurements taken, successes heralded, and experimenters often rewarded with fame and fortune. As important, empiricism brought with it the process by which all modern science is evaluated: the scientific method. The formality and rigor of this process was transformational in science. It’s worth a quick review.

The Scientific Method

The 5-step scientific method is simple to describe, and difficult to implement – and that is the point of this exacting process. The technique is designed to create empirical evidence – sometimes referred to as sense experience – utilizing the tools of observation and experiment. Results must be measurable in the physical world. When done as designed, the method provides quantifiable observations to the scientist - the facts of an experiment. In turn, the scientist provides an explanation of the facts – the theory of an experiment.

Step 1 of the scientific method requires the scientist to ask a question about nature, to make detailed observations and to gather information. In Step 2 the scientist forms a hypothesis (theory) about the observations and creates specific predictions. Next, in Step 3 the scientist tests the predictions with a detailed, observable, quantifiable experiment. Step 4 requires the scientist to analyze the data, to draw conclusions, and to accept, reject or modify the hypothesis. Finally, and most importantly, Step 5 compels the scientist to provide step-by-step directions to duplicate the experiment, and a new scientist must independently reproduce the experiment and find the same results before any knowledge can be proclaimed.

Turn-of-the-century medics must have been truly inspired. For the first time they could listen to – and see – telltale signs of health inside a living body. They could anesthetize their patients prior to surgery, and they used sterilized instruments in a disinfected operating room, blood transfusions available as needed. As important, given the overwhelming success of Pasteur’s germ theory, new hypotheses were being introduced at a fast pace, each theory looking for other likely “germs” that were the root cause of so much human suffering.

Thus, 20th century medical experimentalism launched with new tools, a new paradigm and a multitude of exciting projects. To kick off this new era, two medical devices were revamped during the last decade of the 19th century, setting the stage for the incredible 100 years to follow: the microscope and the culture dish.

Let There Be Light

August Köhler2 was a student of zoology, botany, mineralogy, physics, and chemistry in late 19th century Germany. As a young, post-graduate staff member at Carl Zeiss AG (an optical systems manufacturer) he developed Köhler Illumination. Kohler’s invention produced even lighting across the field of view and greatly enhanced the contrast of the light microscope. During the next 45 years Kohler contributed to numerous other innovations including fluorescence microscopy and grid illumination, a method used in the treatment of tumors.

Around the same time, Julius Richard Petri was working for the Imperial Health Office in Berlin. Lab scientists were uniformly frustrated. In order to observe cultures through a microscope the cover had to be removed, exposing the bacteria to contaminants like dust, hair, and human breath. Petri had the simple idea of placing a slightly larger clear glass dish upside down over the culture dish to protect it from the external environment and, according to one science writer, “changed medical history.”Petri moved on to work in a lab in Germany for the rest of his career where he published nearly 150 papers about the spread of diseases.

“Magic Bullets”

Another German, Paul Erlich4, coined the term “chemotherapy” in 1900. Erlich theorized toxic compounds could be created to selectively target a variety of disease-causing organisms. He predicted future chemists would produce substances to seek out these disease-causing agents, dubbing the substances “magic bullets.” Erlich’s forecast was accurate. “Magic bullets” began to to materialize in science labs around the world. By 1901 blood types were discovered by Austrian Karl Landsteiner, in 1906 Frederick Hopkins discovered vitamins in England, and a Canadian, Sir Frederick Banting, discovered insulin in 1921.

It was a banner century for another “magic bullet”: the vaccine. The most celebrated was Jonas Salk’s polio vaccine. Once introduced in the United States (some may remember the March of Dimes immunization campaign in the early 1950’s) the annual number of polio cases fell from 35,000 in 1953 to 5,600 by 1957. By 1961 only 161 cases were recorded in the United States. Medical science also gave us vaccines for bacterial meningitis, chickenpox, haemophilus influenza, hepatitis A, hepatitis B, Japanese encephalitis, measles, mumps, papillomavirus, pneumococcus, rotavirus, rubella, tetanus, typhoid, tick encephalitis, whooping cough and yellow fever – saving and changing the lives of millions of people.

The Century’s Preeminent “Magic Bullet” – Penicillin

Before antibiotics (lit. against-life), 90% of children with bacterial meningitis died, strep throat was often fatal, and even minor infections would often lead to serious illness and death. Then in 1928, Sir Alexander Fleming5, a Scottish biologist and pharmacologist, made a fortuitous discovery from a discarded Petri dish. The mold that had contaminated an experiment turned out to contain a powerful antibiotic: penicillin. This one discovery, and the analogues to follow, has saved hundreds of millions of lives around the world. Fleming also predicted science would find many new “bacteria killers.” He was right too. Today there are thousands of antibiotics, more created every year.

More “Magic Bullets”

Here is a selection of “magic bullets” discovered and invented during the 20th century (there are many others):

• Arsphenamine for syphilis (1910)

• Nitrogen mustard – first cancer drug (1946)
• Acetaminophen (1948)

• Tetracycline (1955)
• Oral contraception – “the pill" (1960)

• Propranolol – first beta blocker (1962)

• Cyclosporine - immunosuppressant (1970)
• Lovastatin (Mevacor) - first statin (1987)

Procedures

There were amazing number of new procedures created by modern medicine over these 100 years too. Here’s a list of some of the “firsts”:

• Electrocardiogram (1903)

• Stereotactic surgery (1908)
• Laparoscopy (1910)


• Electroencephalogram (1929)
• Dialysis machine (1943)

• Heart-Lung Machine (1953)

• Ultrasound (1953)
• Kidney transplant (1954)
• Pacemaker (1958)

• "Test Tube Baby” (1959)
• Liver transplant (1963)
• Lung transplant (1963)

• Pancreas transplant (1966)
• Heart transplant (1967)
• MRI (1971)

• CAT Scan, (1971)

• Insulin pump (1972)

• Laser eye surgery (1973)

• Liposuction (1974)

• Heart-lung transplant (1981)
• Surgical Robot (1985)

Mankind has been the beneficiary of these creations and we gratefully acknowledge and salute medical science for their wondrous contributions, inclusive of all medical specialties – save one.

Again - What About Madness?

Medical scientists addressing madness contributed to this otherwise spectacular century with four “magic bullets” of their own during the first 50 years of the century, each an unmitigated disaster. The first three are collectively called Shock Therapies and include, Deep Sleep Therapy, Convulsive Therapy, and Insulin Shock Therapy. The fourth is Psychosurgery. Here’s a review.

Deep Sleep Therapy (DST)

Jakob Klaesi, a Swiss psychiatrist re-popularized DST in 1920 (after two failed attempts earlier in the century) using Sonmifen (a sedative) for his schizophrenia patients. For the next 20 years Klaesi and his colleagues dominated the mental health hospital circuit in Zurich using DST, despite high mortality rates and never ending doubts about efficacy. Undeterred, DST was promoted by many eminent psychiatrists of the time, including William Sargant of Great Britain:

"All sorts of treatment can be given while the patient is kept sleeping, including a variety of drugs. . . the patient does not know how long he has been asleep, or what treatment, even including ECT, he has been given. . . a new exciting beginning in psychiatry and the possibility of a treatment era such as followed the introduction of anesthesia in surgery."

The Australian Chelmsford scandal of 1983 finally put an end to this toxic procedure. Dr. Harry Bailey was in charge of Chelmsford Private Hospital in Australia and DST was the primary treatment for madness. Over sixteen years, 27 deaths were directly connected to DST with another 24 reports of suicide in the same year patients received treatment. Facing condemnation from families, the general public and the government, Bailey committed suicide in 1985.6 The scandal brought about new stringent laws and regulations regarding psychiatric care in Australia.7

Convulsive Therapy

Convulsive therapy took hold quickly. In 1934 Ladislas J. Meduna, a Hungarian neuropsychiatrist known as the “father of convulsive therapy,” used metrazol (a stimulant) to induce seizures in patients with schizophrenia and epilepsy. By 1937, the first international meeting on convulsive therapy was convened in Switzerland, and by 1940 metrazol-convulsive therapy was being used worldwide.

Around the same time Ugo Cerletti, an Italian neuropsychiatrist, was using electric shocks to produce seizures in his animal experiments. He noticed when pigs were given an electric shock before being butchered, they were in an “anesthetized state.” With his colleague Lucio Bini, they replaced metrazol and other chemicals with electricity. As a bonus, they surmised, ECT brought about retrograde amnesia so patients had no ill feelings about a treatment they could not remember. Cheaper and more convenient, ECT replaced chemical-induced convulsive therapy and by 1940 was being used in England, Germany, Austria, and the United States. (NOTE: Cerletti and Bini were nominated, though not selected, for a Nobel Prize.)

There was a marked decline in the use of ECT from 1950s to the 1970s because the public perceived the procedure as dangerous, inhumane and overused.8 However, because ECT was convenient and cost-effective, mental health providers balked. By 1985, the National Institute of Mental Health (NIMH) and the National Institutes of Health (NIH) convened a conference on ECT and concluded, while controversial, ECT was effective for a narrow range of psychiatric disorders. In 2001 the American Psychiatric Association expanded the role of ECT and, by 2017, ECT was covered by most insurance companies. This incredibly cruel and torturous “treatment procedure” is gaining popularity - again.9

Insulin Shock Therapy (or Insulin Coma Therapy)10

Dr. Manfred Sakel was a young doctor in Vienna in 1928 when he was given the task to reduce the unpleasant withdrawal symptoms of opiates. Experimenting with a newly discovered pancreatic hormone – insulin – he unexpectedly found a large dose would cause his patients to go into a stupor and, once recovered, were less argumentative, less hostile, and less aggressive. Thus, Insulin Shock Therapy (IST) was born. For the next 30 years IST was the go-to method for tens of thousands of mental health patients as IST doctors proudly proclaimed an “80 per cent cure rate for schizophrenia.” 11

The actual procedure was intense. Insulin injections were administered six days a week for two months or more as the daily dose was gradually increased until hour-long comas were produced. Seizures before or during the coma were common as were hypoglycemic aftershocks. Often patients were subjected to ECT while comatized. Given the profuse cases of brain damage – and an estimated mortality risk rate ranging from 1-5%12 – IST fell out of use in the United States, and nearly everywhere else, by the 1970s.13

Psychosurgery

In 1930 Antonio Egas Moniz, a Portuguese neurologist, used the term “leucotomy” (lobotomy) for the first time to describe a surgical operation that destroys brain tissue by extraction, burning, freezing, electrical current or radiation. The objective is to sever the connections between the frontal lobes and deeper structures in the brain. Approximately 40,000 lobotomies were performed in the United States alone from 1930 to 1970. By the way, the majority (nearly two thirds) were performed on women.14

Use declined rapidly due to increased concern about deaths and brain damage caused by the operation, and by the introduction of neuroleptic drugs. By the mid-1970s the use of psychosurgery had declined to about 100–150 operations a year and disappeared completely by the 1980’s. Remarkably, Moniz (and Walter Rudolf Hess) shared the Nobel Prize in 1949 for this discovery, though not without controversy that still exists in the scientific community.15

What A Century

For medical scientists focused on physical human ailments, it was a stupendous century. Life expectancy is approaching 80 years in the United States, up from 50 years at the beginning of the 20th century. We are routinely treated with medicines and procedures for debilitating diseases that diminished, disfigured and often killed our ancestors not long ago. As important, the consistency and precision provided by empiricism and the scientific method paid off for all medical specialties during those amazing 100 years – save one.

For medical scientists focused on madness it’s been one grotesque failure after another. These scientists put us to sleep for weeks at a time, induced comas for months at a time, used chemicals and electricity to convulse us, and surgically destroyed our brain tissue, all in an effort to fix our brain diseases. Along the way all of them ballyhooed their successes, using their special brand of science and patient testimonials to convince us of the medical necessity and efficacy of their “magic bullets.”

Then in 1950, as if a reprieve for past travesties, a fifth “magic bullet” appeared: neuroleptic (lit. nerve-effecting) drugs. This new state-of-the-art medicine was primed to replace the first four fiascos. Given the track record of Psychiatric Medical Model Theory (PMMT), it’s a wonder anyone took them serious.

Unfortunately, nearly everyone did.

NEXT TIME: Part 3: Thorazine to the Rescue

Medicine men devised all manners of disabling methods—for three centuries—finally discovering drugs as an easy and efficient means of achieving disability."
- David West Keirsey, Disable Madmen, (https://professorkeirsey.wordpress.com/2011/08/17/disable-madman-part-i/)


Endnotes
1The Journal of Visualized Experiments (JoVE) is a peer-reviewed scientific journal that publishes experimental methods in video format. https://www.jove.com/.

2 https://en.wikipedia.org/wiki/August_K%C3%B6hler.

3 How Julius Richard Petri's Dishes Changed Medical History
https://www.medicaldaily.com/how-julius-richard-petris-dishes-changed-medical-history-246396.

4 Erlich shared the 1908 Nobel Prize in Physiology or Medicine with Élie Metchnikoff for their contributions to the field of immunology.

5 Fleming, Howard Florey and Ernst Boris Chain jointly shared the Nobel Prize in Medicine in 1945.

6 You can read more about this at https://chelmsfordblog.wordpress.com/aftermath-of-the-scandal/

7 In her book First Half, Toni Lamond described her experience at Chelmsford: "I was given a semi-private room. On the way to it I saw several beds along the corridors with sleeping patients. The patient in the other bed in my room was also asleep. I thought nothing of it at the time. Although it was mid-morning, the stillness was eerie for a hospital that looked to be full to overflowing. I was given a handful of pills to take and the next thing I remember was Dr Bailey standing by the bed asking how I felt. I told him I'd had a good night's sleep. He laughed and informed me it was ten days later and, what's more, he had taken some weight off me. I was checked out of the hospital and this time noticed the other patients were still asleep or being taken to the bathroom while out on their feet." https://en.wikipedia.org/wiki/Deep_sleep_therapy.

8 Later on, the public’s negative perception of ECT was further tarnished by the movie One Flew Over the Cuckoo's Nest.

9 Read about Pennsylvania’s Rotenberg Center at https://doctorcima.com/2012/06/

10 https://en.wikipedia.org/wiki/Insulin_shock_therapy#Decline

11 https://en.wikipedia.org/wiki/Insulin_shock_therapy

12 Ebaugh, FG. (1943). A review of the drastic shock therapies in the treatment of the psychoses. Annals of Internal Medicine. 18 (3): 279–296. doi:10.7326/0003-4819-18-3-279.

13 In 1953, British psychiatrist Harold Bourne published The Insulin Myth, arguing there was no sound basis for believing that insulin counteracted the schizophrenic process. He said treatment “worked” because patients were chosen for their good prognosis and were given special treatment. Bourne submitted the article to the Journal of Mental Science. After a 12-month delay, Bourne received a rejection, telling him to "get more experience." https://ipfs.io/ipfs/QmXoypizjW3WknFiJnKLwHCnL72vedxjQkDDP1mXWo6uco/wiki/Insulin_shock_therapy.html

14 https://en.wikipedia.org/wiki/Lobotomy. In addition, in Japan the majority of lobotomies were performed on children with behavior problems.

15 There have been calls in the early 21st century for the Nobel Foundation to rescind the prize it awarded to Moniz, characterizing the decision at the time as an astounding error in judgment. To date, the foundation has declined to take action and has continued to defend the results of the procedure.

0 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Tags: