Most misguided medical treatments of all time

By Naveed Saleh, MD, MS
Published February 26, 2020

Key Takeaways

The evidence-based medicine (EBM) movement is considered one of the most significant milestones of modern medicine. It’s described as “the conscientious, explicit, judicious and reasonable use of modern, best evidence in making decisions about the care of individual patients,” according to experts who’ve written about the approach. Essentially, EBM optimizes clinical decision-making by integrating clinical experience and patient values with the best available evidence from well-designed and well-conducted research.

The EBM movement began in 1981 after a coterie of clinical epidemiologists at McMaster University published a series of articles instructing physicians on how to evaluate the medical literature. In 1991, the term “evidence-based medicine” was formally introduced into the medical lexicon. By using EBM, physicians are able to apply their clinical experience to the latest and greatest medical research to diagnose health problems more quickly and accurately, and prescribe the most appropriate treatments for the best outcomes.

Before EBM gained traction, however, the uncertainty inherent in medical practice resulted in some truly misguided medical treatments. Here’s a closer look at five such treatments that serve as reminders that clinical practice continues to evolve: 

Early radiation therapy

Modern radiotherapy is rooted in discoveries that were made in the 1970s and 1980s with the emergence of novel devices that delivered proton beams. Use of ion beams allowed the controlled delivery of radiotherapy and made for a successful oncologic tool. In the late 1990s, stereotactic radiation therapy (ie, 3D conformal radiotherapy) was introduced, which provided an even safer and more effective method for cancer treatment.

But before this era, when the use of radiotherapy was merely evolving, the adverse effects of radiotherapy interventions far outweighed their benefits. Although radiotherapy was performed using radium and x-rays in the 1910s, it wasn’t until the 1920s that scientists learned that radiation needed to be fractionated instead of dosing at “full blast.” Additionally, radioprotection wasn’t recognized until the creation of the International Commission on Radiological Protection in 1928. Importantly, the ionizing chamber was first introduced in 1932, which permitted a rudimentary form of dosing.

Bloodletting

Early surgeons in 13th century England took over the practice of bloodletting from medieval barbers. Surgeons made it scientific-esque with the use of lancets of varying calibers and shapes to provide access to veins at various anatomical levels. (Interestingly, The Lancet is named after the lancets used for bloodletting.) Blood was collected from gushing veins in basins, the volumes of which were weighed and measured. The surgeons would staunch the bleeding once they thought enough blood had been let. 

(If you’re wondering how much blood was let during these procedures, you may be surprised to learn that about 500 mL was taken on average—or the volume equivalent of a modern-day transfusion!)  

As late as the 19th century, patients continued to be bled twice a year at hospitals in England to maintain good health. Hospital floors were populated by patients who passed out from bloodletting procedures and were convalescing. 

Very early on, physicians questioned the prudence of bloodletting due to concerns about exsanguination. For instance, after 4 bouts of bloodletting over 2 days for severe strep throat, George Washington died. His physician pointed to the removal of too much blood as a possible cause of death.

The practice of bloodletting began to taper off around 1830. And by the 20th century, the use of routine bloodletting died out, with the United States being one of the last bastions of the practice.

Pyrotherapy

In addition to voicing concerns about the practice of bloodletting, Hippocrates noted that fever secondary to malaria could placate those with epilepsy. Like Hippocrates, Galen also noted the value of pyrotherapy, the practice of inducing fever for therapeutic purposes. Galen remarked on a case of depression that was cured by an episode of quartan fever, a form of malaria. 

In 1917, Austrian neuropsychiatrist Julius Wagner-Jauregg discovered that infecting patients with malaria could treat dementia paralytica, a manifestation of tertiary syphilis. While working in an asylum, he observed that insane patients with general paralysis occasionally became sane following a febrile episode. After experimenting with several artificial methods, he found that malaria worked best. Thus, he inoculated affected patients with malaria to induce fever. After patients were cured of syphilis, they then received quinine for malaria. Dr. Wagner-Jauregg won the 1927 Nobel prize for his discovery. 

Controlled infection with syphilis was no willy-nilly practice. Instead, patients were admitted to the hospital and vital signs were ardently monitored along with laboratory tests. Other practitioners artificially induced fever with infections of foreign proteins or chemical substances, such as sulfur, or induced infection with parasites. Patients were also immersed in electromagnetic fields, placed in a hot bath/heat cabinet, or even exposed to radiation therapy for pyretic effect. 

The use of pyrotherapy to treat neurosyphilis continued into the 1950s when it was supplanted by the administration of penicillin. Despite his questionable practice of using one infection to treat symptoms of another, Dr. Wagner-Jauregg remained highly regarded in his field, and his approach laid the basis for modern stress therapies in psychiatry, including insulin and electric shock.

Insulin shock therapy

Three revolutionary psychiatric treatments faded into existence in the 1930s: electroconvulsive therapy (ECT), leucotomy (ie, lobotomy), and deep insulin coma therapy. Only ECT persists to this day. However, deep insulin coma therapy for schizophrenia treatment remained in practice until the 1950s.

Although other experimenters had found that insulin improved symptoms of psychosis, Austrian psychiatrist Manfred Sakel took things a step further by inducing full-blown hypoglycemic coma via insulin. This hypoglycemia, however, could be reversed, and he was allowed to practice his technique in Vienna. Between 1934 and 1935, he published 13 reports indicating a success rate of 88% in those with schizophrenia treated in this fashion. He hypothesized that insulin countered the effects of adrenal hormones, with the adrenal system leading to schizophrenia. His work was lauded and adapted globally.

Critics of the therapy argued that it improved symptoms of psychosis simply because the patients were getting hours of personal treatment instead of nearly no treatment, as before. The practice was widely replaced in the 1960s with the use of neuroleptic drugs.

Tobacco smoke enemas

Toward the end of the 18th century, tobacco was introduced to Europe from the New World. Around the same time, a Native American practice involving blowing smoke up a patient’s rectum also became popular in England and the rest of Europe.

Initially, tobacco smoke enemas were used to treat those who drowned—the rationale being that the smoke not only warmed up the body, but also stimulated respiration. Early tobacco smoke enemas were performed with little more than rubber tubing. After the procedure became popular for a wide range of common illnesses—including headaches, hernia, abdominal cramps, and infections—practitioners risked aspirating or swallowing infected fecal matter for the treatment of more serious health conditions like cholera. Later, bellows and rectal tubes were used to protect the practitioner from exposure.

By 1911, English scientist Ben Brodie discovered that nicotine was cardiotoxic; thus tobacco smoke enemas soon became passé.

Share with emailShare to FacebookShare to LinkedInShare to Twitter
ADVERTISEMENT