Medical breakthroughs that were originally rejected

By John Murphy
Published October 21, 2020

Key Takeaways

Lee Iacocca, former president of Ford Motor Company and the man behind the Ford Mustang, once said, “In times of great stress or adversity, it’s always best to keep busy, to plow your anger and your energy into something positive.”

It takes a lot of confidence and energy to fight adversity, even when you know you’re right. Take a look at the difficulties faced by the following physicians, whose medical breakthroughs were criticized in their day but have since become widely and famously accepted.

Vaccines

Anti-vaxxers have been around since vaccinations were discovered. In the late 1700s, British physician Edward Jenner developed the first vaccine—an inoculation against smallpox, a devastating disease at the time. He sent a paper about his experiment to the Royal Society, but it was summarily rejected. Dr. Jenner continued his experiments and was eventually published.

But his discovery was met with criticism and outright scorn. The local clergy called the vaccine unchristian because it was cultivated from an animal. For parents, the idea of actively infecting a child seemed outlandish. Many people refused mandatory vaccination because, they said, it infringed on their personal freedom. People formed anti-vaccination groups in both Britain and the United States.

Eventually, science overcame skepticism. In 1980, the World Health Assembly declared smallpox eradicated.

Antiseptic handwashing

These days, you have to constantly remind patients to wash their hands. It’s enough to drive you crazy, right? Wait until you hear what happened to the guy who started it all.

Years before Louis Pasteur revealed that germs spread infectious diseases, 19th-century Hungarian doctor Ignaz Semmelweis wondered why so many mothers in the maternity ward were dying of puerperal fever after giving birth.

When young Dr. Semmelweis realized that many of the medical students were performing autopsies as well as delivering babies, he concluded that doctors with “cadaverous particles” on their hands were infecting—and killing—their patients. He instructed the medical staff to wash their hands and instruments with both soap and a chlorine solution. As a result, the mortality rate of puerperal fever in the maternity ward dropped by 90% in less than 6 months. It was a momentous discovery.

Surprisingly, most other doctors rejected his idea of antiseptic handwashing. His discovery was both mocked and ignored. (Even today, more than half of doctors don’t properly wash their hands.) Frustrated, Dr. Semmelweis berated and belittled those who opposed him. After several years of trying to convince doctors of his idea, he suffered a nervous breakdown and was eventually committed to an insane asylum at the age of 47, where he died of sepsis only 2 weeks after being admitted.

Bone marrow transplant

The idea of a bone marrow transplant had been around for decades before it was ever achieved. So, although the idea of a bone marrow transplant was accepted, the way to actually accomplish it successfully (without the patient dying) was a matter of significant controversy and skepticism.

The idea became popular after World War II, thanks in part to the information about irradiation gleaned from nuclear bomb explosions. But it wasn’t until 1956 that E. Donnall Thomas, MD, performed the first successful bone marrow transplant between identical twins—one with leukemia and the other with healthy bone marrow to offer.

However, the process wasn’t really understood for another few years—not until scientists recognized that bone marrow contains two kinds of stem cells—hematopoietic and stromal stem cells—as well as human leukocyte antigens, which help the immune system determine what to accept and what to reject.

Progress on bone marrow transplantation proceeded rapidly. For instance, more than 23,000 bone marrow or cord blood transplants were performed in the United States in 2018.

Helicobacter pylori and stomach ulcers

Up until the latter half of the 20th century, the prevailing wisdom among gastroenterologists was that ulcers were caused by stress, spicy food, or too much stomach acid. But in the early 1980s, two doctors in Australia noticed something strange about their patients with ulcers—the patients’ stomachs were populated with an unknown microorganism, which they named Helicobacter pylori.

“I presented that work at the annual meeting of the Royal Australasian College of Physicians in Perth,” said Barry Marshall, MBBS, one of those Australian doctors, in an interview with Discover magazine. “That was my first experience of people being totally skeptical. To gastroenterologists, the concept of a germ causing ulcers was like saying that the Earth is flat. After that I realized my paper was going to have difficulty being accepted.”

Even while Dr. Marshall was successfully treating ulcer patients with a simple, inexpensive 2-week course of antibiotics, he faced constant rejection and criticism from the medical community. The research had to be fast-tracked, he thought, but experiments in animal models weren’t working. He needed to test it on humans. So, he cultured the bacteria, “swizzled the organisms around in a cloudy broth and drank it,” he said.

Dr. Marshall became ill within days. Within a week and a half, an endoscopy revealed his stomach was inflamed and overrun with the bacteria. Dr. Marshall had proved his hypothesis, but it took another decade before the idea finally caught on.

In 2005, Dr. Marshall and Robin Warren, MBBS, the other Australian doctor involved, were awarded the Nobel Prize in medicine for their discovery of H. Pylori as the cause of peptic ulcers.

Share with emailShare to FacebookShare to LinkedInShare to Twitter
ADVERTISEMENT