The 1970s was a decade of medical experimentation, where doctors often used new treatments without fully understanding their long-term effects. Many of these so-called cures were not only ineffective but sometimes downright harmful.
As we reflect on these practices, it’s crucial to remember how far medical science has come. This list explores 20 common treatments from the 70s that, instead of healing, often made patients worse.
1. Radium Water
In the 1970s, radium water was marketed as a health tonic, promising vitality and energy. Unfortunately, this radioactive elixir was far from beneficial.
Individuals consumed it believing it would cure various ailments, unaware of the potential for radiation poisoning.
Symptoms such as headaches, dizziness, and more severe health issues were common among regular consumers. Ironically, the very product meant to rejuvenate actually sapped life away.
Today, we understand the dangers of radiation exposure, but back then, radium water was a trusted remedy. Avoiding such hazardous substances is crucial for long-term health.
2. Bloodletting Revival
During the 1970s, there was a brief revival of bloodletting, a practice dating back centuries. Many believed it could cure or alleviate chronic diseases by balancing the body’s humors. This misguided practice often left patients weaker and more vulnerable to infections.
Though it was supposed to cleanse the body, it frequently led to anemia and other complications. The irony is stark—attempts to enhance health often resulted in the opposite.
Modern medicine thankfully recognizes the importance of maintaining blood volume and has moved away from such dangerous practices.
3. Electroconvulsive Therapy Overuse
Electroconvulsive therapy (ECT) was widely overused in the 70s, often applied without adequate understanding of mental health issues. Thought to reset the brain, ECT was frequently prescribed for conditions ranging from depression to anxiety.
However, the side effects were severe, including memory loss and cognitive dysfunction. Patients often reported feeling worse post-treatment, leading to a mistrust of mental health therapies.
While ECT is still used today, it is highly regulated and considered a last resort, showcasing the significant progress in psychiatric care over the decades.
4. Tobacco Smoke Enemas
Tobacco smoke enemas were bizarrely popular in the 70s, believed to resuscitate drowning victims or invigorate those seemingly lifeless. The concept was that blowing smoke into the rectum would stimulate respiration.
Unfortunately, this practice was not only ineffective but also potentially harmful. The introduction of tobacco into the body in this manner could cause irritation and other complications.
This peculiar use of tobacco highlights how desperation for cures can lead to unusual and hazardous practices. Such historical remedies remind us of the importance of evidence-based medicine.
5. Cocaine for Toothaches
In the 1970s, cocaine was occasionally used by dentists to alleviate toothache pain, capitalizing on its numbing effects. However, this potent stimulant carried significant risks, including addiction and various health complications.
While it might have provided temporary relief, the side effects far outweighed any benefits. Patients often found themselves dealing with increased heart rates, anxiety, and potential dependency.
Today, safer and more effective anesthetics have replaced such dangerous practices, reflecting a more informed approach to pain management in modern dentistry.
6. Lobotomies for Behavioral Issues
Lobotomies were once hailed as a revolutionary solution for behavioral and mental health issues. In the 70s, this invasive procedure was still performed despite its controversial nature.
It involved severing connections in the brain’s frontal lobe, often leaving patients with permanent cognitive impairments.
Though intended to reduce symptoms of mental illness, the procedure often resulted in a loss of personality and autonomy.
The medical community eventually recognized the severe consequences, moving toward more humane and effective treatments. The history of lobotomies serves as a cautionary tale in psychiatric care.
7. Mercury in Diuretics
Mercury, a toxic metal, was once a common ingredient in diuretics during the 1970s. These medications were used to treat a variety of conditions, including hypertension and edema.
Patients inadvertently exposed themselves to mercury poisoning, leading to neurological and renal damage. The symptoms included tremors, mood swings, and cognitive impairments, which were often more debilitating than the original conditions.
As medical science advanced, the dangers of mercury became undeniable, prompting its removal from pharmaceutical use. This shift highlights the ongoing evolution of safer, more effective medication practices.
8. Thalidomide for Morning Sickness
In the 1970s, Thalidomide was prescribed to pregnant women to combat morning sickness, hailed as a miracle drug. However, it soon became infamous for causing birth defects, leading to a global medical scandal.
The drug’s teratogenic effects resulted in thousands of children born with physical deformities, a tragedy that could have been avoided with more rigorous testing and regulation.
This incident underscored the importance of drug safety, leading to stricter pharmaceutical standards. The Thalidomide disaster remains a pivotal moment in medical history, teaching us the value of precaution in drug development.
9. Tapeworm Diet Pills
Tapeworm diet pills offered in the 70s promised effortless weight loss by hosting live tapeworms within the digestive tract. This misguided approach to dieting was both ineffective and dangerous.
The idea was that tapeworms would consume calories, helping users shed pounds without effort. However, this often led to malnutrition, digestive issues, and other severe health problems.
The risks of such a method far outweighed any potential benefits, illustrating the perils of extreme dieting measures. It’s a stark reminder of the importance of healthy, balanced approaches to weight management.
10. DDT for Pediculosis
DDT, a powerful insecticide, was commonly used in the 70s to treat pediculosis, or lice infestations. Though effective at killing lice, DDT posed significant health risks to humans and the environment.
Prolonged exposure led to symptoms such as nausea, dizziness, and long-term neurological damage. The environmental impact was also profound, as DDT accumulated in ecosystems, harming wildlife.
Eventually, the dangers prompted a ban, paving the way for safer treatments. This shift illustrates the importance of considering both human health and environmental consequences in pest control.
11. Lead-Based Treatments
In the 1970s, lead-based treatments were still used for various ailments, despite known risks. These remedies included lotions and powders, believed to cure skin conditions or enhance beauty.
However, lead is a potent neurotoxin, and exposure led to serious health issues, such as cognitive impairment and behavioral changes. The harmful effects were often more severe than the conditions they aimed to treat.
As awareness of lead poisoning grew, these products were phased out, emphasizing the necessity for safe and non-toxic medical treatments. The legacy of lead-based remedies serves as a warning against hazardous ingredients.
12. Fen-Phen for Weight Loss
Fen-Phen, a combination of fenfluramine and phentermine, was a popular weight loss drug in the 70s. It was widely prescribed due to its appetite-suppressing effects.
However, the drug was later linked to severe cardiovascular issues, including heart valve damage. Patients often faced more health problems than they had before starting the medication.
The controversy surrounding Fen-Phen led to increased scrutiny on weight loss solutions, highlighting the need for safe and effective treatments. This episode in medical history underscores the importance of thorough testing in the development of pharmaceuticals.
13. Uranium in Cancer Treatments
During the 1970s, uranium was occasionally used in cancer treatments, believed to target and destroy malignant cells. However, the radioactive nature of uranium posed significant health risks.
Patients often suffered from radiation sickness, experiencing nausea, fatigue, and increased vulnerability to infections. The treatments could be more debilitating than the cancer itself.
Thankfully, advancements in oncology have led to more targeted and safer therapies, reflecting a deeper understanding of cancer biology. Uranium-based treatments serve as a reminder of the importance of precision and safety in medical interventions.
14. Arsenic-Based Tonics
Arsenic-based tonics were a surprising health fad during the 1970s, marketed for their supposed rejuvenating properties. Despite arsenic’s toxic nature, these tonics were popular among those seeking vitality.
Unsurprisingly, regular consumption led to arsenic poisoning, which manifested as abdominal pain, skin changes, and neurological symptoms. The harmful effects far outweighed any perceived benefits, proving these tonics more dangerous than helpful.
The eventual decline of arsenic-based products highlights the critical need for consumer awareness and regulatory oversight in the wellness industry, ensuring that safety takes precedence over spurious health claims.
15. Colchicine for Everything
Colchicine, primarily used to treat gout, was overprescribed in the 1970s for a range of conditions. This misuse often led to toxicity, as colchicine has a narrow therapeutic window.
Patients experienced symptoms such as diarrhea, abdominal pain, and, in severe cases, organ failure. The overreliance on colchicine illustrated a lack of understanding of its potent effects and limitations.
Today, stricter guidelines govern its use, ensuring safe and effective treatment. This history reflects the ongoing need for precise medication dosing and the dangers of overprescription in medical practice.
16. Vitamin Mega-Dosing
The 1970s saw a trend of vitamin mega-dosing, where individuals consumed large quantities of vitamins, believing in their unlimited health benefits. While vitamins are essential, excessive intake can be harmful.
Symptoms of toxicity included nausea, nerve damage, and even organ failure, depending on the vitamin. The belief that more is better often led to more harm than good, with many experiencing adverse effects.
Modern understanding emphasizes balance and moderation, recognizing that excessive vitamin intake can be just as detrimental as deficiencies. This serves as a reminder of the importance of informed health decisions.
17. Insulin Shock Therapy
Insulin shock therapy was a controversial 70s treatment for schizophrenia, involving induced comas through insulin overdoses. This drastic approach aimed to reset the brain, similar to ECT.
However, the risks were significant, including prolonged comas, brain damage, or even death. Patients often emerged with worsened conditions, questioning the therapy’s efficacy.
The decline of insulin shock therapy signaled a shift toward more humane psychiatric treatments, emphasizing patient safety and dignity. This historical practice highlights the importance of evidence-based approaches in mental health care.
18. Strychnine as a Stimulant
Strychnine, a highly toxic compound, was marketed in the 1970s as a stimulant and performance enhancer. Athletes and fitness enthusiasts used it, unaware of its deadly potential.
While it might have temporarily boosted physical performance, the side effects were severe, including muscle spasms, convulsions, and even death. The risks clearly overshadowed any benefits.
Today, strychnine is recognized for its toxicity, and safer, regulated substances have replaced such dangerous practices. This episode in fitness history underscores the importance of safety and regulation in performance enhancement.
19. Amphetamines for Schoolchildren
In the 1970s, amphetamines were commonly prescribed to schoolchildren to manage hyperactivity and attention issues. While they provided temporary behavioral control, the long-term effects were concerning.
Children often experienced side effects such as insomnia, decreased appetite, and mood swings. The reliance on such stimulants raised ethical questions about medicating young minds.
Over time, more comprehensive approaches to managing attention disorders have evolved, focusing on behavioral therapies alongside medication when necessary. This shift reflects a more nuanced understanding of childhood development and mental health.
20. Coca-Cola as a Contraceptive
In the 1970s, a myth circulated that Coca-Cola could be used as a contraceptive when applied intravaginally post-intercourse. This baseless claim gained some traction despite being entirely ineffective.
Not only did it fail to prevent pregnancy, but it also posed risks of infections and irritation. The reliance on such myths highlights the need for proper sexual education and access to reliable contraceptive methods.
Thankfully, advancements in reproductive health have debunked such myths, ensuring safer options are available. This tale serves as a reminder of the importance of accurate information in public health.