Should You Take Vitamins?

Photo by Soleil Nagoda

Check out the vitamin section of a standard store and you’ll find a bottle to cure any problem you could ever dream of having. Reading the labels, it sounds like a perfect life is just a few supplements away. Heart health, weight loss, improved vision, strengthened immune system and even male enhancement. Could a vitamin really give us any of those things?

According to most studies, no. According to an article by Paul Offit for The Atlantic, “nutrition experts contend that all we need is what’s typically found in a routine diet.” He uses several studies to explain that the majority of vitamins are ineffective. Vitamins have been proven to be more of a placebo than a panacea. Offit writes that “at least 15 studies have now shown that vitamin C doesn’t treat the common cold.” It seems like scientists are in agreement: vitamins can’t do a whole lot, much less fulfill their lofty promises.

But it gets a little messier than that—these supplements aren’t regulated by the FDA. On fda.gov, the FDA officially states that they are “not authorized to review dietary supplement products for safety and effectiveness.” Instead, it says, “the manufacturers and distributors are responsible for making sure their products are safe.”

It’s hard to believe that something we ingest is left in the hands of the companies whose priority is their profit. Vitamins are made to impact our bodies. This means that they can easily make a negative impact. While popular supplements like iron, calcium and multivitamins aren’t necessarily dangerous, it’s important to be conscious of anything you’re putting into your body. Know exactly what the vitamins you take are meant to do, and know if they can be taken alongside prescription drugs you also take. All in all, though, science is in and vitamins are out!