Case closed! Vitamin D controversy—settled, once and for all

During weekends on the radio in the 1930s and ’40s, and then on TV in the ’50s and ’60s, it was nearly impossible to avoid  “The Original Amateur Hour” (an early version of “American Idol” and other 21st century copycats). It was even harder to avoid the ads from the show sponsor—the liquid iron supplement Geritol.

Indeed, “Amateur Hour” is an apt description of the mid-20th century medical approach to human nutrition. Mainstream medicine was convinced that nutrition, nutrients, and dietary supplements couldn’t be important for health in comparison to the marvels of modern medical ministrations.

Back then, iron was the only nutrient doctors really paid any attention to—based on the belief that menstruating women, regularly losing some blood, might need iron supplementation. Taking a cue from that concern, the dietary supplement industry (such as it was at the time) relentlessly promoted Geritol and the hazards of “iron-poor blood.”

In essence, because doctors knew so little about human nutrition, they busily prescribed one of the very few nutrients that really is toxic.

Unfortunately, half a century later, it’s still amateur hour at the CDC. And career science bureaucrats are still trying to argue that the solid evidence against iron supplementation isn’t real. They’re even attacking scientists who did the research on the dangers of excess iron—including a Nobel laureate.

Even worse, today’s bureaucratic ignorance about nutrition extends beyond iron. Although as evidence for the importance of nutrients like vitamin D has mounted, the strategy seems to have changed. Instead of ignoring, denying, or diminishing the data and the researchers, now the strategy is to admit the evidence exists—but then try to create the impression that there is still controversy or “debate” about the data.

The same strategy is applied to drugs. When the evidence piles up overwhelmingly against drugs like statins and low-testosterone treatments (see the August issue of Insiders’ Cures) we then have to have a lengthy “debate” about the “controversy.” Which hopefully (for big pharma) lasts long enough for the patents to expire and the profits to be harvested on these harmful drugs.

But the difference between the drug debate and the supplement debate is that if you wait to take a drug until the controversy is settled, you may well be better off. But if you wait to take a critical nutrient like vitamin D…you’ll be setting yourself up for some very serious health concerns.

So today, let’s settle the current “debate” on vitamin D once and for all.

The overwhelming case for vitamin D supplementation

Back in 2006, I gave the keynote address at the annual Johns Hopkins Medical Center conference on complementary and alternative medicine. My speech was followed by a presentation from Michael Holick, MD, PhD—a world-renowned expert on vitamin D at Boston University.

He pointed out that D is important for much more than just bones (which was the prevailing medical opinion). Reams of laboratory and statistical and epidemiological research show that every organ, tissue, and cell in the body has receptors for vitamin D. Dr. Holick also discovered that D influences the regulation and expression of over 400 different genes—which means it’s involved in virtually every process inside our bodies.

Since then, I’m pleased to see that pathology laboratory doctors are also weighing in on vitamin D research. I pay attention to this source not only because it’s my own original specialty of medicine, but also because doctors in this specialty tend to remember what we were taught about chemistry and biochemistry. They tend to be the most scientific in their approach and have the best ability to evaluate medical research in general (although they know no more, on average, about nutrition than other doctors.

Now that it’s well known that vitamin D is essential to every organ and system in the body, it’s certainly no surprise that a variety of researchers and physicians are repeatedly discovering that D plays a critical role in health and disease.

For example, research has proven that vitamin D plays a key role in preventing and treating five major types of disease.

  • Lab studies strongly show that vitamin D influences the proliferation, differentiation, and death of cells throughout the body—making it beneficial in both preventing and treating cancer. The only debate is about how much D needs to be in the blood to have an anti-cancer effect. Sadly, there is little data about this because researchers and physicians often do not measure vitamin D levels in cancer patients (I’ll explain why a little later).
  • Cardiovascular disease. Vitamin D is found in heart muscle cells, heart connective fibers, the cells in blood vessel walls, and the cells lining blood vessels. So it makes sense that research shows that D deficiency contributes to high blood pressure and enlargement of the heart muscle—both of which can lead to heart attacks and strokes, and heart failure. Studies also show that increased levels of vitamin D may offer a safe, new therapy for congestive heart failure.
  • Kidney disease. Some research shows that low levels of D are associated with higher mortality in people with kidney disease.
  • Multiple sclerosis. Low vitamin D is strongly associated with increased risk of MS. The prevalence of this disease in areas where sunshine isn’t strong enough to help the body make its own vitamin D has been staring medical science in the face for decades. But, as I reported in the November 2013 issue of Insiders’ Cures, it wasn’t until the 2013 annual international meeting of MS researchers that the mainstream finally focused on the obvious role of vitamin D—but only after trying virtually everything else over the years!
  • Mental and cognitive health. Research shows that vitamin D activates receptors in the area of the brain associated with depression. And, as I note on my article on vitamin D is also strongly associated with preventing dementia and improving dementia symptoms in people with Alzheimer’s disease.

So why the “D-bate”?

At this point, the only controversy about vitamin D should be that too many medical research studies, and too many physicians simply don’t measure patients’ vitamin D levels at all. And it certainly doesn’t help when an editorial in the influential British Medical Journal this past winter recommended that doctors shouldn’t bother with the trouble and expense of measuring vitamin D in their patients!1

It’s true that there are some technical challenges involved in measuring vitamin D levels in the blood. First of all, the vitamin is rapidly metabolized. And chemically, it’s like a fat, so it doesn’t mix well with blood. There are also different forms of  D that circulate in our blood. Not all lab tests measure the same forms, so there is lack of standardization. And finally, there are two different units of measurement for vitamin D, and they’re not directly comparable.

Despite the battery of automated blood chemistry tests  done every time  you go to the doctor whether you need them or not (they are the cash cows for all the hospital labs), the technology does not exist to routinely include measurement of vitamin D. So instead of letting health and medical concerns determine routine lab testing practices, we are letting standard (profitable) laboratory routines determine health and medical practice.

On top of the technical issues, the quasi-governmental Food and Nutrition Board of the U.S. Institute of Medicine keeps changing its position regarding the recommended daily intake of D. And it still focuses only on skeletal health, ignoring the vitamin’s crucial role in every other part of the body.

Throw in some outdated and unfounded concerns about D “overdose” among physicians (who happily dole out toxic iron supplements to millions who don’t need them) and we have the requisite “controversy” and debate that stands in the way of good nutrition, dietary supplementation, and health.

And the result of this so-called “controversy”? Creation of unnecessary obstacles that interfere with the widespread adoption of effective screening methods for vitamin D, and obscure the obvious benefits of—and need for—this crucial vitamin in all people.

As far as I’m concerned the “debate” about vitamin D is over. And two  facts are indisputable:  (1) vitamin D is a highly potent health promoter, and (2) most of the population is severely vitamin D deficient.

Plenty of research shows that most people don’t (and can’t) get enough vitamin D from diet and sun exposure alone. As I discussed in the October issue of Insiders’ Cures, for people who live above the latitude of Atlanta, even “safe” sun exposure is not enough to maintain optimal D levels year-round. It is especially critical now in November, as the sun gets too low in the sky for its rays to be able to activate vitamin D in the skin.

So what does all the latest science say about vitamin D levels? Serum vitamin D (25-hydroxy-vitamin D) levels below 48 ng/mL are associated with higher rates of preventable diseases.  And 90 ng/mL is a good “upper limit.” (Though vitamin D toxicity is rare when blood levels are below 200 ng/mL.)

To reach optimal levels, I generally recommend taking 5,000 IU of vitamin D year round. You can also get some of this crucial vitamin from meat, fatty fish like salmon, dairy products, and leafy greens, plus 10 to 15 minutes a day in the sunshine—without sunscreen—between April and October.

Ironing out another controversy

Before we had the vitamin D debate, we had the iron debate. But unlike D, the problem with iron is that misguided government bureaucrats and doctors still, to this day, recommend too much iron supplementation.

Iron is important for our blood to be able to carry oxygen. But it does not belong outside the blood cells and the entire body needs only a total of 4,000 mg (4 grams). If you eat a healthy, balanced diet, you’ll easily maintain that total amount of 4,000 mg.

So what happens if you take excess iron? Well, it can act as an oxidant or free radical that can contribute to a variety of diseases. Many years ago, my faculty advisor, Nobel laureate Baruch Blumberg, understood this basic biochemistry and proposed a theory that too much iron is toxic to the liver and other organs—and could cause cancer.

When I went to work at the National Cancer Institute as a young scientist, Dr. Blumberg knew I could get access to the largest human database (taxpayer funded) that had yet been gathered, and proposed a study to test whether high iron levels eventually lead to more cancer. The NCI’s middle-management science bureaucrats didn’t like this idea—even if it did come from a Nobel laureate—and refused access to the data (let alone funding support for the study). Of course, these were the same minions who had no problem dismissing the idea of another Nobel laureate, Linus Pauling, that vitamin C was important for preventing cancer.

Finally, after I left NCI, we petitioned the Department of Energy (which fortunately has an “alternate” medical research program on the effects of ionizing radiation and reactive ions such as excess iron)—and finally got  access to the data and funding to do the study. The results clearly showed that excess iron leads to more cancer, of virtually every type, in both men and women. The data was so compelling that the study was published in the New England Journal of Medicine (the gold standard medical publication in the U.S.) as well as the International Journal of Epidemiology.

Since then, others have found that excess iron leads to more infections and to cardiovascular disease. It is Biochemistry 101, but based on the perpetually misguided recommendations for optimal iron dosages, some public health bureaucrats and doctors must have skipped those classes or forgotten what they learned.

The bottom line, as I’ve said before: Never take any supplement that contains iron. Unless you’ve been diagnosed with iron-deficiency anemia by a qualified physician.

Source:

1 “Vitamin D and chronic disease prevention.” BMJ 2014; 348: g2280.