VDC test kit slider
VDC-Banner-new_468
VDC test kit slider
sperti-banner

What are systematic reviews and are they useful in vitamin D research?

Posted on: April 12, 2014   by  Brant Cebulla

img

A new study published in the British Medical Journal has inspired headlines casting doubt on vitamin D:

The study was an “umbrella review” of all systematic reviews and meta-analyses published on vitamin D through September 2013:

In this blog, I’ll cover what a systematic review is and the Theodoratou et al study. I’ll cover why they found what they found and if systematic reviews are useful in vitamin D research.

What is a systematic review?

A systematic review is a study that finds all the research to date on a specific topic and synthesizes that research into a single paper. They are lauded in medicine because if the systematic review was well conducted, the paper covers everything you need to know about that topic. Furthermore, the researchers conducting the systematic review can make a recommendation based on all the evidence they found to help guide medicine.

Systematic reviews save researchers and health professionals lots of time. For example, if a doctor had a patient with migraine headaches, and she wanted to know whether she should recommend ibuprofen or aspirin, she would (ideally) start literature searching for evidence to help guide her and make the right recommendation for her patient.

Without a systematic review, the doctor would have to dig through all sorts of different research and form her own conclusion. This is timely, and since the doctor is not a specialist in information science, she might even reach the wrong conclusion.

However, if there has been a systematic review on ibuprofen and aspirin for migraine headaches, all the doctor would need to do is look at that systematic review – the review has already looked at all the research and come to evidence-based conclusions on whether to use ibuprofen or aspirin or not. From here, the doctor could make a recommendation for her patient based on just this single systematic review.

In summary, systematic reviews are meant to save doctors, researchers and health professionals time and help them come to the right conclusions and recommendations on various topics.

When researchers conduct a systematic review asking a therapy question (e.g. does aspirin treat migraines?) or a prevention question (e.g. can aspirin help prevent heart attacks?), the researchers are looking for randomized controlled trials (RCTs) on the topic. If there are few RCTs, then the review will come to a conclusion in the realm of, “There’s uncertainty on this topic; we can’t make a recommendation at this time.”

This was the conclusion of the most recent systematic review on vitamin D by Theodoratou et al.

You might ask, “Well how helpful is that to health professionals if all they state is that there is uncertainty?”

Probably not as helpful as actually making a definitive recommendation for or against a therapy/intervention, but stating there is uncertainty stills guides health professionals that we need more research, which is helpful in giving them perspective on topics.

What did the Theodoratou et al paper find?

In this Theodoratou et al paper, they performed a massive systematic review, collecting all the systematic reviews to date on vitamin D, reviewing those, and compiling their findings into one journal paper.

Since researchers are interested in vitamin D for a variety of conditions, Theodoratou et al found 107 systematic reviews and 74 meta-analyses of observational studies on vitamin D levels and 87 meta-analyses of RCTs of vitamin D supplementation. All of these reviews covered 137 different outcomes (e.g. hip fracture, type 2 diabetes, colon cancer…).

After performing their systemic review of systemic reviews, here’s what they found:

  • We have lots of observational data that clearly links vitamin D deficiency to many diseases.
  • We don’t have lots of interventional data (from RCTs) showing vitamin D supplementation can help in these diseases.

Duh, right?

The researchers summarize,

“In conclusion, although vitamin D has been extensively studied in relation to a range of outcomes and some indications exist that low plasma vitamin D concentrations might be linked to several diseases, firm universal conclusions about its benefits cannot be drawn.”

I would disagree with their first contention that vitamin D has been extensively studied. Something is not extensively studied if firm conclusions cannot be drawn.

While the Vitamin D Council does have firm vitamin D recommendations despite the lack of extensive study, our recommendations are based on equal parts evidence and logic, not solely top-tier evidence. Some nutrition and vitamin D researchers think we need a more logic-based approach to vitamin D recommendations, until we get more studies. We agree, and I’ll explain a little more about this later in this blog.

First, I want to cover why we have lots of observational data and not a lot of RCT data.

The vitamin D research problem

While we do have lots of observational data on vitamin D, this data stems from cohorts that were not designed specifically with vitamin D in mind.

For example, we see lots of observational research come from the NHANES cohort. NHANES was established in the 1960s to track and survey participants over time to see what kind of baseline characteristics increase or decrease the risk of developing various diseases later in life. While we are able to look at the relationship of vitamin D levels and risk of developing diseases in retrospect, NHANES was not designed to gain greater insight on vitamin D specifically, rather insight on just about everything pertaining to nutrition and lifestyle habits.

So it’s important to recognize that while we do have good amounts of observational data in vitamin D, this isn’t because there’s always been great interest and money for vitamin D research. It’s only in the past 10 years that researchers have taken the time and interest to retrospectively look at vitamin D.

When researchers who are mostly versed in statistics and evidence-based medicine perform these systematic reviews, their conclusions feel like a personal attack on vitamin D researchers: the research you’ve completed to date isn’t good enough, you should be providing better data.

But how quickly can we move vitamin D research forward? Certainly 10 years’ time is not enough, at least not enough time to put together a meaningful systematic review.

RCTs are expensive and almost singularly explain the discrepancy in the amount of observational data vs. interventional data we have to date.

For example, one large randomized controlled trial underway is the VITAL trial, a trial that is putting 10,000 participants on vitamin D and 10,000 on placebo. This trial will last five years and costs an astounding $22 million dollars. The trial is very much worth it but nevertheless expensive. This is the price we’re paying to get a little bit of data on whether 2,000 IU of vitamin D/day can reduce cardiovascular disease, cancer and more.

Compare these costs to the costs of an observational study. If we wanted to look at NHANES data and publish our findings, that might only cost $20,000-50,000, the costs of a statistician, a coder and access fees.

If we do the math, for every one large randomized controlled trial on vitamin D, we can perform 628 observational studies – of course we have loads of observational studies and a lack of RCTs to date.

Even small randomized controlled trials are costly. A vitamin D trial of 100 participants for one-year costs around $250,000. And a small trial like this wouldn’t be considered enough evidence to make a firm conclusion on vitamin D in a systematic review.

So, as you can see for vitamin D, there are some financial barriers in doing RCTs. Vitamin D does not have the same luxury as a pharmaceutical drug; there is no one corporation funding the research. Researchers have to scrounge for funds for vitamin D. In contrast, observational studies are performed more often because they are much cheaper.

Are systematic reviews useful in vitamin D research?

It’s going to be another 5 to 10 years before an umbrella systematic review will find conclusive-ish evidence for vitamin D. Specific systematic reviews on vitamin D and specific conditions like hypertension, multiple sclerosis, lupus and more might be able to find conclusive-ish evidence a little sooner, but not much sooner.

You then have to ask the Theodoratou et al group, if you know the kind of data you’re looking for does not exist, then why are you performing a systematic review? Would that money you used to perform the systematic review not be better used doing primary research?

For now, systematic reviews are useful to researchers and informationists like the Vitamin D Council. They help explain where research currently is at in a very succinct manner. In fact, we use them routinely to craft our health condition summaries, to let people know where evidence stands between vitamin D and condition XYZ.

But they probably aren’t as useful for health professionals and the public, as they’re not offering good guidance for the time being. Again, it will be another 5 to 10 years until systematic reviews on vitamin D can offer the public good guidance.

Why Theodoratou et al doesn’t influence vitamin D recommendations

In nutrition, recommendations for essentials nutrients (vitamins, minerals, fatty acids and amino acids) have been set by looking at the average 20th century diet and seeing on average what people are getting. Through both observational studies and RCTs, public health officials have been able to fine tune recommendations for some of these nutrients, but there is still a lot of uncertainty in nutrition.

The obvious limitation in this methodology is that you’re making the assumption that the 20th century diet is ideal. It might be. It might not be, hence the need to continue to progress nutrition research.

Using this same methodology, we would ask, how much vitamin D are people getting from food and sun exposure combined? If we looked at the average 20th/21st century American, the answer would be not much. We work indoors. We’re not exposed to sunlight much at all.

The Vitamin D Council and lots of vitamin D researchers would argue that a more rational approach is to look at how much vitamin D people are getting who are exposed to lots of sunlight. Humans evolved in an environment abundant with sunlight. For the majority of human history, we got lots of vitamin D from sunlight.

When we look at humans who still get lots of sunlight, we find that their vitamin D levels are high, with about 68% of the sun-dwellers having levels between 35.2-56.8 ng/ml, with a mean level of 46 ng/ml. This is the equivalent to about 5,000 IU of vitamin D/day.

This is why the Vitamin D Council recommends a vitamin D level of 50 ng/ml and an average intake of 5,000 IU/day. While we want to see more RCTs, and even more observational data, we feel that for the time being, looking at sun-dweller populations offers the most guidance, similarly to how we set recommendations for most all other nutrients.

Vitamin D is unarguably an essential nutrient. Like any nutrient, there are serious health consequences in not getting any of it. As your vitamin D level gets closer to 0 ng/ml, you’re at higher and higher risk of severe muscle fatigue and osteomalacia, a disease that makes your bones soft and brittle.

When systematic reviews come out that claim there are unclear health benefits of vitamin D, it’s a potentially dangerous statement. By conservative estimations, over a third of the world is deficient in vitamin D. It’s unwise to tell the public that vitamin D is not something to worry about – you’re putting a large portion of the public at risk for osteomalacia or subclinical osteomalacia.

What is uncertain are the exact benefits beyond keeping your bones and muscles strong. That’s okay. There’s always uncertainty in medicine. What is certain is that you need vitamin D. It’s an essential nutrient.

5 Responses to What are systematic reviews and are they useful in vitamin D research?

  1. Rita and Misty

    Greetings Vitamin D Council~~

    I am very saddened to learn that Brant is no longer a staff member for the VDC. Brant was a tremendous asset to this organization.

    Brant, I wish you all the best in this world, and I hope that your life journey brings you much happiness.

    In friendship,
    Rita
    203-464-5409 (this number works 24/7 for you, Brant)

  2. IAW

    Great article!
    Helps me to understand better the problems inherent in the “research system”.
    Ivy

  3. beverly@its-alimentary.com

    Thank you so much. It’s so helpful to have you shine light on the problems with these negative reports. Your very basic and easily understood analysis is very helpful in explaining to my nutrition clients why the mantra “you can get all you need from your food” is not good advice.
    Brant, I wish you well at your next endeavors!

  4. rcbaker200@comcast.net

    The well publicized recent studies mostly dealt with giving subjects 500 to 800 units of vitamin D. I have drawn 25OH levels on several thousand people who were taking these amounts in multi-vitamins or calcium pills, and practically all of them were still less than 32 ng, another words, vitamin D insufficient.
    So the recent reviews were totally worthless and meaningless.
    Robert Baker MD Cherry Hill, NJ

  5. Rita and Misty

    🙂 On 6,000 iu D3 daily my level tested at 32 ng/ml.

    The only way to know your level is to test your blood.

    I need to take around 4 X 6,000 iu D3 daily to have an optimal 25(OH)D level.

    Do you think I am unique? I don’t think so.

    😉

Test Your Vitamin D Levels at Home!

Our in-home Vitamin D Test Kit is easy, affordable, and an accurate way to find out your Vitamin D status.

order NOW

We need your help!

We're spreading awareness on Vitamin D Deficiency
Donate NOW
Latest Articles
img
Can vitamin D supplementation increase your risk of adverse effects? This study suggests otherwise.

A meta-analysis of RCTs discovered that vitamin D supplementation did not increase the risk of noncalcemic adverse effects compared to a placebo.

Weekly Newsletter