Researchers from University of California, Irvine, led by Professor Andrew Noymer, recently compiled the results of 3,440,710 individual 25(OH)D levels sent to the Mayo Clinic from all over the United States between July 2006 to December 2011. They wanted to see if seasonal variation still existed for 25(OH)D levels in the years after sun avoidance advice and to see if seasonality is changing over time.
Instead of the actual 25(OH)D levels, they used the percentage of samples with 25(OH)D levels above 25 ng/ml. They used air mass, which causes peaks and troughs of 25(OH)D levels to lag behind the solstices by about a month. Below is their graph.
One of the more interesting things they found is that the seasonality of vitamin D levels appears to be decreasing over time. That makes sense, as widespread vitamin D supplementation will effectively override incidental sun exposure and abolish the seasonality of 25(OH)D levels. I’d like to think that the Vitamin D Council’s work has helped in this regard.
I am also reminded that Edgar Hope-Simpson first hypothesized a “seasonal factor” in influenza and was the one to first document the distinct seasonality of influenza, noting that its peak is about a month after the winter solstice. I am still moved when I read Hope-Simpson’s writing on the seasonality of influenza:
“Outbreaks are globally ubiquitous and epidemic loci move smoothly to and fro across the surface of the earth almost every year in a sinuous curve that runs parallel with the midsummer curve of vertical solar radiation, but lags about six months behind it … Latitude alone broadly determines the timing of the epidemics in the annual cycle, a relationship that suggests a rather direct effect of some component of solar radiation acting positively or negatively upon the virus, the human host, or their interaction.”
Professor Noymer’s work confirms that the seasonality of vitamin D remains, even after sun avoidance and growing oral supplementation.