Arctic Skin Color and the Vitamin D Hypothesis by Race Realist

[Note from Pumpkin Person: The following is a guest article written by Race Realist and does not necessarily reflect the view of Pumpkin Person.  Please place all off-topics in the most recent open thread.  They will not be posted here]

The Vitamin D Hypothesis (VDH) purports to explain the range of skin colors observed between races/ethnies around the world. Since there are little UVR and even less vitamin-D-producing UVB in the northern hemisphere, other ways of producing/getting ample amounts of vitamin D were imperative for survival. Locations such as the far north were uninhabited up until 12,000 years ago—the explanation being that populations didn’t have the culture to survive such harsh conditions (see Goebel, 1999Bergman et al, 2004). However, a more likely reason was that there were biological limits on the production of vitamin D due to the lack of UVB rays for most of the year. In this article, I will discuss the skin color of Arctic peoples and why it does not follow the simple gradient of UVB around the world.

To overcome the biological limitations of little to no UVB throughout the year, they needed to supplement with foods to get ample amounts of vitamin D—to cover what they did not get from the weak UVB rays. To overcome the limitation of their environment and vitamin D production, they had to consume fatty animals who had ample stores of vitamin D in their systems. The types of foods allowed peoples to live so far north, since there were little vitamin-D-producing UVB rays, lifestyle and culture is how we conquered the unforgiving far north.

Peoples like the Inuit and Saami eat a diet that is high in vitamin D. Inuits, for example, eat a diet high in vitamin D and n-3 fatty acids (Schaebel et al, 2015).  Due to the high vitamin D intake from their diet, they were able to supplement what they did not get from the sun in their diet and thusly were able to live in the unforgiving cold north due to their diet high in vitamin D (Deng and Xu, 2018). Their dark skin color can be explained in a few ways: their diet (high in vitamin-D-rich marine mammals), UVB rays bouncing off ice, snow, and water, and they are recent migrants to those climes, which would explain their darker skin color compared to other populations that have evolved for a longer time in these climates (Jablonski and Chaplin, 2002).

When people look at Arctic people such as the Inuit, they look at their skin color and see the amount of UVB rays they receive during the year and presume that the VDH is wrong because, according to the VDH, Arctic peoples should have the lightest skin but have dark skin—compared to others who evolved recently in those latitudes—but they have dark-ish skin for that latitude. The answer is simple: they were able to consume enough vitamin D in their diet—a lack of vitamin D production/consumption was one barrier to living in the far north which was then overcome with culture and the foods peoples eat.

The environment of the Arctic is dim and dark for most of the year, though during the summer, of course—when they are most active—they are bathed in solar radiation which is then reflected by the snow, ice, and water.  Fresh white snow reflects 94 percent UVA rays and 88 percent of UVB rays. Chadysiene and Girgzdys (2008; 87) write:

The average data of experimental measurements show that maximum albedo of UVA radiation (of about 94%) was at 1 p.m. in comparison with albedo of UVB radiation of about 88% at 2 p.m. The measurements of albedo were performed on fresh snow with big crystals.”

For example, Inuit populations in northern Greenland report spending up to 16 hours outdoors in the spring and summer months, and would be exposed to UV rays bouncing from ice, snow, and water (Andersen, Jakobsen, and Laurberg, 2012). Exposure to UV rays for this extended period of time—along with eating a diet high in vitamin D—is enough to explain their skin color.

Clearly, Arctic people get bathed in UVB and UVA rays from being reflected off the snow and ice, which gives them their darker skin color. They have the ability to tan (which is distinct from the American term “tanning”) and their tanning ability protects them from high doses of UVR that are reflected from the snow whereas their diet high in vitamin D gives them their darkish skin color and allows them to remain healthy in such a harsh, unforgiving environment.

Nina Jablonski has been writing about the VDH for about 30 years. Jablonski writes in her book Living Color: The Biological and Social Meaning of Skin Color (2012: 68):

Traditional cultures of the Inuit and the Saami center on harvesting vitamin-D-rich foods. The dietary focus for both groups has compensated for the vitamin D they cannot produce in their skin. Both peoples remain healthy when they stick to their traditional diets but suffer badly from vitamin D deficiencies when they switch to Western diets that are lower in vitamin D.

Here’s the thing: when these populations move away from their natural, vitamin-D-rich diet, they suffer from many deficiencies regarding vitamin D, even today many Inuit populations suffer from vitamin D deficiency, both children, and adults (Hayek, 2011). So the change in the Inuit diet is the cause of these deficiencies—their traditional diet was high in vitamin D, but their new diet (the Western diet) is low in vitamin D; since they have dark skin and the UVB is so variable throughout the year, they then suffer from vitamin D deficiencies (Sharma et al, 2011). Sharma et al (2011: 475) conclude that Arctic people are at-risk for vitamin D deficiency due to lack of UVB exposure, moving away from a traditional diet high in vitamin D to a Western diet low in vitamin D, combined with their dark skin.

Frost (2012) claims that while the explosion of rickets in Arctic populations is due to a change in diet (shifting away from a high meat diet) and “increased consumption of certain reactive substances: phytic acids in commercially processed cereals; sodium bicarbonate in baking soda; and aluminum hydroxide in antacids” (Frost, 2012). The dominant source of vitamin D for the Inuit is their diet (Schaebel et al, 2015), and so, due to their shift away from their natural diet high in fatty fish and vitamin D, once they began eating a diet not ancestral to them, then the maladies began. We can see this with every country/population that begins to eat a new diet full of processed foods.

Since the frequency of rickets has exploded in populations that eat a Western-like diet and not their traditional diet, this implies that the traditional diet provided enough vitamin D, and when they began eating a new diet with less vitamin D, then these problems such as rickets occurred.

To end these implications, the Inuit need to return to consuming a traditional diet, since their traditional diets have the adequate vitamins and minerals needed to survive in the environment they are currently in (Koladhooz et al, 2013). Higher BMI (body mass index), their skin color, and the latitude of where they live contribute to low vitamin D production. Inuits who consumed a low number of traditional food items were more likely to be deficient in vitamin D (Anderson et al, 2013) while this deficiency is seen even in Inuit school children (Hayek, Egeland, and Weiler, 2010Singleton et al, 2015).

In sum, there is no anomaly regarding the skin color of Arctic peoples; the hypothesis is called “the vitamin D hypothesis”, and so they get ample vitamin D from the reflection of UV rays from the snow, ice, and water. Reasons for the darkness of their skin include the fact that they are recent migrants to those locations, they consume a diet high in vitamin D, and the reflection of UV rays from albedo surfaces.

The hypothesis that UVB exposure explains the observed skin gradients predicted a novel fact—that populations that migrated out of Africa would be seen to have light skin. This occurred multiple times through three different molecular pathways, in the Neanderthals (Lalueza-Fox et al, 2007) and Europeans and East Asians (different molecular mechanisms for them; Norton et al, 2007). This was a risky, successful and novel prediction made by Jablonski and Chaplin (2000). That this does not hold for Arctic people is not a blow to the hypothesis; it is perfectly explained by the bouncing of UVR off of albedo surfaces and a high vitamin D diet. Skin color is an adaptation to UV rays.


Open thread May 13 to May 19, 2018

Please place all off-topic comments for the week here.  They will not be posted in the main thread.

This week I will blog about the GRE and maybe sex differences in IQ (per commenter Marry) but today, like every Sunday, is open thread day.


In honor of Mother’s Day, I want to mention a BEAUTIFUL interview I heard on CBC radio with indigenous writer Terese Marie Mailhot.  You can listen to the whole thing here.  The interview opens with Mailhot reading a gut-wrenching scene (from her memoir?) where she’s having breakfast with her white boyfriend.

Mailhot wants to eat both a proper breakfast with eggs and toast and another breakfast with French toast and syrup, but the white boyfriend is having none of it.  So she orders only the proper breakfast with toast.

But the toast doesn’t come.

She complains to the waitress and the toast comes cold.

She complains again and now her breakfast is cold.

Her white boyfriend looks at her with disgust.

Mailhot also talks about her childhood.  She understands why so many indigenous girls keep going missing because growing up in an indigenous community, there were always random men offering her drives to school.  They looked perfectly normal, until you looked in their eyes….

Switching gears, I watched a talk by Yuval Noah Harari.

He makes the point that what makes humans superior is not our intelligence per se, but our ability to cooperate.  He notes than one on one, we’re no better than a chimp.  Indeed if he was placed on a deserted island with a chimp, he suspects the chimp would win.  But if you placed a hundred men against hundred chimp, then men would win because we can cooperate and they can’t.

For those of us who romanticize the idea of the superior individual, this was a hard pill to swallow.

But I think humans are superior to chimps even on the individual level, if given enough time.  The chimp might dominate the man for the first few decades, but if both individuals could live for centuries, the man would eventually figure out how to build a cage and put the chimp in it.  Perhaps collectivism and culture achieve what one individual life doesn’t have time to do.

The 21st century technology we enjoy today is the cumulative collective result of the 107 billion humans to have ever walked the Earth.

Could a single stone age man living all alone on Earth have eventually achieved 21st century technology all by himself, if he lived 107 billion times as long as the average man?  By contrast, a lone chimp living the same amount of time, would never get beyond the stone age.

The most pro-Jewish President of all time?



The Jewish community should give Trump an award for being the most pro-Jewish president of all time.  He took military action against Israel’s enemy Syria, and further infuriated the Muslim world by moving the U.S. embassy to Jerusalem and ripping up the Iran nuclear deal.

And if all that wasn’t pro-Jewish enough, Trump apparently raised his children to marry Jews and so all eight of the grandkids who will inherit Trump’s multibillion dollar empire are either half or a quarter Jewish.

Of course Trump is not the first rich person to give his money to a different ethnic group. Bill Gates has spent billions helping Africans and Angelina Jolie and Madonna have adopted Africans.  Such liberal behavior is a sign of high IQ, perhaps because it’s morally intelligent when rich races help poor races and reduce inequality, even though it’s dumb from purely tribal perspective.

But it’s very rare for someone to give all their money to an ethnic group that’s richer than their own and thus increase inequality at their own group’s expense.  This seems unintelligent both from a moral and ethnocentric perspective, and thus is likely a sign of low IQ.  Not since Michael Jackson left his fortune to his three Caucasian looking kids have we seen such an extreme act of slavish conservatism.


Michael Jackson claimed to be genetically related to his kids, but skeptics say “no way”

Trump is to the Jewish community as Michael Jackson was to whites.  How long before Trump gets plastic surgery to look more Jewish? 🙂


Michael Jackson before and after his transformation

DNA predicts cloned Neanderthals would average 90 on IQ tests

[Please post all off-topic comments in the most recent open thread. They will not be post here]


Steve Hsu once estimated that if we cloned Neanderthals and raised them in modern times, they would average IQs of 70.  Not sure how he arrived at that estimate, though he does mention their genetic affinity to humans.

I’d estimate a much higher figure:

It’s been reported that the genomes of humans and chimps are 98.8% similar, humans and Neanderthals are 99.84% similar, and that different human races are 99.9% similar.

On an IQ scale where white Americans average 100 with a standard deviation of 15, chimps score around 14.  Thus a 1.2% genomic gap may have caused a 86 point IQ difference.  Assuming a linear relationship between genomic gaps and IQ gaps and independent genomic effects, we might expect that (controlling for environment) white Americans and Neanderthals to differ by 11 IQ points and white Americans and other races to differ (on average) by 7 IQ points?

Thus if we cloned Neanderthals, raised them in average white U.S. homes, we might expect them to score 89 on IQ tests normed on U.S. whites (25th percentile of the U.S. white distribution)

Many Neanderthals might end up in jail since 90 has long been the average IQ of U.S. criminals, and the Neanderthal’s incredibly muscular build (enhanced by modern steroids) might give them the confidence and roid rage to engage in violent crime.

Interesting HBD controversy

[Please post all off-topic comments in the most recent open thread. They will not be posted here]

Recently Sam Harris interviewed Charles Murray on his podcast:

As a result, Harris endured severe criticism from Ezra Klein’s influential platform The Vox.  The resulting controversy culminated in the two men debating.  Sadly, they just ended up talking past each other without reaching common ground.

Overall I though Harris was more reasonable, but I doubt he’s as scientifically objective as he claims to be, especially if HBD ever threatened his group.  He lets Charles Murray talk about blacks but I doubt he’d ever let Kevin MacDonald come on his show and talk about Jews, even though Harris does entertain trivial Jewish stereotypes like a “gene” for materialism.

Open thread: Week April 29 to May 5

Please post all off-topic comments for the week in this thread.  They will not be post in the main articles.

I once saw a clip from a very old episode of the soap opera The Young and the Restless, that was so powerful, I never forgot it.

A greedy landlord planned to bulldoze some low income apartments to make room for more expensive ones.  As a result, a bunch of poor senior citizens were being forced to move out.

One of them came to his office.

“You think you got it all figured out,” she said to him.  “Well one of these days, someone smarter than you are is gona come along and give you a taste of your own medicine, and skin you right down to the bone!  Then you’ll be poor like the rest of us.  Only, you’ll be all alone.  At least we have each other!”

The rich landlord looked slightly frightened by the old lady’s prophecy.

“Because if there’s two things you learn growing old,” she said as she walked out of his office, “one is nobody ever has anything all figured out, and the other is…”

She grabbed the cardboard cut-out of the fancy buildings he was planning to build and weakly broke it in half over her knee.

“nothing lasts forever”

How fast did intelligence evolve?

[Please post all off-topic comments in the most recent open thread.  They will not be posted here]

One reason people deny HBD, is they don’t believe intelligence has evolved much or at all since modern humans left Africa tens of thousands of years ago, and diverged into the races we know today.  Earlier today, commenter Swank defended this view:

…there are GOOD REASONS to believe intelligence hasn’t ‘evolved’ since then…

with pleiotropic and polygenic traits, the mutation rate is SLOW

in fact, the mutations humanity’s forebears that changed intelligence took several hundred thousand/millions of years to occur at a time. and it just seems like they were mutations quickly went to fixation, altering brain size/cranial capacity/whatever.

homo sapiens have not been around long.

so, with regard to skin color, eye color, nose shape, whatever…..
these things are likely controlled by not so many genes and subject to new mutations adding genetic variance….and we see diversity here.

other traits? not so much.

There have been scientists who’ve agreed with Swank about intelligence evolving slowly.  For example,  geneticist Spencer Wells says in his TED talk (see the 14 minute mark of video below), that from 1 million years ago, to about 65,000 years, there was a long period of cultural stasis where stone tools and other artifacts shows virtually no improvement.  Then suddenly after 65,000 years ago, the archeological record shows radical improvement.  Wells believes this was because fully complex language began to appear around that time.

Scholar Richard Klein has been making a similar argument for decades, but Klein believes this major genomic change occurred 40 or 50 thousand years ago.

Both Wells and Klein argued that the change occurred in Africa and caused humans to expand beyond Africa, but apparently disagree on when humans left Africa, which is understandable given the uncertainties in dating such ancient events (increasingly, scientists believe that behavioral modernity occurred much more gradually than Wells or Klein implied, but there does seem to be a consensus that technology was largely static from over 1 million kya to less than 300 kya)

Like Swank, Klein seems to think it was the last major cognitive change, stating:

What happened 40,000 or 50,000 years ago was the last major change in the genotype. At least the last major biological change. Evolution continues, but the evolution that’s involved in making us capable of wielding this vast variety of cultures–that probably stopped around 40,000 or 50,000 years ago and there’s been no essential change since.

Mitchell Leslie writes:

Forget about the construction of the first cities or the introduction of the internal combustion engine. The revolution that made the biggest difference occurred on the savanna of East Africa roughly 45,000 years ago, Klein and others maintain.

Stephen Jay Gould agreed with Klein, famously stating:

There has been no biological change in humans in 40,000 or 50,000 years. Everything we call culture and civilization we’ve built with the same body and brain

However scholar Greg Cochran is having none of it, stating:

I can’t think of any genuine reasons for thinking that human evolution had stopped. Some people seem to have thought that 40,000 years was small potatoes compared to the time since the chimp-human split (five or six million years), so that there wouldn’t have been much change over that time period. Of course this ignores the massive ecological changes that humans experienced over the last 40 millennia, and the resulting selective pressures. Others seem to have thought that newly clever humans instantly came up with a technological fix for any problem that arose, which would have removed the selective pressure associated with the problem. Face it, we’re not that smart. People suffered from malaria for thousands of years before figuring out that it was transmitted by mosquitoes (in 1897, by Ronald Ross) — and we haven’t knocked it out yet.

And often when we did solve problems, they didn’t stay solved. For example, whenever we came up with better methods of food production, population increased until people were hungry again. At that point you see selection for metabolic efficiency, for the ability to digest newly available foods such as milk, etc…

Speculating on why so many scientists believe there’s been little important recent evolution, Cochran states:

…Certainly some were (are) heavily invested in a vision of human sameness. I’m not sure how much of that is driven by practical payoffs: ethnic and racial differences continue to exist whether people “believe” in them or not…

…On the other hand, some certainly worry about political fallout of possible discoveries, about the impact on their NIH funding, etc….

… I think that some are genuinely confused. This is easier than you might think since very few biologists or human-science types know much about genetics and natural selection. Others simply don’t know much about human variation, while others are probably just spreading ink.

Elsewhere in the same interview Cochran states:

Cranial capacity has shrunk 10% in 15,000 years: that’s the fastest rate of change ever seen in the human fossil record, by far. Consider the number of genetic differences between humans and chimpanzees: they occurred over about six million years, from which you can determine the average rate or change. The number of genes that are apparently being replaced by new versions is much larger than you would expect from that long-term rate — something like 100 times larger.

Of course Cochran’s assuming the decline cranial capacity is genetic.  Richard Lynn argued, that like the Holocene decline in height, it was caused by malnutrition/disease, and that its (full?) recovery in the 20th century partly caused the Flynn effect.


More data on long-term Wechsler IQ stability

[Please post all off-topic comments in the most recent open thread.  They will not be posted here]

In my previous article, I described two studies showing Wechsler stability coefficients of 0.89 and 0.9 in young and older people tested 14 and 20 years apart respectively.

In the interest of full-disclosure, I found a study of Wechsler IQ that did not show such sky high long-term stability: 24 men and 24 women aged 39 to 44 were given the WAIS circa 1969 after having already taken it in 1956.  The correlation was 0.73.


Of course the sample suffered from range restriction, with an IQ variability much smaller than the general U.S. population’s.  This probably explains why the stability was only high instead of sky high.


There was also a study as cited by Ian Deary et al. that found a 0.78 correlation between Wechsler IQ measured at age 2 and at age 15 (incredible considering IQ is thought to be unstable before age 10 and especially before age 6), though oddly, only a 0.47 correlation between Wechsler IQ at age 9 and 15.



So for all studies I could find, comparing Wechsler IQ in people tested at least 13 years apart, the correlations are 0.73, 0.78, 0.88, and 0.9 with a median of 0.83, which is probably an underestimate given that most studies suffer from range restriction.

If anyone knows of any other studies of long-term Wechsler IQ stability, please let me know in the comments.



The incredible long-term stability of Wechsler IQ

[Please post all off-topic comments in the most recent open thread.  They will not be posted here]

For all the talk we hear about neuroplasticity, it seems IQ, at least as measured by the Wechsler, is incredibly stable.  According to a study by Erik Lykke Mortensen and his colleagues, there was an astonishing 0.89 correlation between WISC full-scale IQ measured at age 9.5, and WAIS full-scale IQ measured at 23.5 in a sample of 26 low birth-weight kids.  That’s absolutely colossal.  To put that in perspective, when a sample of 16-year-olds (n = 80) took the WAIS and WISC within an interval of one to six weeks, the correlation was 0.88 (WAIS-R manual, pg 48).

In other words, WISC IQ measured at age 9.5 predicts a young adult’s current WAIS IQ about as well as his WISC IQ measured a few weeks ago!

Wechsler IQ appears to be much more more stable than even height!  For example, the correlation between adult height and height at age 13 was 0.7 in a sample of Copenhagen men.

And given the moderate to high correlation IQ has with everything from lifetime income to occupational status, its long-term stability is even more compelling.  A psychologist can give a 9-yea-old an hour’s worth of silly games involving cartoon pictures, jig-saw puzzles, blocks, and funny riddles, and from that predict the trajectory of his life better than his teachers and parents.  They are modern day prophets.  The cartoon drawings of black children on the WISC-R are their tarot cards.  Indeed the South Asian woman who gave me the WISC-R at age 12 even dressed like a fortune teller.


And yet they’re also scientists.  Intelligence researchers were the ones who invented correlation, factor analysis, and other techniques scientists in all fields depend on, and they invented IQ tests, one of the single most stable, predictive, and fascinating measures science has ever seen.

Of course you can dismiss the study I cited above because the sample was not large and representativeness enough, but Steve Hsu independently reported similar data, from other tests:

From fig 4.7 in Eysenck‘s Structure and Measurement of Intelligence. This is using data in which the IQ was tested *three times* over the interval listed and the results averaged. A single measurement at age 5 would probably do worse than what is listed below. Unfortunately there are only 61 kids in the study.

age range       correlation with adult score

42,48,54 months               .55
5,6,7                               .85
8,9,10                             .87
11,12,13                          .95
14,15,16                          .95

The results do suggest that g is fixed pretty early and the challenge is actually in the measuring of it as opposed to secular changes that occur as the child grows up. That is consistent with the Fagan et al. paper cited above. But it doesn’t remove the uncertainty that a parent has over the eventual IQ of their kid when he/she is only 5 years old.

Another study of 141 adults found a near perfect 0.9 correlation between WAIS full-scale IQ measured at age 50 and 70 (a 20-year interval!).

Of course none of this conclusively proves IQ is as immutable as height or as solid as a rock.  It could be that society is forcing stability on the brain by giving mental stimulation only to those who show promise young.