Excellent talk on epigenetic inheritance

I really enjoyed this talk by Karen B. Michels on epigenetic inheritance (see video below):

Michels is a Radcliffe fellow (one of the top 50 artists/scholars as chosen by Harvard each year) and she begins the talk by explaining how when she first got the email informing her of her acceptance, she started screaming on a bus in rural Vietnam, and the other passengers may have thought “these foreigners are soooooooo weird”

I thought it was a funny story; her audience didn’t, but she was adaptable enough to quickly switch gears and dive right into epigenetic inheritance.

Epigenetics literally means “on top of genetics”, and the epigenome refers to the chemical tags placed on the DNA sequence to either silence or activate certain genes.  Unlike our genome, which remains stable throughout our life unless we get rare mutations, the epigenome is much more sensitive to environmental effects like smoking and diet etc.

The question is, does this environmental damage to your epigenome get passed on to your kids?  Many studies claim it can, including a very famous study in which the children of mice taught to fear a certain smell, also feared that smell.

However Michels explains that if you have kids, these epigenetic tags are removed not once, but twice, during fertalization, which should protect your kids from the environmental damage you did to your epigenome.

So how does Michels explain what looks like cases of environmental damage to the epigenome being passed on?  A common example is a grandmother smoking during pregnancy causing obesity in grandkids.  While this is commonly interpreted as a case of epigenetic inheritance (since smoking damages your epigenome), Michels explains that the grandkid didn’t nessecarily inherit her damaged epigenome from grandma, but rather smoking damaged the unborn female fetus (including its reproductive cells, thus damaging the future grandchild to boot).

Study after study proclaims “epigenetic inheritance” even though they don’t even come close to proving it (which would require four generations on the female line and three on the male line).  Michels got so frustrated by the misleading use of the term “epigenetic inheritance” that she complained to Nature Genetics and their response was NOT INTERESTED.

The audience gasped.

Michels explained that the way you get published, tenure, and grants is to use epigenetics in your titles.

Michels does not deny epigenetic inheritance, she just feels there’s no evidence for it in humans (and presumably other complex animals) with the exception of genomic imprinting.

Overall I loved the talk, but what I really wanted to know was how an epigenetic inheritance skeptic like Michels explains how mice can be born to fear smells their parents were taught to fear.

 

Advertisements

Lamarckism vs Darwinism

Lamarckism is the theory that organisms evolve not through survival of the fittest as Darwin argued, but by passing on acquired traits to their descendants.  For example,  you might naturally have a very scrawny body build, but if you spend all your time lifting weights, not only will you become more muscular, but you will biologically pass those muscles on to your son, and if he too lifts weights, he’ll pass on even more muscle to his sons, and in many generations, we’ll evolve into a race of incredible hulks.

Lamarckists believed that giraffes evolved long necks, not because only the longest necked giraffes survived (as Darwinists believe) but because by stretching their neck to reach food, they made them longer, and that extra length was somehow biologically inherited by the next generation, who would in turn stretch their necks even further, etc.

Lamarckism was famously discredited when a scientist chopped off the tails of mice for multiple generations and the mice were still born with tails.  This convinced scientists that you could not biologically pass acquired traits down to your children.  Lifting weights everyday might turn a scrawny nerd into a hulking  power lifter, however his son will still be born with the same scrawny build dad had before he started lifting weights, and the only way we would evolve into a race of hulks would be for the most naturally muscular people to have the most kids each generation, as Darwinists argue.

However with the rise of epigenetics, many people, including our very own Race Realist, have been arguing that Lamarck was somewhat right after all.  To oversimplify, epigenetics refers to chemical tags that are placed on the letters of our DNA sequence, that turn certain genes on or off, and some believe that not only can these tags be influenced by our life experience, but they can be passed on for many generations.

However when Race Realist tried to push this view at the West Hunter blog, scientist Greg Cochran would have none of it.  This is not surprising because Cochran’s skepticism towards such theories is well documented (see the 15 min mark in the below video):

Also expressing skepticism is scientist Richard Dawkins (see 2:40 mark in below video), though Race Realist feels this is largely because epigenetics undermines his “selfish gene” theory.

A major 2014 article by  Edith Heard and Robert Martienssen, published in the journal Cell, was every bit as skeptical as Cochran and Dawkins.  According to a summary of the Cell article by science writer Alex B. Berezow:

…characteristics many researchers assume to be the result of epigenetic inheritance are actually caused by something else. The authors list four possibilities: Undetected mutations in the letters of the DNA sequence, behavioral changes (which themselves can trigger epigenetic tags), alterations in the microbiome, or transmission of metabolites from one generation to the next. The authors claim that most epigenetic research, particularly when it involves human health, fails to eliminate these possibilities.

It is true that environmental factors can influence epigenetic tags in children and developing fetuses in utero. What is far less clear, however, is whether or not these modifications truly are passed on to multiple generations. Even if we assume that epigenetic tags can be transmitted to children or even grandchildren, it is very unlikely that they are passed on to great-grandchildren and subsequent generations. The mammalian epigenetic “reprogramming” mechanisms are simply too robust.

Therefore, be very skeptical of studies which claim to have detected health effects due to epigenetic inheritance. The hype may soon fade, and the concept of Lamarckian evolution may once again return to the grave.

 

IQ & the Bell Curve

Commenter Race Realist claimed in the comment section that IQ and many other physiological traits do not form bell curves.

A bell curve is just a distribution of scores where most people score around average, and as scores move further from the average in either direction, the number of people very gradually decreases (forming the shape of a bell).

As Arthur Jensen noted in his book, Bias in Mental Testing, many physical traits roughly form a bell curve, for example, height:

bellcurve4

Birth weight:

bellcurve6

Brain weight:

bellcurve5

The reason for this, as Jensen brilliantly understood even back in the early 1970s, is these are complex polygenetic traits caused by a great many uncorrelated genetic and micro-environmental effects, and thus their distribution should resemble the flipping of thousands of coins, giving you either bad or good genetic (and environmental) luck:

bellcurve10

While it’s true that modern IQ test results are forced to fit a bell curve, this is not necessarily because cognition doesn’t naturally form a bell curve, but rather it’s because in order for test results to naturally form a bell curve, you need what’s called an interval scale: A scale where items increase in difficulty at equal intervals.

But because it can be tricky and tedious to judge whether a certain IQ test item is 10% more difficult than another one, or 30% more difficult,  IQ tests often contain abrupt jumps in difficulty, making them ordinal scales, not interval scales, which prevent the distribution of scores from being smooth and continuous.  As a result scores wont always naturally fit a bell curve, they must be forced to.

However as Arthur Jensen noted in  Bias in Mental Testing, some psychometric tests are based on interval scales.  For example the original Binet scale used the concept of age.  Since the difference between a six-year-old and a five-year-old is theoretically the same as the difference between a ten-year-old and a nine-year-old (one year), this is an interval scale, and so IQs calculated from the ratio of a child’s mental age to his chronological age closely approximated a bell curve for the middle 99% of the population:

bellcurve3

As Jensen explained, the departure from normality at the lower extreme is caused by major disorders that override normal polygenetic variation such as mutations of large effect, chromosomal abnormalities, birth trauma and the like.  The surplus of scores at the high extreme is less pronounced and harder to explain.

One of the best measures of IQ is Vocabulary.  Most vocabulary tests are not true interval scales because psychologists arbitrarily pick words for people to define, and these may increase in difficulty in a non-linear way,  however as Jensen noted, some Vocabulary tests are based on selecting words from the dictionary at random, and when this is done, the total number of random words kids can correctly define, approximates a bell curve:

bellcurve8

Another example of an interval scale is Digit Span, because Digit spans gradually increase in difficulty one digit at a time, with multiple trials at each difficulty level.  When this was scored such that each digit correctly recalled in the right order scored a point, the scores of 5,539 Navy recruits approximated a bell curve:

bellcurve7

Of course I’m not suggesting that all cognitive abilities form a bell curve.  Indeed a member of Prometheus once claimed that because the human mind works in parallel, complex problem solving speed actually doubles every 5 IQ points, which is about as far from a normal distribution as you can get.

An interesting question is why do some forms of cognition (including some very g loaded abilities like Vocabulary) form a bell curve, while spatial and math talent may form an exponential curve, and does this imply math and spatial geniuses are vastly more intelligent than verbal geniuses, since the latter are at most only about 100% verbally smarter than average, while the former are many orders of magnitude spatially or mathematically smarter?

 

IQ, biology and culture bias

An ideal study of IQ and environment might be as follows:

Find 300 Bushmen babies being raised as hunter-gatherers and randomly assign them to three groups of 100 Bushmen each:

Group 1:  Gets sent to the United States where they are raised by billionaire Ivy League PhDs and the full-blooded children of these adopted Bushmen are also raised by billionaire Ivy League PhDs.

Group 2:  Remains in their hunter-gatherer environment, but gets weekly visits from doctors and nutritionists  to make sure they, and their babies have the exact same First World medical care, health and nutrition as Group 1, but these health professionals are not allowed to speak to them in English or explicitly educate them in anyway; their only role to make sure the Bushmen reach their biological potential which means doing regular health checkups and supplementing any nutritional deficiencies, especially in pregnant women.  If the health professionals do their job, we’d expect the second generation of group 2 to have the same birth weight, infant head circumference, adult height, and perhaps adult MRI brain size as group 1’s second generation.

Group 3:  Remains in their hunter-gather environment with no intervention at all.

Several decades later, the children of all three groups would be administered the Wechsler intelligence scales (in English for group 1, translated into a Khoe language for groups 2 and 3, though for the Vocabulary subtest they would still have to define English words, none of which they would have heard before, but they would define them in their native Khoe)

If this were done, I would expect the subtests of the Wechsler could be divided into the following categories:

Type 1:  subtests where the group 1 > group 2 gap far exceeds the group 2 > group 3 gap.  These would likely be subtests like Information and Vocabulary which requires exposure to Western culture which groups 2 and 3 lacked.

Type 2:  subtests where the group 2 > group 3 gap far exceeds the group 1 > group 2 gap.  These would be subtests where exposure to Western culture and education matters much less than the physical development of the brain.  These would likely include some of the hard-core Wechsler performance subtests where you have to use your hands to quickly fit objects together in a spatially competent way.

Type 3:  subtests where all three groups would score relatively equal.  These are subtests where neither the cultural nor biological environment matters much unless it’s pathological.  Skeptics would deny type 3 tests are even possible, but perhaps some of the Wechsler auditory short-term memory subtests might be type 3.

We don’t have to give the three subtest types names, but it’s tempting to use adjectives like crystallized, achievement, and culturally loaded to describe type 1 subtests, and fluid, aptitude and culture reduced to describe types 2 and 3.  The difference between 2 and 3 being that the former show more phenotypic plasticity, but for biological, not cultural reasons.

The Raven Progressive Matrices is a test which showed enormous phenotypic plasticity over the 20th century (the Flynn effect) even though it was intended to be culture reduced.  This can be partly explained by the fact that Flynn effect is partly biological (Richard Lynn noted that improved nutrition has increased brain size since WWI) and by the fact that the Raven is partly cultural, as James Flynn has argued.

If even the Raven is culturally biased, is a truly culture fair psychometric test even possible?  If we define culture fair as tests where group 1 and group 2, but not necessarily group 3, score equally, these might be possible, but I think the reason the Raven failed was a) it’s boring nature made it too sensitive to test motivation which is a culturally sensitive variable, and b) as James Flynn implied, it relied too much on hypothetical thinking:  people in less modern cultures only apply their intelligence to clearly defined practical looking problems with tangible solutions.

Some of the Wechsler auditory short-term memory subtests or hands-on spatial subtests might come a lot closer to culture fair than the Raven did.

They’re all gona laugh at you!

A popular theory among U.S. elites:  Trump ran for President because he was so humiliated by Barack Obama at the 2011 White House Correspondents’ Dinner that it was the only way to save face.  Explaining the theory Dan McLaughlin writes:

Despite being born to wealth, he’s lived his whole life as the nouveau riche kid from Queens whose fame, fortune, Ivy League degree, fashion-model wives, TV shows, casinos, beauty pageants, football team, political largesse . . . none of it could get his old-money Manhattan society neighbors, the smart kids, the political movers and shakers to treat him as a peer, an equal, a man of consequence.

Partly because of this,  The New York Time‘s Charles M. Blow argues Trump is jealous of Obama:

Trump wants to be Obama — held in high esteem. But, alas, Trump is Trump, and that is now and has always been trashy. Trump accrued financial wealth, but he never accrued cultural capital, at least not among the people from whom he most wanted it.

Therefore, Trump is constantly whining about not being sufficiently applauded, commended, thanked, liked. His emotional injury is measured in his mind against Obama. How could Obama have been so celebrated while he is so reviled?

The whole world seemed to love Obama — and by extension, held America in high regard — but the world loathes Trump.

Obama was a phenomenon. He was elegant and cerebral. He was devoid of personal scandal and drenched in personal erudition. He was a walking, talking rebuttal to white supremacy and the myths of black pathology and inferiority. He was the personification of the possible — a possible future in which legacy power and advantages are redistributed more broadly to all with the gift of talent and the discipline to excel.

Given this backdrop, when Obama lured Trump to the 2011 White House Correspondents’ Dinner to be laughed at to his face by a room full of  U.S. elites and on international TV, he snapped, according to The New Yorker‘s Adam Gopnik:

On that night, Trump’s own sense of public humiliation became so overwhelming that he decided, perhaps at first unconsciously, that he would, somehow, get his own back — perhaps even pursue the Presidency after all, no matter how nihilistically or absurdly, and redeem himself

Explaining further, McKay Coppins writes:

On the night of the dinner, Trump took his seat at the center of the ballroom, perfectly situated so that all 2,500 lawmakers, movie stars, journalists, and politicos in attendance could see him….But as soon as the plates were cleared and the program began, it became agonizingly clear that Trump was not royalty in this room: He was the court jester. The president used his speech to pummel Trump with one punchline after another…When host Seth Meyers took the mic, he piled on with his own rat-a-tat of jokes, many of which seemed designed deliberately to inflame Trump’s outer-borough insecurities: “His whole life is models and gold leaf and marble columns, but he still sounds like a know-it-all down at the OTB.” The longer the night went on, the more conspicuous Trump’s glower became. He didn’t offer a self-deprecating chuckle, or wave warmly at the cameras, or smile with the practiced good humor of the aristocrats and A-listers who know they must never allow themselves to appear threatened by a joke at their expense.

whitehouse

Instead, Trump just sat there, stone-faced, stunned, simmering — Carrie at the prom covered in pig’s blood.

carriewh

It’s ironic that Coppins seems to hint at Trump’s lack of social intelligence in this situation since commenters on this blog often praise Trump as one of the greatest social geniuses of our time, a reasonable opinion given Trump beat the top politicians in America at the their own game, despite no political experience.  Perhaps Trump was just too angry to display his social skills on that night, or perhaps his type of social savvy can’t adapt to upper class environments.

More interesting, given it’s Halloweek, was Coppins’s reference to Stephen King’s first novel.  In Carrie, after being lured to the prom by the elite kids only to be publicly laughed at, a high school senior takes her revenge by becoming the most powerful girl in the World (destroying the school with her telekinetic powers).

Similarly, after being lured to the White House Correspondents’ Dinner by U.S. elites to be publicly laughed at, Trump got his revenge by becoming the World’s most powerful man, displacing the President who mocked him.

He who laughs last, laughs best.

Did modern humans evolve from killer apes?

Homo Erectus

In honor of Halloweek, I thought I’d share a terrifying little tidbit I learned from a great lecture by professor Henry Gilbert.  At the 1 hr 14 min mark in the below video he mentions a theory that the really thick crania observed in Homo Erectus may have been an adaptation to the fact that they were bashing each other’s heads in.  This is wildly speculative but this might also help explain extreme selection for brain size we see in Erectus, since 1) winning fights requires brain functions like intelligence and physical coordination, 2) head butting people requires a large cranium, 3) some research claims big brains can absorb more insults though this is disputed, and 4) selection for brain size was paralleled by selection for height, which is also useful in combat, especially head butting.

Another point Gilbert makes is that the huge brow ridges of Homo Erectus might be explained by their  robust cranium combined with small frontal lobes (compared to modern humans).

In my last article I discussed the opening scene in 2001: A Space Odyssey, which really emphasized Raymond Dart’s killer ape theory that was popular in the 1960s but has since fallen out of favour, which is a bit surprising given the facts that 1) violence is an obvious selection pressure for intelligence 2) humans are incredibly violent creatures and so are our ape relatives the chimpanzees, and 3) genetic evidence confirms that anatomically modern humans rapidly replaced all other “human” species with only minimal admixture.  On the other hand, there isn’t much evidence the replacement was violent, other than a controversial claim that we ate Neanderthals and the fact that caves occupied by Neanderthals were often taken over by modern humans quite rapidly.

One problem with the killer ape theory is that a recent paper claimed that contrary to Gilbert, cranial thickness was not exceptionally extreme in Erectus.

2001: A Space Odyssey

In honor of Halloween eve, I thought I’d show an eerie clip from one of my favorite movies of all time (2001: A Space Odyssey):

What I love about this scene is just the haunting beauty of life before proto-humans evolved higher intelligence.  Even though the scene is very violent, there’s just something extremely peaceful about it:  For millions of years, generations of African apes would just lived there simple repetitive pointless lives under the quiet glare of the rising and setting sun, and then died, allowing their kids to repeat the same pointless cycle.

The film implies it took an alien intervention for apes to evolves into humans, which as former commenter “Race Realist” has noted, is a common conspiracy theory, perhaps because the human mind is such a complex entity that it’s hard for people to imagine how a process as random as natural selection could have produced it.

Indeed commenter “Mug of Pee” often cites the failure to beat Secretariat’s records as an example of how  even artificial selection for relatively simple traits like horse racing speed eventually hits a brick wall where natural selection runs out of mutations to exploit.  This may explain not only Stephen Jay Gould’s punctuated equilibrium theory, but the hundreds of thousands of years in the fossil record where the genus homo showed no progress in tool making and it’s also an argument HBD deniers like commenter “Swank” have used to argue all races must have equal intelligence, at least at the high end, because intelligence is so complex that any mutation that would increase its upper limit would take hundreds of thousands of years to appear.  This in sharp contrast to scholars like Greg Cochran and Henry Harpending who argued Ashkenazi Jews were selected for high IQ mutations that emerged only about a thousand years ago.

Even though the theme of aliens intervening in human evolution sounds kind of cheesy, it’s handled deftly by director Stanley Kubrick, who never shows us the aliens, instead focusing on the featureless black monolith that mysteriously enhances the IQ of the apes who touch it.

Some claim Kubrick, who was Ashkenazi Jewish, had an IQ of 200, and even assuming the old age ratio IQ scale that gave ridiculously inflated results, this is probably a huge exaggeration.  Nonetheless, he may have been the smartest major director of his generation so it’s interesting to note that like “America’s smartest man” Chris Langan,  he’s quite interested in a scientific definition of “God”, and that’s what the aliens were meant to be in his film.

Readership hits record highs as Charles Murray comments on my blog

A few hours ago I noticed an explosion of traffic on my blog and rushed to twitter to see where it was coming from.  It seems one of the most influential people on the planet had recently commented on a blog article I wrote.

charles

A man who co-authored the most famous IQ book of all time (The Bell Curve) and whose 1984 book Losing Ground revolutionized U.S. social policy. What a great honor it is to have the man who Bill Kristol called America’s leading living social scientist commenting on my blog.

So everyone please be on your best behavior.  You never know who’s reading.

 

The Paleolithic Black-White IQ gap

One reason people think the black-white IQ gap is at least partly genetic is its durability over time.  The roughly one standard deviation IQ gap (15 points) between blacks and whites living for centuries in the United States was first observed in World War I during a time of extreme racism.  It was thought that after decades of racial progress in civil rights, the IQ gap might diminish,  but the most recent high quality IQ data shows the adult racial gap remains over 15 points  (though the gap has narrowed to 12 points in children).

As Arthur Jensen noted, what makes the consistency of the U.S. black-white IQ gap especially striking is that it has endured over a period of such extreme environmental chance that the entire U.S. population is now performing as much as two standard deviations higher on IQ tests because of some combination of increased schooling and media making folks more test savvy,  and increased health and nutrition causing brain size and function to improve.  So even though Americans of all races today score some 30 points higher than their great grandparents in WWI,  the U.S. gap between blacks and whites adults is still 15 points!

Of course one could argue that even a century of IQ gaps proves little, because even though the environment for black Americans has improved dramatically since WWI, they continue to lag way behind white Americans on most measures of socio-economic well-being.

What is needed is data going back much further in time and space.  Obviously, we can’t get in a time machine and return to the paleolithic to give IQ tests to the ancestors of today’s blacks and whites, but what we can do is check the archeological record for evidence of prehistoric intelligence.

On page 134 of his landmark 2007 book Understanding Human History, Princeton astrophysicist Michael Hart documents some of the greatest achievements of the Upper Paleolithic.

upper2

Hart notes that with the exception of pottery, all of these inventions were made by people dwelling in Europe.

On page 135 he writes:

None were made by Negroids,  nor by any other group living in tropical regions.

These facts are consistent with–and most easily explained by–the hypothesis that the groups that were living in cold climates had already evolved higher intelligence by 40 kya…

Critics dismiss IQ as just a score on a silly little test with no relevance to real world intelligence, however if racial differences in IQ predict real world creativity tens of thousands of years ago,  this suggests the tests are measuring differences that are very real, very important, and very genetic and ancient in origin.

One problem with Hart’s book is that he credits the bow and arrow to Europeans.  As commenter Jm8 likes to remind us, archaeologists working at South Africa’s Pinnacle Point cave site found evidence that humans had already invented the bow and arrow 71,000 years ago, likely before the major races had diverged.  However preeminent paleoanthropologist Richard G Klein finds the evidence for this unconvincing (see the 28:06 mark in this video).

Another problem is Hart’s exclusion of the Ishango bone from his list of important paleolithic inventions, as some believe this 20,000 year old African object preserves the earliest known example of math, however skeptics believe the notches on the bone “may in fact be meaningless, simply scratched in to create a better gripping surface.”

When did humans become smart enough to survive the cold?

What sets humans apart from the great apes? The ability to survive the cold.

Humans are an African primate.  Darwin inferred that Africa was the cradle of mankind because it was the land of our closest relatives: the great apes.  All living hominoids, including us, evolved in the tropics, and with the exception of modern humans, no living hominoid is capable of surviving the cold. Gorillas, bonobos and chimps all live in Africa and orangutans, gibbons and siamangs live in southeast Asia.

Apes first appear in the fossil record 25 million years ago in sub-Saharan Africa, so the hominoid body has had tens of millions of years to become perfectly, exquisitely, well-suited  to tropical life, so any hominoid that dared to leave Africa and face the bitter cold of the ice age, needed to be incredibly adaptable to survive an environment so opposite of what his ancestors spent 25 million years specializing in.  It’s likely that such rapid adaptation could not occur until the hominoid brain reached a certain size, giving us a high capacity to learn, invent, and create culture.

This was the transition from ape to man.  Indeed the ability to survive the freezing cold seems to be what separates humans from the apes.  For centuries people have speculated about a giant bipedal ape surviving in the Pacific Northwest,  but the fact that sasquatch is just a myth further shows that apes can’t survive the cold.

When in our evolutionary history did we become smart enough to do so?

“Humans” first entered Europe 1.8 million years ago, but there’s no evidence we were smart enough to survive Northern Europe until 780,000 years ago,  when the climate was similar to today’s southern Scandinavia, and it’s only within the last 40,000 years that humans have proved able to survive the arctic.

Of course even once humans evolved the intelligence to survive the cold,  some could survive it more efficiently than others, and as commenters MeLo and Phil78 have pointed, out competition may have been the decisive variable.  But competition may have been especially intense precisely because it was cold and thus there were fewer natural resources, while in the tropics, selection pressures were more relaxed because there was less need for shelter and more food to go around.

A 2010 article in the guardian describes archaeologist Brian Fagan’s view that Neanderthals lacked the cognitive ability to adapt to the cold as creatively as modern humans:

This meant, says Fagan, that we learned to use local materials – antler, bone and ivory – in ways Neanderthals simply could not imagine. In one case, this resulted in “one of the most revolutionary inventions in history: the eyed needle, fashioned from a sliver of bone or ivory,” he adds. While Neanderthals shivered in rags in winter, humans used vegetable fibres and needles – created by using stone awls – to make close-fitting, layered clothing and parkas: the survival of the snuggest, in short..

In 2012, paleoanthropologist Rick Potts said:

Whenever glacial habitats invaded Europe and Asia, it appears that the Neanderthals moved south, into Iberia and the Italian peninsula, to take advantage of the warmer places. Overall, their bodies show evidence of cold adaptation. Yet during one cold period, when the Neanderthals retreated, populations of Homo sapiens began to infiltrate the cold regions. How could they do this, especially since these populations were dispersing from tropical Africa? The difference is that these early populations of our species had developed the ability to invent new tools, like sewing needles that were useful in producing warm, body-hugging clothing.

In a 2013 article in the BBC, Oxford university professor Robin Dunbar is quoted as saying the following about Neanderthals:

They were very, very smart, but not quite in the same league as Homo Sapiens. That difference might have been enough to tip the balance when things were beginning to get tough at the end of the last ice age

In 2014 paleoanthropologist Chris Stringer of the Natural History Museum in London told National Geographic how much harder it was for humans to survive in freezing Eurasia compared to warm Africa:

If temperatures drop 5-10 degrees in Africa, you’re not going to die; there may be changes in rainfall and desert and forest and so forth, but that temperature drop probably won’t kill you.

In Britain, or Siberia, these populations were constantly under pressure. When it was really cold, they were surviving in pockets in the south—in the Iberian Peninsula, the Italian peninsula, the Balkans, maybe in India and Southeast Asia. All the area to the north would empty of people. Then when it warms up, people would start to expand north and grow their numbers. But often they only had 3,000 years before the temperature dropped all the way back again. So I think it is the climate that was shutting down the diversity of those populations; they couldn’t maintain large numbers because of the climate wearing them down.

In March 2015, Chris Stringer told Oxford university that when the ice age got really bad, all humans in Britain simply died out (see 17:20 mark of the Oxford podcast) and Britain had to be recolonized.

And as I noted back in July,  The BBC wrote in 2016:

…Neanderthals, with their shorter and stockier bodies, were actually better adapted to Europe’s colder weather than modern humans. They came to Europe long before we did, while modern humans spent most of their history in tropical African temperatures.  Paradoxically, the fact that Neanderthals were better adapted to the cold may also have contributed to their downfall.

If that sounds like a contradiction, to some extent it is.

Modern humans have leaner bodies, which were much more vulnerable to the cold. As a result, our ancestors were forced to make additional technological advances. “We developed better clothing to compensate, which ultimately gave us the edge when the climate got extremely cold [about] 30,000 years ago,”…