Today all human populations have the genetic potential to average IQs in the 85 to 105 range, with a species mean of say 97 (Euro norms). Scientists claim living humans have an average cranial capacity of around 1350 cc, but I believe this is way off, because unlike our well nourished ancestors, most modern humans have been malnourished since the neolithic transition. Even today, virtually everyone outside the developed World has sub-optimum nutrition.
Perhaps the best estimate of living human brain size under First World conditions is the average brain size in the U.S., since the racial diversity of this country mirrors the species as a whole.
The average young American has a cranial capacity of at least 1418 cc, with a within-sex standard deviation of 91 cc. The average is probably closer to 1438 cc since the data comes from the army who tend to be smaller than their civilian counterparts.
The correlation between IQ and brain size (among members of the same sex and country) is anywhere between 0.25 and 0.4, with 0.32 being my current best guess (it changes all the time). This means that for every one standard deviation (91 cc) difference in brain size, IQ differs by 0.32 standard deviations on average (the IQ scale has a standard deviation of 15).
If we extrapolate this logic to chimps, who average cranial capacities of 400 cc, which is 1032 cc or 11.34 SD less than the average human (under First World conditions) we’d expect an IQ that is 0.32(11.34 SD) = 3.63 SD less than 97 (the human mean under well nourished conditions).
In other words, chimps should have an average IQ of 43.
And yet that’s not what we find. Last March I wrote:
In 2007 there was a fascinating study that compared human 2.5 year-olds to chimps and other apes on a battery of intelligence tests. With the exception of social intelligence, where the human toddlers were way ahead, the apes and toddlers had the same intelligence.
In other words, chimps have the same intelligence as a 2.5 year old (white) human.
What adult IQ does a mental age of 2.5 equate to? The question is a lot trickier than it seems. One could define adult mental age as 16+ and then use the age ratio method to conclude that since 2.5 is 16% of 16, a mental age of 2.5 equates to an adult IQ of 16. The problem with this method is it assumes intelligence develops as a function of age in a linear way, which is an oversimplification.
What is needed is an actual intelligence test that’s been given to both adults and to toddlers and one where scores increase on an interval scale.
One such test is digit span. Since the earliest days of intelligence testing (digit span has virtually no Flynn effect) it’s been known that by the age of three, a white child can repeat two digits, which probably means a 2.5 year old can repeat one digit.
By contrast U.S. adults average a forward digit span of 6.645 with a standard deviation of 1.35 and since races in the U.S. differ very little on forward digit span, this should be taken as the white adult distribution. This means that an adult who performs like a 2.5 year-old (digit span of 1) is 4.18 standard deviations below the white adult average.
If we assume that most cognitive abilities are like digit span, then chimps (who score like 2.5 years olds on most tested cognitive functions) perhaps average 4.18 standard deviations below white adults on the average test.
Does this mean their IQs average 4.18 SD below the average white adult? No, because if you score 4.18 SD below white adults on the average test, your composite score on a battery of tests is actually much lower. Why? Because it’s much more rare to average an extremely low score across a battery of tests than it is to score that low on any one test. Indeed based on the intercorrelation of WAIS-IV subtests, someone who is 4.18 SD below average on the average subtest, would be 5.73 standard deviations (86 IQ points) below average on the composite score, thus my best guess for the average IQ of chimps is 14 (white norms).
So contrary to what I’ve said in the past, the regression line predicting IQ from brain size in humans, overpredicts chimp IQ by 29 points!
Their expected IQ is 43, but their actual IQ is 14.
This is because apes are not just small brained humans anymore than humans are big-brained apes. No matter how big a chimp’s brain gets, it will never be organized as efficiently as a human’s.
This is because the human brain is spherical which as commenter pumpkinhead noted, is a uniquely efficient shape because it minimizes the distance between any two points, thus maximizing communication between neurons.
This explains why Homo heidelbergensis, (600 ka to 300 ka) despite having a relatively large brain of 1280 cc, was virtually incapable of innovation. In their book The Rise of Homo sapiens: The evolution of Modern Thinking Frederick L. Coolidge and Thomas Wynn write:
…nothing much changed in Africa and Europe between 1.4 million years ago and 300,000 years ago. Hominins made the same types of stone tools they always had-hand axes and cleavers and a range of flake tools…
So despite having a near-human sized brain, Homo heidelbergensis was still an ape in my opinion, and so his expected IQ would fall on the ape regression line, not the human one, and thus his IQ was 29 points lower than an equally big brained human.
And similarly for Neanderthals.
It was not until 300 ka that incipient forms of H. sapiens appear and only then do we start to see a gradual transformation from ape shaped brains to the globular brains of fully modern humans, paralleled by a few isolated signs of human culture. It’s not until 100 ka to 35 ka is the brain transformation is complete and we get the upper paleolithic revolution, the neolithic transition and the ability to finally leave Africa and rapidly colonize the whole world and beyond.
The simplest explanation for this is that IQ genetically increased 29 points over this period, as we leaped from ape shaped brains to human shaped ones.