Some of my commenters seem to think the IQ scale is like percentage.  That is if you have 90% as many neurons as the average person, your IQ is 90.  Nothing could be further from the truth.


The IQ scale was originally based on the idea that if a child was mentally functioning at 50% of his chronological age, he had an IQ of 50, and if he was functioning at 150% of his chronological age, he had an IQ of 150.

So an adult (age 16 to 19), who would score like an eight-year old, would have an IQ around 50.  The problem with this is that as the above chart shows, an eight-year-old does not have 50% of the brains of an adult, but more like 95%.  I realize there’s much, much more to intelligence than just overall brain size, but assuming the brain size growth curve is typical of the growth of other brain properties, the IQ scale is wildly distorted, in that it makes the moderately retarded mind seem only half as developed as the average mind, when in reality, it’s more like 95% as developed as the average mind.

If the IQ scale were based on the actual linear growth of absolute neurological development, instead of having a mean of 100 and a standard deviation of 15, it would have a mean of 100 and an SD of maybe 1.7.

This explains why I was having so much trouble mapping chimps to the human IQ scale.  Chimps are about as smart as human toddlers, and as the above chart shows, the neurological development during the toddler years is far greater than in childhood and adolescence combined, but because the concept of IQ is based on linear mental age units, these huge low level differences get compressed, while differences above the human toddler level get extremely exaggerated.