IQ stands for intelligence quotient because originally IQ was calculated as the ratio of mental age to chronological age, so if you were a six-year-old who cognitively functioned like the average six-year-old, you had an IQ of 100, because you were functioning at 100% of your chronological age. By contrast if you were a six-year-old who was functioning like a four-year-old, your IQ was 66, because your development was only 66% as fast as it should be, and you were sent to what were then called EMR classes.

This was a beautifully elegant concept but there were a few problems. The first is all of us are 0.75 years older than we think we are since we grew in the womb for 9 months. The age ratio method would have made more sense if they had added 0.75 to both the chronological and mental ages and I suspect the distribution would have been more normal.

The other problem is cognitive growth is not linear function of age throughout the entire maturation process.

“Some guy” writes:

Does it really matter if it’s not linear though? If someone scores as the average 10 year-old then it indicates they have the drawing IQ of a 10-year old, which seems more useful than a subjective number.

What’s more useful information about a man’s height? That he’s as tall as the average 10-year-old, or that he’s 1.3 feet shorter than the average man. Both are useful, but the advantage of creating a scale that is independent of age is that it has a much higher ceiling. On the old Stanford-Binet, scores stopped increasing after age 15, so how do you assign a mental age to someone who is smarter than the average 15-year-old?

The old Stanford-Binet got around this problem by arbitrarily extending the mental age scale beyond 15, so Marilyn vos Savant was able to claim an IQ of 228, because she scored a mental age of 22.8 at age 10, even though there was no such thing as a mental age of 22.8 on a test where mental growth peaks at 15.

This makes about as much sense as telling a 19-year-old seven-footer they have a height age of 92, and therefore a Height Quotient of 484, after all the average male height only increases by 0.2 inches from 19 to 20, so if height didn’t plateau, at that rate it would take the average man until his 90s to reach seven feet.

“Some guy” continues:

Presumably they still used this system to see if people scored averagely for their age, but had to first to figure out what the average for each age was anyway.

A related question: Is the mental age concept still applicable to modern IQ tests even though they’re not based on it? Let’s say 10-year old scores 130 on the WAIS. 2 SD above the mean on a 16 SD mental age test would be 132. Can that child be assumed to have the same IQ as the average 13.2 year old?

Put it this way. If a ten-year-old scored like an average 13-year-old on every subtest of the WISC-R, he’d get a full-scale IQ of 126, which is similar to the 130 you’d expect from the age ratio formula. On the other hand if a 10-year-old scored like a 15-year-old on the WISC-R, he’d get a full-scale IQ of 134, which is much less than the 150 you’d expect from age ratios.

And yet if a six-year-old scores like a nine-year-old on the WISC-R he gets a full-scale IQ of 143. So the same ratio IQ equates to different deviation IQs depending on what age it’s obtained (or what test it’s obtained on) which makes it a problematic index.

It probably agrees most with deviation IQ when both the chronological and mental age are no lower than 4 and no higher than 12, since that’s probably the most linear developmental period.