A recent study used genomic predictors of cognitive ability, education, and reaction time respectively, derived from a UK sample, to predict test scores in Scottish samples.


As we can see, genomic predictors of VNR scores in a UK sample explained 3.59% of the variance in age 70 Morray House IQ scores in a Scottish sample , implying a correlation of 0.19.  However if we assume that the VNR has a g loading of only 0.45, and further assume that the correlation between two scores is a product of their factor loading (Jensen, 1998), then dividing 0.19 by 0.45 tells us that a polygenic score based on a perfect measure of g would correlate 0.42 with Morray house scores at age 70, and if the Morray itself were a perfect measure of g, the correlation would rise  above 0.5, explaining 25% of the variance, since variance explained equals correlation squared.

And keep in mind we are only looking at common additive genetic variance.  Gail Davies et al writes

SNP-based estimates of heritability for general cognitive function are about 20–30%13. However, these estimates might increase to about 50% when family-based designs are used to retain the contributions made by rarer SNPs14. To date, little of this substantial heritability has been explained, i.e., only a few relevant genetic loci have been discovered

Heritability of 50% (meaning genomic predictions of 0.71) might be an overestimate because a lot of the variants might not be causal.  On the other hand it could be an underestimate because we’re still only talking about additive variants; we haven’t even begun to look for gene-gene interactions.

I think it’s neither an overestimate nor an underestimate, but roughly correct, because even the most extreme critics of twin studies pegged heritability at 45%.