It’s not an exact science for starters so a +/- 5 points seems a likely margin, but I guess its just inside one “deviation”. 125 would be at +1 and so on (just making example numbers here).
Which is fucking pointless to point out because 80% falls into that 15 point deviation.
If you want true average, a 5-6 point deviation is more useful. Especially given the vast difference between someone with an IQ of 86 and the IQ of 114 - former will be failing most of their classes whereas latter will be excelling in most (not accounting for other personality traits that affect scholarly results, of course).
Technically not even below! The average is 85-115 IQ.
Isn’t the average 100 by design?
And also by definition of what an average is?
It’s not an exact science for starters so a +/- 5 points seems a likely margin, but I guess its just inside one “deviation”. 125 would be at +1 and so on (just making example numbers here).
Which is fucking pointless to point out because 80% falls into that 15 point deviation.
If you want true average, a 5-6 point deviation is more useful. Especially given the vast difference between someone with an IQ of 86 and the IQ of 114 - former will be failing most of their classes whereas latter will be excelling in most (not accounting for other personality traits that affect scholarly results, of course).
No, ~68,3% fall into that deviation.
Given that tests only approximate the real value and that there are errors and biases, being too exact wouldn’t serve you any function.
But I think you are right that a deviation of 30 from under to upper limit of average IQ is pretty big.
A range can’t be an average