Ya, it would be like taking the ACT and the SAT as classes. Both give bell-curve results, albeit they are different beasts in nature. Imagine if 50% of students got above whatever 87.5% translates to on the SAT scale...I mean that's a good example of keeping scores normalized as a good measure, IMO.
See, damnit RedLeader, you had to go and add more fuel to my soap box fire.
The good old SAT.
Back in the early 90's, it didn't matter how shabby your high school grades were; if you cranked out a 1500, you pretty much had your pick of any Ivy League instution (an episode of Saved by the Bell comes to mind...).
But NOW, if you get a 1500 (or whatever the equivalent on the three part version is...2250?), you would be VERY lucky to get into ANY Ivy, even if you were valedictorian.
Not to derail the thread to high school standards, but the same principal applies: If you make B+ the average score, then there's more way to differentiate amoung below average students (ie F, D-, D, D+, C- C, C+, B-, B) than above average students (who are either A-, A, or A+).
Following this line of reasoning, I actually think it would be better to make D+ the average, and thereby essentially turn the tables.
And by all means, I'm well aware that this probably wouldn't be a wise move from the "business end" of universities, because...lol...who would want to apply to a college where the average grade (
which i imagine correlates fairly well with the average student allowance as a percentage of the parents disposable income)
is a D+?
