I went to Saint Anselm College in New Hampshire. It's a liberal arts college with about 2,000 students. They explain to all the incoming freshmen (and their parents!) that they don't do grade inflation. An average student is a "C" student. The bookstore sells t-shirts with "C+ Better than Average!"
This is the way it should be everywhere. Grade inflation is nothing more than a way to beef up a resume. GPA is a number like a credit score which I feel has quite a bit of bias behind it and does not tell one much about a person's character or student abilities.
I believe the cause of this may be the use of student evaluations. Students don't like hard professors, and give them bad evaluations. To some extent, professor's continued employment is dependent on having good evaluations. Hence, grade inflation.
In my personal experience, you can lose 20-30% of your evaluation based on hard grading. My reason for believing this? When I was a TA (i.e., "I don't write the tests, but I'll help you pass them"), I got typical evals of about 4.5. When I became a postdoc and started writing my own tests, my evals dropped to about 3. I don't think I became a worse teacher over the course of a single year, so I'm guessing my low evals resulted from not giving out easy tests.
(Not to mention several comments on the evals claimed that tests were too hard. I.e., "bad professor, highest grade on midterm was 75 out of 110!")
This is an agency problem which is fairly easy to solve: standardize exams and evaluate professor quality based on VAM instead of student opinion.
I believe the cause of this may be the use of student evaluations
I would posit that evaluations aren't the problem, per se. The real problem is one of incentives: professors are hired and fired (otherwise called "denied tenure") based almost entirely on publication at most big public universities. Teaching plays a nominal role in hiring and promotion decisions. This gives professors an enormous incentive to optimize what they're going to be judged on -- research.
Evals are a problem because they only look at what students think of the prof. A more useful measure than just what students think would be a graph of what students think versus the class's average/median grade, plotted against others teaching similar courses. Then you could correct for "easy graders" and "hard graders."
standardize exams and evaluate professor quality based on VAM instead of student opinion.
A lot of disciplines don't have standardized content, especially in the humanities and social sciences. A lot of profs emphasize different topics and ways of doing things, even in science and math. This might create problems worse than the problem it's trying to solve without solving the incentive issue.
Those of you waiting for a proposed solution from me are going to be disappointed: the only solution I can see is "incorporate teaching into hiring and tenure decisions," but I don't see a way to get universities to do that.
I agree with you that teaching matters little at Research I universities, at least for full profs. It does, however, matter to the army of adjuncts working there, as well as to profs anyplace besides Research I schools.
At Research I schools, the incentives for inflation are mainly about avoiding non-research work. An actual conversation I was involved in:
Student: "But how can you fail me? I'm graduating, and I won't get my math minor!"
Me: "A math minor is a signal to future employers that you can multiply two matrices by hand when asked, and perhaps even write a proof. You haven't demonstrated any ability to do those things."
Student's father: "How can you fail my daughter? She worked so hard for this!"
The dean's office: "Some student's father is complaining about you! Give her a retest!"
Student's father, about a month after she failed the retest: "When are you going to give my daughter a retest and a passing grade?"
Not all Research I universities: at MIT, a lot of full professors take teaching very seriously (in my admittedly limited experience, the ones not interested in undergraduates were the exception rather than the rule, although of course not all were able to turn that into great teaching). And there are no adjuncts to speak of, all are tenure track or tenured and all classes are taught by them (I know of one grad student exception in the '80s who proved the rule).
However your point about research is spot on WRT the incentives: at MIT the undergraduates are potentially in the graduate student ecological niche and there's a formal Undergraduate Research Opportunities Program (UROP) that allows professors to hire them for credit or a "UROP Minimum Wage" ... that doesn't have overhead levied upon it (that's used mostly for January and summer).
Also, a professor who's truly bad or impossible at teaching will not get tenure. And I've watched or rather overhead (sort of an open office setup) a department head force a tenured professor you've probably heard of read every evaluation form submitted for a particular required foundational class, all but one of which (which was a special case) were highly negative, and then tell the professor he'd never be allowed to teach that particular course again.
As for the general topic, MIT's big grade inflation period seems to track pretty well with the Vietnam War and the draft: http://www.gradeinflation.com/MIT.html.
standardize exams and evaluate professor quality based on VAM instead of student opinion.
As a student I'm fine with standardized tests if you can absolutely guarantee that the the tests test core understanding of the concepts and not just rote application of memorized algorithms and formulas. In other words, you have to guarantee that "teaching to the test" is a good thing and not a sign of poor priorities.
I can't guarantee that, but I also can't make that guarantee about non-standardized tests. Plenty of professors make crappy tests which are just rote memorization - that's a pretty easy way to improve your evaluations.
Forcing students to think on a test will only hurt your scores.
I guess, the main issue is that if half the colleges increase grades, the other half looks bad. Maybe a new measure should exist, like your GPA relative to the school average.
Although there is clearly a grade inflation, the correlation between student selectivity and average GPA should also be noted:
"As a rough rule of thumb, the average GPA of a school today can be estimated by the rejection percentage of its applicant pool:
GPA = 2.8 + Rejection Percentage /200 + (if the school is private add 0.2)
Non-selective public schools (typically with 15 percent rejection rates or less) with GPAs in the 2.8 range or less tend to have only modest grade inflation. Some have none."
I went to one of the public schools where this data was gathered from. My experience with grade inflation is that it's mostly about money. If a class is deemed hard, students don't want to take it. If students don't take the class, the department can't make any money, and therefore can't pay the bills.
That, and a lot of professors just don't care anymore. They don't want to be bothered with teaching, so they just pass everyone without a second thought.
The gap between grade inflation at colleges and universities could represent a gap in the energy, motivation and determination of students to complain about their grades.
There's asymmetry between the (nearly limitless) time and energy students have for complaining about their grades versus the (extremely scarce) amount of time professors have for addressing it. This is anecdotal, but I should note that the trendline is inversely proportional to the demands on professors' time over the past few decades.
In the MIT chem department one professor's tactic was to have the student evaluations right before dropping the hammer with a tough final exam.
Carnegie Mellon's Average GPA is still a 2.8. Hasn't changed in a long time.
One thing I have noticed and could be interesting though - is it possible that students are getting better at the material over time?
I'm a TA so I have access to exams from past years. It is most definitely the case the exams just keep getting harder. Exams from 5 years ago are complete jokes in comparison to exams these days, but students seem to be keeping up. It may be that they are just exposed to the material before college now. Any comments?
In the UK we are getting more PhD candidates than there were 10-20 years ago because the stipend that a student is paid is actually a living wage. It used to be that the PhD stipend covered the university fee only, and that to continue as a PhD student you needed some other income.
It's now seen as a reasonable career move - boost your skills, do something interesting, travel the world and get paid at a rate that most undergraduate students consider luxurious.
Makes sense. 10 years ago I hired a PhD. I was ashamed to make the offer because of the limitations of my budget at the time. Yet, it was still three times more than she was able to get in academia.
There is an interesting chapter in _Excellence Without a Soul_ by Harry Lewis, suggesting that garde inflation is less a problem that widely believed. You could look it up.
"On the other hand, Dye and Reck's meta-analysis (1988) revealed that limited variations of undergraduate GPA (e.g., GPA for the last two years, or for courses in the major field) can be more valid predictors of performance than overall GPA. In addition, Colarelli, Dean, and Konstans (1987); Wise (1975); Harrell and Harrell (1974); and Weisbrod and Karpoff (1968) all found positive relationships between grades and performance. "
"Although GPA has been widely analyzed, the research has produced inconsistent results. Some meta-analyses suggest that grades have relatively low validity as a predictor of job success (Bretz, 1989; Hunter and Hunter, 1984). Individual studies by Ferris (1982) and Schick and Kunnecke (1982) support the meta-analysis findings; these studies found no relationship between grades and performance evaluations."
In other words the state of research on this question, back in the early 90s when this article was written, was "inconsistent".
I considered included that paragraph but decided against it since your question was simply "Anybody know of any studies correlating GPA with job performance." This article quotes five studies that show a positive relationship.
My point wasn't so much that the article supports the relationship, just that there are studies that do. Maybe I should have linked to the studies instead.
GPA is a measure of how well you do in school. The question is whether doing well in school correlates with job performance; obviously, you'd measure that by correlating GPA with some measurement of job performance (open question what that measurement would be).
ok i'll bite ... the y axis starts at 2.6, not 0, which visually exaggerates the differences. regardless of the merits of this analysis, this graph is disingenuous
not all graph should start at 0. In this case that might be a good idea but if your data is all above 5000 and moves a couple points, you probably don't want to show 0 because it will be hard to see what is happening. A one point change might be very significant.