The concept of studying in Higher Education for more opportunities, job stability, and income during one's lifetime is not to be dismissed.
That said, if the predominant value of education is in the pursuit of money, then I think using the Brookings Institute model (which The Economist references) that includes two-year and vocational education entities is a much more pragmatic and practical approach. That is, if money is the goal, only focus on money as the outcome. Conflating money with prestige is, to me, foolish. A University is a prestige degree, insofar as there's a pursuit of knowledge, in theory, to produce a well-rounded, educated person...in theory.
I say this as a graduate of two generally highly ranked Universities, who occasionally gets the feeling that I'd be making more money if I'd taken my education budget, got trained in HVAC service/repair/sales, and started my own company.
Even if the predominant value of education is the pursuit of money, the Economist still has the more useful model. It allows you to differentiate between schools whose graduates make money because they are full of people who are going to try to make money, and schools that make money because the school brings something to the table that helps its graduates above other options.
Look at CalTech for example. It does well in the Brookings model because it attracts a lot of people who want to be engineers. It does poorly in The Economist's model because its engineers are not making as much in 10 years as would be expected given a variety of factors that they control for. (Now that said, CalTech may be hurt because they graduate a lot of people who take side tracks through grad school and presumably would make more 15-20 years out. But you have to work with the data you have, and all they had was 10 years.)
Now if I'm a student planning to go into engineering, the Brookings model merely confirms that I'm making an excellent choice. The Economist's model suggests that if I'm capable of going to CalTech, maybe that is not the best school for me.
After reading through your response, I believe you missed my point. What I was referring to is that for a large population of potential students - those who wish to simply make more than if they do not go to a higher education program - going to a two year or vocational path school is exceptionally smart. Leaving that out of consideration is...not smart...highly biased...as in, how many readers of The Economist would consider sending their kid to become a diesel mechanic?
From the research I've absorbed over the years, students that can afford to go to prestige universities often come from well-to-do, college educated families - these are high indicators of future success. For a much larger swath of the population that is trying to advance their stake in life, they are first-generation students, and often not well-served by taking on debt and attending a large University environment. Thus, many Universities are, practically speaking, very bad at ROI for a large contingent of students who simply want to make more money - vocational training? An excellent prospect.
What remains to be seen is how much more earning potential there is in professions that require vocational training to be licensed or successful (i.e. welding, machining, air traffic control, plumbing, electrical contracting) as the generation currently working retires / dies from the workforce, thereby spiking demand, and in fields where a "traditional" University degree are, well, useless.
Read the article again. The Economist controlled for a variety of factors. Family socioeconomic status was among them.
That said, one of the shortcomings of their metric is that they only had data on students who took out federal loans. How predictive this result is for others is open to question.
Well, then, it's even more glaringly bad-form to leave out two-year and vocational schools from their approach then, if socioeconomic status is a factor, now isn't it? My perspective is yes, it's bad. It seems we'll just keep talking around each other, but I'll stand by my thumbs up toward the Brookings version and give the hairy eye to The Economist's method.
As a former Caltech undergrad it's hard to really disagree with the conclusions. I wouldn't be surprised given what I've seen / read if this changes dramatically over the next decade based on changes already taken place.
Median is also possibly not what an incoming Caltech (or MIT, or Harvard) student should optimize for though given that many of them will have relatively safe downsides. Mean for Caltech might be quite high (I don't know for sure). But given that I overlapped with multiple people who have certainly made large sums of money since graduation (about 10 years ago), I'm thinking it might look favorable for Caltech with a small student body.
Worth noting based on a paper linked from the pypy 4.0 thread[0] that maybe the most fare would be a geometric mean of salaries instead of an arithmetic mean.
Yeah, and the UT San Antonio nursing school is gangbusters! That totally makes sense to me - cheap to attend / finance, extremely high earning potential in context.
This tool would be a lot more useful if it allowed for filtering by chosen course of study. For example, University of Washington has a median earnings that is slightly below expected, but I can pretty much guarantee that their computer science graduates are earning well above median.
That was my thought too, to make a simplistic example if a university had both say an engineering school and an art school, it might presumably do worse than a university with only an engineering school. So this metric might favor smaller, focused schools which happen to concentrate on education areas with high median salaries...
> So this metric might favor smaller, focused schools which happen to concentrate on education areas with high median salaries...
I don't think that part's necessarily true. If a school focuses on an area with high median salaries, the model will take that into account in the predicted salaries, so the school will have to have even higher actual salaries than typical for the field (and its input demographics, SAT scores, etc.) to get a positive value-add. See Caltech for an example of a STEM-focused school that does badly by this measure: from its SAT scores, demographics, and heavy concentration of STEM majors, the regression analysis predicts that it should produce graduates with a median salary of $82k. But the actual median is $74k, so its value-add is taken to be -$8k.
Some of the schools that do well are in areas with poor salaries, but score highly because they do better than you'd expect (or than the model would expect, anyway) for that area and student demographics. Otis College of Art and Design has a predicted salary of $29k from the regression analysis, but actual median is $42k, so implied value-add +$13k.
Interestingly, having gone to Caltech, I suspect that its extreme focus on STEM research actually hurt it here. Mostly because that focus results in a very large portion of undergrads going on to grad school (much larger % than any other university), and grad students don't earn very much.
Hi there, Mike! Care to expound on any solutions to make Tech more industry oriented? Are there things you'd rather have been exposed to more in your undergrad education given where you're at now?
Not all the majors at Caltech are going to be value adds, and of those that are, they are generally leading to bench/engineering rather than business dev/marketing/executive positions
It assumes you select a school that is focused on your field of choice, so if you pick an art school, your preference is for art, and if you pick an engineering school, your preference is for engineering.
The issue is that you can't select the University of Washington's engineering school. You can only see University of Washington which has both an engineering and an art school, and so it doesn't tell you what the data will be if you only look at engineering students.
any time data is shown in the article, it should be best to provide access to raw data as anyone can analyze with their own angle. When I see static graphs and analysis I smell distorted facts. (of course in this case you can do search, but giving access to raw data would be more useful.)
Something feels "off" about a methodology that basically ranks all of the "top" state schools as poor values.
Take the top-3 state schools in Virginia (by most other rankings, that's UVA, W&M, and VT). All three are nationally recognized. And all three are competitive that you need very good grades and a solid set of extra-curricular activities to be accepted.
UVA ranks the lowest of the three, yet has the highest actual earnings. VT is ranked significantly higher with the middle earnings value.
What does a student do with this? Cost of attendance at the three is similar. Should they attend VT with it's higher ranking, despite lower average earnings?
Similar comments can be made about UNC-CH, UW-Seattle, UT-Austin.
Also lacking from this analysis seems to be the loan burden borne by students. Georgetown and Villanova both rank very high in this list. But, both are insanely expensive to attend. Even with high actual earnings, it could take a decade or more for many students to pay off a potential six-figure loan.
I don't think the Economist's ranking can be used for choosing which school to attend. It tells you how good a job schools do at boosting the earnings of the people who attend there---conditioned on who attends. In other words, they tell you how much the school does for a typical member of its student body. But if you're not representative of that student body (and you certainly won't be representative at many of the schools, especially the outliers!), this will not tell you how much the school would do for you.
I guess I just find it extremely hard to believe almost all of the nations "top" state schools have a negative impact (actual earnings less than expected) on the students that attend.
Maybe I need to dig into the source of the expected earning figures. I'm definitely biased to some degree, being a graduate of one of the state schools with a poor ranking - I'm sitting here wondering where else I could have gone to get a better value.
The methodology is trying to remove the admissions department's effect on student success, leaving behind everything else (faculty, peers, course structure, university prestige, money vs prestige focus, etc).
Income is a poor measure of college education. The problem is that the valuable benefits of college are difficult to quantify; income has importance and is quantifiable, so I understand the temptation to use it as a metric, but it doesn't represent of the values of a college. It's like using the number of lines of code as a metric to represent the value of a piece of code - it's quantifiable, significant in some ways, but not representative of the code's value.
I much prefer the Times Higher Education model, especially their reputation survey. They survey 10,000 tenured and published academics worldwide, using what looks like well-designed methodology [1]. These are people in a position to have expert knowledge about the qualities of various universities. Yes it's imperfect and those people have bias too, but I can't think of a better model:
One excellent alternative is Washington Monthly's rankings. Their approach:
We rate schools based on their contribution to the public good in three broad categories: Social Mobility (recruiting and graduating low-income students), Research (producing cutting-edge scholarship and PhDs), and Service (encouraging students to give something back to their country). More here:
The methodology feels odd. The results are essentially just errors in their prediction model. They're assuming that the model errors represent the school's contribution to median income. They're also assuming that those contributions are normally distributed around zero with constant variance.
I understand why they made those assumptions but they're almost certainly not true, which isn't exactly ideal. Then again I can't quickly think of a better way to do it without more granular data. Maybe use a mixed effects model with multiple years of data from each school....
There seems to be a major issue with cost-of-living adjustments as well. Several rather obscure California colleges are highly ranked, and I assume this is because both wages and costs are higher in California, where many of their graduates stay after college.
Absolutely. Cost of living in an area will have a massive impact on exit salaries. The economist did not realize that the South and Mid-west have the lowest cost of living when they wrote the following:
> Its upper tiers are dominated by colleges that emphasise engineering (such as Worcester Polytechnic) and attract students with high SAT scores (like Stanford). The lower extreme is populated by religious and art-focused colleges, particularly those in the south and Midwest. This number represents the benchmark against which we subsequently compare each college’s alumni earnings to produce the rankings.
> There seems to be a major issue with cost-of-living adjustments
Yep yep. Otis College of Art and Design "overperforming" for $42k salary is ridiculous. The title Junior Graphic Designer pays around $40k* in Los Angeles, where Otis is located. Therefore, the average for graduates is as-expected. *Source: Glassdoor.
I'm not so sure. My generally-highly-regarded college in California (bay area, no less!) is almost bang-on average ($-33) despite being in a very high cost of living area.
To reply to myself, the comments include this mention from "DR" of The Economist:
-----
Geography (both city and state) are variables included in our model. Colleges in the Bay Area are not rewarded for the high salaries available there--they have to surpass a higher "bar" of expected earnings. That is why our eighth-ranked college is in West Virginia, and the ninth in Laredo, Texas.
------
Now, I don't know how they compensate for this. Cost of living at the place where the student ends up? Or where they study?
If you study in a high cost of living area but move to Podunk, Iowa, your return will be far lower than expected, if they're only controlling for CoL at the college's geographical location.
If they control for CoL at the student's new home, then okay.
The Brookings study [1] mentioned by the the Economist offers this interesting analysis:
1. Graduates of some colleges enjoy much more economic success than their characteristics at time of admission would suggest. Colleges with high value-added in terms of alumni earnings are often focused on training for high-paying careers in technical subjects. ...
2. Four college quality factors are strongly associated with higher earnings for alumni:
Curriculum value: The amount earned by people in the workforce who hold degrees in a field of study offered by the college, averaged across all the degrees the college awards;
STEM orientation: The share of graduates prepared to work in STEM occupations;
Completion rates: The percentage of students finishing their award within at least 1.5 times the normal time (three years for a two-year college, six years for a four-year college);
Faculty salaries: The average monthly compensation of all teaching staff
3. Value-added measures are fairly reliable over time. ...
(Personally, I think the most valuable things gained in college or in any education don't happen to have much monetary value.)
Well, I'm glad my shitty school beat Yale (almost last) so handily. Really, I don't think they could have come up with a worse metric. If you simply took each school's equivalent of this:
So basically, the methodology is that they do a very complex analysis of the student body to predict what their average earnings would be if they went to "college in general" and then compare that with what the graduates actually make.
There's one thing that jumps out at me when I see the actual rankings:
At least three of the eight schools that received a perfect 100 score -- W&L, Babson, and Bentley -- have a very, very high number of students who go to work for lucrative family businesses immediately upon graduation. In the case of W&L (which provides a great liberal arts education mind you) it's mostly southern good ol' boy/girl types. At Babson / Bentley (which provide great undergrad business educations), a huge chunk of the student body are the scions of the economic elite in developing countries, who are getting schooled up so they can be ready to be put in charge of something at a young age.
My hunch is that the Economist's "expected earnings" methodology wasn't granular enough to take these idiosyncratic attributes into account, and that its r-squared of .85 might not be a rigorous number.
Note that the data set they had only includes students who got federal loans. I strongly doubt that students who go to work for lucrative family businesses immediately upon graduation will be in the data set.
But having classmates who look like that seems to be a really good economic choice! Which is a factor that is not apparent in normal college rankings.
This is hilariously bad. Schools that produce academics as opposed to those who go into high-paying professions are penalized, when in fact (at many of the top institutions) producing academics is one of the main goals.
That doesn't make this ranking list bad. The rankings system explicitly explains, in painstaking detail, that the system is designed to measure return on degree, empirically. That means they must use empirical data.
Conversely, schools with a monomaniacal focus on industry do rather well. I'm specifically thinking of Rose-Hulman - an undergraduate-only engineering school - ranked at 23rd.
Also, a very similar argument to what you're making explains why I don't think that the gender wage gap is an big issue.
But it is a serious flaw in the methodology. It says nothing about the quality of education or the boost in earnings that a college will provide, if you are specifically going to a school to do so. At the very least they need to somehow account for individuals that go to the top schools but aren't looking to go into lucrative careers.
CalTech is a common example being cited here. It is basically impossible to deny realistically that going to CalTech is a great way to make a lot of money. But it is in the bottom percentages of these rankings, likely because so many CalTech alumns go into academia.
The metric is supposedly showing which colleges can potentially increase your earnings the most, but it just isn't doing that. It's a great idea, it just hasn't been implemented fully.
I think the money thing is paralleling something else entirely - given the supposed caliber of student that comes into Caltech, is Caltech really making them better scientists than if they had gone elsewhere (or at least better scientists than they were turning out in the 1970s). That to me is a open question.
Yes, but Caltech students have some of the highest rates of going into academia. Which pays well on a general scale, but compared to where engineers and developers go onto it drags the stats down.
They do admit that the metric only gauges financial success and is imperfect when it comes to accounting for alternative priorities. But do you seriously believe that CalTech provides one of the worst "financial bumps" even accounting for the quality of their admissions? I find it incredibly hard to believe.
It's not at all clear that every Caltech student is going to be an engineer or developer. Recall Jim Simons getting booted out from Princeton because he sucked at programming?
If I ranked colleges based upon something silly such as the length of the college's name, that would also be "one of many ways to rank colleges" but it would also be "bad" in any traditional sense of the word as applied to our problem of ranking colleges based upon educational quality.
This analysis doesn't make any sense to me. Even by their own intent (i.e., best added value based on salary), their methodology is nonsensical.
For example: Why should CalTech get hurt for being near L.A.? They're basically arguing that you're better off going to a school in the middle of nowhere because "hey, for being in such a crappy location, you did pretty well!". In an absolute sense you are better off going to CalTech, it's just that they might not leverage their advantage as well as some other schools.
Not that they even show the last point -- it seems unlikely that the true model is linear (I'm guessing they used linear regression). For example, if the true model is closer to a sigmoid, then schools at the high end suddenly get unfairly penalized and schools near the low end get unfairly boosted.
Finally, the statistical indicators are equally misleading. I can obtain an R^2 of 1.0 just by including indicators I[is COLLEGE_NAME] for each college. While that might not give you significance, the point is that getting good prediction is meaningless.
I think what they really want is to restrict to predictors about the students. So, given that you're a straight A student with a 2400 SAT score, what would you expect to make coming out of each school? This at least tells me something about the added value to me of going to a certain school. (This approach is still prone to bias, but in the opposite direction -- there's a chance that the straight A student with a 2400 SAT score going to community college may have been smart but unmotivated, which might correlate with lower salary.)
Edit: Here's another concern. They're claiming to have a model for "expected" earnings:
earnings = A * (college covariates) + b + error
but they can't distinguish between model error (i.e., error because their model is misspecified) vs. the school variation that they are trying to capture.
> They're basically arguing that you're better off going to a school in the middle of nowhere because "hey, for being in such a crappy location, you did pretty well!".
Well, no, not exactly. It's a subtle distinction, but what it's actually ranking is how well that school exceeds expectations, not best outcomes. This is not necessarily a list that will give a student the best school to go to, but rather (what it says on the tin) a scorecard for how well those schools are doing, given their resources.
That's my point -- who decides what expectations are? Their results are incredibly dependent on the model specification. I imagine if they changed which indicators they used, the results would vary widely.
Here's another way to see my concern. Suppose you had a classifier that achieves 1.0 R^2; then since it perfectly predicts each school's expected value, it'll assign each school a score of 0. I'm pretty suspicious of an approach where the results get worse with better predictive power.
Even if you want to do "exceeds expectations", I think you shouldn't include variables that are school specific, only variables that are student specific. In other words, for my expected outcomes, which school is best?
> Here's another way to see my concern. Suppose you had a classifier that achieves 1.0 R^2; then since it perfectly predicts each school's expected value, it'll assign each school a score of 0. I'm pretty suspicious of an approach where the results get worse with better predictive power.
This depends on your view of the importance of undergraduate education, and what worse is. From my point of view, undergraduate education is an institutional obligation used to fund or justify faculty's personal objectives: research.
The reason that the model counts location is simple: universities tend to place candidates locally. I'm pretty sure the recruiters attending fairs at Stanford and Berkeley have higher starting wages than the ones at University of Kansas, and that a lot of that difference is simply regional cost of living. If you don't factor that in, you risk a bad school in an expensive place ranking higher than a good school in a cheap place.
> Suppose you had a classifier that achieves 1.0 R^2; then since it perfectly predicts each school's expected value, it'll assign each school a score of 0. I'm pretty suspicious of an approach where the results get worse with better predictive power.
If I'm understanding correctly, that result would indicate a world where the college you attend has no effect on your earning power. ie. choose any college you want, because you'll earn the same amount regardless of which one you choose.
This would only apply to colleges that people in your demographic group actually attended though. If the dataset doesn't contain any information about people like you who went to Harvard, then maybe Harvard would indeed increase your earning potential if there was a way for you to actually go there.
I'm not saying that each college you attend has no effect on earning power. It's just that I can perfectly predict the effect of each college on your earning power. Does that make sense? If I have an oracle that tells you
"if you got to Harvard, you will make $80,000, if you go to MIT, you will make $86,000",
and the oracle is exactly correct, then under this model, The Economist assigns every college a score of 0.
I think you are missing the key ingredient in the analysis.
The Economist is attempting to build such an oracle via statistical regression. HOWEVER, the Oracle is intentionally limited in input to a specific list of things: SAT scores, sex ratio, race breakdown, size, public or private, earning power in the city where it is located, etc.
The things that are omitted constitute the actual value the University brings to the table: quality of teachers, instruction, organizations on campus, etc. (1)
So however far off the model is for two given Universities must be explained by all the missing inputs, i.e. largely how "good" the University is.
If the Oracle was able to perfectly predict your earning power given that limited set of inputs, then it would basically mean that a University is completely defined by SAT scores of students, sex ratio, race, etc. and there's absolutely no value they add or subtract beyond that. That was be a very, very interesting result. But you can see why it's unlikely.
Hopefully this makes sense?
(1) Of course it's possible that there are factors like "how many trees on campus" or "how many vowels are in the name" which might also affect earnings. But we can probably agree that it's less likely to be important than the aforementioned ones ("quality of instruction", etc.).
> and the oracle is exactly correct, then under this model, The Economist assigns every college a score of 0.
If what you are saying is true, then I agree that the ratings are nonsensical. But I don't think you are correct. Their methodology consists of comparing two numbers:
1. the observed earnings
2. the estimated earnings if the same students had studied elsewhere (presumably some average)
So if we had your oracle, the numbers 1 and 2 would be different for the two colleges, not the same.
Does anyone know of a good diy college ranking system? One where you can pick that factors that actually matter to you. I'm always happy to see options other than U.S. News but I'd love to be able to tweak the algorithm myself.
I think these one size fits all rankings are all flawed by their inherent nature.
The economist article linked here references (and uses in their rankings) data from https://collegescorecard.ed.gov/, which is put out by the US Dept of Ed. Looks like there's some additional metrics in there that may be useful.
If I were a prospective student interested in making a lot of money out of college, surely I'd still just follow this very simple model:
Take all the schools I could get into, look up their actual median earnings, and go to the one which is highest (let's ignore issues of cost of attendance).
If it turns out that the one which is highest isn't ranked high on this particular list, I don't see why that would suggest I still shouldn't pick that school. Imagine these are my options, for instance:
(A) A school with expected earnings of $90k and actual earnings of $75k.
(B) A school with expected earnings of $50k and actual earnings of $55k.
(B) will rank vastly higher than (A) in this study, but I'd pick (A) over (B) every time if I just wanted money.
In short, I don't actually see the value add here from this list. How am I supposed to act on these rankings? How are these rankings supposed to change any idea I might have about which school I should attend?
It seems if you want to know which school to attend based on earnings we already have much more reliable data for that: actual earnings data.
The problem with your method (which this analysis is attempting to although may not actually correct) is what if the difference in the actual earnings between the schools is due to quality of the students admitted into the school rather than the increase in the students earnings that the school causes. For example, if your options are
(A) A school that admits 1000 people with IQ of 200 who go on to earn an average of $100,000 and 100 people with IQ of 50 who go on to earn an average of $1,000
(B) A school that admits 100 people with IQ of 200 who go on to earn an average of $1,000,000 and 1000 people with IQ of 50 who go on to earn an average of $10,000
(A) has median earnings of $100,000 while (B) has median earnings of $10,000, so by your criteria (A) is much better. However, (B) has much better outcomes for both groups of students.
>"The government generated the numbers by matching individuals’ student-loan applications to their subsequent tax returns, making it possible to compare pupils’ qualifications and demographic characteristics when they entered college with their salaries ten years later."
So the dataset excludes students who do not apply for loans? (i.e., this analysis penalizes schools who admit the folks most likely to make lots of money, and the schools that have the lowest expected student contribution.)
>"...based on a simple, if debatable, premise: the economic value of a university is equal to the gap between how much money its graduates earn, and how much they might have made had they studied elsewhere."
If we are only to look at financial incentives, a more reasonable analysis would be to compare the expected future earnings _distribution_ as opposed to just a central tendency statistic like the median.
> So the dataset excludes students who do not apply for loans? (i.e., this analysis penalizes schools who admit the folks most likely to make lots of money, and the schools that have the lowest expected student contribution.)
It penalizes (or at a minimum, undersamples in a nonrepresentative way) schools with their own, non-loan, aid programs.
This probably very badly hurts schools like Caltech, since it means that there (unless Caltech no longer has the essentially full-coverage need-based aid they had when I went -- unsuccessfully, I graduated elsewhere) they are only counting people who are making the unwise choice of taking loans when they have no need given their current resources.
That this will skew the data badly should be obvious.
Axlines were suspended some time ago, I think they are still proactive with need based aid but I think with their tuition skyrocketing they can't possibly be as gracious as they once were.
I am a proud alumnus of Rice University and I see that it is on page 64, fifth from the bottom. Also, even more surprisingly, very near the bottom is Caltech.
This is obviously wrong. If their methodology says that Caltech is in the bottom 2% of US colleges, then one concludes that their methodology is worthless, or at least that what it's predicting is not very closely related to the quality of the education provided. I suspect that many Caltech undergraduates decide to pursue grad school and academia, which is not an especially lucrative career, and which is probably not highly corrolated with political leftism or "reefer madness".
I can also tell you that my friends at Rice who were looking to make a lot of money after they graduated, by and large succeeded.
"The analysis goes against my gut feeling, therefore it's wrong."
Did you bother to read the article?
The Economist’s first-ever college rankings are based on a simple, if debatable, premise: the economic value of a university is equal to the gap between how much money its graduates earn, and how much they might have made had they studied elsewhere.
That's what they did, and those were the results. They even say it's debatable. Using a different analysis or metric, the results would be different.
>In summary: Bullshit.
As opposed to the anecdotes regarding your Alma Mater and rich friends.
I think he has a point, though. This study doesn't seem to control for an incredibly important variable: student interests.
If one school tends to attract students interested in making a lot of money, and another school tends to attract students interested more in academics and graduate school, can one really conclude that the former adds more value? Perhaps, but certainly not without taking into account the different student interests and inclinations towards money-making.
They do control for field of study, but I don't think this is sufficient. A school that attracts a lot of STEM majors who go to graduate school instead of industry is pretty much going to get destroyed in this study.
Yes, I did. In suggesting that their methodology is "debatable", they are indeed hedging their bets. But they are still advancing the theory that, after looking at the data, one could plausibly accept their model.
My "anecdotes" consist of four years as an undergraduate at a university (Rice) rated in the bottom 0.5%. I know damn well that Rice prepared the vast majority of students to be extraordinarily successful in the career paths they chose. I also know that my education was tremendously valuable for its own sake, and that the same was true for my classmates.
And as an academic now teaching at the University of South Carolina, I can only wish that we offered our students what was offered at Rice.
It is true, as a Rice graduate I am biased. But nevertheless I do not believe that I am stretching when assert with confidence that, according to any metric which measures anything worth measuring, Rice lies somewhere in the top 99.5% of universities in the US.
I think the premise of the analysis is good (at least if you are optimizing for income), but the actual implementation is questionable simply because they don't have access to enough data about the students.
Rice is definitely on the way up - of any school that has a claim on being Harvard of the South they are the one that has made the most progress in terms of endowment and snagging promising researchers.
As a Caltech grad, I'll give my two cents. They are afflicted with the worse version of the same problem MIT has - "working for Harvard men". Who besides Adam d'Angelo has made it really independently big in the tech world from Caltech? They are turning out somewhat eggheadish very good Google engineers but without the reorientation towards entrepreneurship that MIT at least has with Sloan, the Media Lab, etc. Over the last 10 years, student life has become progressively more proscribed as the administration seeks to make campus a tamer version of Princeton. Caltech's purpose was to produce scientific leaders of the mold of Oppenheimer, Mark Wrighton, et al. - people who could both do research at a high level and manage institutions while providing a useful counterpart to the institutional drift that comprise NSF and NIH review panels. It's sad that the college that prides itself so much on the elite and supposedly self-selected nature of its student body and touts the supposed freedom and creativity offered doesn't trust its students. It may be that the faculty collectively take them less seriously than in the past, I don't know.
I highly doubt the kind of people who would flourish at Caltech care what the Economist thinks. Which is a pity in some sense, because they should care about what the stereotypical Economist reader thinks.
"The bar is set extremely high for universities like Caltech, which are selective, close to prosperous cities and teach mainly lucrative subjects. If their students didn’t go on to extremely high-paying careers, the college would probably be doing something gravely wrong."
Isn't that an inherit flaw in the system? It is also based on the assumption that everyone prioritizes money over everything else. If I worked in the Yale marketing department I could easily spin this as a being caused by the high character of Yale graduates who are more willing to dedicate their life to lower paying but higher impact jobs like academics or public service while those greedy kids from Harvard only ever chase money.
Every other publication's college rankings prioritize "everything else" over money. The Economist just came up with a different ranking based on expected value.
It's not a problem that their ranking is based on optimizing for income, since they state upfront what they are ranking them on. However, it could be a problem that not all of the students included in the data are optimizing for that. A school that was very good at helping students increase their potential income, but that attracted mostly students who aren't interested in maximizing income would be ranked low, but it should receive a high rank.
I dunno, how about pursuing careers as scientists rather than as hacks for the finance industry? Is that "gravely wrong"?
And as far as being close to a prosperous city -- that is because Caltech is near the world hub of the entertainment industry. A Caltech education is mostly irrelevant to finding employment there.
That's not what the rankings are ranking schools by. It's entirely about value v. expected value. For example, say College State University had an expected income of $0, and the median income of their alumni was $15K. That would be a top 10 school by this methodology, despite no sane person thinking that it would represent an upgrade over going to a school with an expected income of $75K and a median income of $74K.
Think of it as a supplementary rankings system, rather than a definitive one.
The biggest problem with the methodology, to me, is it doesn't consider average cost of attendance to students at all.
Yale is also in the bottom bracket. I imagine at least Yale and Caltech have similar problems: attracting people who could be bankers (Harvard) and sending them into liberal arts (Yale) or engineering (Caltech).
It's important to distinguish that this system is not rating graduates, but institution performance. Sort the table by expected earnings, and Rice is on the front page. Students admitted to Rice are smart and successful people the moment they step into classes on day one.
The question they're trying to answer is: once a smart and successful person chooses a college, how much value does that college add? Looking at that same expected earnings sorted table, the actual earnings is quite varied. MIT graduates earn 8k more than expected, and Rice earned 10k less. That's quite a difference!
There's many reasons this dataset is flawed, but not for any of the reasons you mentioned:
1. Financial aid applicants are not a representative sample of students.
2. Highly selective schools have a very high bar to exceed.
3. Students may prefer other opportunities that pay less, like civil service, or politics.
4. Earnings are only tracked for ten years (standard loan repayment).
I also question how earnings are calculated. For example, because I max out both a 403b and a 457b retirement account, my earnings are taxed as if I earned half of what I actually do. Does my double savings opportunity affect the calculation?
I definitely agree with you. I wasn't trying to explain how I think the article is flawed; rather, I was just trying to argue that its conclusions can be dismissed immediately.
In particular, I believe that your (3) describes what I see as the biggest flaw with their methodology; that (at least for elite colleges) what they are largely measuring the proportion of students that choose to go into high-paying professions.
I just find your kneejerk response of "Anything that rates my school poorly is wrong" and pocket dismissal is counterproductive to actually rating school performance. Selection bias is an actual demonstrable problem in school rankings, and coming up with some way of separating the effect of a school's selectivity from it's educational results is critical if we wish to improve the productivity of educated professionals.
I actually don't think that 3 has much explanatory power. Rather, I think the bulk of the variance is beyond 10 years. Yale's strongest suit is likely professional programs like law and medicine. These are high paying professions, but they require additional years of education beyond a 4 year bachelor's, and that subtracts earning years from the 10 year cutoff.
I don't know much about Rice, but I suspect it's particular problem is being a highly selective school in a low cost of living area. If an average Rice CS student randomly selected into other schools with comparable SAT scores, they'd be going to a much more expensive region, and their controlling for location may not sufficiently model how a CS degree differs from community college nursing.
> I just find your kneejerk response of "Anything that rates my school poorly is wrong" and pocket dismissal is counterproductive to actually rating school performance.
I see what you are saying. If their methodology ranked my school in the middle of the pack (instead of in the top 2%, which I believe it deserves), saying in effect that the advantages of a Rice education were those that its students brought with them, then I would be annoyed but I can see how that could be fair.
But the bottom 0.5%? I went there and I know better. (And as an academic I know a lot about how universities work and how they differ from each other.) I do feel that I am justified in calling bullshit immediately.
> instead of in the top 2%, which I believe it deserves
You would have to know how undergraduate education works in far more institutions than the average academic attends to have experiential data affirming that hypothesis. This is a problem in consulting academics about undergraduate college rankings, they know the researchers far better than the graduating classes, and generally rate one another that way.
> an academic I know a lot about how universities work and how they differ from each other
Which means you helped bring the average down, because grad school is 4 or 5 years of crap earnings.
As a long-time Texas resident and qualified applicant for Rice University who likely could have attended (if I hadn't refused taking SAT IIs), I'd also like to note your esteemed alma mater is likely the most picky University in all of Texas, and, if US News' numbers are correct, just a shade less picky than Johns Hopkins. In short: Selection bias is not to be ignored.
If you sort by Median Earnings, Cal tech shows up in the top.
we ran the scorecard’s earnings data through a multiple regression analysis
So they ran some machine learning algorithm, the Over/Under column just shows the errors in their final model. If their choice of variable was perfect, then every school would have an over/under of $0, some metrics are just not linear and not easy to get (school reputation, networking power?)
In the end this is not a very valuable metric, who cares if they think a cal tech graduate should make $82,126, it still makes $74,000. What is more interesting to me would be how much money that graduate spent on school to get there.
> who cares if they think a cal tech graduate should make $82,126, it still makes $74,000
Mostly someone who wants to measure the effectiveness of their undergraduate curriculum. As a student, there's a strong case to be made for ignoring this data and seeking the most selective alma mater possible. But if the law surrounding Duke v Griggs ever gets overturned and IQ / SAT scores are allowed in screening job applicants, you definitely want to know how much of a school's function is the actual learning and placing bit.
I wonder how Rice's presence in Houston affected their expected earnings for Rice students. I could easily see Houston really throwing off their calculations if they weren't very careful. Houston's the fourth largest city in the US, but isn't anywhere near as expensive as its peer cities. Similarly, outside of a few select industries, Houston salaries are lower than salaries in similar cities as well.
It would be very cool if you could sort by percentage differential between the over/under and expected earnings. I think sorting by that parameter would produce some really interesting results.
Agreed. Ranking by raw dollar values means that schools in very expensive areas (or that feed into very expensive areas) will have exaggerated scores and end up either really highly ranked or really low.
Another interesting study would be to see whether a child's college attendance correlates with a better or worse standard of living from their parents'. Maybe look at Father's salary at age 30 and child's salary at age 30, and see if the attended college is significant. The question being: If you're born rich, you tend to stay rich, and if you're born poor, you tent to stay poor. Does college choice matter?
My university asked for my earnings only one time. The categories maxed out at 70k+. They might be higher on that list if they took a more precise survey.
The data used here does not come from the universities, but from the government. Specifically, from matching federal student loan applications (from the Department of Education) to tax returns (from the IRS).
The data makes more sense if you sort by expected earnings or median earnings and then compare within localized clusters of universities. For example, MIT and Caltech have similar expected earnings, but MIT wins on median earnings.
If a college has a huge communications program and a smaller engineering program, it would be ranked less because of pay disparities. This methodology is simply inaccurate
i tried to look up several four-year U.S. institutions just now, including one major research university, and none were in the list.
from the article, apparently they excluded universities when the scorecard dataset was missing one or more factors; it would be useful if the tool at least showed the university anyway as a placeholder and indicated what data was missing.
That said, if the predominant value of education is in the pursuit of money, then I think using the Brookings Institute model (which The Economist references) that includes two-year and vocational education entities is a much more pragmatic and practical approach. That is, if money is the goal, only focus on money as the outcome. Conflating money with prestige is, to me, foolish. A University is a prestige degree, insofar as there's a pursuit of knowledge, in theory, to produce a well-rounded, educated person...in theory.
I say this as a graduate of two generally highly ranked Universities, who occasionally gets the feeling that I'd be making more money if I'd taken my education budget, got trained in HVAC service/repair/sales, and started my own company.