Well shoot. We have literally been intending to talk about this for 1-1/2 years.
We had posted a question for all y’all way back in November 2017, looking for input on why Duke was ranked at a certain spot on the BusinessWeek list of best business schools. We got just one response at the time (hi rwelch9439!), and then another BSer popped in and gave additional insights a couple weeks later (hey buffalo!), and we had planned at the time to do a follow-up post of our own, to mostly agree with what was already said and hopefully continue the conversation about rankings.
But that didn’t happen.
Round 2 2017 took over. And then summer 2018. And then Round 1 2018. And, well, you know. Life.
But rankings are really something we want to discuss.
We would tweet stuff like:
BW ranked @NYUStern P/T MBA #35, after U of Louisville? (U of who??)
Here’s a question about the methodology used to gather some of the US News data
And we would remember the many posts in the ‘snarchives where we called out different publications, like this long-ago post about BusinessWeek’s data being unreliable.
And we would privately be scoffing and throwing things and rolling eyes and all sorts of aghastedness when we saw that School X was ranked higher than School Y on such-and-such a listing.
We really wanted to do the deep dive for you, to demystify rankings and help you understand them, so that you’re a more informed consumer of them as you go about your school research tasks.
(Because in case you’re not already aware, EssaySnark is not so much a fan of these rankings thingies.)
We planned to go into detail on points like this:
One factor of The Economist’s ranking is the percentage of students finding jobs through the school’s career services. What are the implications of this when a school decides to tailor its admissions practices in order to score better on a ranking methodology? Does it discourage schools from admitting family business or sponsored students? What if your goals are off the mainstream, so that you’ll by definition be on your own for recruiting? In some respects that could be seen as a positive factor in a profile, as it could convey to the adcom that you’re self-sufficient and ready to carve your own path in life. But could that also be a headwind against you, if a school cares about maximizing their points in the context of how this publication is scoring them? The Economist also used years of pre-MBA work experience as part of their “student quality” measure, which is just really tough to get the head around. Why is years of experience a quality factor? Not getting it.
These methodologies are obviously a very complex topic, and rather than continue to punt it down the road and reschedule it into some never-to-arrive future slot in our editorial calendar, we’re going to offer a very much condensed version today, mostly comprised of links to a few other resources that we encourage you to explore.
Then if you want to come back with your discussion points or questions or observations or theories of how a particular publication’s rankings methodology might have unintended consequences in the business school marketplace, we’re all ears.
You can start here: From a now-dead link, the GMAT people break down the rankings with a useful interactive chart — originally at http://www.mba.com/us/plan-for-business-school/choose-a-school/evaluate-schools/users-guide-to-mba-rankings and thankfully preserved at wayback
Or for a very simple overview, here’s a description of how to use the rankings from a now-defunct admissions directors’ blog targeted to military candidates.
More recently, a cheat sheet to the rankings from Darden
Obviously whenever a school is touting its appearance on a particular rankings, that’s part of its marketing and promotional effort. You know that the schools study this stuff closely. Ranking matters to ALL of them and sometimes a school will bring in a new dean with an explicit mandate to improve their standing in the rankings. It’s a rigged system — not rigged by any one party, but overall, not serving any particular constituent to any true degree of value. The only entity who profits from rankings is the publications who produce them, because it drives sales of their guidebooks and traffic to their site. That’s it. The consumer — you — the student — the supposed audience of this — does not benefit, except from a very gross value level of putting a line in the sand between schools like Michigan Ross and “U of Who?” Many schools are catering to their own colleagues in some of their policies and promotions.
The fact that some publications use the opinions of other deans as a ranking factor is just ridiculous. How could a dean at one school know what’s going on at another school? How could Dean Wharton evaluate Dean Stanford? Sure, they are professional peers, they likely interact in a variety of professional settings, and hopefully they’re following what their competition is doing in the marketplace, in terms of big new programs or initiatives being launched — but don’t hold your breath. Most deans have trouble staying on top of what’s going on at their school much less following the changes at all the other five-ish schools they consider to e direct competitors to, much less the 15-ish schools that are broadly considered to be “the best.” (Pro Tip: If you ever hear an admissions person claim that their school is “the only one who” has such-and-such a program, they’re almost always wrong.) How can a ranking put ANY weight on what the deans thinks of the other schools? It’s all just who’s got a puffed-out chest and is making noise and attracting attention. A school like NYU who has in the recent past had more low-key deans who don’t go out there strutting their stuff, yet who are very busy making all sorts of improvements to their programs and expanding to new geographies and focusing on their culture, those schools don’t get noticed. They don’t have the “reputation” — yet these are incredible schools that offer a high-value and high-impact educational experience. But you wouldn’t know it based on many of these rankings systems.
So, the main message today is, caveat emptor as far as rankings go. When a school is touting how they are #1 on this scale or #3 on that one, take it with a grain of salt. There are now so many different rankings and categories of rankings and subrankings that every good school is bound to end up in a respectable position on at least one of them. Heck, EssaySnark even posted our own rankings before!! Based on how easy it is to get into one school versus another. If you’re really focused on rankings, seems like that would be the most important one of all, no?
You may also be interested in: