New Yorker Writer Malcolm Gladwell Trashes Rankings

by John A. Byrne on

New Yorker staff writer Malcolm Gladwell makes no bones about it: he hates college rankings--maybe even more than business school deans.

New Yorker staff writer Malcolm Gladwell makes no bones about it: he hates college rankings.

Malcolm Gladwell does not like college rankings. In fact, he abhors them.

The bestselling author and New Yorker writer turns his attention to the subject of rankings in the latest edition of The New Yorker. The result of his exploration into the highly controversial topic: a scathing six-page critique of the value and intellectual honesty of all ratings of universities. Gladwell, known for using more subtlety and finesse in writing about topics and issues, takes out a sledgehammer to beat up U.S. News in particular and all rankings in general. “Who comes out on top, in any ranking system, is really about who is doing the ranking,” concludes the well-known author and staff writer at The New Yorker.

It’s something we at Poets&Quants have been saying ever since we launched six months ago (see “Turning the Tables: Ranking the Rankings”). And as far as B-school rankings go, U.S. News is far more transparent than most and it measures factors that are more directly related to program quality than rankings by either The Financial Times or The Economist. U.S. News, for example, uses average GMAT and GPA scores to assess the quality of incoming students and post-MBA compensation and employment data to assess the opportunities an MBA program affords its graduates.

Writing in the Feb. 14 & 21st issue of The New Yorker under the headline, “The Order of Things: What College Rankings Really Tell Us,” Gladwell takes a deep dive into the college rankings published by U.S. News & World Report. He doesn’t specifically address business school rankings by U.S. News or any other media outlet. But his conclusions are just as applicable to the B-school ranking game as they are to U.S. News’ general college rankings.

His major conclusions:

  • Determinations of ‘quality’ turn on relatively arbitrary judgments about how much different variables should be weighted.
  • Asking deans and MBA program directors to rate other schools is less a measure of a school’s reputation than it is a collection of prejudices partly based on the self-fulfilling prophecy of U.S. News’ own rankings.
  • It’s extremely difficult to measure the variable you want to rank because of differences in how a specific metric can be compiled by different people at different schools in different countries.
  • Rankings turn out to be full of ‘implicit ideological choices’ by the editorial chef who cooks up the ranking’s methodology.
  • There is no right answer to how much weight a ranking system should give to any one variable.

“The first difficulty with rankings is that it can be surprisingly hard to measure the variable you want to rank,” writes Gladwell, “even in cases where that variable seems perfectly objective.”

The writer then trots out a ranking of suicides per hundred thousand people, by country, to prove his point. At the top of the list, with what would seem the highest suicide rate of any country is Belarus where 35.1 people out of every 100,000 deaths are declared suicides. At the bottom of this top ten list is Sri Lanka where 21.6 people out of every 100,000 deaths are determined suicides.

“This list looks straightforward,” believes Gladwell. “Yet no self-respecting epidemiologist would look at it and conclude that Belarus has the worst suicide rate in the world. Measuring suicide is just too tricky. It requires someone to make a surmise about the intentions of the deceased at the time of death. In some cases, that’s easy. In most cases, there’s ambiguity, and different coroners and different cultures vary widely in the way they choose to interpret that ambiguity.”


What does this have to do with college rankings? “The U.S. News rankings suffer from a serious case of the suicide problem,” believes Gladwell. “There’s no direct way to measure the quality of an institution—how well a college manages to inform, inspire, and challenge its students. So the U.S. News algorithm relies instead on proxies for quality—and the proxies for educational quality turn out to be flimsy at best.”

Consider the most important variable in the U.S. News methodology—accounting for 22.5% of the final score for a college: reputation. In U.S. News’ business school ranking, it’s the single most important metric as well—given a weight of 25% of the final ranking for a B-school (the next most important variable in the B-school survey is its highly flawed survey of corporate recruiters which accounts for 15% of the final ranking). This so-called “peer assessment score” comes from a survey of business school deans and directors of accredited master’s programs in business. They are asked to rate programs on a scale from “marginal” (1) to “outstanding” (5). A school’s score is the average of all the respondents who rate it. About 44% of the deans and directors surveyed by U.S. News responded to it in the fall of 2009.

1 2 Next
  • Sunny

    I was looking for an article that compared the various rankings – US News, Business Week, Financial Times – and explored the merits of each (or lack thereof) and I didn’t find it – I found this instead. This article is interesting in that it questions the efficacy of this annual rite – always a good thing – whatever we choose to do with the information.

    What is really intriguing though is the way the schools themselves use these rankings, presumably to which they have contributed to spin their message. For example, The University of California Irvine’s MBA program jumped 15 spots in 2010 to 36 in US News’ rankings, dropped back to 40 in 2011 (also by US News) and was ranked 27th nationally and 53rd internationally by Financial Times in 2011. Of course their promo for 2010 stated that they rose to 36. Their hook for 2011 touts the 27 ranking, which looks like they jumped up again, never mind that they actually fell back to 40 and the ranking is from a totally different publication. Really? Do they all do this?

  • John A. Byrne


    Here’s the article you were looking for in the first place: Ranking the Rankings.

  • Architect

    Depending on what a student wants to study, there are many factors that are completely irrelevant. So it would be more useful to have a website in which the user selects and weighs each factor to create a personalized college ranking?

  • Kumar

    Couldn’t agree more! Rankings are indeed a lot of hogwash. I personally know someone who went to IE Business School in Spain based on rankings only to realize how shallow the school and its placement cell really were. He discontinued and returned to go to an Ivy league US School. So beware of all these 2nd tier non-US schools that show up highly on rankings.

  • George

    Spearhead, I’m impressed by the methodology that you employed. For those people that don’t want to get into the same level of detail, a simpler alternative would be to approach the firms that you think that you would like to work for post-graduation, and ask them which b-schools they recruit from.

    If they recruit from a particular school, you are in the game. This opens the door. It’s then down to you, how you impress them at application and interview stage. They will be judging YOU against all of the other individuals that they’re interviewing that cycle. They won’t be comparing your school badges. Your school brand may open the door of opportunity, but whether or not you are ultimately successful in clinching the dream job is in your own hands.

Our Partner Sites: C-Change Media | Poets & Quants for Execs | Poets & Quants for Undergrads | Tipping the Scales

Site Design By: