New Yorker Writer Malcolm Gladwell Trashes Rankings

by John A. Byrne on

New Yorker staff writer Malcolm Gladwell makes no bones about it: he hates college rankings--maybe even more than business school deans.

New Yorker staff writer Malcolm Gladwell makes no bones about it: he hates college rankings.

Malcolm Gladwell does not like college rankings. In fact, he abhors them.

The bestselling author and New Yorker writer turns his attention to the subject of rankings in the latest edition of The New Yorker. The result of his exploration into the highly controversial topic: a scathing six-page critique of the value and intellectual honesty of all ratings of universities. Gladwell, known for using more subtlety and finesse in writing about topics and issues, takes out a sledgehammer to beat up U.S. News in particular and all rankings in general. “Who comes out on top, in any ranking system, is really about who is doing the ranking,” concludes the well-known author and staff writer at The New Yorker.

It’s something we at Poets&Quants have been saying ever since we launched six months ago (see “Turning the Tables: Ranking the Rankings”). And as far as B-school rankings go, U.S. News is far more transparent than most and it measures factors that are more directly related to program quality than rankings by either The Financial Times or The Economist. U.S. News, for example, uses average GMAT and GPA scores to assess the quality of incoming students and post-MBA compensation and employment data to assess the opportunities an MBA program affords its graduates.

Writing in the Feb. 14 & 21st issue of The New Yorker under the headline, “The Order of Things: What College Rankings Really Tell Us,” Gladwell takes a deep dive into the college rankings published by U.S. News & World Report. He doesn’t specifically address business school rankings by U.S. News or any other media outlet. But his conclusions are just as applicable to the B-school ranking game as they are to U.S. News’ general college rankings.

His major conclusions:

  • Determinations of ‘quality’ turn on relatively arbitrary judgments about how much different variables should be weighted.
  • Asking deans and MBA program directors to rate other schools is less a measure of a school’s reputation than it is a collection of prejudices partly based on the self-fulfilling prophecy of U.S. News’ own rankings.
  • It’s extremely difficult to measure the variable you want to rank because of differences in how a specific metric can be compiled by different people at different schools in different countries.
  • Rankings turn out to be full of ‘implicit ideological choices’ by the editorial chef who cooks up the ranking’s methodology.
  • There is no right answer to how much weight a ranking system should give to any one variable.

“The first difficulty with rankings is that it can be surprisingly hard to measure the variable you want to rank,” writes Gladwell, “even in cases where that variable seems perfectly objective.”

The writer then trots out a ranking of suicides per hundred thousand people, by country, to prove his point. At the top of the list, with what would seem the highest suicide rate of any country is Belarus where 35.1 people out of every 100,000 deaths are declared suicides. At the bottom of this top ten list is Sri Lanka where 21.6 people out of every 100,000 deaths are determined suicides.

“This list looks straightforward,” believes Gladwell. “Yet no self-respecting epidemiologist would look at it and conclude that Belarus has the worst suicide rate in the world. Measuring suicide is just too tricky. It requires someone to make a surmise about the intentions of the deceased at the time of death. In some cases, that’s easy. In most cases, there’s ambiguity, and different coroners and different cultures vary widely in the way they choose to interpret that ambiguity.”

‘U.S. NEWS RANKINGS SUFFER FROM A SERIOUS CASE OF THE SUICIDE PROBLEM.’

What does this have to do with college rankings? “The U.S. News rankings suffer from a serious case of the suicide problem,” believes Gladwell. “There’s no direct way to measure the quality of an institution—how well a college manages to inform, inspire, and challenge its students. So the U.S. News algorithm relies instead on proxies for quality—and the proxies for educational quality turn out to be flimsy at best.”

Consider the most important variable in the U.S. News methodology—accounting for 22.5% of the final score for a college: reputation. In U.S. News’ business school ranking, it’s the single most important metric as well—given a weight of 25% of the final ranking for a B-school (the next most important variable in the B-school survey is its highly flawed survey of corporate recruiters which accounts for 15% of the final ranking). This so-called “peer assessment score” comes from a survey of business school deans and directors of accredited master’s programs in business. They are asked to rate programs on a scale from “marginal” (1) to “outstanding” (5). A school’s score is the average of all the respondents who rate it. About 44% of the deans and directors surveyed by U.S. News responded to it in the fall of 2009.

1 2
  • http://poetsandquants.com/members/llavelle/ Louis Lavelle

    I read Gladwell’s piece the other night and was not impressed either John, and I’m a big fan of his. The piece really didn’t break any new ground in terms of what critics of rankings have been saying for years. What I find most offensive about all the criticism is this implicit idea that educational institutions somehow defy ranking–that they differ so much from each other and are so infinitely complex that any ranking methodology that attempts to pierce the veil is doomed to failure. No ranking is perfect, and you can argue about how educational institutions should be ranked. But this idea that they can’t be ranked in any kind of legitimate way is ridiculous.

    Louis Lavelle
    Associate Editor
    Bloomberg Businessweek

  • http://poetsandquants.com/members/jbyrne/ John A. Byrne

    Thanks Lou for weighing in.

  • http://kaiserina.blogspot.com Katy Zei

    I wonder what Gladwell’s true motivations were for writing the piece were. He doesn’t look old enough to have a daughter of college age who didn’t land her first choice, so it can’t be personal like that. I haven’t read the piece yet, but judging by the abstract it seems a bit more filler than killer.

    The MBA rankings are crucial in helping me make my decision for the right b-school fit, and in giving me information on b-schools that I wouldn’t normally have considered applying to. I don’t necessarily look at the ten top unis listed by my fave business mag and base my possibilities on them. I use the information in the rankings and make up my own mind what my top ten are.

    I’ve heard that a lot of MBA candidates are very competitive, rankings-oriented people, but I think that the people who want to do an MBA for the learning experience — and not just the pay raise — are more than able to read a table of data, and take rankings for what they are: a bit of a kludge, but useful nonetheless.

    I don’t think this article will have much impact in the MBA community. Thought it will doubtlessly spawn many a heated academic discussion.

  • mary tricious

    I have always thought business school deans are a bunch of rankers. Now I know it. Still, there is a statistic that would be interesting to know. How well correlated are the deans’ rankings? If low, this is evidence that it’s all arbitrary. If high it might be self-fulfilling prophecy.

  • Art

    Gladwell is absolutely correct. These rankings are hogwash. However, since the public deems these rankings ( highly ranked schools ) will lead to success then it must be so. Go with the name stupid.

  • http://poetsandquants.com/members/msshona/ Rishona Campbell

    Personally, I also feel that college rankings are to be taken with a grain of salt; especially when there is no consideration for the cost of tuition in the methodology. However Art is right; since the public (so industry) believes in these ratings, that alone gives them validity. So what it boils down to is really how much are you weight are you willing to give to your B-School’s ‘brand name’…and more importantly, how much are you willing to pay for it.

  • http://poetsandquants.com/members/llavelle/ Louis Lavelle

    Interesting comment Rishona. What exactly is the argument for including the cost of tuition in a ranking? There are rankings that claim to tell you which schools deliver the most bang for the buck–those that are specifically created with cost in mind. And there is certainly a place for those. But most of the rankings we’re talking about here are, like BW’s, rankings of the “best b-schools.” Not the best inexpensive b-schools, just the best. In my view, that kind of determination has to be made, almost by definition, without regard for cost. Think of the specific schools that would be eliminated from BW’s list if your criteria was total tuition under $75,000. The No. 1 school would be Michigan State. Does that list–without Harvard, Wharton, Stanford, and all the rest–truly represent the best?

    Louis Lavelle
    Associate Editor
    Bloomberg Businessweek

  • Spearhead

    I finally found utility for the rankings when I decided to make my own rankings based upon the metrics that are important to ME. I am going into consulting and my immediate career objectives include landing a job with a top-3 Management Consulting firm. However, reputation, customer satisfaction, employer satisfaction, are also critical issues that I considered. Here is how I broke down MY rankings:

    School reputation (30%) :
    I took the average ranking from USN over the past 10 years and the past 5 years. Giving equal weight to each one. My logic here is that I think USN measures more of the prestige and perception out there about each institution and I think that brand is important in the long run.

    Customer Satisfaction (30%): I considered some data points from the BW rankings to compile this information. Once again, the areas that are important to ME. They included MBA satisfaction, Recruiter Satisfaction, Career Center Grade (Being a career switcher myself, I found this particularly important), and General Management Skills Grade

    Management Consulting Strength (30%): I considered the percentage of the class that goes into MC. Not only from 2010 but over the previous 2 years as well. i.e. Wharton sent 29% of class into consulting last year…etc. The second metric I used was the proportion of MBA’s that had offers from the top 3 firms (McKinsey, Bain, BCG) divided by the total number of MBA’s that went into consulting. i.e. Of the 29% that went into consulting at Wharton, 43% got offers at the top 3.

    Overall Rankings (10%):
    Took all 5 major rankings (Which I didn’t have to calculate myself thanks to P&Q!)

    I don’t claim to be a statistician and I know that my methods are not perfect, but going through this exercise has helped me settle which schools are best for ME.
    Anyone can do the same thing whether they plan on IB, Non-Profit, GM, etc. Depending on what you value more your rankings will shift (i.e. do you value prestige of brand far more than specificity within your field? )

    This has been really useful for me in deciding where to attend school. Hopefully it is of some use to others.

    Good Luck!

  • http://poetsandquants.com/members/jbyrne/ John A. Byrne

    Spearhead,
    Very impressive. I like the fact that you invested the time and the energy to take available information and craft it to your needs. That is exactly what applicants should be doing when using the rankings. Good luck.

  • http://poetsandquants.com/members/msshona/ Rishona Campbell

    @Louis,

    I can totally understand that there are those who will want to attend the best schools regardless of the cost (you have that at the primary/secondary educational levels as well). But from my perspective, I would like to see some clear justification in one B-School charging $75K over another one. For sure, you have the same considerations to make when looking at MBA programs that you have in regards to undergraduate programs…such as public vs. private, urban vs. rural, etc. However in graduate school you are more limited in regards to financial aid; and you are making more of a committment in that you just can’t transfer away to another school if you run into some sort of financial difficulty.

    The term “ROI” is thrown around a lot when talking about MBA programs…however it really is a spectrum of results that apply differently to different people. For example, Spearhead did not include tuition as a factor at all, but for me, I gave it about a 30% weight in my final decision on where to attend.

  • http://poetsandquants.com/members/llavelle/ Louis Lavelle

    Fair enough Rishona. I’m not saying cost shouldn’t be a factor in deciding where to go–obviously it is. Best of luck in your studies and your post-MBA career.

    Louis Lavelle
    Associate Editor
    Bloomberg Businessweek

  • MBA2013

    @Spearhead, could you share the resulting list? I’d love to see what your algorithm produced!

  • Spearhead

    @MBA2013
    No problem. Here are my findings. Unfortunately I didn’t compile a complete list because at this point I have already submitted my applications and I am trying to narrow down my 6 schools to one final decision, as well as identifying my 1st choice, 2nd choice, etc. Here is my list based on the schools I applied to (Wharton, Tuck, Michigan, Darden, Duke, Cornell) :

    1. Wharton 100
    2. Tuck 88
    3. Michigan 85
    4. Darden 84
    5. Duke 79
    6. Cornell 75

    I assigned a score to each one with my top school (wharton) as the 100% reference point. The actual numbers don’t really mean anything other than they help me see how the schools stack up relative to one another.

    Hope this is useful for you in creating your own rankings. Good Luck!

  • http://poetsandquants.com/members/redpoet/ Red Poet

    Agree with Rishona. And would add: Big name on my degree won’t help me if I’m struggling to cover $200k in total educational debt while making a 1/3 or 1/2 of that total debtload in salary. That amount of debt cries golden shackles — no thanks. That doesn’t interest me. I have no interest in being a Wall Street banker or to work for a big consulting firm.

    My plan right now is to consider in which cities or regions I’d like to spend the next decade or so (largely based on employment prospects,cost of living, and quality of life), and plan on aiming for the best MBA in that region that I can afford. I wish I had the financial option to go to say, Tuck or Kellogg, assuming I can achieve the GMAT score needed to even bring those schools within the realm of possibility; realistically, though, I’m not at all sure that I do. (The schools will of course say, “Look, an MBA is an investment, you have to decide whether it’s worth this price.” Well, guess what I’ve decided, in the absence of a better deal …)

    Also like Spearhead’s methology and am planning to use a similar tool to help narrow down my final choices.

  • Mick

    Spearhead;
    I did a similar thing myself last year and had similar results:
    1. Tuck
    2. Darden
    3. Duke
    4. Cornell

    Did you apply?
    regards

  • Arthur Dullsworthy

    Gladwell’s point is that rankings are pernicious in their overall effects and without any firm intellectual justification. The latter claim relies on 1) the possibility of creating a different ranking based on different criteria and weighs; and 2) Gladwell’s own experience growing up in a country that was then immune to the rankings fetish. It’s understandable that Mr Lavelle has a vest interest. Gladwell doesn’t.

  • http://www.bestplaces.net Bert Sperling

    I was very interested in Mr. Gladwell’s take on this subject of ranking and rating, since this has been my professional focus for the last 30 years or so. (for a good overview, see the New York Times article titled simply “The Guy Who Picks the Best Places to Live.”)

    So I was disappointed in Mr. Gladwell’s big revelation that rankings could reflect the bias of the author. Disappointed – because this notion is so obvious and basic. Everything is a reflection of its creator, whether it’s a piece of art or a piece of scientific research. Even a mathematical formula is a collection of choices, and subject to opinion.

    I could pick apart many of the examples Mr. Gladwell provides in his article. For example, he states that a list of the top ten countries by suicide rates is meaningless and useless, due to possible inconsistencies in data collection. Well, I suppose that means a social researcher should never perform an analysis or make any hypothesis, because the data may be less than perfect. (and I can assure you, it probably is.)

    The important thing to take away, is that rankings are just a tool, not the ultimate answer. They are a tool for us to dig deeper, to learn more, and not an end in themselves. To be disillusioned that rankings are not perfect, and have flaws and biases, is naive and disappointing.

  • young_

    Gladwell’s most important point– one that has been made in the blogosphere for a while– is that the reputations scores are compiled in a completely ridiculous manner that renders the final ranking numbers almost completely meaningless (given the weight accorded to reputation scores).

  • N-Sox

    Hi all

    First things first, let me say that I did take the various rankings of B-schools, especially that of Poet’s and Quants, very seriously in making a choice as to which schools to apply to, so I personally find value in them. In fact I got accepted to all three schools. Having said that, I did read the article, and your responses, I have a few observations that I’d like to make.

    1) Having written a number of research projects that required statistical analysis and ranking, I tend to agree with Gladwell’s assertion that it’s difficult for a ranking system to be both heterogeneous and comprehensive. One needs to pick one or the other, make it clear why they took that stance, and get on with it.

    2) I would like to see rankings that do try to rank B-Schools that are similar from the onset. In other words, one that ranks perhaps only Private institutions, Public institutions, Well-endowed (excuse the pun) institutions, etc. I believe that when you are comparing like-to-like, the differences in the institution’s perceived and real qualities will shine through a lot stronger, and thus properly differentiate the offerings of each. When you dump what does seem to be institutions with very different characteristics, comparisons tend to e less meaningful. Basically, if I understand Gladwell’s analogy, most school ranking systems are similar to ranking sports cars, SUV’s and sedans all in one ranking. In fact CAR magazine in South Africa (my home country), rank cars by segment (i.e. budget car, sedan, hot-hatch etc.)

    3) I agree with Louis Lavelle that nothing here is ground-breaking, but I’m, not sure if any of us can claim that Gladwell wrote the article with the intention of providing ground-breaking criticism. My sense is that he is sharing how he reasons with the flaws inherent in ranking systems. Personally, I had never really gotten my head around the inherent flaws of the ranking systems until I read Poets and Quants and Gladwell’s article. Some things like data collection inconsistencies may appear to be an unfair criticism as expressed by @Bert Spelling, but I don’t think Gladwell wanted to take it to that degree, as opposed to just pointing something out that may seem obvious to others, but may not be so to someone else.

    4) I do disagree with the notion that he doesn’t believe that schools “can’t be ranked in any kind of legitimate way”. He does say in relation to Car and Driver: “The magazine’s ambition to create a comprehensive ranking system – one that considered cars along twenty-one variables, each weighted according to a secret sauce cooked up by the editors – would also be fine, as long as the cars being compared were truly similar.” This alone should be an admission that rankings can work, as long as there is n “apples and apples” comparison. He goes further and says that the same “sauce” should be altered in terms of the weight of the variables, depending on the type of car that one’s comparing. So if US News Week uses 7 variables, then perhaps the weightings of the variables should change depending on whether one is looking at Private vs. Public institutions.

    5) For me, what does come out rather clearly throughout his article is the notion of cost; he mentions it both in the car and schools examples. I think that this is his own ideological bias, which is fine, but he obviously needs to be accept that his ideology cannot necessarily be that of others. To some degree, I do think he is self-aware of this.

    6) @Spearhead & @Mick, I’m very impressed with your initiative in creating your own ranking systems, very innovative indeed!

    I’ve never posted on an article before, so please accept my apology for such a lengthy post!

  • Eduardo

    @Spearhead

    I, too, am interested in using the MBA to transition to management consulting. How did you identify the proportion of MBAs with offers from the top consulting firms?

    Thanks.

  • Sunny

    I was looking for an article that compared the various rankings – US News, Business Week, Financial Times – and explored the merits of each (or lack thereof) and I didn’t find it – I found this instead. This article is interesting in that it questions the efficacy of this annual rite – always a good thing – whatever we choose to do with the information.

    What is really intriguing though is the way the schools themselves use these rankings, presumably to which they have contributed to spin their message. For example, The University of California Irvine’s MBA program jumped 15 spots in 2010 to 36 in US News’ rankings, dropped back to 40 in 2011 (also by US News) and was ranked 27th nationally and 53rd internationally by Financial Times in 2011. Of course their promo for 2010 stated that they rose to 36. Their hook for 2011 touts the 27 ranking, which looks like they jumped up again, never mind that they actually fell back to 40 and the ranking is from a totally different publication. Really? Do they all do this?

  • http://poetsandquants.com/members/jbyrne/ John A. Byrne

    Sunny,

    Here’s the article you were looking for in the first place: Ranking the Rankings.
    http://poetsandquants.com/2010/06/28/turning-the-tables-ranking-the-mba-rankings/

  • http://www.simplesearch.com/ Architect

    Depending on what a student wants to study, there are many factors that are completely irrelevant. So it would be more useful to have a website in which the user selects and weighs each factor to create a personalized college ranking?

  • Kumar

    Couldn’t agree more! Rankings are indeed a lot of hogwash. I personally know someone who went to IE Business School in Spain based on rankings only to realize how shallow the school and its placement cell really were. He discontinued and returned to go to an Ivy league US School. So beware of all these 2nd tier non-US schools that show up highly on rankings.

  • George

    Spearhead, I’m impressed by the methodology that you employed. For those people that don’t want to get into the same level of detail, a simpler alternative would be to approach the firms that you think that you would like to work for post-graduation, and ask them which b-schools they recruit from.

    If they recruit from a particular school, you are in the game. This opens the door. It’s then down to you, how you impress them at application and interview stage. They will be judging YOU against all of the other individuals that they’re interviewing that cycle. They won’t be comparing your school badges. Your school brand may open the door of opportunity, but whether or not you are ultimately successful in clinching the dream job is in your own hands.

Partner Sites: C-Change Media | Poets & Quants for Execs | Tipping the Scales