Ranking Business School Reputations


“It’s not about aptitude,

 It’s the way you’re viewed.

 So it’s very shewd to be,

 Very, very popular…like me!”

– Galinda Upland (“Wicked”)

Sometimes, U.S. News’ business school rankings seem like a popularity contest. Sure, the index scores awarded to each MBA program integrates empirical data such as starting pay, job placements, GMATs, and GPAs. In the end, the single biggest weight – 25% – is given over to “peer assessments.”

So what exactly are these assessments? Business school deans and MBA program directors assess each other’s programs. Using a five-point scale, where one is marginal and five is outstanding, the deans and directors basically hold the fates of their fellow business schools.


You can probably identify the flaws here: Administrators often possess surface knowledge of the academic curriculum, students, support, and initiatives at other schools. While these administrators can opt out of scoring particular schools, that simply gives greater weight to a smaller swath of responses (only 42% of targets respond to U.S. News’ survey already). With survey respondents evaluating their competitors, there is always the possibility of mischief. When you factor in recruiter surveys, which account for 15% of the formula’s weight, 40% of U.S. News’ rankings are based on subjective data.

In other words, U.S. News’ peer assessment score may be less about gauging the overall quality of an MBA experience and more about measuring a school’s brand. Think of it this way: Why do football teams get so pumped to play Notre Dame? It isn’t necessarily about the program, which hasn’t won a national championship since the Reagan years. When it comes to the Fighting Irish, teams are playing more than 11 guys on the field. They’re duking it out with the lore of Knute Rockne, the Gipper, and Touchdown Jesus. Even in victory, opponents only elevate themselves slightly. In the end, people remember gold helmets, Cotton Bowl comebacks, and the “Victory March.” Notre Dame football is a brand – and carries a century worth of associations with it. In the end, that eventually drowns out whatever happens in the present.

The same is true of business schools. For most, Harvard, Stanford and Wharton are the gold standard (and deservedly so). And that makes it difficult for other schools to get attention. Take Booth, for example. At the beginning of the decade, Booth was ranked #9 by U.S. News. This year, U.S. News lists it at #4. What’s more, Booth is placed at #1 by The Economist and Bloomberg BusinessWeek, #2 by Forbes, and #3 by Poets&Quants. It also boasts a higher employment rate than Harvard and Stanford.


In the academic and business communities, Booth enjoys a sterling reputation, no doubt. But ask yourself this: If you asked these same people to list the top three programs, how many would substitute Booth for Wharton? Chances are, the majority would exclude it. And that brings up another question: Is Booth excluded on merit…or perception (or what their grads might call a school’s brand – and all the popular images that go with it).

Earlier this year, Poets&Quants published “When MBA Rankings Lag The Facts,” which applied a formula developed by U.S. News to identify business schools that were “underperformers” and “overperformers.” In other words: Which schools’ ranking matched their reputation among academics? Here’s how it worked: U.S. News ranks schools by their average peer assessment score. From there, they subtracts a school’s overall rank from its peer assessment rank. A positive number indicates that a school is an “overperformer,” while a negative number reflects an “underperformer.”

What’s the difference? An overperforming school has a higher overall rank than an academic rank. In the words of Bob Morse, the director of research at U.S. News, “an overperforming school has a “reputation among its academic peers [that] has not kept pace with what it has achieved in the underlying academic indicators.” Conversely, underperformers have higher academic ranks than overall ranks, which means that its rankings may be artificially inflated by peer assessments.


In other words, underperformers tend to be popular, where their brand and reputation may be masking underlying flaws in areas like academic inputs and career outputs.

Based on comparing 2014 and 2015 peer assessment scores, one theme is clearly evident: It is very difficult for schools to shake their peers’ preconceived notions. In fact, 59 of the top 103 schools received the same peer assessment score as last year (with three schools’ scores from the previous year being unavailable). Meanwhile, 17 schools gained just a tenth of a point (+0.1), while 18 lost a tenth of a point (-0.1). In other words, only six schools saw more than a slight change in their peer assessment scores: Emory University (Goizueta) (+0.2), George Washington University (-0.2), Thunderbird (-0.5), Claremont (Drucker) (-0.2), the University of Cincinnati (Lindner) (+0.2), and the University of Texas-Dallas (Jindal) (-0.2). This stagnant scoring, coupled with the 25% of the peer assessments, makes it more difficult for some programs, particularly lesser-known ones, to bolster their rankings.

Top 20: Booth Punished for Higher Performance

Looking for statistical flukes? Look no further than the top 20 schools when it comes to underachievers. Here, the U.S. News formula knocks down schools for increasing their overall rankings. So which schools got the proverbial short end of the stick? Check out the accompanying table for the results.

(Check out the next page for the Top 20 rankings)

  • I also wonder if Tuck’s approach to an MBA program (ONLY having a full-time two year program) hurts its image among peers. Tuck has one of the smallest MBA footprints with only ~270 total MBA grads a year.

  • Brain2

    Replaced would be a better word than juxtaposed. I’m not sure you actually juxtaposed anything.

  • Kyle

    Fordham is a perpetual under performer, given location pretty telling.

  • 2cents

    I don’t actually know a whole lot about the programs (so I’m hoping someone can clarify) but I’ve lived/worked on both coast and internationally and don’t really understand why Berkeley ranks higher than Dartmouth and Columbia on peer reviews or in general? I took a look through the P&Q summary and it looks like the difference is the peer assessment (and for Tuck recruiters knock them). Pretty comparable otherwise except Tuck/CBS appear to feed more to consulting/finance (higher salary) v. Tech (slightly lower). Is that the difference? Anyways it’s nit-picking, all three are great programs I’m sure.

  • Prateek

    What are the prospects of Alfred Lerner College of business and economics. , university of Delaware which is ranked at 80 in US News… Poets and Quants never talk about this MBA college..
    It has quite a good reputation for its business programs.

  • Jeff-Schmitt

    You are correct. I accidentally juxtaposed Ross with Fuqua (which dropped from 11 to 14). I have made the correction. Thank you for bringing this to our attention. I appreciate it. And good luck with your application! – Jeff

  • 2015Applicant

    “However, the University of Michigan (Ross) … dropped in the overall rankings, though both posted modest gains in the average peer assessment score.”

    John – didn’t Michigan Ross move up in the rankings (14 to 11) as well as see an improvement in its assessment score? Just want to make sure I am looking at the table correctly.