Did Businessweek Botch Its Latest MBA Ranking?

To construct an MBA ranking, Bloomberg Businessweek editors collect thousands of data points and crank out a multitude of calculations. The opportunity to make computational mistake is high, particularly if the people overseeing this much-scrutinized annual accounting have little to no background in statistics. A sophisticated analysis of the newest 2021-2022 MBA ranking by Businessweek shows that the magazine made a big boo-boo on its recently published list.

The analysis, carried out by Yale School of Management Deputy Dean Anjani Jain, uncovered major errors in the ranking. Using Businessweek’s reported scores for each school, he found that applying the stated weights to the five metrics used by the magazine would lead to results that are “egregiously off-kilter when compared to the published ranking” (see his full analysis here).  And he calculated that the only way to replicate the ranks published by Businessweek is to apply a very different set of weights to the five metrics.

In fact, the recalculation by Jain would change the positions of 23 of the Top 25 business schools, and in many cases, the changes are so dramatic as to fuel suspicions that Businessweek’s editors may have felt the need to alter the results to preserve some sense of credibility. Applying the stated weights to scores published by Businessweek causes MIT’s Sloan School of Management to fall uncharacteristically to a rank of 21 from 7th; the Jindal School of Management at the University of Texas in Dallas to oddly skyrocket into the Top Ten, placing ninth, a rise of 22 places, and Emory University’s Goizueta Business School to achieve its highest rank ever in tenth place, eight spots better than the published ranking of 18th.


Jain, an expert in optimization algorithms and the analysis of combinatorial problems, does not believe the screw up was intentional but it could have resulted from grave errors in applying the ranking’s stated methodology. Bloomberg Businessweek says that its overall ranking is composed of five indexes that purport to measure five aspects of an MBA program: compensation, learning, networking, entrepreneurship and diversity. In trying to replicate the magazine’s published results with its own reported data, however, those weights appear to be incorrectly applied.

Instead of weighting compensation by 35.7% as Businessweek contends, the true weight to replicate the 2021 ranking shows that the compensation index was given a weight of weight of 58.5% (see below table). Though Businessweek maintains that it weighted “learning” by 25.8%, the true weighting of that component turned out to be only 7.7%. The impact on the overall ranking itself is considerable. If Businessweek weighted the scores correctly, Wharton would have fallen to a rank of 28th, much lower than its already surprising low rank of ninth. That is because the “learning” index is largely composed of student survey answers, and many of the MBA students surveyed by Businessweek were highly disappointed in the school’s response to the pandemic (see Revenge Of The COVID Cohort).

Jain poses a thoughtful explanation for the unusual rankings of both Wharton and MIT Sloan, two MBA programs that are widely considered among the best in the world. His perspective reinforces the belief that the differences between and among the schools are often so small as to be statistically meaningless, unless you add a metric that has greater variation in the results. In this case, the variation appears to come from those student surveys as expressed in its “learning” index. In the past, Businessweek would smooth these results by combining them with previous student surveys. That was not the case this time, and the impact was severe. Businessweek did maintain that approach for its alumni surveys, however, with 50%, 25%, and 25% weight on their responses from the most recent to the least.


“On the “learning” index of the ranking, Wharton and MIT are both in the bottom quartile among 84 US schools (Wharton at rank 78, MIT at 64—I note this without editorial commentary),” writes Jain. “On the other hand, UT Dallas (Jindal) has the highest “learning” score, ranking #1 on this index.  If “learning” did contribute 25.8% to the overall score, Wharton and MIT would need to have dominant scores on the other indexes (which they don’t) to rise to their high overall ranks.”

Of course, those counter-intuitive results on their own would raise serious credibility issues about the ranking. But there's more. Of the 84 U.S. business schools ranked by Businessweek, 79 of the MBA programs would be ranked differently, 15 of them would experience double-digit changes from their published ranks. Duke University's Fuqua School of Business would plummet 14 places under the recalculation to a rank of 29th from 15th. Cornell University's Johnson Graduate School of Business would fall 12 places from 20th to a rank of 32nd. Ohio State's Fisher College of Business drops 14 spots to place 58th instead of 44th. Baylor University's Hankamer School of Business would jump by 18 places to rank 39th, much better than its published ranking of 57.

In academic research, the ability to replicate results is critical to the credibility of both the authors and the findings that result from their research. When those results cannot be duplicated by using the researchers own methodology, it raises red flags about the validity of the study. Jain's analysis is a massively large red flag, not unlike one of those over-sized American flags that fly in front of American auto dealerships in the midwest. He notes that "it is problematic (and ironic) that BBW’s published ranking of the U.S. schools cannot be replicated by applying their stakeholder-generated weights to the five indexes that make up the overall score for each school."


If Jain's analysis is right, this would not be the first time that Businessweek has published an erroneous ranking. Eight years ago, mistakes in calculating what was then a measurement of a school's "intellectual capital" led to major revisions of the ranking. A month after publication, Businessweek fixed the mistakes online, causing some schools to move by as many as 10 positions in its ranking of intellectual capital. At that time, Bloomberg BusinessWeek was not so eager to publicized its own failings. The magazine quietly revised its rankings online with little fanfare or notice (see BusinessWeek’s Big Oops Ranking Moment).

This new challenge to Businessweek's ranking is the ultimate oops because it impacts not merely one data point in the ranking but all of them. Like any solid researcher, Jain did his homework and then brought his findings directly to Businessweek for a response. What came back from Caleb Solomon, the editor in charge of the ranking, did not reassure him. In the exchange, dated Sept. 30th, Solomon confidently told Jain his analysis was all wrong and then went into the weeds of the magazine's methodology.

Page 1 of 4