Stanford Tops ‘Modernized’ Businessweek MBA Ranking


What can cause that kind of unusually high volatility? Certainly, the major changes in methodology are largely to blame. But like all rankings, the underlying scores behind each numerical rank are often so close as to be statistically meaningless so inconsequential changes that can result from the size of a survey’s sample or a tiny difference in starting salaries can cause outsized changes in rank.

Readers of rankings rarely consume the footnotes and explanations of methodology that results in a school’s actual rank. And this year, the few who wander into that explanation are likely to shake their heads at the complication in Businessweek‘s new methodology, an approach so complex that it is hard to explain why one school is up or down, even in double digits.

“Rather than assign weightings ourselves, as most rankings do, we took an approach recommended by several of the business schools we spoke with: Let the stakeholders decide,” explains Caleb Solomon,  Bloomberg senior editor. “To do this, we surveyed students, alumni, and recruiters to learn what was most important to them. Their answers determined the weightings of each of our new indexes. Then, based on our survey results and compensation data, we calculated overall rankings.”


Of all the four ‘indexes’ by which schools are ranked, the clearest explanation for what Businessweek is doing occurs in the compensation category where Businessweek is putting 38.5% of the weight on (there’s 27.9% on networking, 23.1% on learning, and 10.5% on entrepreneurship). For compensation, the editors are placing a 25% weight on survey questions and the remaining 75% on figures provided in surveys and year-old employment reports from business schools (even though 2018 data is pretty much available).

The 75% component consists of median salary after graduation (weighted 30%), median alumni current salary (22.5%), percentage of students seeking employment who were employed within three months of graduation (11.25%); percentage of students reporting salary information who received a bonus (5.625%), and median sign-on bonus (5.625%). For non-U.S.-based schools, local compensation figures were converted into U.S. dollars on or near the data collection cutoff date, but salaries were not adjusted for purchasing power parity. The only data Businessweek is sharing is median starting salary in comparative tables. At Stanford, which finished first in the compensation index, it was $140,000 last year.

This approach puts significantly more importance on pay and placement. Last year, the rough equivalent of this category placed a  30% weight on these metrics vs. this year’s 38.5%. It weighed starting base salaries, adjusted for variations across industries and regions, at 10%, the increase in alumni compensation above pre-MBA levels 10%, and job placement three months after commencement at 10%. By deciding against using purchasing power parity and dropping last year’s region adjustment, non-U.S. schools will likely lose standing when Businessweek combines all the schools in the new global ranking in December.


To try to explain why a school might rise 30 places as Howard University did this year, you generally have to look under the hood of a ranking and compare each set of metrics against each other over two years. That exercise would be impossible this year because of the change in methodology. But Businessweek makes such investigations impossible anyway because it does not make public the vast amounts of survey data that it collects.

The publication of the index scores for the four categories somewhat helps but no year-over-year comparision can be made there, either. Still, when a school jumps from a rank of 63rd to 33rd, you might expect that there could be some explanation. This year, however, the changes Businessweek has made and the paucity of information behind them do not permit any rational explaination.

In every ranking, of course, just as many MBA programs seem underrated as they do overrated. Two schools that fall into the underrated category—Virginia’s Darden School and Cornell’s Johnson School—make the top ten on this list. On the other hand, Michigan Ross, the top ten school with the best rankings momentum this year, shows up in 18th place. Even worse, Dartmouth Tuck sits at 19th. On what planet are these two MBA programs 18th and 19th?

Meantime, much has been lost in the transition to this new methodology. Businessweek no longers breaks out its rankings from corporate recruiters,  student or alumni student satisfaction. However flawed, those were highly informative numbers. And inside each school profile, Businessweek now provides a “climate” bar where students are asked about tolerance toward women, minorities, sexual orientation and internationals. This would have been helpful if Businessweek actually supplied numbers rather than a hard-to-decipher bar graph.


Most puzzling, however, are those index scores. In ‘entrepreneurship,’ Stanford gains a perfect 100 score. But is it really 35 points better than Harvard Business School, which incidentally is behind Willamette, or even 19.5 points better thn Babson? Is the University of Utah better on this measure than MIT Sloan? Is Mississippi better for entrepreneurship than Kellogg? If you think The Economist’s ranking is unreliable, this measure seems downright crazy.

But then there are the ‘learning’ index scores. Stanford merits a rating of just 77.3 on the 100-point scale. Wharton’s 61.3 and Harvard”s 54.4 are even worse. Surprisingly, the way Businessweek cooked up this metric produced unusually low scores at schools generally highly regarded for their learning experiences. MIT, Booth, Berkeley, Columbia, and Kellogg all score at a 65.5 to 73.8 clip. UVA Darden, well known for having the best MBA teaching faculty on the planet, rates just 67.5.

The best score among the Top 20 ranked schools is Carnegie Mellon at 81.2. So which MBA programs actually top the learning index? William & Mary, Utah and UT-Dallas. Carnegie Mellon is in fourth place, followed by Brigham Young, Baylor, Tampa and Texas Christian. We have no doubt that these schools provide their students with excellent learning experiences, but are these the MBA programs any thoughtful observer would put at the top?

And finally there is ‘networking.’ William & Mary is ranked ninth best in this category, UT Dallas comes in at 22, and Texas Christian University at 27. Either a few schools have learned how to game the system or there is so much self-interested cheerleading in the surveys by students and alumni, who afterall have a vested interest in seeing their alma maters ranked highly, that the surveys are no longer capable of delivering valuable insights.

While the editors should be commended for the vastly improved response rates on its surveys and for gathering so much data on MBA education, the result only goes to prove that more of the wrong data doesn’t lead to better outcomes. It would be tempting to urge Businessweek back to the drawing board yet again, but there probably aren’t all that many deans who would dare ask the editors to take yet another stab at it.