Financial Times 2025 MBA Ranking: 10 Biggest Surprises

MBA rankings

A survey of business school officials at a session on rankings at last week’s EFMD Deans Conference identified perceived flaws

7) Are Schools Gaming The FT Ranking?

At last week’s EFMD Deans Conference in Lisbon, a well-attended session focused on the legitimacy of business school rankings. The 70 or so attendees at the session were asked to vote on the most significant flaws of these controversial lists. The No. 1 defect identified in the room was something of a surprise: “They are too easily influenced or gamed by schools.” That is a telling if unscientific criticism from a largely European crowd that is most obsessed with this particular ranking.

Generally, the most common complaint isn’t that a ranking can be gamed. It is more basic: that rankings fail to accurately measure the quality of the education students receive or that rankings unfairly lump all schools together regardless of their size, inherent strengths, or location, all of which impact the educational experience.

Over the years, there is no doubt that schools will put the best face on the data they submit to ranking organizations. Some have been caught red handed in simply inventing favorable but fraudulent data and supplying that to a ranking. But outright manipulation of a ranking tends to be rare and less likely to impact the FT. Or is it?

The Financial Times is the only ranking organization that actually conducts occasional audits of school-supplied data. But those audits performed by KPMG are not exactly up to date. The majority of schools have not had an audit for five years or more.  The last time Harvard, Wharton, Yale, or Tuck was audited by the FT was seven years ago in 2019. As a newcomer to the ranking this year, Tongji University’s School of Economics and Management has never been audited. The last time the FT peeked under the hood at the data supplied by Copenhagen Business School was six years ago in 2018. Hult International Business School was last audited ten years ago in 2015.

By all accounts, these are light audits restricted to school supplied data, anyway. The FT does not have the resources to do extensive double-checking of its alumni surveys other than making sure the IP addresses of respondents match the emails they were sent.

And when the FT suspects cheating, either by audit or other means, it does not report on the potential fraud or discrepancy. Instead, it simply takes a school off its list for a year, without informing readers of the reasons for their exclusion. That lack of public shaming effectively encourages gaming because the consequences, if caught, are not all that severe. A school simply disappears for a single year.

Of course, the best way to game the ranking is to influence those alumni survey responses. Alumni responses figure prominently in the ranking, informing eight of the 21 ranking metrics. Alumni supplied data contribute 56% of the ranking. The FT surveys alumni three years after completing their MBA. For the current 2025 ranking, a total of 6,299 alumni from the class of 2021 completed the surveys — an overall response rate of 36%. While the overall response rate is impressive, in many cases sample sizes are barely met which can have an outsized effect on the responses. And due to the disruption from Covid in 2021, the FT considered schools with a lower response rate.

Schools are explicitly told not to “select and lobby alumni to complete the survey.” But not all strictly abide by the rules, with some actively encouraging alumni to complete the surveys in a way that reflects favorably upon their alma mater. There is also a more sinister way to impact the alumni surveys: That is to select more successful alumni to survey. Instead of providing email addresses for an entire cohort of alumni, as requested by the FT, a school could hold back the email addresses of unemployed or disenchanted alums who are less likely to provide positive responses. After all, the FT gives schools an out: “Please exclude those who want to opt out of our survey,” according to the FT‘s own instructions.

It’s just another very good reason to treat this and every ranking with a very large grain of salt.

8) What The FT Doesn’t Count May Matter Most Of All

Every year, the Financial Times collects an intriguing data point from its alumni surveys that may well be far more important than most. Yet, the FT doesn’t even include it in their ranking calculations. The newspaper asks MBA alums to rate their overall satisfaction with their MBA experience.

Readers always need to be cautious when it comes to student or alumni surveys used for rankings. That’s because most will be cheerleaders on such surveys, knowing that their answers will result in a ranking for their alma maters. Some argue that this is a phenomena that is effectively offset because it impacts all the respondents across the board.

Nonetheless, we have found over the years that you can never trust a single year’s worth of data from graduate surveys because it’s inevitable that such data dumps produce anomalies due to small sample sizes or efforts by schools to encourage alumni to fill out these surveys in a more favorable light. That is why combining scores over several years is likely to provide a more credible result than a single set of data in a single year.

With that in mind, we crunched the numbers on overall satisfaction for the past three years. We set our hurdle high. Schools had to score 9.0 or above on a scale of 1 to 10, with 10 being the highest possible satisfaction score, and they had to accomplish this feat for three years in a row. If a school failed to make the FT list for one year because it failed to meet the alumni survey response rate, we looked back one more year to include that score. This exception allowed us to add Stanford and Michigan Ross to the list of schools with the best overall satisfaction.

Overall, only 14 MBA programs made the cut led by Stanford Graduate School of Business. Here’s what we found: