The Absolute Worst MBA Rankings

the worst MBA rankings

Before we get to what arguably qualify as the worst MBA rankings on the planet, we need to make it clear that we know many people believe all university rankings are crap.

That’s become increasingly clear in recent years as law and medical schools began a crusade to stop cooperating with U.S. News & World Report. And it has become full focus in the business school arena with academic attacks on the methodology behind the Bloomberg Businessweek ranking.

Jeremey Shinewald, founder and CEO of mbaMission, one of the leading MBA admissions firms, puts it this way: “”They are all terrible in their own way and that does include Poets&Quants which averages out the errors of them all,” he says. “I barely do more than glance at them, because that is what they are meant for. They aren’t meant for applicants to dig in and really understand what is being measured. Instead, they are there to simplify something as complex as an institution — with hundreds of moving parts — down to a single number.

“The United Nations  numerically ranks countries each year and Switzerland, Norway and Iceland often come up as the best places to live – none of us are trying to emigrate on the basis of this ranking. Why aren’t we all moving to Iceland? I have been twice and it is an awesome country. The answer is that because where you live — just like which MBA program you choose — is often a very complex choice that requires consideration of your own values and needs. I think we would all be better served if we ignored most rankings and engaged in independent thought.”

Truth is, however, that prospective students are obsessed with rankings and so are current students, alumni, faculty and the leadership of a school. There’s no question that in the U.S., the leading MBA ranking is produced by U.S. News & World Report. In Europe, the most watched ranking is put out by the Financial Times. While flawed on several dimensions, those two organizations put considerable effort into getting it right, with the FT conducing third-party audits of metrics supplied by business schools for its annual list that comes out every February.


Good advice, no matter what you think about a specific ranking, comes from Matt Symonds, co-founder of Fortuna Admissions: “The least useful MBA ranking is the one that you look at without any awareness of the methodology, and how those metrics and comparisons really matter to you. From US News and Bloomberg to the FT and Forbes, each offers a snapshot based on some interesting but limited variables. Forget about which is the best business school, and think about which is the best business school for you.  I like the P&Q MBA Ranking which aggregates the results across all the major rankings to even out a faulty survey technique or flawed methodology.”

Bloomberg Businessweek, which has substantially changed its MBA ranking over the years, has come under considerable fire in recent years, mainly from Yale School of Management Deputy Dean Anjani Jain (see Did Businessweek Botch Its Latest MBA Ranking?). He was unable to replicate the magazine’s ranking using the weights the editors applied to the metrics employed to create the list. Linda Abraham, founder of, a well-known admissions consulting firm,  believes the Bloomberg BusinessWeek ranking deserves to be on the worst list. She dislikes the ranking for four core reasons: “Surveys represent over 50% of its methodology and they can be easily gamed,” says Abraham. “Surveys, even if not gamed, are very subjective. There is relatively little reliance on actual data, whether of input (test scores, GPA, acceptance rates) or results (salary data, employment rate, increase in salary) . As P&Q reported, the rankings and number-crunching Bloomberg claimed to have done is flawed. And Bloomberg has become rather volatile which calls any credibility into question since schools change slowly.”

All true and valid criticisms. But at least the Bloomberg Businessweek ranking is a serious and well-intended effort to rank MBA programs, even if done rather mindlessly by its editors.

So what are the worst? We consulted with MBA admission consultants and business school staffers who work on rankings, gathering the requested data for ranking organizations. Our conversations led us to focus on three rankings that lack credibility, either because the organizations that produce them measure the wrong things or because they lack transparency in a way that leads to serious questions. No doubt, different critics might well put other rankings on this list. But our money, these three truly stand out as the worst.

Sometimes, the order of schools is so out of whack that you just know the list is a waste of your time. In other cases, it’s the underlying methodology being deployed to rank the schools that is severely flawed or that the ranking organization conceals critical information to help users evaluate the ranking more thoroughly.


Ranking Reasons
1. QS Global MBA Ranking Vague methodology, poor transparency, questionable results, and concerns over conflict-of-interest
2. CEOWORLD Magazine Vague methodology, very poor transparency, quirky outcomes
3. Fortune Over-reliance on year-old pay and placement data, lack of transparency, use of two useless metrics, and entirely U.S. centric

1. QS Global MBA Ranking

Published annually by QS, Quacquarelli Symonds, a U.K.-based company that runs admission fairs, this list is awful for several reasons. But all you need to do is glance at QS’ Top Ten MBA programs in the world to know that something is amiss. The list is based on three surveys–to employers, academics, and business schools–and uses a total of 13 criteria including ‘employability’, ‘entrepreneurship and alumni outcomes’, ‘return on investment’, ‘thought leadership’ and ‘class & faculty diversity’.

David White, a co-founder of Menlo Coaching, a leading MBA admissions firm, believes the QS ranking is the absolute worst. “In terms of the results,” he says, “the least useful MBA ranking is the QS Global MBA Ranking. In that ranking, HEC Paris and IE Business School are ranked significantly higher than M7 schools, including Kellogg, Booth and Columbia. This does not match reality: many applicants who are qualified for admission to HEC Paris and IE would be declined at Kellogg, Booth and Columbia, and very few applicants would accept an HEC Paris or IE offer if they also had an offer from Columbia, Booth or Kellogg.”

His reservations, however, go well beyond the listed outcomes. “QS does not show the full calculation behind each factor, but I’d speculate that the way HEC and IE did so well in the ranking is related to: 1) Strong employer perceptions in their own countries despite a weaker ability than M7 schools to place graduates into roles in other locations, and 2) Lower incoming salaries for students helps with the ROI calculation. HEC and IE recruit more students from Europe, where salaries are lower than in the US. It is a fact (based on each school’s employment report) that Columbia, Booth and Kellogg have higher average post-MBA salaries than HEC, but HEC does better in the ROI calculation, probably because their graduates’ salaries were lower before the MBA.

“It is also curious that HEC beats Chicago Booth for “thought leadership” despite Chicago Booth’s numerous Nobel Prize winning faculty. HEC Paris and IE are great MBA programs for the right candidates, but cannot seriously be ranked above M7 schools.”


Suspicions about the QS ranking, moreover, are long-standing. Two years ago, in fact, an academic published a paper entitled “Does Conflict of Interest Distort Global University Rankings?,” citing the QS lists. The study by Igor Chirikov, director of the student experience at the Research University Consortium at UC-Berkeley, examined the impact of contracting with rankers on university ranking outcomes by gaining access to the advertising spend of universities with QS. He focused on the positions of 28 Russian universities in QS World University Rankings between 2016 and 2021 with information on contracts these universities had for services from QS. The 128 QS-related contracts he discovered between 2013 and 2020 showed that these Russian universities spent $3,871,378 on QS services during the five-year time period.

His conclusion: “The study finds that frequent contracts for services from QS contribute to a substantial increase in the rankings of universities,” he writes. “Positions of Russian universities that had frequent QS-related contracts increased on 0.75 standard deviations (approximately 140 positions) more than they would have increased without frequent QS-related contracts. In a similar way, QS faculty-student ratio scores of Russian universities that had frequent QS-related contracts increased on 0.9 standard deviations more than they would have increased without frequent QS-related contracts…On average, QS ‘frequent clients’ universities also improved their faculty-student ratio scores by 14 points during the same period while QS “occasional or non-clients” did not.”

Chirikov says he focused on the QS rankings over other global lists for three reasons. “First, Quacquarelli Symonds (QS) offers universities a wider array of services than other rankers. For example, it offers a fee-based rating system, QS Stars, that evaluates universities and awards them from 0 to 5+ “gold stars”. The stars indicate “quality” of the university and appear on the website next to the university name, including in the main QS World University Rankings table. Second, Russian universities had a much larger number of contracts with QS than with any other ranker. Third, QS has been frequently called out by media outlets and higher education experts for having and not reporting conflicts of interest.”

The author of the paper points out possible conflicts of interest in all rankings. “Taken together, these findings suggest that conflicts of interest may produce significant distortions in global university rankings,” adds Chirikov.”When rankers evaluate data submitted by universities, they are likely vulnerable to an unconscious self-serving bias also reported by the studies of the conflict of interest among auditors. Self-serving bias may lead to a more favorable consideration of the data coming from universities that are also frequent clients as compared to other types of institutions.”

The 2023 Top Ten According To QS

2023 Rank & School Index Score
1. Stanford Graduate School of Business 93.6
2. Harvard Business School 92.4
3. University of Pennsylvania (The Wharton School) 92.1
4. HEC Paris 92.0
5. London Business School 91.6
6. MIT (Sloan School of Management) 91.5
7. IE Business School 91.4
8. Columbia Business School 90.7
9. INSEAD 89.9
10. IESE Business School 89.8

Questions about this article? Email us or leave a comment below.