EVERYTHING BUT THE KITCHEN SINK METHOD OF RANKING MBA PROGRAMS
Even in a good year, The Economist‘s ranking is often thought to be somewhat quirky. After all, Harvard, Stanford and Wharton have never emerged winners since the list’s debut in 2002. Chicago Booth has topped the list eight times, Kellogg four times, and Dartmouth Tuck once. That’s why Booth’s decision to hang with the M7 in this boycott made the least strategic sense. The school had the most to lose in repetitional value by deciding not to participate.
The Economist has an everything-but-the-kitchen-sink approach to ranking MBA programs. It examines business schools by considering the most criteria—21 different metrics in all versus the 20 at the FT–from the diversity of the on-campus recruiters to the range of overseas exchange programs. Compensation and career placement are heavily weighted, including starting salaries, pre-MBA versus post-MBA pay increases, and the percentage of graduates who land jobs through the career management center. Pay and placement account for 45% of the methodology.
The volatility in the ranking has long been jarring because The Economist actually attempts to smooth the year-over-year changes by taking a weighted average of data from 2021 (50%), 2019 (30%) and 2018 (20%) to, in its words, “provide a rounded picture of the school over a period of time.”
STUDENT SATISFACTION LOOMS LARGE IN THE SURVEY’S METHODOLOGY
The one big difference with the Financial Times is The Economist’s rather significant reliance on student satisfaction, gathered by an annual survey of current MBA students and recent alumni. Those results account for 20% of the ranking. They’re asked to rate the quality of the faculty, the career services staff, the school’s curriculum and culture, the facilities, the alumni network, and their classmates. The methodology takes into account new career opportunities (35%); personal development/educational experience (35%); increasing salary (20%); and the potential to network (10%). The figures are a mixture of hard data and the subjective marks given by the school’s students who are aware that their answers will result in a ranking that could reflect on the reputation of their degrees.
Parsing the student opinion results yield some fascinating insights, whether flawed or substantive. The magazine’s survey of alumni found that Michigan Ross provided the best overall educational experience in its MBA program. Ross was followed by IMD, IESE, Wisconsin Business School, and NYU Stern. IESE Business School came out on top based on alumni answers to the survey question on program content. After IESE, the Indian School of Business was second followed by IIM-Ahmedabad, UNC’s Kenan-Flagler Business School and IMD. Student ratings of the faculty put Texas Christian in first place, followed by UC Davis, Sun Yat-sen University in China, Penn State, and SDA Bocconi. That compares with last year’s winners: Darden, Booth and Harvard.
In almost all cases, the differences between and among the actual schools on the five-point scale are tiny. Yet, those kinds of fractions make for outsized differences in a school’s overall ranking that would cause a statistician fits because the results are so closely clustered together that they lack statistical significance. Yet, the fortunes of business school MBA programs turn on these minute differences in The Economist survey, much to the chagrin of B-school deans and MBA directors.
The big question posed by this unusual ranking is whether it has any real credibility. With so many of the top-tier players missing from the list, the ranking fails to convey an accurate picture of the choices qualified applicants face. It’s another reminder that candidates who consult rankings to decide where to apply should use them as a mere starting point in the search and never rely on only one list even for that.