How Wharton Fought Its Way To The Top Of U.S. News’ MBA Ranking

The University of Pennsylvania’s Wharton School


Wharton is not the only outlier among peer schools when it comes to non-report rates. Columbia Business School and the University of Michigan’s Ross School of Business last year had response rates even lower than Wharton, at 90% and 87%, respectively. Columbia and Ross also managed to eke out rankings gains in U.S. News this year, both improving by one place to ninth and 11th.

All of these schools meet the reporting standards set by the MBA Career Services & Employer Alliance, an organization of business school career directors and MBA recruiters. The group was created in 1994, partly as a result of the proliferation of rankings and the importance of getting apples-to-apples school reporting. At first, the group adopted an 80% threshold for reports in 1998.

“They felt that was the minimum,” recalls Megan Hendricks, executive director of the MBACSEA.  “They probably didn’t want to go too high at the time because they worried that some schools couldn’t get that high. That was what seemed reasonable at the time.” The group has since revised the standard upward to 85% in 2005. “A strong majority of our members are reaching and exceeding that threshold. There are schools that have trouble reaching it because they either have a small staff or a large student population.”


Some career officials believe the current standard is too lax, especially in a world where it is far easier to track down graduates via LinkedIn and other social media. Eric Johnson, director of graduate career services at the Kelley School of Business at Indiana University, is in that camp. Kelley’s employment reports are routinely based on a response rate of 99% to 100%.

“I’m under the assumption that the students most likely to be non-responders are also the ones most likely to be still-seeking (jobs),” says Johnson. “Whether it’s because they’re embarrassed, or because they don’t regularly check communications or something else, I’d guess that this group’s offer rate would be lower than the 85% who reported. I have a good enough relationship with my peers that I would listen to them defend a reporting rate under 90% with an open mind, but there’s no doubt in my mind that it skews the numbers.”

The reason why standards were necessary in the first place was because of the pressure placed on school administrators to game the rankings. Many admission directors and consultants concede that the dramatic rise in average GMAT scores at many schools is a direct result of rankings that use class scores on standardized tests to rate programs.


In recent years, Wharton and Booth have been engaged in an ever-escalating battle of higher and higher GMAT scores—which account for roughly 16% of U.S. News’ ranking. Booth has risen its class GMAT score for 13 consecurive years. In the last five years, the school has upped its average to 726 from 719. Wharton, meantime, has boosted its class GMAT score by a dozen points to 730 from 718 since 2011.

That ‘arm’s race’ strategy between the two schools may have run its course, however. Last year, Booth merely held on to its class GMAT score, while Wharton slipped by two points from 732, though Wharton managed to stay ahead of HBS’ 729 average by a single point. Still, even HBS and Stanford are not immune from this game. Harvard and Stanford both reported four-point gains in average class GMAT scores for their incoming 2016 cohorts, with Stanford maintaining its 737 GMAT lead.

If the purpose of a ranking is to provide useful information to applicants and other stakeholders, the industry should revisit its reporting threshold and consider increasing it to 95%. Given the significant influence of the U.S. News’ ranking, the magazine should give serious thought to switching to medians from averages and adjusting the numbers for their quality before putting them into a spreadsheet to crank out a ranking.

The takeaway for readers? Read any and all rankings with one very big grain of salt.