The New FAKE MBA Ranking From The Princeton Review

In the past year or two, it’s been relatively common for a certain politican to call out FAKE news. More often than not, of course, people label news they simply dislike as fake, regardless of its truth or accuracy. But there are indeed FAKE news stories out there, and there are also undoubtedly FAKE business school rankings.

How do you tell the real thing from what’s simply made up? Brand reputation certainly matters. You can essentially trust such global media brands as the Financial Times, U.S. News & World Report, or Bloomberg Businessweek. They bring to their projects a level of journalistic integrity and standards that should give one confidence that the game is not rigged. That is not to say their MBA rankings are not without significant flaws or that they are put together by people who truly know what makes a business school experience excellent.

But one of the best ways to test the legitimacy of a ranking is to look at the organization’s methodology for composing its list. A rankings approach that lacks transparency should immediately earn one’s distrust. Afterall, without a clear understanding of what is being ranked, it’s impossible to judge the value of the list. More than that, a vague and unspecific methodology opens the list up to foul play.

FAILING THE GRADE: THE PRINCETON REVIEW

And then there is the methodology itself, beyond its actual transparency. What is measured and how it is measured really counts. If you don’t know what a ranking measures, how can you know  if it has any meaning for you? Also, a methodology that surveys students or administrators yet fails to report a response rate should also raise doubts.

And all of these issues surface in the latest list of the best MBA programs from the Princeton Review, the test prep company that is largely in the rankings business to generate publicity for its guidebooks and prep courses. You have to presume that the editors of the Review know the higher education business. They’ve been in this game for many, many years.

But Princeton Review’s 2017 MBA ranking is a quintessential example of a FAKE ranking. First off, PR has 267 schools on its overall list, none of them with a numerical rank. By declining to rank the programs, Princeton Review actuallly does its readers a great disservice. Does anyone really think that Southeast Missouri State University’s MBA program is every bit as good as the one at Washington University’s Olin School of Business? Of course not. Yet, the Princeton Review list effectively gives equal standing to both schools. That is utter nonsense.

‘HIERARCHICAL RANKING LISTS OFFER VERY LITTLE VALUE TO STUDENTS’

Why would this organization not put an actual rank on a program? There’s an official explanation from Princeton Review: “We don’t have a ‘Best Overall Academics’ ranking list nor do we rank the schools on a single list because we believe each of the schools included offers outstanding academics,” the editors claim. “We believe that hierarchical ranking lists that focus solely on academics offer very little value to students and only add to the stress of applying to business school.”

Well, there is another more plausible reason. Having a large list of unranked schools essentially allows the Southeast Missouri State’s of the world to boast that they are on a best list with Harvard, Kellogg, Booth and Tuck. And it more easily allows the Princeton Review to secure marketing dollars from a highly grateful school with an MBA offering that is comparatively inferior to most of the top MBA programs on the truly influential ranking lists.

Still, let’s judge the ranking on our three key criteria: Brand, transparency and methodology. The Princeton Review is not a media brand, with journalistic standards for integrity and reporting. It is a test prep company, period, trying to garner greater attention for its test prep courses. Its motivation for doing these lists is based on just that and nothing more. So on the brand test, the Princeton Review, which has changed ownership hands numerous times in the past ten years, doesn’t make the grade.

FLUNKING MARKS FOR TRANSPARENCY AND METHODOLOGY

But it completely flunks the test on both transparency and methodology. Here is the complete description from the Princeton Review:

“Our Best On-Campus MBA cohort includes 267 business schools which were selected using a combination of factors including institutional and student survey data. Data used includes career outcomes, admissions selectivity, and academic rigor, among others.”

When it comes to the overall ranking by the Princeton Review, that’s it. There is no disclosure whatsoever on what metrics or weights Princeton Review places on any of these criteria, never mind “among others.” And that is in total contrast to a REAL ranking.

First off, legitimate rankings are entirely transparent, spelling out exactly what weights are placed on each measured criteria. We know that the U.S. News formula puts 14% of the weight in its ranking on average starting salaries and sign-on bonuses and 16.25% on the average GMAT and GRE score of the latest entering class. The Financial Times, on the other hand, puts a 40% weight on pay in ranking MBA progrms, with 20% on the average income of an alum three months after graduation and another 20% on the percentage increase in salary above pre-MBA levels.

ALWAYS LOOK FOR A RESPONSE RATE ON A SURVEY TO JUDGE ITS CREDIBILITY

That kind of specificity allows readers to make their own decisions about the kind of discount they might place on a ranking. And when the data is clearly shared as it is in the better rankings, readers can easily parse the metrics for what’s most important to them. None of that is possible in this ranking from the Princeton Review.

Readers also don’t have a clue about the response rate for the student surveys or when the surveys were in the field. Is it even remotely possible that the Princeton Review collected opinions from 23,000 MBA students in a single year? That is doubtful. In fact, the published methodology fails to inform readers that those surveys were collected over a three-year period from three different classes.

It also is not known if there is any minimum threshold level for an acceptance student survey response as there is in other rankings. At the FT, where the latest response rate was 41%, no school can make its list without a minimum 20% response rate per school, with at least 20 fully completed responses. Such basic standards are meant to give some level of confidence in the results.

FROM ‘BEST CLASSROOM EXPERIENCE’ TO ‘MOST COMPETITIVE STUDENTS’

Besides the overall list of 267 MBA programs, the Princeton Review also publishes numerical ranks for the top 10 schools in each of 18 categories, including “best classroom experience” and “best career prospects.”

What data is used to crank out these lists? The best classroom experience is based, according to PR editors, on “student answers to survey questions concerning their professors’ teaching abilities and recognition in their fields, the integration of new business trends and practices in the curricula, the intellectual level of their classmates’ contributions in course discussions, and whether the business school is meeting their academic expectations.”

Best career prospectsis based on school-reported average starting salary and percent of students employed within three months of graduation and student answers to survey questions “assessing the efforts of the placement office, the quality and availability of the alumni network, the quality and diversity of the companies recruiting, and the opportunities for off-campus internships and to work with mentors.”

PASSING THE SMELL TEST

That is all reasonable enough, but in both cases, there is no disclosure on the importance of each of these metrics in the formula. You have no idea how much weight PR is giving to starting salaries vs. employment or any other metric. Neither are the underlying index scores for the ranks disclosed. Those numbers, openly revealed by U.S. News and several other ranking lists, tell readers how far ahead or behind a school is from another and whether there is a meaningful statistical difference between and among the ranks.

And then there is, of course, the “smell test.” Admittedly, that is more subjective and largely syncs with existing beliefs about each of the school’s MBA programs. Consider Princeton Review’s ranking for the best classroom experience. Does anyone believe that the University of Florida, ranked fourth in this category, should be in the top ten when Harvard, MIT, Wharton, Chicago Booth, Northwestern Kellogg, Columbia, Yale, Dartmouth Tuck, and Duke, among other leading programs, fail to even make the list?

You intuitively know the answer to that question, and the answer suggests the “smell test” wasn’t passed.

Bottom line: Take a glimpse of these top ten lists for their sheer entertainment value (see below for the results of each). But treat them with one big grain of salt in what is pretty much a FAKE ranking.

The Best Classroom Experience

The Best Career Prospects

The Best Professors

The Most Competitive MBA Students

Most Family Friendly Business Schools

Best MBA Campus Environment

Best Administered Business Schools

Toughest MBA Programs To Get Into

 

Questions about this article? Email us or leave a comment below.