Can You Trust This Online MBA Ranking?

In its survey to administrators, did more than half a dozen people respond? You can’t tell. Did Princeton Review weigh admissions selectivity more than technological infrastructure? The Review isn’t saying. How exactly does anyone even measure “technological infrastructure?” That’s a really good, unanswered question.

In its survey to students, what weight does the Princeton Review put on student opinions of the faculty or of fellow classmates? No one knows. Did the Review require a certain number of responses from each school to insure that its sample size was adequate? The Review doesn’t say.

In contrast, when U.S. News ranked online MBA programs in January, it spelled out its methodology with great clarity and detail. U.S. News ranked these programs on five general categories, with student engagement given a 28% weight, admissions selectivity 25%, peer reputation 25%, faculty credentials and training 11%, and student services and technology 11%. Each of these categories is explained, also in detail, by U.S. News which publishes a wealth of the data it collects for all to see.

A RANKING THAT IS SIMPLY A BIG BLACK BOX

You can argue with the general approach, the weights attached to each metric, or the sample size. You can even question how U.S. News can measure “student engagement” without surveying students in these programs (we certainly do). But that’s because U.S. News is completely transparent about what it is doing. U.S. News reports that 104 surveys were completed by academics at schools with online MBA programs. It reports on the number of schools that submitted requested data.

With Princeton Review, the ranking and the methodology used to come up with the list is one big black box. As a result this ranking has no credibility or authority. It can’t be trusted nor relied upon. Frankly, it’s an embarrassment.

Even its presentation leads to serious questions about how accurately Princeton Review described what it did do. In its basic table ranking the top 25 online programs, for example, there’s a brief paragraph under the headline “Students Say.”

WOULD ANY STUDENTS EVER SAY THESE THINGS?

Under the University of Florida’s Internet MBA program, ranked sixth best, students say this: “Features of the University of Florida’s Internet MBA program include the option to complete the degree in one or two years, limited class sizes, and the opportunity to earn a concurrent master’s degree in outreach engineering management in 20 months…” That hardly sounds like something any student would say in a questionnaire and more like a description of the program from the University of Florida’s website.

Or how about the statement attributed to students for the Rochester Institute of Technology’s online Executive MBA program: “The prestigious Rochester Instiute of Technology’s Online Executive MBA Program, within its Saunders College of Business, is specifically designed for professionals with at least six years of work experience.” Huh? Can’t you just hear students telling you that?

There are two conclusions to draw from this nonsense. The writers at the Princeton Review either mistakenly labelled these sections or so few students filled out the surveys that there was nothing to quote or nothing worth quoting. Either way, it’s just another red flag. It’s hard not to be suspicious that Princeton Review just made up this ranking from thin air—and not based on any of the metrics it claims to be measuring.

A SIDE-BY-SIDE COMPARISON WITH U.S. NEWS IS TELLING

So the inevitable question is how does it compare with the more transparent approach taken by U.S. News & World Report just four months ago? Not very well. In judging the value of any list and a school’s standing on it, you tend to look for consensus across several different rankings. If numerous organizations rank a school similarly, you can interpret that ranking more confidently. When numerical ranks substantially diverge from one list to another, there’s plenty of reason to doubt the authenticity of a school’s showing.

In the top six positions of the Princeton Review ranking are schools that generally have performed well enough to be at or near the top of the U.S. News list. Clearly, somebody at the Review was consulting U.S. News. There’s the University of North Carolina’s UNC@MBA online program, ranked first by both publications, while Indiana University’s Kelley Direct program is ranked second by the Review and first, in a three-way tie with UNC and Temple University, by U.S. News. The Review has Temple University in Philadelphia in fifth place, after No. 3 IE Business School in Spain, and Arizona State University in fourth, the latter’s standing of which is an exact replica of the U.S. News ranking.

But two things are notable here: First off, the Princeton Review, in ranking 25 separate online programs, never concedes that any school is in a tie. U.S. News, on the other hand, acknowledges that three schools are tied for first, two are tied for fourth, seventh and tenth place, four programs are tied for 12th, two for 16th, and three for 18th. In other words, U.S. News concedes that it’s way too close to call, that it’s not possible to assign a numerical rank to these programs because there is no meaningful difference among them. Princeton Review sidesteps that inconvenient truth and imposes numerical ranks on all 25 programs on its list with no ties at all.

DO ANY OF THESE RANKINGS MATTER AT ALL? NOT REALLY

Secondly, missing from the Princeton Review list are 13 of the top 25 programs ranked by U.S. News, including Carnegie Mellon’s online hybrid, Penn State, the University of Wisconsin at Eau Claire, Lehigh University, and the University of Massachusetts at Amherst. That seems rather improbable for a number of reasons.

And finally, there are the big gaps in what the Review says are the best and what U.S. News says. Princeton Review puts the Rochester Institute of Technology’s online MBA in seventh place, when U.S. News ranks it 25th. The Review has Northeastern University’s online offering in tenth place vs. U.S. News’ rank for the same program at 36th.

Ultimately, of course, we don’t think online MBA rankings matter all that much, anyway. Why? Because online programs largely benefit from the reputation and image of the school’s brand which is mostly determined by how well a school’s full-time MBA program ranks. Call it the halo effect. The value of the MBA on your resume is determined by the school’s overall image and reputation, not an online program ranking meant to get hits on the Internet. Carnegie Mellon tops the list on this score, followed by UNC, Indiana, the new online offering from the University of Southern California, Maryland, and Penn State. Interesting that only two of these schools even make Princeton Review’s Top 25.

We said it before and we’ll say it again: Don’t waste your time looking at any ranking cranked out by the Princeton Review. It’s just not worth the paper, or the bits and bytes, it is printed on.

DON’T MISS: HARVARD BUSINESS SCHOOL TOPS ‘LAUGHING STOCK’ RANKING or PRINCETON REVIEW’S SHAMEFUL MBA RANKING

Questions about this article? Email us or leave a comment below.