Winners & Losers In The 2012 BW Ranking

Print Friendly

by John A. Byrne on

According to Bloomberg BusinessWeek’s new ranking of the best business schools, North Carolina State’s Jenkins Graduate School of Management has the least satisfied MBA students in the world. Perhaps even worse, Jenkins’ career services office ranks dead last among 114 business schools in the U.S., Europe and Asia, whose graduates BusinessWeek surveyed. You’d think a business school’s full-time MBA program couldn’t get more negative publicity than that.

But at least Jenkins managed to be ranked by BusinessWeek yesterday (Nov. 15) when the magazine published its 2012 ranking of the best full-time MBA programs. Jenkins is 63rd on a list of 63 U.S. schools. An even worse fate? Falling off BusinessWeek’s ranking altogether. This year, five schools disappeared from the magazine’s list of the best full-time MBA programs, including two schools that had ranked in the Top 50 the last time BusinessWeek published its list of the best business schools. Tulane University’s Freeman School, ranked 35th in 2010, didn’t warrant a mention.

Neither did William & Mary’s Mason School, which had been ranked 47th two years ago. Ditto for No. 54 Pittsburgh’s Katz School, and No. 51 Case Western University’s Weatherhead School of Business. All are gone from BusinessWeek’s ranking. It’s not clear whether those schools disappeared because BusinessWeek couldn’t get a minimum response rate on the school from either the Class of 2012 or corporate recruiters. In any case, it’s bad news.


Tulane would have had to fall 29 places, for example, to do its disappearing act. That’s quite an unexplainable plunge for a first class university’s business school.

Every ranking brings winners, who are among the most likely to promote a newly published list and their standing on it, and losers, who quietly ignore the latest news. The new 2012 Bloomberg BusinessWeek ranking is no exception. For while five schools fell off the ranking, there were 10 schools that had not been ranked last time but cracked the 2012 list. This happy group was led by UC-Irvine’s Merage School which placed an impressive 43rd on the list, meaning it had to rise by a minimum of 15 spots to make it. In 2010, BusinessWeek put numerical ranks on 57 U.S. schools versus 63 this year.

Other newcomers to the BW list were No. 46 Texas Christian University’s Neeley School; No. 48 the University of Florida’s Hough School; No. 53 University of Iowa’s Tippie School; No. 55 Syracuse University’s Whitman School; No. 56 the University of Missouri’s Trulaske School; No. 58 Fordham University; No. 60 University of Tennessee at Knoxville; No. 62 University of South Carolina’s Moore School, and No. 63 North Carolina State’s Jenkins.

Other significant winners and losers, of course, are those who stay on the list year after year but move up or down, especially in double-digit movements. The two biggest winners were the University of Maryland’s Smith School of Business and Vanderbilt University’s Owen Graduate School of Management. Smith jumped a remarkable 18 places to rank 24th among the best U.S. schools, up from a lowly 42nd in 2010. Owen rose 12 places to rank 25th, up from 37th only two years ago.


How did they do it? A look at the underlying results pretty much tells the story. Maryland went from a rank of 33rd on BusinessWeek’s student satisfaction surveys in 2010 to a rank of 6th this year. That’s an almost unheard of turnaround from one survey to the next, especially because BusinessWeek combines the results of three student satisfaction polls in 2012, 2010, and 2008 to come up with the ranking. The latest poll gets 50% of the weight, while the other two polls each get 25%. Still, rising 27 places is nearly a red flag event. Maryland also gained considerable ground in the corporate recruiter part of the methodology, moving up ten places to rank 44th from 54th two years ago.

The school is so pleased with the result it shot out a media release yesterday (Nov. 15th) to spread the news. “We’re so pleased to see our commitment to our students and our investments in our programs recognized with this ranking,” said Dean G. “Anand” Anandalingam in a statement.  “We have been focused on strengthening the learning experience for our students. We have some of the best researchers in the world who truly excel at translating their work in the classroom for our students. And we’ve made significant investments to transform our career services offerings to help our students develop as strong leaders.”


1 2 3 4 Next
Air Time - Comments
  • abe

    Tulane’s Freeman School of business was bound to loose its ranking, misreporting key metrics to key ranking institutions along with terrible management and administrative staff are killing/killed this school. I Sincerely advice prospective students to think twice before coming to Tulane (They might offer you a good scholarship, but New Orleans is an expensive city, It will cost you enough to make you repent the decision you took in the first few months, and secondly there are hardly any jobs for a MBAs here)

  • Djooke

    “Please expound and give evidence as to why you believe Indiana to be overrated and not to be in Columbia’s league.”

    Is that a serious question? Clearly, as a Kelly grad you are biased. I don’t fault you for sticking up for your school. But even you need to live in some reality. For starters — the faculty and alumni at Columbia. Secondly – the recruiters that come to campus are among the most elite companies and hire in major numbers. The overall prestige of the degree.

    Of course, Kelly is good program and offers a great ROI. It has solid recruitment stats. But how many would chose to attend Kelly over Columbia? Columbia is an elite school — highly selective; the caliber of students compared to kelly is definitely better (in objective terms with regard to GMAT, GPA, Work Exp, International experience, etc).

    Again, don’t view this as an attack on Kelly. It’s not personal. But please don’t feel the need to stick up for kelly to the point that all objectivity and reason are thrown out of the window. If you cite the BW rankings as your evidence, then I will suggest you look at other rankings such as US NEWS, Economist, FT, etc to get an overall snapshot. You cannot just pick and choose what data you wish to examine.

  • hangtime79

    “Simply put, Indiana is way overrated and definitely not in the same league as CBS.”

    Please expound and give evidence as to why you believe Indiana to be overrated and not to be in Columbia’s league.

  • Jimbo

    Why aren’t community colleges ranked here? I see only expensive private and state schools.

  • David W. Frasier

    Small issue with the information above – the University at Buffalo School of Management is still in the ranking for 2012 at the exact spot we were in for 2010 – #57 – please correct this major mistake. David W. Frasier, Assistant Dean, University at Buffalo School of Management.

  • llavelle

    Not at this time. Our ranking does not rely on school supplied data to the extent that some others do, so there’s really no need for the kind of audit you’re talking about. And no auditor will be able to audit all the schools in the ranking fast enough to meet our publication schedule; we would have to delay publication for many months, or audit the schools’ data submissions after the rankings have been published, neither of which serves our readers well.

    Louis Lavelle
    Associate Editor
    Bloomberg Businessweek

  • Ayman

    Dear Mr IIavelle
    some rankings do audit their ranking by a professional auditor. This would give great credibility. Does BW intend to do so in the future?

  • llavelle

    Surely you jest. The peer assessment survey amounts to a mechanism for measuring the reputation that the ranking creates–as far as I’m concerned it’s completely lacking in value. (Most of the deans who fill out the survey don’t know anything about the programs they’re rating…except for their U.S. News rankings.) Most of the rest is based on information provided by the schools. There may be perfectly legit reasons for using that data, but U.S. News is basically giving schools a giant incentive to lie, with no reliable way to check the data. At least three schools have admitted to supplying inaccurate data to ranking organizations including U.S. News in recent months, so this isn’t some theoretical concern. The U.S. News ranking is about as close as you can get to letting the schools rank themselves. Thanks, I’ll stick with the BW surveys.

    Louis Lavelle
    Associate Editor
    Bloomberg Businessweek

  • llavelle

    Thanks Steve, but BW does have it’s act together. Nobody’s “gaming the system.” I’ve already explained why this is virtually impossible, and why there’s absolutely no factual evidence for this claim, and how there’s factual evidence that contradicts it. So I won’t repeat it. (See my response to The Kipper, above.) The letter grades are worth at most 2 percent of the final ranking; they’re based on one or two questions in the 2012 student survey, which includes more than 50 questions, and they’re allotted based on a pretty standard distribution A+=top 20%, A=next 25% and so on. Information on the letter grades is in the footnote to the ranking table, the methodology, and the FAQ, so there’s no giant conspiracy to keep readers in the dark.

    Louis Lavelle
    Associate Editor
    Bloomberg Businessweek

  • llavelle

    Except that there is: we employ a team of statisticians to scour the student survey data for signs that students aren’t answering honestly. Also, students just don’t fill out the surveys that way: they do plenty of griping. We’ll be publishing comments from all the schools we surveyed in a few weeks–take a look at the comments and tell me honestly if you find them all to be glowing reviews intended to boost their school’s ranking. Finally, and I can’t believe this hasn’t occurred to anyone yet, ask yourself which schools have the biggest incentive to give these glowing reviews? The schools at the bottom of the ranking, right? Now look at the student survey ranks for the schools at the bottom of the ranking…if they were so good at manipulating the ranking, you’d think they’d somehow manage to get those ranks out of the double digits. Except that they haven’t. In a quarter century. The student surveys we use have plenty of credibility. You just need to open your eyes to see it.

    Louis Lavelle
    Associate Editor
    Bloomberg Businessweek

Our Partner Sites: Poets & Quants for Execs | Poets & Quants for Undergrads | Tipping the Scales | We See Genius

Site Design By: