Winners & Losers In The 2012 BW Ranking

by John A. Byrne on

According to Bloomberg BusinessWeek’s new ranking of the best business schools, North Carolina State’s Jenkins Graduate School of Management has the least satisfied MBA students in the world. Perhaps even worse, Jenkins’ career services office ranks dead last among 114 business schools in the U.S., Europe and Asia, whose graduates BusinessWeek surveyed. You’d think a business school’s full-time MBA program couldn’t get more negative publicity than that.

But at least Jenkins managed to be ranked by BusinessWeek yesterday (Nov. 15) when the magazine published its 2012 ranking of the best full-time MBA programs. Jenkins is 63rd on a list of 63 U.S. schools. An even worse fate? Falling off BusinessWeek’s ranking altogether. This year, five schools disappeared from the magazine’s list of the best full-time MBA programs, including two schools that had ranked in the Top 50 the last time BusinessWeek published its list of the best business schools. Tulane University’s Freeman School, ranked 35th in 2010, didn’t warrant a mention.

Neither did William & Mary’s Mason School, which had been ranked 47th two years ago. Ditto for No. 54 Pittsburgh’s Katz School, and No. 51 Case Western University’s Weatherhead School of Business. All are gone from BusinessWeek’s ranking. It’s not clear whether those schools disappeared because BusinessWeek couldn’t get a minimum response rate on the school from either the Class of 2012 or corporate recruiters. In any case, it’s bad news.

TULANE’S FREEMAN SCHOOL, RANKED 35TH IN 2010, DISAPPEARED ALTOGETHER

Tulane would have had to fall 29 places, for example, to do its disappearing act. That’s quite an unexplainable plunge for a first class university’s business school.

Every ranking brings winners, who are among the most likely to promote a newly published list and their standing on it, and losers, who quietly ignore the latest news. The new 2012 Bloomberg BusinessWeek ranking is no exception. For while five schools fell off the ranking, there were 10 schools that had not been ranked last time but cracked the 2012 list. This happy group was led by UC-Irvine’s Merage School which placed an impressive 43rd on the list, meaning it had to rise by a minimum of 15 spots to make it. In 2010, BusinessWeek put numerical ranks on 57 U.S. schools versus 63 this year.

Other newcomers to the BW list were No. 46 Texas Christian University’s Neeley School; No. 48 the University of Florida’s Hough School; No. 53 University of Iowa’s Tippie School; No. 55 Syracuse University’s Whitman School; No. 56 the University of Missouri’s Trulaske School; No. 58 Fordham University; No. 60 University of Tennessee at Knoxville; No. 62 University of South Carolina’s Moore School, and No. 63 North Carolina State’s Jenkins.

Other significant winners and losers, of course, are those who stay on the list year after year but move up or down, especially in double-digit movements. The two biggest winners were the University of Maryland’s Smith School of Business and Vanderbilt University’s Owen Graduate School of Management. Smith jumped a remarkable 18 places to rank 24th among the best U.S. schools, up from a lowly 42nd in 2010. Owen rose 12 places to rank 25th, up from 37th only two years ago.

A PEEK UNDER THE HOOD OF THE RESULTS OF TWO OF THE BIGGEST WINNERS THIS YEAR

How did they do it? A look at the underlying results pretty much tells the story. Maryland went from a rank of 33rd on BusinessWeek’s student satisfaction surveys in 2010 to a rank of 6th this year. That’s an almost unheard of turnaround from one survey to the next, especially because BusinessWeek combines the results of three student satisfaction polls in 2012, 2010, and 2008 to come up with the ranking. The latest poll gets 50% of the weight, while the other two polls each get 25%. Still, rising 27 places is nearly a red flag event. Maryland also gained considerable ground in the corporate recruiter part of the methodology, moving up ten places to rank 44th from 54th two years ago.

The school is so pleased with the result it shot out a media release yesterday (Nov. 15th) to spread the news. “We’re so pleased to see our commitment to our students and our investments in our programs recognized with this ranking,” said Dean G. “Anand” Anandalingam in a statement.  “We have been focused on strengthening the learning experience for our students. We have some of the best researchers in the world who truly excel at translating their work in the classroom for our students. And we’ve made significant investments to transform our career services offerings to help our students develop as strong leaders.”

DON’T MISS: BOOTH RETAINS NO. 1 RANKING IN 2012 BUSINESSWEEK RANKING

1 2 3 4
  • http://www.facebook.com/profile.php?id=25503923 Bruce Vann

    Businessweek’s rankings fluctuate waaaaaay too much from year to year for anyone to justifiably take them as gospel.

  • Matt C

    Any rankings that have IU next to CBS is automatically disqualified.

  • The Kipper

    The only part of the BW rank that has credibility is the employer evaluation – it lines up with applicant perception of top schools much better. As far as student satisfaction surveys go, there is nothing keeping the students at lower ranked programs from manipulating the survey – why wouldn’t they? – They know it will boost their rank. It’s a bit suspicious when none of the conventional “top” schools are found in the top 10 schools with the highest student rating.

  • JDMBA33

    Mate, I suggest you read the research methodology on the BW website. They have mentioned that they have two professors that are experts in big data and fraud detection. They specifically examine the data to determine if there is any outside manipulation. It’s not that complicated to design algorithms to detect this type of stuff.

  • JDMBA33

    The rankings are a by-product of the methodology. You don’t rank the schools in some order and then find a methodology that fits that narrative. That is not research. You might not agree with the methodology (and there are certainly legitimate issues with that), but that is a different point. But to make a blanket statement such as the one above, is painting with a very broad brush. It’s important to understand the methodology of the rankings.

  • Kendra

    Additionally, ALL students at ALL schools want their school to be ranked well. It is not unique to one school or one subset of schools. There will always be students who try to inflate rankings, but that will happen at every school. Therefore, each school is for the most part on a level playing field when it comes to student surveys.

  • llavelle

    If those students at lower-ranked schools were so great at manipulating the ranking for their own benefit, they wouldn’t be lower-ranked schools. There’s a good reason why those conventional top schools don’t have the highest student ratings: they pay more, they expect more, and they’re easily disappointed. You could argue that that’s unfair to the top schools, that we should somehow correct for the expectations gap, but even if that were possible I’m not sure it would change the rankings in a big way.

    Louis Lavelle
    Associate Editor
    Bloomberg Businessweek

  • llavelle

    Um, not really Bruce. Every two years 3-5 schools enter the top 30, 3-5 schools drop out. The top 10 is more stable, and the top 5 even more stable than that. Top five schools move around (a little) but the composition of that group has been virtually unchanged in a quarter century. Even changes in methodology (such as the introduction of the intellectual capital measure) haven’t had much of an impact on the ranking’s stability.
    Louis Lavelle
    Associate Editor
    Bloomberg Businessweek

  • Diego

    The historical rankings shed more light on the “winners” like Smith and Olin. In Maryland’s case, maybe the big jump isn’t as big a story as the big drop-off from the norm in 2010. Both schools have just moved back to where they usually are. http://www.businessweek.com/articles/2012-11-15/best-b-schools-ranking-history

  • Palimo

    I am extremely happy for Olin at Washington and Owen at Vanderbilt. Those two schools deserve more and they are really underrated.

  • Raj

    why there is no salary data for london business school?

  • Djokovic

    Simply put, Indiana is way overrated and definitely not in the same league as CBS. No ranking is going to change that, BW included. It’s very interesting to see schools with great placement like Emory did not get any credit it deserved, while USC with the lousiest placement stats suffered almost nothing. I suggest BW take placement stats into consideration for their next rankings.

  • Sami

    I agree with the statement about Emory. It is truly the hidden treasure, same placement as the top 10 with comparable salaries, good location, and really solid program. the elite consulting firms know it very well and recruit heavily there. It is excellent for someone looking for excellent education, career boast, and meaningful experience.

  • Dan

    No one ever thought they were bad schools but Vandy did fall on some hard times for a few years.

  • Dan

    True, but these rankings are a bit off. Every year there seems to be a few precarious additions and deletions from the list. Some entirely counterintuitive rankings. But, it is what it is. BW remains the second most credible ranking.

  • Steve

    Louis ,

    First off does anyone name their kid Louis anymore :) ? BW always had similar rankings they just had them 1st tier, send tier and third. What is frustrating is that the alphabet scores like Career Services do not have a number attached creating a lack of transparency, what is a C and what is an A worth, who are the recruiters that are measuring ? I do not see how a program like TCU can get more hiring than say Fordham in NYC or even Missouri ! What MBA jobs are in Missouri ? Salary should be a determinate.

    Seriously we need a reliable system of ranking as opposed to cereal box stuff. The FT is silly and so pro UK programs ( for advertising sake ), US News is highly manipulated now. I want to see reliability, the top schools know that by giving A’s it maintains their rankings, Having gone to a top program we are instructed to rank it high for career purposes ! I also would like to see a delineation from 2 year programs to one year ( which is the trend now ). My view is that the degree is being watered down and resold, I for one am skeptical now in hiring an MBA especially from a one year program. BW needs to get it’s act together, the undergrad rankings are all over the map, I am tired of programs easily gaming a ranking system and others playing by the book and suffering the consequences.

  • obamaniac

    Can you explain this? Are you questioning if TCU, in a top metro (20 Fortune 500 headquarters) in the top state for business can get people hired? And have you not heard of interstate travel?

  • llavelle

    The school didn’t supply any, so BW used its own surveys of LBS grads. What information we had was published here: http://images.businessweek.com/slideshows/2012-11-14/2012-best-international-business-schools#slide2

  • llavelle

    Using placement stats for rankings creates its own problems. Everybody wants a system that can’t be gamed, but using placement stats gives schools another way to game the system. And that’s assuming they can even get their numbers straight. (So far three schools have reported errors in the placement/salary info they sent us. If we used that for the rankings, they’d be worthless.) There’s also the fairness issue. If Wall Street slashes headcount and cuts back on MBA hires, and schools that place a lot of grads in Wall Street jobs report lower placement numbers, should those schools be penalized in the rankings? I think most people would say no. I’m as perplexed as you by USC: I’m really surprised that student satisfaction didn’t plummet more than it did. I guess students feel the school did the best possible job in a difficult situation.
    Louis Lavelle
    Associate Editor
    Bloomberg Businessweek

  • mindbullet26

    BW: I know of a way to significantly improve your rankings!! Buy a copy of U.S. News rankings, copy and paste. Just be aware of potential legal problems.

  • llavelle

    Except that there is: we employ a team of statisticians to scour the student survey data for signs that students aren’t answering honestly. Also, students just don’t fill out the surveys that way: they do plenty of griping. We’ll be publishing comments from all the schools we surveyed in a few weeks–take a look at the comments and tell me honestly if you find them all to be glowing reviews intended to boost their school’s ranking. Finally, and I can’t believe this hasn’t occurred to anyone yet, ask yourself which schools have the biggest incentive to give these glowing reviews? The schools at the bottom of the ranking, right? Now look at the student survey ranks for the schools at the bottom of the ranking…if they were so good at manipulating the ranking, you’d think they’d somehow manage to get those ranks out of the double digits. Except that they haven’t. In a quarter century. The student surveys we use have plenty of credibility. You just need to open your eyes to see it.

    Louis Lavelle
    Associate Editor
    Bloomberg Businessweek

  • llavelle

    Thanks Steve, but BW does have it’s act together. Nobody’s “gaming the system.” I’ve already explained why this is virtually impossible, and why there’s absolutely no factual evidence for this claim, and how there’s factual evidence that contradicts it. So I won’t repeat it. (See my response to The Kipper, above.) The letter grades are worth at most 2 percent of the final ranking; they’re based on one or two questions in the 2012 student survey, which includes more than 50 questions, and they’re allotted based on a pretty standard distribution A+=top 20%, A=next 25% and so on. Information on the letter grades is in the footnote to the ranking table, the methodology, and the FAQ, so there’s no giant conspiracy to keep readers in the dark.

    Louis Lavelle
    Associate Editor
    Bloomberg Businessweek

  • llavelle

    Surely you jest. The peer assessment survey amounts to a mechanism for measuring the reputation that the ranking creates–as far as I’m concerned it’s completely lacking in value. (Most of the deans who fill out the survey don’t know anything about the programs they’re rating…except for their U.S. News rankings.) Most of the rest is based on information provided by the schools. There may be perfectly legit reasons for using that data, but U.S. News is basically giving schools a giant incentive to lie, with no reliable way to check the data. At least three schools have admitted to supplying inaccurate data to ranking organizations including U.S. News in recent months, so this isn’t some theoretical concern. The U.S. News ranking is about as close as you can get to letting the schools rank themselves. Thanks, I’ll stick with the BW surveys.

    Louis Lavelle
    Associate Editor
    Bloomberg Businessweek

  • Ayman

    Dear Mr IIavelle
    some rankings do audit their ranking by a professional auditor. This would give great credibility. Does BW intend to do so in the future?

  • llavelle

    Not at this time. Our ranking does not rely on school supplied data to the extent that some others do, so there’s really no need for the kind of audit you’re talking about. And no auditor will be able to audit all the schools in the ranking fast enough to meet our publication schedule; we would have to delay publication for many months, or audit the schools’ data submissions after the rankings have been published, neither of which serves our readers well.

    Louis Lavelle
    Associate Editor
    Bloomberg Businessweek

  • David W. Frasier

    Small issue with the information above – the University at Buffalo School of Management is still in the ranking for 2012 at the exact spot we were in for 2010 – #57 – please correct this major mistake. David W. Frasier, Assistant Dean, University at Buffalo School of Management.

  • Jimbo

    Why aren’t community colleges ranked here? I see only expensive private and state schools.

  • hangtime79

    “Simply put, Indiana is way overrated and definitely not in the same league as CBS.”

    Please expound and give evidence as to why you believe Indiana to be overrated and not to be in Columbia’s league.

  • Djooke

    “Please expound and give evidence as to why you believe Indiana to be overrated and not to be in Columbia’s league.”

    Is that a serious question? Clearly, as a Kelly grad you are biased. I don’t fault you for sticking up for your school. But even you need to live in some reality. For starters — the faculty and alumni at Columbia. Secondly – the recruiters that come to campus are among the most elite companies and hire in major numbers. The overall prestige of the degree.

    Of course, Kelly is good program and offers a great ROI. It has solid recruitment stats. But how many would chose to attend Kelly over Columbia? Columbia is an elite school — highly selective; the caliber of students compared to kelly is definitely better (in objective terms with regard to GMAT, GPA, Work Exp, International experience, etc).

    Again, don’t view this as an attack on Kelly. It’s not personal. But please don’t feel the need to stick up for kelly to the point that all objectivity and reason are thrown out of the window. If you cite the BW rankings as your evidence, then I will suggest you look at other rankings such as US NEWS, Economist, FT, etc to get an overall snapshot. You cannot just pick and choose what data you wish to examine.

  • abe

    Tulane’s Freeman School of business was bound to loose its ranking, misreporting key metrics to key ranking institutions along with terrible management and administrative staff are killing/killed this school. I Sincerely advice prospective students to think twice before coming to Tulane (They might offer you a good scholarship, but New Orleans is an expensive city, It will cost you enough to make you repent the decision you took in the first few months, and secondly there are hardly any jobs for a MBAs here)

Partner Sites: C-Change Media | Poets & Quants for Execs | Tipping the Scales