BusinessWeek’s Core B-School Team Out

Lou Lavelle

Lou Lavelle

The two top editors of Bloomberg BusinessWeek’s business education team left the magazine yesterday (Nov. 7), Poets&Quants has learned.

Associate Editor Louis Lavelle, who has been the magazine’s business schools editor since 2005, and Staff Editor Geoff Gloeckler, who has covered management education since March of 2005, are no longer employees, according to another BusinessWeek staffer.

The pair, who together had 16 years of experience covering the business school beat, left on the day Bloomberg BusinessWeek published its latest rankings of Executive MBA and part-time MBA programs. The stories were authored by Gloeckler who reported to Lavelle. Their departure deals a serious blow to BusinessWeek’s coverage of business education. It also follows the decision this year by long-time business education staffer Alison Damast to quit the magazine following a maternity leave that began last December. Damast had covered MBA issues for six years.

It is not clear whether Lavelle and Gloeckler quit or were dismissed. Both journalists declined to comment. But their leaving occurred only two days after Poets&Quants disclosed significant revisions in BusinessWeek’s 2012 MBA ranking. Those revisions, quietly published last month online, were made because BusinessWeek said it had erred in compiling the intellectual capital portion of its ranking.

Geoff Gloeckler

Geoff Gloecklerdisclosed significant revisions in BusinessWeek’s 2012 MBA ranking. Those revisions, quietly published last month online, were made because BusinessWeek said it had erred in compiling the intellectual capital portion of its ranking.

‘LOOKING AT WAYS TO REVAMP AND RE-ENERGIZE THE RANKINGS’

The departure of Lavelle and Gloeckler could signal some significant changes ahead in how BusinessWeek ranks business schools. “The Bloomberg Businessweek rankings are now 25 years old and we’re looking at ways to revamp and re-energize them,” Assistant Managing Editor Janet Paskin told Poets&Quants. “We want to make sure we’re gathering the information that matters most to prospective students and employers. We’re generating ideas here, but are also reaching out to business schools to get their critical ideas and opinions.”

Paskin, who did not return a phone call seeking comment on the loss of BusinessWeek’s primary journalists who do its various business school rankings, had made the earlier remarks to Poets&Quants in an email response two days ago.

The BusinessWeek rankings are used by more MBA applicants than any other published list of the best business schools, according to a survey of last year’s applicants by the Association of International Graduate Admissions Consultants (AIGAC). Some 69% of the responding applicants said they consulted the BW list, compared with 61% for The Financial Times, 54% for U.S. News & World Report, and 41% for Poets&Quants. Some 40% said they consulted The Economist, while 31% referred to Forbes‘ biennial ranking.

The shakeup occurred after the magazine’s mistaken calculations changed the overall numerical rank of a dozen schools–10 in the U.S. and two non-U.S. schools. More significantly, the “intellectual capital” rank of some schools moved by as many as 10 positions. Harvard Business School, for example, fell to 19th from 9th in intellectual capital as a result of the revision.

While something of an embarrassment for Bloomberg Businessweek, which prides itself on accurate reporting and writing, the mistake was easy enough to make. Each ranking cranked out by the magazine involves the collection and analysis of thousands of data points. It takes many months of tedious work to complete the project from start to finish.

CORRECTION MADE WITH LITTLE FANFARE OR NOTICE TO READERS

The errors that led to the revisions, moreover, were less serious an issue than the way the magazine disclosed the mistake and the corrections to readers. The magazine quietly revised its rankings online a month ago with little fanfare or notice. At the top of its corrected table, in small hard-to-read type is this rather vague, less-than-clear statement: “(Corrects to revise 2012 overall rankings, Ranking index and 2012 Intellectual Capital rankings following errors in the calculation of the Intellectual Capital score.)”

Even BusinessWeek’s updated article on its rankings methodology failed to acknowledge the mistake. Instead, the magazine simply crossed out part of the description of the methodology.

Oversight for the way the publication handled the issue would have fallen to Lavelle’s boss, Paskin, a relative newcomer to the organization who also serves as BusinessWeek‘s online editor. Paskin joined BusinessWeek from The Wall Street Journal on Nov. 12, just three days before the 2012 full-time MBA rankings were released.

  • MSV

    The way the magazine “chose” to disclose the mistake…?

  • Conrad Chua

    I don’t have the article where PQ “belittles” the FT so I can’t comment on that. I do think you need to strike a balance between large swings and stasis in rankings. If you look at the rankings data, there are sometimes big year on year differences in schools’ performance of some of the criteria, eg salaries if the majority of a school’s MBAs are recruited into one particularly volatile sector like finance. So there could be a reason for large swings. On the other hand, if you have very little change from rankings to rankings, then that breeds a sense of old clubbiness and you run the risk of ignoring new trends.
    It is true to some extent that there probably isn’t that much change in terms of a school’s program from year to year. And to their credit, the FT does list the average ranking over three years, and they weight salary data over three years as well to smoothen out the fluctuations.
    I am not sure where the criticism of small sample size comes from. Is it from the fact that the FT only requires 20% of respondents from the alumni? From what I know, the average response across all schools is much higher than the minimum 20% or 25% for smaller cohorts. At Cambridge, we have never gone below 50% and there have been years when the response rate has exceeded 70%.
    If there is a criticism of small sample size, then that should also apply to other rankings. As far as I know, no rankings publish the actual sample size. In fact, BW did not even publish the minimum required alumni or recruiter response rate.

  • Radon

    Conrad,

    Thanks for your insight. No wonder B-Schools experience ranking fatigue as providers demand so much information by sub-categories.

    I may have a final question on the FT ranking. PQ’s John Byrne belittles the FT for its wide swings of spots and MBAs are wary of FT’s reliability. He cited that due to possibly small number of response compared to other rankings, a small change or statistically insignificant sample can lead to wide swings from previous results. Some B-Schools move 20+ spots and this raises issue of FT quality. What is your take on it? Do you agree or refute this claim?

  • Conrad Chua

    We don’t participate in many rankings, only the FT, BW and the Economist. It is a conscious decision because of the resources involved — we don’t want our alums to be swamped with surveys.

    So it is difficult to draw conclusions with a small sample. You could make a case that a smaller, non-US business school could have an impact. I will throw in another aspect which is that we do better on the FT which is the survey that relies least on subjective data. Now whether you agree with the criteria or not is something else.

    In terms of the workload, I think the first thing is actually to differentiate between what data is used for the ranking, and what is used purely for the publication’s website. For example, the BW MBA survey has more than 100 fields, and in addition, you still have to complete the General Survey. And the BW only uses a fraction of the data collected and it is unclear whether a failure to provide some of the data affects the ranking.

    As an example of the data that is being asked for, there is a table asking for salaries broken down by job sector and region. For a small school like us, we cannot provide that sort of granular data without divulging individual salaries in some cases.

  • Radon

    Conrad,

    Thanks for your perspective. Do you think that this mismatch of recruiter response rate is specific a BW problem or have you experienced it with other ranking providers? I notice that Judge does well in British publications such as the FT. Do you think it a result of Judge

    a) small B-School
    b) non US

    I can understand that after the workload, it must be frustrating. What can you do to rectify the problem apart from increased communications? Good luck.

  • Mountain Ash

    Wow, Josh Tyrangiel’s destroying more people’s livelihoods. Next thing you know, we’ll read that the Pope’s Catholic.

  • Jeff Schmitt

    Louis Lavelle’s departure is a huge blow to Bloomberg Businessweek. I used to be an online columnist there. While Louis and I only exchanged a few emails, I found him to be very supportive. Both his writing and reputation at BBW were stellar. With his talent, I wouldn’t expect him to be out of work long. Whoever hires him will get a real asset.

  • Conrad Chua

    We were not included because of a low recruiter response rate. We sent BW a list of 30 recruiter contacts but the way the rankings work is that BW does not necessarily approach these recruiters. Instead, they will approach one person from that organisation who is on their database. Given that we are a small school with a geographically diverse student body who work all over the world, it is possible that the person BW approaches might not be aware of the number or quality of our students who work in that organisation, especially if the organisation has a decentralised recruitment process.
    To take a concrete example, at the moment, one of the top strategy management consulting firms is one of our top employers from last year’s class. And I will be upfront to say that because of our small class size, our top employers don’t hire large numbers of our students. But because the students are placed in offices with separate recruitment processes, I am not sure whether putting them down as a recruiter is such a good idea for next year’s BW rankings if BW is going to approach someone in that organisation, say based in the US, who was not involved in the recruitment process.
    It is quite frustrating to be excluded from the rankings because our employment stats are strong. In the 2013 FT rankings, 97% of our students were employed three months after graduation which is one of the highest among the top 50 schools. It is also frustrating given the resources needed to complete the BW rankings survey.
    I am encouraged that BW is looking to revamp and re-energise the rankings and I would be happy to share my views with BW.
    Conrad Chua
    Head MBA Admissions
    Cambridge Judge Business School

  • Lomas

    The absence of the Cambridge University Business School in the BW latest ranking is a very big question! No one provided an answer yet!!