The fact that we compile this doesn’t matter as we can’t affect the ranking one way or the other. We did it because the magazines weren’t doing it. We needed it for our own benchmarking purposes. Then we decided to share it with our colleagues. If we did something that’s not credible, people would simply ignore it.
There seems to be no adjustment for the size of the faculty. So Wharton, with a total faculty of 459, is first largely because the size of its faculty is so large. The smaller business schools are at a distinct disadvantage. Do you agree that this is a problem?
I wish we were able to provide a version with scaling. In fact, the first year we tried to do scaling. The problem is that many schools did not respond to us. Some provided data that was not correct. We ended up using numbers that we determined from web pages. We ended up with lots of complaints about the accuracy of data. This is why we decided not to scale. Fortunately, we provide the full data so if a school wants to benchmark against a finite number of schools, they can do the scaling themselves. We seldom get negative feedback for this ranking. In the last five years only a handful of complaints were directed to us. By the way, we are a relatively small school (we only have 89 tenured or tenure track faculty members) and as such get negatively affected by the lack of scaling.
Who uses this ranking and what kind of feedback have you gotten so far?
The faculty of B-schools, deans and potential PhD students are using it. If I am interviewing a young faculty member, I can use this to get information on his advisors – how good they are, how many publications they have, etc.
We have gotten overwhelmingly positive feedback on the rankings. This ranking has been in existence for about five years now and it is used worldwide. We have got some criticism too. Generally it is about why we haven’t included a particular journal or we have a journal which someone thinks is not good.
Who is your most vocal critic?
We really do not have a persistent vocal critic. From time to time people contact us suggesting additional journals, most recently we received requests for considering Decision Sciences, Journal of MIS and Journal of Business Venturing.
So how do you choose the journals on the basis of which you rank research?
These are really the top set of journals. If people don’t publish in these, they go to A- journals and the B+ journals. If a school is ranked here, they are going to be ranked similarly if you expand this list by 10 more or 20 more journals.
There may be a journal that is outstanding but it’s not in this list. That is generally because those journals are either applied psychology journals or economics journals, or journals that the B-school faculty publish in but they are not mainline B-school journals.
Interestingly, your database does not include popular management journals like Harvard Business Review or Sloan Management Review. How come?
That’s because we want scientific and academic journals as opposed to those that publish philosophical pieces.
The FT, which also does a B-school research ranking, uses a pool of 45 academic and practitioner journals. BusinessWeek, which also does a research ranking, measures output in 20 journals. Your ranking measures 24. How different is your ranking from theirs?
Our results are highly correlated with their results. The shortcoming of the FT ranking is that they only report results for the schools that apply for the ranking. When they have joint degree programs between two universities, how they calculate (the scores) is very murky. Their purpose is really to rank the degree program and not research productivity. Their ranking looks at the school’s faculty and their publications for the last three years. If I hire a professor this year, I can include his publications in the FT ranking. In our ranking if I hire a professor this year, I only get to include his articles for this year. So last year when he was at another school and published an article, that (score) stays with that school. FT asks the schools to list this data, so it is very difficult to check accuracy. Ours is totally transparent and accurate because we don’t ask the schools to report anything. We go to the journals at the time an article is published and look at the affiliation of the authors.