U.S. News Rankings: What Are Deans Really Assessing?

Rankings is a game of inches

Rankings is a game of inches

At the same time, several other schools – Dartmouth Tuck, Virginia Darden, NYU Stern, North Carolina Kenan-Flagler, Washington Olin, and Washington Foster – experienced a trend where just 0.2 of the point separates their best and worst scores. Even then, their high or low score is a one- or two-year outlier. Washington Olin is a perfect example. In 2008, Olin nabbed a 3.8 peer assessment score. However, for the past four years, it has settled into a 3.7 rut.

What’s more, you’ll find this same trend at business schools ranked from 25-50 by U.S. News. Minnesota Carlson? They seesaw between 3.6 and 3.5 each year. Arizona State’s Carey School of Business? You can peg them at 3.5 like clockwork. And you’ll find Penn State Smeal coming in at 3.4 year after year, too.


So does such consistency point to a sound or deeply flawed evaluation system? Start with this premise: Schools are being evaluated by their peers. While these peers may be experts in business education, they are unlikely to be intimately versed with other schools’ curriculum, students, or day-to-day life. As a result, you have to wonder what these evaluators are basing their scores on.  Chances are, it is a surface knowledge mixed with reputation, hearsay, brief encounters, and gut feelings. What’s more, “quality” – U.S. News’ criteria – is a nebulous term. Is it measured with inputs like student test scores and or outputs like student accomplishments, personal growth or academic research?

Along with its criteria, U.S. News’ peer assessments are also hampered by a scoring system that produces heavy clusters (i.e. you’ll find seven schools each tied with averages of 3.5, 3.3, 3.0, and 2.9 in the 2016 rankings).  In other words, the differentiation is small and that carries over year-over-year. That creates a nominal distinction between schools, particular those within the same tier.

Developments within schools also don’t make a dent in peer assessment scores. In 2011, UCLA Anderson unleashed a new curriculum that included a greater emphasis on leadership, specialty tracks, and an intensive team-based client project. In the four rankings since its implementation, UCLA has earned scores of 4.1 (2013), 4.1 (2014), 4.2 (2015), and 4.1 (2015) – barely moving the needle compared to its 4.1 scores in the preceding 2012 and 2011 rankings. In the past three years, Northwestern Kellogg has also beefed up its curriculum, rolling out 55 new courses that constitute a quarter of its electives. Although this effort fixed a one year dip, Kellogg’s scores have since settled back into a normal 4.6-4.7 range.


Bottom line: Peer assessment scores act as anchors, seeming creating a school caste system where a fourth of a school’s score is almost pre-determined. As a result, the ability of full-time MBA programs to move up and down is, to some extent, tempered. And this has a big impact on the so-called school “arms race,” where programs drive up tuition and costs by continuously adding new facilities, degrees, activities, and faculty, often in an attempt to climb a rung or keep pace with their peers. These days, many administrators are grumbling, ‘What do we have to do to move up in the U.S. News rankings.’ With 25% of a ranking seemingly fated, the answer is clear: Find a way to boost your reputation among a narrow cadre of fellow administrators and academics!

Alas, you’ll find some opinion shifts about schools embedded in the 2016 U.S. News rankings. Among top 10 schools, Northwestern Kellogg, Dartmouth Tuck, and Berkeley Haas each slipped a nominal 0.1 of a point among administrators and academics. Of greater statistical significance, North Carolina Kenan-Flager, Rochester Simon, and SMU Cox each fell by 0.2 of a point, while Emory Goizueta tumbled 0.3 of a point – despite producing the best job offer rates for new graduates for the third consecutive year. Emory may be losing some luster among academics, but the people who really matter – employers – were sure smitten with the program.

From a historical perspective, starting with the 2006 rankings, peer assessment scores have changed little. Technically, Dartmouth Tuck, a B-school darling for perfecting culture and community, has slipped from 4.4 to 4.2 over the past 11 years. However, that comes with a caveat: Tuck lost 0.1 of a point in the 2016 rankings-–and it held steady at 4.3 from 2013-2015. SMU Cox has also slipped from a 3.3 high in the 2010 rankings to 3.0 in 2016, though (again) 0.2 of that decline came in the 2016 rankings. At the same time, Georgetown McDonough (3.3 to 3.6) and Rice Jones (3.1 to 3.4) have made notable climbs in the past 11 years.


In the big picture, most schools’ peer assessment scores oscillate within a small range. So year-after-year, schools are left back where they started, regardless of their efforts.

What does this all mean? Simple: The U.S. News rankings have an Achilles heel: Their static and vague scoring system inadvertently keeps schools in their assigned seats. So if you’re wondering why your target school or alma mater isn’t moving up in the rankings, take a breath and look at the big picture. Chances are, your school probably isn’t resting on its laurels – it’s the academics who judge them who are.

To see historical trends for the top schools over the past 10 years, go to the next page. (Note: P&Q only had access to the peer assessment scores of the top 20 MBA programs in U.S. News’ 2013 rankings.)


Page 2 of 3