Rankings Under Fire: What MBA Applicants Need To Know About The NORC Report by: Marc Ethier on June 13, 2025 | 413 Views June 13, 2025 Copy Link Share on Facebook Share on Twitter Email Share on LinkedIn Share on WhatsApp Share on Reddit A detailed report by NORC at the University of Chicago released last fall turns a critical spotlight on the systems that churn out college and business school rankings — and its findings should give MBA applicants pause. Commissioned by Vanderbilt University, College Ranking Systems: A Methodological Review evaluates the integrity of five major rankings: U.S. News & World Report, Wall Street Journal/College Pulse, Forbes, Times Higher Education (THE), and QS. Its conclusions are clear: too many rankings are methodologically unsound, opaque in their purpose, and often misleading. (Full disclosure: Poets&Quants is owned by THE.) For prospective MBA students, whose careers and six-figure investments can hinge on a school’s ranking, the report’s implications are far-reaching. Rankings may appear objective, but according to NORC, many are built on shaky foundations. VAGUE PURPOSE, FLAWED METHODS At the core of NORC’s critique is a fundamental question: What exactly are these rankings measuring? The nonpartisan research organization’s report finds that many systems fail to define whether they’re assessing educational quality, return on investment, or institutional reputation. That ambiguity makes it difficult for MBA candidates, among others, to interpret rankings meaningfully. Inconsistent and unsubstantiated weightings compound the confusion. For example, U.S. News’s full-time MBA rankings rely heavily on peer and recruiter assessments — measures that reflect perception more than performance. This can disadvantage newer or more innovative programs that haven’t yet built widespread name recognition. THE & QS RANKINGS DRAW SPECIAL SCRUTINY Of the five rankings analyzed, Times Higher Education and QS come in for particularly sharp criticism. NORC finds that THE uses opaque weighting schemes and overemphasizes research citations and global reputation surveys — metrics that have little to do with the student experience in an MBA or any other program. For MBA applicants, this is a serious issue. A school might score high in THE’s rankings due to faculty publications and institutional research strength, but offer weak career support, outdated curricula, or poor post-graduation outcomes. In some cases, schools that perform well in THE rankings don’t even offer robust MBA programs at all. Adding to the concern is a lack of measurement invariance across global rankings like THE’s and QS’s. In plain terms, the report states, these systems don’t account for differences in national context or institutional mission. Schools that serve working professionals, international students, or underrepresented groups may be penalized for not fitting a narrow definition of “excellence.” DATA PROBLEMS & PROXY PITFALLS The NORC report is clear-eyed about the quality of the data that powers these rankings — and the news isn’t good. Most systems rely on self-reported data from schools or federal databases like IPEDS, the Integrated Postsecondary Education Data System, which is a system of interrelated surveys conducted annually by the National Center for Education Statistics, part of the U.S. Department of Education. Such surveys are prone to errors, omissions, and inconsistencies: some schools may overstate outcomes; others may submit incomplete data or report on different timeframes, making comparisons unreliable. Even more troubling is the use of proxy metrics — indirect indicators like faculty credentials or alumni donation rates — that may have little to do with actual teaching quality or career impact. Rankings that reward these proxies risk painting a distorted picture, particularly for MBA candidates evaluating programs based on ROI. The report’s conclusion is blunt: Without better data and more transparent reporting, rankings risk “obscuring rather than illuminating institutional strengths.” WHAT MBA APPLICANTS SHOULD DO For students trying to make a smart, high-stakes decision, the NORC report offers some guardrails: Know what matters to you. Salary increases, geographic reach, entrepreneurship support, industry placement — these may not align with the rankings’ criteria. Don’t mistake prestige for performance. A high rank may reflect research output or historical reputation, not outcomes that matter to MBA students. Dig into methodology. If a ranking doesn’t clearly explain how it weighs different factors, it may be more marketing tool than evaluation guide. Use rankings as a launchpad, not a decision-maker. Supplement them with employment reports, alumni testimonials, and career outcomes. A CALL FOR CHANGE The NORC report outlines several concrete steps ranking organizations should take to improve the integrity and utility of their systems. Among them: “Ranking organizations should clearly define what they are measuring and why, ensuring their methods align with those goals.” NORC also emphasizes the need for transparency: “The use of criteria and their relative weights should be disclosed and justified so that users can interpret rankings appropriately.” Better data collection and validation are central to NORC’s recommendations: “Agencies that produce rankings should ensure data sources are accurate, consistent, and comparable across institutions.” NORC also calls for context-sensitive comparisons: “Rankings should account for differences in institutional missions, resources, and student populations rather than applying a one-size-fits-all model.” See the full NORC report here. DON’T MISS POETS&QUANTS’ ANNUAL RANKINGS OF THE TOP U.S. MBA PROGRAMS AND THE TOP INTERNATIONAL MBA PROGRAMS