As the GRE continues to gain greater acceptance in business school admissions, the folks with a vested interested in the GMAT are starting to go on the attack.
Target number one? Those conversion charts that allow test takers to compare their GRE results to the GMAT. Chris Han, head of test development and psychometrics department at the Graduate Management Admissions Council, maintains you can’t compare the results. In fact, he contends, the conversion tool could be off by as much as 200 points so that a predicted GMAT score of 650 could be anywhere between 450 and 800.
What’s more, Han is accusing the Educational Testing Service, which administers the GRE, of publishing conversion charts that mislead business school applicants. “You can’t convert a GRE score to a GMAT score, and vice versa,” he writes in an article on the GMAC website. “The GRE and GMAT exams are different tests, measuring different content, and no conversion table or conversion tool can ever make them equivalent. The best way to know what your GMAT score would be is to prepare for and take the GMAT exam.”
GMAT OFFICIAL ALLEGES ‘MISLEADING STATEMENTS FROM ETS MAKE THE COMPARISON TOOL COMPLETELY BASELESS’
He goes further, however, questioning the statements of ETS officials and accusing the organization of everything from sampling errors in its calculations to deliberating misleading test takers and schools. “There are several critical flaws in research design and interpretation—as well as misleading statements from ETS—that ultimately make this tool completely baseless and a disservice to test takers and business schools alike,” claims Han.
Han maintains that the comparison tool’s linear regression models were built based on 472 test-takers who took both GRE and GMAT within a one-year period between August 2011 and December 2012. “This data is outdated, and its integrity not verified,” he argues. “The GMAT testing population from ten years ago is completely different from today’s more globalized testing pool, and there are substantial differences in the score distributions for both the Quant and Verbal sections, so this outdated data isn’t relevant anymore. In addition, all the score data was self-reported from participants, and there is no evidence of data integrity.”
ETS: ‘NOT SURPRISED TO SEE THIS RESPONSE FROM GMAC AS GRE CONTINUES TO GAIN INCREASING ACCEPTANCE’
A spokesperson for ETS attributes Han’s critique to GRE’s continued success. “We are not surprised to see this response from Han and GMAC as the GRE continues to gain increasing acceptance among business schools in the U.S. and abroad,” says Kristen Mitchell, manager of public relations for ETS. “ETS remains focused on serving both institutions and test takers at a high-level, with top-notch innovation, resources and tools to meet their current and emerging needs. We look forward to further growth of the GRE test among business schools and showcasing the many ways that the test and, by extension our resources and services, are most impactful in helping students and institutions reach their goals.”
ETS says its comparison tool was updated in 2017 and promotes it as an “easy-to-use tool was designed to help newer GRE score users understand and appropriately interpret GRE scores in the context of GMAT scores.” In a four-page explanation of the tool, ETS says that the GMAT exam and the GRE General Test have a high correlation with each other. “So it is reasonable to use the actual scores on one measure to predict the likely score range on the other for a given applicant,” according to ETS.
Mitchell, moreover, says that “ETS has never promoted the use of this tool for test takers nor is it intended for their use. It was always intended for business schools’ interim use as they became more acquainted with how to best use GRE scores in their admissions processes. Once institutions become familiar with using GRE scores, we encourage score users to judge applicants on the strength of their GRE scores without trying to convert them to GMAT scores. ETS’s intention in creating our score conversion tool was in response to early GRE-adopters less familiar with the GRE asking ETS for this type of tool as the GRE and GMAT share a lot of common predictive value. But again, we have always only promoted the tool for use by institutions who are newer GRE score users.”
ATTACK OCCURS AS GMAT TEST-TAKING SUFFERS MAJOR DECLINES
The tool consistently shows that business schools tend to enroll MBA students with lower converted GRE scores than their GMAT averages. By inputting Harvard Business School’s median verbal and quant scores for the latest cohort that submitted the GRE for admission, the tool says the equivalent GMAT score would be 700. Meantime, Harvard says that incoming students who took the GMAT had a median score that was 30 points higher at 730 (see below table for how the tool predicts GMAT scores from GRE results).
Han’s critique is not merely pointed; it’s promotional. He argues that the world’s top business schools have trusted the GMAT exam because it’s specifically validated to predict performance in the first year of an MBA program. “The GMAT exam is the only admissions test designed specifically to be used for admissions to graduate business programs,” he claims. “It measures the higher order skills most relevant to succeed in a graduate business program. Its four sections target specific skills that are highly relevant to the business career path you aspire to. The GRE, on the other hand, is (by design and name) a general test and its three test sections are not related to any specific field of study.”
A SURPRISING AND UNUSAL ATTACK FROM A TOP GMAC OFFICIAL
Counters Mitchell of ETS: “The GRE has also been ‘specifically validated’ to predict performance in MBA programs. A 2014 study by Young, et. al. (2014), found that GRE revised General Test scores, particularly GRE-Quantitative Reasoning and GRE-Verbal Reasoning scores, have a high degree of predictive validity for forecasting the academic performance of students enrolled in MBA programs. Both the GMAT and GRE assess higher order skills “relevant to succeed in a graduate business program.” The GMAT, by design, does not measure any specific business skills because it is intended to be useful for applicants who did not necessarily have a business background—and the same is true for the GRE.”
Han’s attack on ETS is surprising and unusual. Rarely do officials from either testing organization go on the record attacking the validity of their tests. But Han’s critique occurs after years of GMAT test-taking declines (see chart below) resulting from GRE gains in marketshare as well as a movement in business education for schools to increasingly make standardized tests optional for admission.
In the past five years alone, GMAT test-taking has fallen by 40% to 173,176 in the test-taking year 2021 which ended in July of last year. That’s down from the more than 260,000 exams taken in the testing year 2016 (see below chart). The decline would have resulted in an estimated $28.3 million drop in test revenue for the Graduate Management Admission Council.
“According to what we’ve learned from candidates and test prep organizations,” adds Han, “candidates do not prepare for both the GMAT and GRE exams equally. Candidates usually try the GMAT exam first, and if they find their scores lower than expected, they sometimes decide to prepare for GRE (or vice versa). Given this typical pattern of candidates who ended up taking both GMAT and GRE, there is little reason to believe that GMAT and GRE scores equality reflect the candidates’ preparedness.”
GETTING INTO THE WEEDS OF 'LINEAR REGRESSION MODELS' & 'SYSTEMATIC PREDICTION ERRORS'
Much of his critique gets into the weeds of the comparison tool, taking issue with ETS' own explanation of the tool. Example: "The uses of linear regression models must have resulted in systematic prediction errors (i.e., score bias) and the difference in measurement errors between GRE and GMAT was completely overlooked," Han writes.
He also questions the tool's ability to predict an equivalent GMAT score. "On ETS’s website, they stated, as a footnote, that “the predicted score range is approximately +/- 50 points for the total GMAT score,” he adds. "This statement is false. The 'prediction interval' should always be presented with its confidence level, such as 68 percent (=1SE), 95 percent (=2SE), or 99 percent (=3SE) confidence level. Their statement is deceiving the public as if they were reporting 100 percent confidence (“predicted score range” means the difference between the minimum possible predicted score and the maximum possible predicted score) while their reported values were actually only for 68 percent confidence interval."
Han contends that if a candidate got a GMAT total score of 650 or above, "which is what many schools are interested in, there is no evidence that the actual prediction error with 99% confidence interval to be any smaller than ± 200 points. It means that the true GMAT scores for individual candidates whose predicted GMAT Total scores were 650 could be anywhere between 450 and 800 99% of the time, if not worse, even in the best-case scenario where ETS’s regression model somehow did not cause any prediction biases despite the data-related issues pointed out above."