Exclusive Study: GMAT Test Prep Companies That Provide The Biggest Score Increase

students on campus

There is a bit of a caveat that should be noted. While test-takers using Official GMAT reported the biggest difference between the platform only and the platform with a class, as of now, the Official GMAT does not offer classes. So all respondents who used that platform went to an additional provider for the class instruction. We saw some score increases of 100 points or more from students using firms like Sharpminds, Sherpa Prep, Jamboree, Coefficient Solutions, as well as Manhattan Prep and Veritas.

The next highest difference was respondents using GMAT Club, which reported an average difference of about 33 points — nearly twice as much as the average of all respondents which was 16.9 points. Similarly to Official GMAT test-takers, those using the GMAT Club platform also took classes from a range of providers. For example, students whose score increased 100 points or more reported that they had taken classes also from Sherpa Prep, Jamboree, Manhattan, Veritas, and E-GMAT.

In the case of Manhattan GMAT, Veritas, and Kaplan, we only counted the difference between those who had taken the class from the same provider as the vendor. So the average difference for Manhattan GMAT represents a difference of 30.5 points more for classes taught by Manhattan instructors — with an average of 104.1 versus those who had used the Manhattan materials on their own — with an average of 73.6.

Kaplan’s instructor value-add was the least score increase, at only 5.5 points. That is, with an instructor, the average reported score increase from a Kaplan class was 74.1 points, compared to a platform-only increase of 68.6. Veritas was in the middle, with a class value-added increase of 14.5 points, where those who had taken classes reported an increase in score of 80 points, while those who used the platform only increased their score by 65.5 points.

So Magoosh is the big outlier. Although the sample size of those who reported scores after taking a class along with the Magoosh platform is relatively small, it appears that the platform itself is more powerful than its own platform with a class. It is also significant that Magoosh’s platform increase of 93.2 is almost as high as the average class score increase of 93.7, along with the average increase with a tutor of 90.2.


We wanted to understand why the private tutor increase was not dramatically higher than those who had taken classes. In the very broad aggregate, we saw that the group who took classes increased their score by an average of 93.7 points compared to those who used a tutor, increasing their score by 90.2. When we stripped down the data to those who reported that they used a tutor without a class compared to a class without a tutor, we got an even more dramatic difference — an average difference of 12 points.

That is not to disparage tutors, and individual experiences do vary. We saw quite a wide number of tutoring firms in the survey sample, not just the big firms with a large bank of private tutors, such as Manhattan, Veritas, and Kaplan. In large and smaller firms such as Noodle Pros, Varsity Tutors, and firms with names such as GMAT Dudes and GMAT Ninja, students reported score increases of 100 points or more, though the sample size for these companies was too small to draw firm conclusions.


The lower-score-increase-for-tutors-only phenomenon might just have something to do with the number of teaching hours. We saw a very clear correlation between higher scores and the number of hours of instruction — bearing in mind that one-on-one is much more focused than a class, online, or in-person.


Students using Manhattan Prep reported the highest average score increase, as well as the largest number of responses. Because of the robust data, we were able to see more clearly what is apparent in the aggregate data — that a class and tutor combined give the greatest score boost over going it alone. In the subset of students who took both the class and tutor of Manhattan GMAT, the average score increase was 118.8 (note sample size was 16) compared to the platform only score of 73.6. (sample size of 43).

So who did the best of all? Students who took a Manhattan class and used a tutor scored an average of 118.8 points higher — as a result of their instruction. We wondered for those who reported score increases and the number of hours in the class and with the tutor, we found that students spent, on average about 10 hours with a tutor and 26 hours in class.

For those who used just a tutor without a class with Manhattan scored only 84 points higher (compared to the aggregate average of 79.7) and had 13.9 hours of instruction. Those who used a class without a tutor saw a score increase of 98.6 points, with an average of 28.8 hours of instruction. And those who used a tutor without a class, saw an increase of 84 points, with 13.9 hours of instruction.

Despite performing well in the score increase category, Stacey Koprince, Manhattan Prep’s Content & Curriculum Lead wrote us after publication to point out some hesitations she has with the survey and dataset. The two broad points Koprince makes is Poets&Quants did not publish the starting point for GMAT score increases and participation bias. “The impressiveness of 104.1 (score increase) depends pretty heavily on your starting point — it’s far harder to improve by 100 points at the higher end of the scale,” Koprince points out, noting a jump from 500 to 604 is less impressive than 650 to 754, for example.

“Imagine that, for some reason, a particular company tends to attract people who have higher average starting scores,” Koprince continues. “For instance, my company offers an Advanced GMAT Course for which we require a 650 minimum starting score to enroll. Our students in this class can literally only improve by 150 points — or fewer if they started higher than 650 — so how would you factor that into this study?”

Selection bias is something most surveys and studies experience. People are most likely to respond if they feel strongly — positive or negative — about a topic or subject. In this case, it’s likely the dataset skews towards larger score increases than an unbiased dataset. Applicants obviously shouldn’t expect exact — or even similar — score increases published here just because they take a Manhattan Prep online course, for example.

After all this analysis of score increases — you would think the firms that delivered the highest scores would have the happiest customers, but the correlation isn’t that simple. In the aggregate, those who gave the highest scores to their provider also got the highest score increases. Stay tuned for the next installment!

Author Betsy Massar is the founder and CEO of Master Admissions, a leading MBA admissions consulting firm. P&Q turned to Betsy for her analytical and quant skills for this analysis of the data that was collected by Poet&Quants’ Nathan Allen.



Questions about this article? Email us or leave a comment below.