08.31.11
Hello Again!
If you’ve looked around for an SAT or ACT tutor, you’ve probably noticed that many companies publish their “average score improvements.” The claims run from modest, 150-point improvements to unbelievable, 300+ point gains. On the surface, you might think that the average score improvement would be the most important factor in choosing a tutor, but this is definitely not the case. These score improvement numbers are rarely what they seem.
The first question you should ask is “average score improvement from what?” A previous PSAT/SAT or ACT? A practice SAT/ACT? A company diagnostic test? Unfortunately, company diagnostic tests are notorious for being harder than actual SATs. That way, when a student takes the actual SAT, his or her score cannot help but “improve” from the diagnostic test. The companies know that 99.9% of students and parents have no idea how to score or judge the difficulty of a practice SAT, so these misleading diagnostic tests are hard to detect. Even if your score improvement is based off of an official practice test from the College Board guide, there is no guarantee that the score you are given has been calculated honestly and properly. One major company had something of a scandal regarding these practices in 2008. You can read the details here.
The moral of the story is never to trust company diagnostic tests. Score improvements should be calculated from previously taken tests or, if there are no such scores, official practice tests. If you get a score from an official practice test, please ask your tutor to show you your raw score and score conversion table to ensure that your score has been calculated accurately.
The second question you should ask pertain to score reporting. Specifically, how many students actually report their score improvements? Chances are, the number is smaller than you think. I mean, how many customer satisfaction surveys do you bother to fill out on a day-to-day basis? Furthermore, I would argue that the students who report their scores are likely the students who are proud of their scores. Students who are angry or embarrassed about their scores probably don’t want to speak with their tutor or preparation company anymore. From my personal experience as a tutor, students who do great report their scores to me as soon as they are out; I generally have had to track down the students who haven’t done as well. Unlike me, large companies don’t have the time to track down all of their students score results, so customer satisfaction surveys are all they have. You can read more about the return rate on these surveys here. This is also a good article on score improvements in general.
Average score improvement data has other problems still. First, the number tells you nothing about the original scores of the students who reported. If a student goes from a 2300 to a 2400, this is obviously a huge victory, but the student’s score improvement is still only 100 points. Students who score highly are going to have smaller score improvements, but this certainly does not mean that tutoring was a failure. Furthermore, score improvement numbers often don’t tell you how many meetings were completed by students who reported their scores. A student who meets with a tutor three times should not expect nearly the score improvement of a student who meets with a tutor 20 times. If you are curious about a company’s score improvement data, you should ask how that data relates to your (or your child’s) original score profile and potential tutoring program.
The upshot is that score improvement data is murky at best unless it is calculated from a previous, official test. In that case, it is very useful information. But don’t let it get in the way of evaluating a tutor’s other qualities, particularly experience and quality of references, both of which are more important than score improvement data.
I hope you found this post helpful… There’s more to the world of tutoring than meets the eye!
-Ben