How the Washington Post misuses statistics

Bob Somerby

A minor oddity: For the record, there is a minor oddity in the way the Post has presented its data. We don’t know why the Post chose to present the data this way. But this minor oddity tends to shrink the size of the achievement gap we’re hoping to eliminate. And it tends to disguise the groaning problem with Glod’s basic type of analysis.

What is that minor oddity? The Post could have made a direct comparison between low-income kids in Montgomery County and kids of higher incomes. (“Those from middle-class and affluent backgrounds,” to use Glod’s language.) After all, the question we’re asking is fairly simple: Are low-income kids catching up to kids from more affluent backgrounds? But that’s not what the Post chose to do in the data found in its graphic. Instead, the Post compares the passing rates for low-income kids to the passing rates for all Montgomery County students. In that latter measure, the lower passing rate of the low-income kids drags down the overall passing rate–thus obscuring the size of the gap between low-income kids and kids of higher incomes.

Here’s the problem which is caused by with this minor oddity:

For 2008, the Post’s graphic seems to show that roughly 80 percent of Montgomery’s low-income kids passed the state reading test. (That’s a huge jump from 2003, when roughly 40 percent passed.) Since the graphic says that the gap in passing rate was 14 percent, that would mean that roughly 94 percent of all county kids passed the test in 2008. But uh-oh! The low-income kids are included in that latter statistic; this means that, on their own, the higher-income kids must have had an extremely high passing rate. Their passing rate must have approached (or equaled) 100 percent. And that’s where the problem is lurking.

Edit –
Somerby received the following email from a statistician
:

E-MAIL (10/6/08):Great job pointing out the flaws in the Post article. As a statistician, I cringe whenever I read articles like that. I have a few additional comments:

  1. I think you were a bit kind in your description of their comparison of ‘economically disadvantaged’ kids with ‘all students’ as “a minor oddity”. That is a huge error. No competent statistician would make such a mistake. It completely distorts the picture. In some counties there is a higher percentage of poor students than in other counties. If the percentage is high, then the ‘all students’ curve will necessarily be pulled closer to the ‘economically disadvantaged’ curve than in other counties. So the amount of distortion varies by county.
  2. I’m glad you mentioned the ceiling effect. Clearly, most upper income folks were passing the test. When you have a binary measure of success (pass/fail), you have a very limited story you can tell. If you define achievement as ‘passing the test’, then there are a lot of ways to narrow the ‘achievement gap’. As you implied, if you make the test easy enough (where everyone can pass) there will be no achievement gap at all!
  3. Achievement is probably the wrong word. I think tests like this are aimed at determining if children have obtained grade-level competence. The gap that exists in the percentages of kids that achieve this minimal level of competence might be closing (or the test got easier; or…), but that’s a much weaker claim than was made by the authors.
  • There are ways to close achievement gaps that are not necessarily good. For example, schools could decide to use all of their resources on kids who are not meeting minimal standards, and ignore kids who are. Certainly achievement gaps could be narrowed, but at the expense of kids who started out ahead. There are two issues here. One is narrowing gaps that occur between birth and 5 years of age. Schools don’t have much influence on that, but it should be an area of public policy focus. The other is making sure all children progress in school at a reasonable rate. If low income students are improving at a slower rate than upper income kids, that is a gap the schools should be concerned about.
  • Advertisements

    0 Responses to “How the Washington Post misuses statistics”



    1. Leave a Comment

    Leave a Reply

    Fill in your details below or click an icon to log in:

    WordPress.com Logo

    You are commenting using your WordPress.com account. Log Out / Change )

    Twitter picture

    You are commenting using your Twitter account. Log Out / Change )

    Facebook photo

    You are commenting using your Facebook account. Log Out / Change )

    Google+ photo

    You are commenting using your Google+ account. Log Out / Change )

    Connecting to %s





    %d bloggers like this: