Bias in Standarized Testing? No, but Don't Tell That to Academics

When it comes to junk science and the interpretation of it, I have news for you: the wheels have officially and totally come off the wagon.

A new study published in Journal of Applied Psychology is calling for a revival of research into possible test bias in standardized testing. A topline read of this new report offers some truly startling and impressive looking information:

Results based on 15 billion 925 million individual samples of scores and more than 8 trillion 662 million individual scores raise questions about the established conclusion that test bias in preemployment testing is nonexistent and, if it exists, it only occurs regarding intercept-based differences that favor minority group members.

pretend

Wow, more than eight trillion scores examined. This must be one heckofa study to have looked at all that information. The researchers used what they called “a powerful and sensitive methodology,” in search of test bias where heretofore no one has been able to find it. A provocative premise indeed.

In order to support this premise, the researchers engaged in some very sophisticated mathematics, like this item – one of my favorite passages – from page 653:

equation

There’s page after page of this sort of language attempting to convince people that the reason people cannot find test bias is because the way they look for bias is biased in the first place.

Then there’s media coverage of this report which has thankfully been limited to only one of the usual suspects, Scott Jaschick, the well-renowned Writer-of-All-Things-Related-to-Standardized-Tests-As-Long-As-They-Say-The-Tests-Suck-And-Even-If-They-Don’t-I’m-Going-to-Write-It-That-Way-Anyhow.

Jaschick’s anti-testing biases have been well documented for a long time but after reading through his August 2 article on this absurd study, one has to wonder whether his emotions run deeper than mere bias and lie closer to simple hatred.

In Jaschick’s classic facts-be-damned fashion, he powerfully asserts in the second paragraph of his screed:

But a major new research project – led by a scholar who favors standardized testing – has just concluded that the methods used by the College Board (and just about every other testing entity for either admissions or employment testing) are seriously flawed.

Sorry folks, that’s not what the research concludes. Reading the research and writing honestly about it reveals that the entire report isn’t about analyzing data. The research study claims that because the ACT and SAT don’t look for test bias the same way these researchers did, the tests may have a bias but nobody can find it.

Huh?

wild_highways_next_nature

Okay, fine. Let’s take this at face value for a moment. But first, we should ask how the researchers conducted their analysis in the first place to determine whether there might be bias in a test. I think the best way to sum up this entire research report lies in the seven-word lead sentence under the heading ‘Limitations and Suggestions for Future Research on Test-Bias Assessment’:

Our results are based on simulated data.

I swear I am not making this up. It’s located on page 672 of the Journal of Applied Psychology issue carrying the report.

So these researchers are trying to ding standardized college testing for not finding bias in their tests because the testing organizations analyze real data like test scores, grade point averages and so forth instead of just simulating data. This is positively Orwellian!

Not only does Jaschick ignore the fact that the study on which he’s reporting uses admittedly “simulated data,” he goes on to neglect the fact that the study involved pre-employment testing, not academic testing. Jaschick’s biases run so deep he doesn’t even bother to note what the study’s authors themselves say about their research:

Our report points to the need to revive test bias research in preemployment testing.

Sounds to me like Jaschick is using simulated facts just like the study used simulated data.

We’re talking about a severe case of drive-by reporting here. Jaschick blows past the initial summation of the study – that it’s relevant to pre-employment testing – but he then repeatedly ignores the many additional citations to that effect and which are listed elsewhere in the report, like this reference on page 648:

Few topics in industrial and organizational (I/O) psychology and human resource management have generated more media attention than bias in preemployment testing.

Or this one on page 649:

… we raise important questions and cast doubt about the established conclusions regarding test bias in preemployment testing and provide an alternative explanation for the consistent results reported over the past 40 years of research.

1984-movie-big-brother

Or this one on page 650:

In this study, we raise questions and cast doubt on these established conclusions about test bias in preemployment testing based on methodological and substantive reasons.

Or this one on page 654:

… we show that researchers are more likely to conclude incorrectly that performance is overpredicted for members of the minority group when the mean minority group test score is lower than the mean majority group test score and test scores are measured with less than perfect reliability, which are normative conditions in the context of GMA and other types of preemployment testing.

Or how about this one which is the title of the report!

Revival of Test Bias Research in Preemployment Testing

I think you get my drift here, people.

Academics and university flacks desperate for some kind of media coverage to justify their increasingly tenuous positions might be cut a little bit of slack for promoting astonishingly tortured mathematical permutations and “simulated data” to allegedly show that employment test bias cannot be definitively disproved and cannot be found unless you look for it the way these researchers did with their simulated data.

But when media bias as absurd as what passes for news on the pages of Inside Higher Ed creeps into the discussion, it’s not only an embarrassment for the author and his publisher, it’s a disservice to honest, hard working educators.

COMMENTS

Please let us know if you're having issues with commenting.