Call me curious. On January 17, I took the Michigan Test for Teacher Certification (MTTC) subject area test in English. My department has asked me and my colleagues to examine the teacher certification tests in Michigan, and while my colleagues took the much more challenging Professional Readiness Exam (PRE) that all would-be teachers must pass, I took the subject area test.

Last night, I got the results from the test. I am pleased to say that I passed. But what surprised me was the paucity of data Pearson actually sent me. Here is the report in its entirety:

I scored ++++ in all four areas of the test. A key included in the report indicates that ++++ means I answered “most of the questions correctly.” It is the highest of four possible ratings. But that is it. No comparison to other test takers on that day. No specific data on which items, if any, I missed on the test. Full disclosure: I know I missed one question, but I was ready to fight against what I thought was a poorly worded question. But the report did not let me see this information.

This is probably par for the course for all major standardized tests, but the truth is, Pearson has all of the data. If I wanted to compare my test results against all other English majors taking the test in 2014, I should have been able to do so, even on subareas, and perhaps down to the individual question type, as in, “56 percent of all other English subject area test takers also got question 32 (parallel structure) incorrect.”

That at least would tell me, hypothetically, which specific kinds of questions I could study for future retakes, if I had failed. Does Pearson have this much data? Absolutely: they have reams on it on individual questions–this particular subject area test has been unchanged for at least the last five years.

On a related note,GVSU even has a subscription to Mometrix, a database dedicated to offering study tips for teacher certification tests.