Categories
Assessment Accommodations Bridges Resource Library Crossing Bridges Together: Secondary Transition In the Field, A Reading Room For Educators

Interpreting Results of Assessments That Are Not Normed On Or Validated For Blind-Low Vision Students

Updated as of September 20, 2024.

Scenario

“My 3rd grade braille student is taking the Woodcock Johnson v. 4 in braille. Her special education case manager is administering the test. Math tests 5 and 9 are in Nemeth code but in a math format that my student has never been exposed to before. I think it is unfair to test my student in a format that she is unfamiliar with and so I wanted to do a little pre-teaching of the format before my student is tested on those 2 tests. Can you please weigh in on whether or not I should be allowed to pre-teach the format?”

IDEA Legal Requirements For Testing and Assessment

Individuals with Disabilities Education Act (IDEA)

Test format

20 U.S.C. section 1414 addresses evaluations of students with disabilities. This section specifically requires that: “assessments and other evaluation materials used to assess a child”

  • “are provided and administered in the language and form most likely to yield accurate information on what the child knows and can do academically, developmentally, and functionally, unless it is not feasible to so provide or administer;” 20 U.S.C. section 1414(b)(3)(A)(ii)

If the test, as constituted, is in a format that is unfamiliar to the student, the test would fail this requirement.

Test Validity

Federal law also requires that all evaluations “are used for purposes for which the assessments or measures are valid and reliable.” 20 U.S.C. section 1414(b)(3)(A)(iii).

While there is a version of the Woodcock-Johnson IV (WJ IV): Adapted for Braille Readers available from the American Printing House for the Blind (APH), I cannot find evidence that this assessment has been found to be either valid or reliable for the test’s stated purpose of “evaluating strengths and weaknesses among contemporary measures of achievement, oral language, and cognitive abilities.” Introducing the Woodcock-Johnson® IV.

Thus, even if the student receives pre-teaching of the test format, it is possible (if not likely) that the test results will have to be used carefully and should be understood to have at least some likelihood of artificially depressing the student’s true abilities when compared to the student’s non-disabled peers.

Test Administration

The IDEA further requires that all evaluations “are administered in accordance with any instructions provided by the producer of such assessments;” 20 U.S.C. section 1414(b)(3)(A)(v).

We have not found evidence that the publishers of the norm-referenced print version of the WJIV have produced any instructions relative to braille readers. Thus, while the test results might be helpful to some extent and might illuminate patterns of suspected disability, the results should be understood to fail IDEA requirements and should thus not be used to pigeon-hole a student, particularly if compared to non-disabled peers for whom the WJIV has been found to be both valid and reliable.

Educational Research

Are we testing the student’s knowledge/skills or are we testing the student’s familiarity with the test?

Carnegie Mellon’s Eberly Center notes that students’ lack of familiarity with a particular exam format “may interfere with their ability to apply their knowledge … [and] their performance may be an underestimate of their knowledge … Or, an unfamiliar exam format may weaken students’ confidence and hence lower their performance indirectly.” Students performed poorly on an exam. Format of exam was unfamiliar.

Impact of unfamiliarity with test format

Regardless of the reason for the sub-par performance (inability to apply knowledge or anxiety/stress), the presence of an unfamiliar examination format can artificially lower a student’s exam performance.

Additionally, if any student is unfamiliar with the format of the test, that unfamiliarity will likely adversely impact the student’s performance on the test.

Construct validity of the assessment

Unless the test is designed to measure the student’s facility with taking tests in that format, this adverse impact constitutes, “construct-validity bias” because it will not accurately measure the variable it has been designed to measure.

In this case, instead of measuring only the student’s mathematical knowledge, it will also be measuring the student’s lack of familiarity with the test format, even though the test is not designed to measure test format familiarity.

Content validity of the assessment

If a braille-reading student would be less familiar with the format of the test than a print-reading student, the test would have content-validity bias toward print readers and against braille-readers.

Even if the lack of test format familiarity is unique to the student (for example, the student has not had enough experience with tactile graphics, vertical equations, etc. to be able to read them as fluently as print-readers can), this also constitutes content-validity bias against the braille reader.

Bottom line

Either of these biases (or both) lead to predictive-related/criterion-related validity bias. When this type of bias occurs, the assessment loses its usefulness because the test results favor one group of students over another based upon factors other than the purpose of the assessment. See Test Bias, from The Glossary of Education Reform.

Conclusion

Based on both law and research into test bias, it is imperative that the student receive pre-teaching of the test format. Indeed, if this format is found in other standardized testing materials, the student needs to have regular and meaningful opportunities to practice with this format to gain familiarity with it. The key is to keep the testing format from impacting the student’s test performance.

Additionally, it is important that educational professionals recognize that these tests are neither standardized upon nor normed for blind/low vision students. It is highly likely that test bias exists, and educators must take the testing results with a grain of salt. While these assessments may still be used to provide some information, they must not be used to include or exclude the student from educational opportunities or supports.

Contact the Bridges Helpdesk for More Information

This unique project is being coordinated through The IMAGE Center of Maryland, a center for independent living in Towson, and it is funded by a grant from the Maryland Department of Education Division of Special Education/Early Intervention Services.

Leave a Reply

Your email address will not be published. Required fields are marked *