Chapter 1

What is mismeasurement and where does it originate?

Table of Contents

Chapter 1 Overview

This chapter defines mismeasurement and categories three types of errors: errors of commission, omission, and judgment. Data is usually considered to be the object of analysis. But data is just a raw record of human observation. Data must be built into evidence prior to analysis. How do errors originate? Semantic confusion, assessment misunderstandings, logic flaws, arithmetic errors, and visual misrepresentation of evidence are among the reasons. The high frequency of teachers’ misinterpretation of test results is a starting point for asking how school districts detect errors of human judgment. The chapter advances two methods of analysis that are feasible for non-technical educators: comparative effectiveness and practical benefit. Finally, the chapter looks at the sources of resistance to evidence and empirical methods.

Chapter 1 Excerpt

What is mismeasurement? It’s not an everyday term in the school world, after all. But it is commonly used in other fields and professions. When we use the term “mismeasurement,” we mean an error that occurs when gauging the quantity or quality of behavior, attitudes or other human events. Mismeasurement includes misunderstanding the size or meaning of something, which in turn may lead to errors in judgment. Given how much data educators have been collecting over the past several decades, it shouldn’t be surprising that some of them have been measured incorrectly. But where in the process have those errors occurred? Who has made them? How consequential are they? We don’t expect errors when an event is recorded in a database, or when simple student-level events like attendance are assembled into a body of evidence (attendance statistics). We expect those to be routine, free of bias and free of errors. But accidents occur in the act of observation, in the recording of events and in the act of human judgment. All three steps are full of human effort, including human bias, and are susceptible to error. Our definition of mismeasurement also includes misunderstanding the meaning of something. We expect people to disagree about the interpretation of measures of organizational vital signs, especially when the reputation of a school or district depends on it. We also expect differing interpretations to emerge from the same body of data. Attendance statistics, for instance, used to be viewed as simply the percentage of days that students attend school. Now you’ll often see attendance data viewed from the vantage point of chronic absentee rates—the percentage of students who are absent more than 10 percent of school days. Other analysts reframe attendance data to isolate those days missed that are adjacent to holiday breaks, or to exclude excused absences. New meaning may emerge from each reframed point of view. So, mismeasurement as we use it is more than a recording error or the use of the wrong measuring tool. Mismeasurement also includes misinterpretation. It may be making a mountain out of a proverbial molehill. Or it may be ignoring the mountain altogether. We consider mismeasurement to include errors of human judgment. In between the act of counting something and reporting that result is a lot of reasoning. To begin examining the soundness of that reasoning, let’s create a typology of mismeasurement….

Chapter 1 References

Hattie, John. Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to Achievement. New York: Routledge, 2008.

Meehl, Paul E. Clinical Versus Statistical Prediction: A Theoretical Analysis and a Review of the Evidence. Minneapolis: University of Minnesota Press, 1954.

Paulos, John Allen. Innumeracy: Mathematical Illiteracy and Its Consequences. New York: Hill and Wang, 1989.

Popham, James W. Unlearned Lessons: Six Stumbling Blocks to Our Schools’ Success. Cambridge: Harvard Education Press, 2009.

Rampey, B.D., Finnegan, R., Goodman, M., Mohadjer, L., Krenzke, T., Hogan, J., and Provasnik, S. (2016). Skills of U.S. Unemployed, Young, and Older Adults in Sharper Focus: Results From the Program for the International Assessment of Adult Competencies (PIAAC) 2012/2014: First Look,” (NCES 2016-039rev). U.S. Department of Education. Washington, DC: National Center for Education Statistics. Retrieved October 30, 2018, from https://nces.ed.gov/pubs2016/2016039rev.pdf

Rankin, Jenny. “Over-the-Counter Data’s Impact on Educators’ Data Analysis Accuracy.” PhD dissertation, Northcentral University, 2013. ISBN: 978-1-303-52615-2 http://pqdtopen.proquest.com/doc/1459258514.html?FMT=ABS

Rankin, Jenny. Standards for Reporting Data to Educators: What Educational Leaders Should Know and Demand. New York: Routledge, 2016.

Wasserstein, Ronald L., Allen L. Schirm, and , Nicole A. Lazar. “Moving to a World Beyond p < 0.05.” The American Statistician. 73:sup1, 1-19 (March 2019). DOI: 10.1080/00031305.2019.1583913

Ziliak, Stephen T., and Deirdre N. McCloskey. The Cult of Statistical Significance: How the Standard Error Costs Us Jobs, Justice, and Lives. Ann Arbor: University of Michigan Press, 2008.

Photo of Barbara Nemko
“While it may be frightening to many parents and educators to learn that the data being used and frequently misinterpreted to determine the quality of a school is flawed, take heart. These education veterans (Jill served as a school board member for 24 years and Steve has been analyzing education test data for almost as long) have provided a path to improvement, if not enlightenment. First of all, they take the complex concepts of testing and make it comprehensible for the rest of us. Then, after clearly detailing what is wrong with education data and how it is reported, they take the time to give school administrators, other education leaders, and board members specific questions to ask which will uncover the problems and lead to correcting how the entire process of assessment is conducted. They also have provided detailed suggestions as to when and where the corrections should start; specifically in pre-service teacher training programs and all the way through the staff at State Departments of Education. Finally, they are urging continued discussions and further research, so let’s get the conversation started.”
Barbara Nemko – Superintendent, Napa County Office of Education