Thursday, May 22, 2014

The American Statistical Association states teacher evaluation methods based on test scores should not be used for decisions that matter

by Valerie Strauss
Answer Sheet
The Washington Post
May 18, 2014

------------------------------------------------------------------------------------------------------------------------
Arne Duncan’s reaction to new research slamming teacher evaluation method he favors-Click Here

"Education Secretary Arne Duncan has been a proponent of using students’ scores on standardized tests to evaluate teachers, even as a growing mountain of evidence has shown that the method now used in most states, known as “value-added measures,” is not reliable. With two recent reports released on VAM adding to warnings long given by assessment experts, I asked the Education Department whether Duncan’s position had changed.

"VAM purports to be able to take student standardized test scores, plug them into a complicated formula and measure the “value” a teacher adds to student learning. The method has been adopted as part of teacher evaluations in most states — with varying weights put on the results — but for years researchers have said the results aren’t close to being accurate enough to use for decisions that matter."

The American Statistical Association, the largest organization in the United States representing statisticians and related professionals, said in an April report that value-added scores “do not directly measure potential teacher contributions toward other student outcomes” and that they “typically measure correlation, not causation,” noting that “effects — positive or negative — attributed to a teacher may actually be caused by other factors that are not captured in the model.” This month, two researchers reported that they had found little or no correlation between quality teaching and the appraisals that teachers received using VAM.

"For years, many prominent researchers have warned against using VAM. They include a 2009 warning by the Board on Testing and Assessment of the National Research Council of the National Academy of Sciences, which stated that “VAM estimates of teacher effectiveness should not be used to make operational decisions because such estimates are far too unstable to be considered fair or reliable.” The Educational Testing Service’s Policy Information Center has said there are “too many pitfalls to making causal attributions of teacher effectiveness on the basis of the kinds of data available from typical school districts,” and Rand Corp. researchers have said that VAM results “will often be too imprecise to support some of the desired inferences.”

These are just a few of the concerns that have emerged in recent years over this method. Still, there are economists enthusiastic about VAM, and their work has been embraced by school reformers who have opted to use it as part of teacher and even principal evaluation and who have chosen to ignore the much larger body of evidence warnings against using VAM for high-stakes purposes. In fact, when Michelle Rhee was chancellor of the D.C. public school system (from 2007-2010), she liked VAM so much that she instituted an evaluation system in the district in which nearly every adult in a school building was evaluated in some part by student standardized test scores, including the custodial staff. The percentage of a teacher’s evaluation linked to VAM depends on the state, from a small percentage up to 50 percent."

Statisticians slam popular teacher evaluation method-Click Here

The most meaningless teacher evaluation exercise ever?-Click Here

D.C. custodial staff were evaluated by student test scores. Really. (update)-Click Here

The American Statistical Association Statement on Value Added Assessment-Click Here