First published on Schools Matter on July 26, 2014
"While value-added models are intended estimate teacher effects on student achievement growth, they fail to do so in any accurate or precise way. — Professor Bruce Baker"
Back in early April I penned a piece for K12NN on The American Statistical Association's (ASA) paper on Value Added Methodologies . In it I asserted that the "document provides strong support to those who oppose this wrongheaded use of statistics to make high stakes decisions effecting the lives of students, educators, and our school communities." This week I noticed a trackback to a reprint of the article. What caught my eye was the title, which seemingly was entirely out of keeping with the spirit of the ASA's stance: Advocating for a robust value-added implementation.
I read through the VAM cheerleading piece and was gobsmacked by the deliberate manipulation of the ASA document's tenor and tone. This VAM apologetic read less like a legitimate blog posting and more like a corporate press release. Without doing much more research, I typed the following comments:
This quotation from the ASA document sums up the entire issue best: "The majority of the variation in test scores is attributable to factors outside of the teacher’s control". To, as the author above has, try and frame ASA's position as supportive of VAM phrenology takes mendaciousness to breathtaking heights. Rather than considering students as empty receptacles for "knowledge" deposited by a method that can be "measured," perhaps we can start talking about students as agents in their own pedagogical experiences—something that doesn't exist in the current regime of the profitable testing-industrial-complex.
The author's response displayed the same blatant avoidance of issues as the original piece. In fact, it stated some of the long discredited claims of the VAM camp, including Professor Bruce Baker's favorite trope about how "more sophisticated" VAMs address the issues people have with VAMs.
Thanks for your comment. The ASA statement seems to discuss primary drivers of student test scores, not student growth. It is well known that there is a strong relationship between students’ achievement (or test scores) and their socioeconomic/demographic background. However, there is typically little or no relationship between students’ growth and their socioeconomic/demographic background.
Another way to see this is that the most important factor of “current” test scores is prior tests scores and, once enough prior test scores are included in the model, the socioeconomic/demographic factors become relatively small or even non-significant, despite enormous sample sizes.
That said and to your concern about considering students in the context of their own experiences, more sophisticated value-added/growth models, like EVAAS, can follow the progress of individual students over time, so that each student serves as his or her own control.
Aside from being patently wrong, the whole thing smacked of being boilerplate text written in the bowels of a corporate public relations department. That's when I decided to look into this dubious Jennifer Facciolini was and who she wrote for. I should have done that in the first place.
As of 2012, Statistical Analysis System (SAS) Institute Inc., is one of the largest privately owned software companies in the world with revenue $3.02 billion USD (2013). They are the developers of the wildly inaccurate, but highly profitable Education Value-Added Assessment System, (AKA SAS EVAAS) — a VAM implementation used in many districts. Money chasing SAS is all about big fish government contracts, to wit an excerpt from a recent Businessweek piece:
SAS Institute Inc. won a $6,479,583.96 federal contract from the Defense Information Systems Agency, Scott Air Force Base, Illinois, for Statistical Analysis System software licenses and support renewal.
When a huge firm like SAS pulling down big dollar defense contracts makes the education "market" their priority, you can believe that they won't let things like facts and evidence discrediting VAM get in the way of them selling and supporting their EVAAS phrenology kit to any and all districts infected by the neoliberal corporate reform virus.
Hiring several white, well educated former-teachers like Nadja Young and Jennifer Facciolini to shill for your product is a smart public relations investment for the VAM behemoth. Given that they were teachers in the south, the chances are that they are making a great deal more money at SAS by simply selling out their former profession to corporate interests. Whether the boilerplate prose in their gushing blog posts is written by them or not isn't all that important. What's important is the appearance that professional teachers actually might think that phrenology and VAM are legitimate sciences. I can only hope that that the prose in these mindless corporate posts isn't written by these former teachers. Students should never be exposed to such nonsensical drivel. A selection of some of their titles should serve to numb the mind of any sentient being:
- Data-driven education books make great holiday gifts for educators. Yes, really.
- "March madness" of student course enrollment gets assist from value-added assessment
- Beyond value-added: Teachers need diagnostic data to improve their practice
- Student growth measures can be the bridge to new assessments
I'll spare readers further torment. As the preponderance of evidence against VAM pseudoscience like the watershed ASA paper grows, expect profit hungry firms like SAS to keep doubling down on the duplicity and deception. Having an army of former teachers shilling for your defective product is a small expense compared to losing those highly profitable contracts with districts.
 In its various permutations we've seen the "M" in VAM stand for Modeling, Measures, Methodologies, and others. The only honest word for last member of the acronym would be "Mendaciousness," since phrenology by any other name is…