I will protect your pensions. Nothing about your pension is going to change when I am governor. - Chris Christie, "An Open Letter to the Teachers of NJ" October, 2009

Monday, September 6, 2010

Value-Added Modeling: A Primer For Teachers

An informal poll at my school of my colleagues tells me that many are not aware of "value-added modeling" (VAM). And yet I believe this, along with the looming pension crisis, are the two most important issues that will face teachers in the coming five years.

So, a brief introduction to VAM:

- I'd suggest anyone not familiar with the basics of VAM start here: David Leonhardt in the NY Times:
A few months ago, a team of reporters at The Los Angeles Times and an education economist set out to create precisely such a consumer guide to education in Los Angeles. The reporters requested and received seven years of students’ English and math elementary-school test scores from the school district. The economist then used a statistical technique called value-added analysis to see how much progress students had made, from one year to the next, under different third- through fifth-grade teachers. The variation was striking. Under some of the roughly 6,000 teachers, students made great strides year after year. Under others, often at the same school, students did not. The newspaper named a few teachers — both stars and laggards — and announced that it would release the approximate rankings for all teachers, along with their names.
- Next, take a look at the LA Times project about VAM that Leonhardt references. But be warned; the LA Times is taking an advocacy position here:
What is "value-added" analysis?
"Value-added" analysis is a statistical method that estimates the effectiveness of a teacher or school by looking at the standardized test scores of students -- in this instance, math and English scores on the California Standards Tests. Past scores are used to project each student's future performance. The difference between the child's actual and projected results is the estimated "value" that the teacher or school added (or subtracted) during the year.
- As I've written before, I think the LA Times is on dangerous ethical ground here; they are making money by publishing this data, so how can they be expected to report objectively on the controversy surrounding VAM?

Yes, it is controversial. The Economic Policy Institute has published the most comprehensive set of objections to date:
A review of the technical evidence leads us to conclude that, although standardized test scores of students are one piece of information for school leaders to use to make judgments about teacher effectiveness, such scores should be only a part of an overall comprehensive evaluation. Some states are now considering plans that would give as much as 50% of the weight in teacher evaluation and compensation decisions to scores on existing tests of basic skills in math and reading. Based on the evidence, we consider this unwise.
The objections include:


Many of the links above are to Rutgers professor Bruce Baker's School Finance 101 blog. Every teacher owes it to him or herself to spend a few minutes looking at Bruce's writings on VAM.

For me, the bottom line is this: while the quality of a teacher may be the most important factor in the life of a student in school, there are many, many other factors that affect learning. To discount those factors and use one annual test to make decisions about which teachers are fired and which get paid more is, in a word, insane.

And make no mistake: this stuff is coming. It's already being used in Washington DC, where hundreds of teachers were fired based on their VAM assessments.

As Bruce Baker points out:
I guess that one could try to dismiss those moral, ethical and legal concerns regarding wrongly dismissing teachers by arguing that if it’s better for the kids in the end, then wrongly firing 1 in 4 average teachers along the way is the price we have to pay. I suspect that’s what the pundits would argue – since it’s about fairness to the kids, not fairness to the teachers, right? Still, this seems like a heavy toll to pay, an unnecessary toll, and quite honestly, one that’s not even that likely to work even in the best of engineered circumstances.
Fellow teachers, heed my warning - this fight is coming to your classroom this year. If NJ had won its Race To The Top grant, nearly $50 million would have been spent to implement a computerized tracking system for student progress. What do you think NJDOE was going to do with the data?

We need to start informing each other about this issue now.


In The Know: Are Tests Biased Against Students Who Don't Give A Shit?

2 comments:

Anonymous said...

Well done. I'm glad to be able to help out in the future in any way I can. I do hope you keep up the blog. But I certainly understand how it can eat up time and cut into other productivity!

Duke said...

You know, I wrote that yesterday, and did three posts tonight! But when concert season starts...

I really can't say enough good things about your work, Bruce. Teachers have to get hip to this stuff, and I appreciate the way you take the complex and make it understandable to those who are willing to give a little effort. You and Paul Krugman share a gift (I hope you take that as a compliment!).

The little Superman logos on your last scatterplots were a riot!