Morgan serves on the Evaluation Pilot Advisory Committee (EPAC), the group charged with advising NJDOE on the new teacher evaluation system, AchieveNJ. I know some of the other members of this committee personally; they are excellent educators and I'm sure they have the best of intentions. But the endorsement of good people is not reason enough to dismiss the very serious problems I and others have outlined with AchieveNJ.
Teachers, administrators, parents, and students in New Jersey have to understand that this new teacher evaluation program has serious flaws that have not been acknowledged, let alone addressed. To that end - and I hope the good people on the EPAC panel, including Morgan, understand the spirit in which I offer this criticism - let me address some critical points in Morgan's op-ed:
One of the measures of AchieveNJ, the proposed system -- and the one that has received the most press attention -- is the Student Growth Percentile (SGP) score. The SGP is a measure of how a teacher’s students have grown on the New Jersey Assessment of Skills and Knowledge (NJ ASK) in comparison to those statewide who are most like them academically. As an eighth-grade language arts teacher, I am one of those teachers who will receive an SGP rating. Personally, I am very curious to see my score, and I know that it is going to give me valuable data about the students I teach and my impact on their learning.(I'm going to forgo a lot of links in this post, because everything you need to know is here. And if that's a little too technical for you, don't worry - read this.)
It would be one thing if the SGP component of AchieveNJ was solely about letting a teacher "see her score"; it is not. The SGP measure is a mandatory, rigidly applied portion of the evaluation for a small subset of teachers (4-8 grade math and language arts). The median SGP (mSGP) Morgan and her fellow teachers receive has a high-stakes consequence attached to it; it is not merely information to help instruction. How could it be? After all, Morgan's students will already be in high school when she gets their SGPs.
The use of SGPs is almost entirely for accountability, not instructional improvement. The NJDOE has gone on for a while now about how valuable it will be for a teacher or administrator to see how a child "grows" in comparison to his or her "peers." But because SGPs are descriptive and do not even attempt to find the cause for growth, the value of them as instructional assessments has been way overhyped. It's far more critical that we find out why a child is or isn't growing at her full potential than to compare her to her "peers."
As to those "peers" - the only characteristic they share is a similar history in previous scores. SGPs do not attempt to add student characteristics into their description, and we have very good evidence that this creates a bias (one not well ameliorated, by the way, by the use of value-added models, or VAMs). Given that classrooms are not assigned similar rosters of students, either within or between schools and districts, there is every reason to believe the teachers who educate the neediest children will suffer a disadvantage.
And then there's the problem of using the median SGP of a class, which is mathematically suspect. All of this adds up to a system about which any teacher should have serious concerns. So it's more than a little frustrating to read a piece where these concerns are brushed aside so easily. NJDOE owes the teachers of this state serious answers to their concerns; they are not to be found here.
As to the SGOs: I would refer all readers to Dr. Howard Wainer, a distinguished education researcher:
There is no research that NJDOE has put forward - none - that justifies using SGOs as 15% of a teacher's evaluation. As a long-time observer of NJDOE, I strongly suspect SGOs are being put into place in an attempt to placate critics who have pointed out that only a small number of teachers will be subject to evaluation by SGPs.
Again: I am not going to criticize the motivations of any of the members of EPAC. But this op-ed piece is not helpful: it evades addressing the central concerns with AchieveNJ, and it asks teachers to happily sign on to a system of evaluation that has no support in research. Unless and until the NJDOE is prepared to seriously and substantively answer the criticisms leveled against these rigid, top-down, unsupported regulations, no teacher should buy into what comes across as little better than propaganda.