Pages

Monday, August 19, 2013

NJ Teacher Evaluations: Countdown to "Operation Hindenburg"

We are at around t-minus two to three weeks, depending on your district, until the launch of New Jersey's new teacher evaluation system, code name: "Operation Hindenburg." Unfortunately, just like the fabled zeppelin, this entire thing could be avoided if the people in charge would just stop and think for a minute:
Superintendents in South Jersey say they are ready for a controversial evaluation system that will link student test scores to teacher performance. 
The new initiative will be implemented statewide in September, but planning began in 2010, with pilot programs running during the 2011-12 and 2012-13 school years. 
Officials with the New Jersey Education Association contend the assessment system is not yet fully developed. 
NJEA spokesman Steve Wollmer urged the state to hold off until next year because the current plan puts too much of an emphasis on standardized test scores. [emphasis mine]
But it's not just the teachers union that says AchieveNJ isn't ready - it's the NJDOE itself! As of today, 8/19/13, 11:00 AM, this graphic is still on the NJDOE's website for teachers about AchieveNJ (annotation mine):
 
It's not like I'm making this up, folks: the NJDOE doesn't even know yet how they are going to convert student test scores into data for teacher evaluation. Even Jonah Rockoff, eminent economist and the NJDOE's own expert witness before the state BOE on testing data and teacher evaluation, admitted to me that what the NJDOE proposes to do here with SGP cut scores is prone to error. How can NJDOE possibly think it's OK to move ahead without telling us how they intend to fix this serious problem?

NJ Teachers, if you haven't read Bruce Baker's take on all this, you must read it now. There is an easily digestible Q-&-A version at the NJ Education Policy Forum. Here's the money quote:

Q: But if SGPs can’t determine how a teacher affects a student’s test scores, why do the NJDOE and other states’ education departments want to use them in teacher evaluations?

A: Arguably, one reason for the increasing popularity of the SGP approach is controversy surrounding the use of VAMs in determining teacher effectiveness. There is a large and growing body of empirical research describing the problems with using VAMs; however, there has been far less research on using SGPs for determining teacher effectiveness. The reason for this vacuum is not that SGPs are simply immune to problems of VAMs, but that researchers have, until recently, chosen not to evaluate their validity for estimating teacher effectiveness because SGPs are not designed for this task.
And this is only one of the many problems with Operation Hindenburg. I'll be writing soon (I hope) about the Student Growth Objectives -- SGOs -- that all teachers will have to participate in this year. Until then, I have a challenge for the NJDOE: find us one piece of research that demonstrates that SGOs, as NJDOE has conceived them, improves student learning in the domains they are supposed to assess. Because I sure couldn't find it here.

And then we have these completely unsubstantiated promises about the "next generation" of tests that will be foisted on New Jersey's children:
According to the state Department of Education, the new assessment model, AchieveNJ, will measure performance on student learning, including standardized tests and instructional practices. The goal is to improve teacher effectiveness. 
“The concept that teachers will ‘teach to the test’ — or try to ‘prep’ their students to be prepared for test questions, only works on old-fashioned multiple-choice bubble tests,” said Department of Education spokesman Michael Yaple. 
But the new tests require children to analyze, show their work, write essays. In short, the children need to be taught higher-order critical thinking skills. That’s the goal, and that can’t be achieved by ‘teaching to the test,’ ” Yaple said. [emphasis mine]
That is simply not the case, because none of these tests have been properly vetted! The NJDOE has never -- never -- released any serious research that shows any type of predictive validity for the state tests. And yet here they are, confidently vowing that these new tests will require "higher-order" thinking. How do they know? More importantly, how do we know?

We have just witnessed a huge meltdown in confidence in the state education system right across the Hudson. You would think the Broadies and the MBAs and the TFAers and the charter cheerleaders at NJDOE would at least pause for a minute and rethink their plans. Not a chance: full speed ahead! And so they follow in the wake of NYSED in spite of all the obvious, justified misgivings of parents and teachers.

Mark my words: this will be a disaster of epic proportions. When teachers get their evaluations, and when students get their new, "more realistic" test scores, there will be an uproar. NYSED has had no credible answers to its critics; neither will NJDOE.

To all of you reading this down in Trenton at the Education Department (and, yes, I know you read this blog every day): I'll admit that I've been just about your biggest critic, and you probably don't think my advice is sincere. But I'm telling you right now: this is your last chance. Pull back now, while you still can. If you go through with this, you will have no credibility at all with New Jersey's educators, parents, or students.

Consider yourselves warned.

Operation Hindenburg: completely avoidable.

8 comments:

  1. The big guys at the education department aren't worried about Hindenburgs, icebergs or anybergs because they are ultimately unaccountable. They will just move on and get a great job in the Rheeform Industrial Complex. Chris Cerf is a case in point.

    ReplyDelete
  2. "There is a large and growing body of empirical research describing the problems with using VAMs; however, there has been far less research on using SGPs for determining teacher effectiveness. The reason for this vacuum is not that SGPs are simply immune to problems of VAMs, but that researchers have, until recently, chosen not to evaluate their validity for estimating teacher effectiveness because SGPs are not designed for this task."

    Hilarious--we know the product designed for the task doesn't work, so let's use the instrument that wasn't designed for the task, since we don't know whether it works or not.

    ReplyDelete
  3. Even before you get to the nitty gritty - what's missing from the actual test/process, it's important to take a stepack - getting ed. admin buy-in to ed reforms.

    As you know teacher evaluations have already been pilotted in Camden (for 2 years at Cramer, less at other schools). However, they're virtually useless where the state views them as a means to their first objective - increasing achievement in Camden Schools, and ed admins view them as a means to attach accountability to teachers and not to themselves.

    While it's not unusual for teachers to agree with reform initiatives in Camden and that increasing achievement takes first priority, I think many ed admins in key positions strongly disagree. In fact, I'm pretty sure any number of other things take priority over student achievement from their point of view.

    The RACs are going to have to address the absence of administrative buy-in before real reform is possible in Camden.

    ReplyDelete
  4. The RACs aren't going to address anything as the system is poorly planned and implemented. Many years ago a system similar to the RACs failed and this reinvention of a failed idea is doomed to failure too.

    ReplyDelete
    Replies
    1. Initiatives in Camden, released to district personnel in memos, on websites, etc have been a re-statement of what Camden teachers said to them. Only now, they're saying it back to us. That's probably why I'm continually marvelling about the depth of their perceptions. However, getting buy-in from admins requires real leadership, and they're trying to move forward w/o addressing it. That's like trying to teach kids who aren't listening to you.

      So far, I've only seen fresh face 30 something RACs and retired from the neck up former principal RACs - no military.

      Delete
  5. The RAC's don't know what they are doing. There are ex-military personnel in their staff that are telling educators how to do their jobs. Do they even know what they are looking at? I'm not saying ex-military aren't intelligent, but they are not educators. On another note, I plan on joining the military in a supervisory capacity when I retire from teaching; it clearly needs reform. There is too much waste, abuse, and fraud in the system.

    Like, I've seen "Saving Private Ryan," so like, I get it, yanno.

    ReplyDelete
  6. Unfortunately, it removed my ending comment that stated "remove tongue from cheek."

    ReplyDelete
  7. I want Superintendent Growth Percentiles (SGP) for Broadies in Charge (BIC).

    Let's start with Mike Miles, our $7,000 a day BIC in Paterson! Diane Ravitch gave us an update on Miles but not on this big $7,000 a day Paterson reform plan - http://dianeravitch.net/2013/08/06/dallas-teachers-flee-superintendent-mike-miles-under-investigation-his-family-moves-back-to-colorado/

    "Teachers flee, Mike Miles under investigation, His family moves back to colorado"

    But we need an update on Paterson. What great reforms are in place for the $7,000 a day service he rendered? Where are all the good news stories of great accomplishment? Should Cerf bring this $7,000 a day best practice by BIC to other districts?

    What is the SGP? Do we compare Paterson to non-BIC reform districts to find the percentage?

    This is accountability all taxpayers & educators need to know.

    ReplyDelete

Sorry, spammers have forced me to turn on comment moderation. I'll publish your comment as soon as I can. Thanks for leaving your thoughts.