In my SGOs, I have to demonstrate "student growth" using a rubric approved by my supervisor, following a format informed by the state Department of Education. And like every other scheme dreamed up by Chris Cerf's NJDOE, the department insists there is a research base for all this:
Once again, NJDOE makes a claim about research backing up their plans without actually pointing us to the research they are citing. Certainly, it isn't anything that eminent education research Howard Wainer has ever seen:
State officials stressed that among the greatest values in the process may be teachers working with their supervisors when developing SGOs.
“The research is showing there is an inherent value in that collaboration,” said assistant commissioner Peter Shulman. “They are setting specific targets for their specific students’ needs.” [emphasis mine]
Oh dear. Well, there has to be some research that NJDOE is relying on; what about their "AchieveNJ for Teachers" webpage?
Let's take a look at these links -- but let's go out of order, starting with #2, published just this past spring:
WHAT DOES RESEARCH TELL US ABOUT SGOs?SGOs (or Student Learning Objectives/SLOs) have been in use in many states and districts for the past 14 years. The following studies and resources offer more information about the implementation of SGOs across the country:
- Community Training and Assistance Center Report: It's More Than Money (Feb 2013) – study of Student Learning Objective (SLO) implementation starting in 2008 in Charlotte-Mecklenburg Schools (CMS)
- Regional Educational Library at EDC: Overview of Student Learning Objectives: Review of the Literature
- Reform Support Network (RSN) Targeting Growth: Using Student Learning Objectives as a Measure of Educator Effectiveness
- Reform Support Network (RSN) Quality Control Toolkit for Student Learning Objectives
Yes, ladies and gentlemen, you heard that right, straight from the NJDOE itself: There is no research base that demonstrates SGOs will aid in effectively evaluating teachers in non-tested areas. This is faith-based teacher evaluation, folks; the document reiterates the point:Because there are not yet any research-based models for evaluating teachers’ impact on student learning for non-tested groups, states are experimenting with various approaches (Goe & Holdheide, 2011). [emphasis mine]
The first link from NJDOE is a report about the implementation of SGOs in Charlotte-Mecklenberg. Understand the report says nothing - nothing - about the validity or reliability of using SLOs/SGOs in non-tested subjects. And the report's findings about SLOs in tested subjects looks like a candidate for the Mountain-Out-Of-A-Molehill-Inator* (p. 77):Research has shown that setting student learning objectives, in general, has a positive impact on learning (e.g. Beesley & Apthorp, 2010). However, because the use of SLOs in teacher evaluation is a fairly new approach, there is not yet rigorous research available on its effectiveness. Harris (2012) describes SLOs’ “potentially attractive” qualities of allowing for teaching autonomy in setting individualized objectives and customizing instruction accordingly. However, he emphasizes that these same qualities could lend themselves to manipulation and non-comparability and that “there is essentially no evidence about the validity or reliability of SLOs” (Harris, 2012, p.5). Non-experimental studies conducted on the Austin, Denver, and Charlotte programs described above seem to indicate a positive correlation between the quality of SLOs and student achievement and between the number of objectives met by teachers and student achievement, but more research is needed on the relationship between teachers meeting SLOs and student net achievement (Tyler, 2011).
Analysis of SLO Attainment, 2009-10
Cross-sectional HLM analyses are conducted on the attainment of SLOs for elementary and middle school mathematics and reading. The findings are consistent with those on the quality of SLOs.
Translation from geekspeak: most likely, a student who met the SGOs in elementary math and reading got a few more questions right on the state tests. Shocking, I know...There is a positive, statistically significant association between attainment of SLOs and student achievement in elementary school mathematics and reading. In terms of z-scores, attaining the SLO growth target increases the z-scores in elementary reading by 0.15 points, and elementary mathematics by 0.11 points.The attainment of SLOs is not a statistically significant predictor of student achievement in either middle school mathematics or reading.[emphasis mine]
There is, admittedly, a correlation between whether a teacher had a "high-quality" SGO and whether that teacher got a bonus based on a Value-Added Model (VAM) that uses student tests scores to judge his or her "effectiveness." (p.58) But considering how crappy VAM models are, there's not really much for NJDOE to hang its hat on here. And, again: none of this is relevant to teachers in "untested" areas.
I want to be fair here: throughout the report, you get the sense that teachers and principals in C-M were willing to give SLOs/SGOs a fair shake. Everyone agrees that measuring student growth is a fundamental practice of good teaching. Everyone agrees that teachers ought to be judged, at least in part, on their ability to affect student growth. Everyone agrees that all professionals should look at the best practices of their colleagues. The teachers of North Carolina, like teachers in New Jersey and everywhere else, have no problems with being held accountable for their work.
But let's be very clear: there is very little research basis to back up the top-down implementation of SGOs in AchieveNJ. There's no reason to believe these metrics are valid enough, reliable enough, or precise enough to make high-stakes decisions. There's no reason to think using SGOs will substantially increase student achievement in tested or untested domains. And there's absolutely no reason to assign a weight of 15% to a teacher's SGO score.
Teachers and principals and administrators will do what they always do with mandates that are delivered from on-high by non-educators: they will work to make them as useful as possible for professional development, and as unobtrusive to student learning as they can.
But that doesn't change the fact that there is very little reason to believe the wide-spread implementation of SGOs will do much of anything for teachers, administrators, or students.
Accountability begins at home.
* You mean you haven't heard of the Mountain-Out-Of-A-Molehill-Inator?
ADDING: You know, I have to wonder if Charlotte-Mecklenburg's teachers are as enthusiastic about SGOs and the like now that the teaching profession has been destroyed in North Carolina...