Teach for America got a major public relations boost in September when a federally funded study found that on average, its “corps members” slightly outperformed teachers who entered the profession after traditional teacher preparation, which can involve years of course work and supervised practice. The eight-state study by Mathematica Policy Research found that on average, the students of Teach for America corps members made more gains in math in one year – amounting to an average of 2.6 months of additional learning time. It was unclear whether the program’s selectivity in recruiting or other factors accounted for its positive results.
Sigh. Look, I know this stuff is knotty. And kudos to Brody for talking to Aaron Pallas, one of the smartest guys on the planet when it comes to this stuff. But nobody should be taking TFA's word on their "success" without really looking at what the research says. And, as I wrote in September, the Mathematica TFA study is hardly a ringing endorsement:The report came at a time of intense debate over how to attract quality candidates into teaching, especially in high-poverty areas, and how to help them succeed in the job. [emphasis mine]
- Effect sizes matter. The study shows a difference of .13 standard deviations in high school math and .06 standard deviations in middle school math. That is equivalent to moving a student from the 27th percentile to the 30th. The study uses the "x months of learning" conversion to say that's 2.6 months of learning; I find that to be misleading at best. Having four TFA teachers in a row isn't going to mean that a student will take calculus in their senior year instead of pre-calculus: these effects aren't necessarily cumulative.And so on. Again: I know this is tough stuff, and Brody did present an opposing view to the usual TFA propaganda. Maybe I'm just annoyed at how damn good TFA is at getting their brand out there with very little challenge to the hegemony they've created for themselves, both in the media and political spheres.
What really should be reported is how many more questions the TFA students got right on the final tests. And it can't be many when an effect size is that small. Statistically significant? Sure. But practically significant? Come on. And I'm not even going to deal with the scaling problems here...
In addition, TFAers come from elite schools, which means they generally have good test-taking abilities. Are they able to impart those abilities on to their students? And is that "real" learning, or simply learning how to beat the system?
In any case, the results here seem like a great candidate for a treatment from the Mountain-Out-Of-A-Molehill-INATOR: there's just not that much here to get excited about.- Some randomness isn't full randomness. OK, so the kids are randomly assigned -- but the teachers weren't. The TFAers aren't going up against the entire population of "regular" math teachers; they are matched only with those teachers who:
- Teach the same subject,
- In the same school,
- To which only TFAers are assigned.That is a very, very limited control group from which to draw broad conclusions about the effectiveness of TFA. There is plenty of reason to suspect that many districts do not distribute teachers across their schools equally; and that's not even addressing the distribution between districts.So the students may be randomly assigned, but the teachers in the control group most certainly are not. That is a big hit to the generalizability of this study. And speaking of those teachers...- Heterogeneity of the control group. The study tries to account for some differences between the control group teachers, but it is a very limited description. That's not a criticism; it's simply pointing out the limitations of the study. Colleges are reported dichotomously as "selective" or "not selective," as if there isn't a whole world of difference between programs in "not selective" colleges (for example: state schools with scholars as faculty vs. crappy, on-line, for-profit "universities"). Interestingly, TFA teachers were more likely to have degrees in secondary math education than comparison teachers (18.8% vs. 15.9%).I also found this telling: 70 percent of both the TFAers and the comparison teachers had 20 or fewer days of student teaching in math. That says as much, to me, about the quality of training of the comparison teachers as it does about the TFA group.
But TFA isn't a "solution" to any "problem" with urban education. As Diane Ravitch has said: we're not going to staff our schools with Ivy League graduates, because the raw numbers needed are just too great. Let's focus instead on making the profession more attractive to college graduates of both private and state colleges and universities. Let's start improving teacher working conditions, which are student learning conditions.
Relying instead on the kindness of strangers who don't stay very long is not a very well-though out strategy.