I will protect your pensions. Nothing about your pension is going to change when I am governor. - Chris Christie, "An Open Letter to the Teachers of NJ" October, 2009

Tuesday, May 1, 2018

What Do We Teach In America's Schools? "Hey, Honey, Sit Down and Shut Up!"

America, it's time to play Spot The Pattern!™

First, Chicago (all emphases mine):
Earlier this month, we posted a story about discipline practices inside Noble Network of Charter Schools, which educates approximately one out of 10 high school students in Chicago. One former teacher quoted in the piece described some of the schools’ policies as “dehumanizing.” 
Through the teacher, several students also agreed to communicate by text message. 
One described an issue raised by others at some Noble campuses, regarding girls not having time to use the bathroom when they get their menstrual periods. 
“We have (bathroom) escorts, and they rarely come so we end up walking out (of class) and that gets us in trouble,” she texted. “But who wants to walk around knowing there’s blood on them? It can still stain the seats. They just need to be more understanding.” 
At certain campuses, teachers said administrators offer an accommodation: They allow girls to tie a Noble sweater around their waist, to hide the blood stains. The administrator then sends an email to staff announcing the name of the girl who has permission to wear her sweater tied around her waist, so that she doesn’t receive demerits for violating dress code. 
Last year, two teachers at Noble’s Pritzker College Prep helped female students persuade administrators to change the dress code from khaki bottoms to black dress pants. Although their initiative was based in part on a survey showing that 58 percent of Pritzker students lack in-home laundry facilities, it remains a pilot program available only at the Pritzker campus.
Next, New York City:
A veteran city educator who said officials botched her sexual harassment case is calling out Mayor de Blasio for shaming victims — and omitting dozens of sexual harassment complaints from recently published city statistics.

The educator, who asked to remain anonymous because she fears retaliation, said she was sickened to hear de Blasio say this week that the Education Department substantiated less than 2% of complaints because of a "hyper-complaint dynamic" in the city agency.

"I'm certainly offended that Mayor de Blasio would say that," said the educator, who sued the city over her harassment by a supervisor and won a settlement.

"With a wife and daughter of his own, I was in shock," she added.

She called the city Education Department's investigation into her claims "a long, complicated, ugly process," that ultimately failed to bring her justice.

"No one would go through this if it were not true," she said. "It is a horrific experience. It upends your entire life."

City officials are scrambling to contain a growing sex harassment scandal in the city schools.

A tally of sex harassment complaints published by the city Friday omitted 119 Education Department complaints erased from the record because officials deemed them "non-jurisdictional."  
Figures published by the de Blasio administration on April 20 showed 471 cases of sexual harassment complaints in city schools from 2013 to 2017. But internal records kept by Education Department officials showed 590 complaints during the same period — a figure 25% higher than the number reported by de Blasio. 
Observers said it looks like the Education Department is trying to hide the facts about sex harassment cases. 
"That's exactly what's happening here," said New York City Parents Union President Mona Davids. "They covered things up and they squashed the complaints."
NYC teacher Arthur Goldstein has more on this.

Let's go to Washington:
At a roundtable with the nation’s top educators on Monday afternoon, at least one teacher told Education Secretary Betsy DeVos that her favored policies are having a negative effect on public schools, HuffPost has learned. HuffPost has also obtained video of DeVos expressing disapproval of the teachers strikes currently roiling Arizona.

DeVos met privately with more than 50 teachers who had been named 2018 teachers of the year in their states. As part of the discussion, teachers were asked to describe some of the obstacles they face at their jobs and were given the opportunity to ask the education secretary questions. 
DeVos also expressed opposition to teachers going on strike for more education funding, per video of the meeting obtained by HuffPost. DeVos made her comments after Josh Meibos, Arizona’s teacher of the year, asked her about when striking teachers will be listened to. In response, DeVos told Meibos that she “cannot comment specifically to the Arizona situation,” but that she hopes “adults would take their disagreements and solve them not at the expense of kids and their opportunity to go to school and learn.”

“I’m very hopeful there will be a prompt resolution there,” DeVos can be heard saying in the video. “I hope that we can collectively stay focused on doing what’s right for individual students and supporting parents in that decision-making process as well. And there are many parents that want to have a say in how and where their kids pursue their education, too.”

She continued, “I just hope we’re going to be able to take a step back and look at what’s ultimately right for the kids in the long term.”
When reading this, keep in mind that about three-quarters of America's teachers are women. So when DeVos tells teachers they shouldn't protest against receiving low wages, she's very much telling women to stop complaining that their pay is low compared to other professions for college-educated workers -- professions more like to employ men.

It's also worth noting that DeVos is sticking to a set of talking points about the teachers strikes that she paid for.

Back to Washington:
We all know that black girls are disciplined more harshly for the same infractions as their white peers in schools (and life), but a new study shows that part of this disparity is linked to school-uniform policies.
The National Women’s Law Center recently looked at school dress codes in Washington, D.C., and found that black girls are unnecessarily and predominantly penalized under uniform rules.  
In fact, because humans in their unconscious and implicit biases are the ones who enforce rules around dress codes, it goes without saying that sexism, racism and traditional gender roles play a part.
According to the study, black girls were found to often be in violation of dress codes for so-called infractions like being “unladylike,” “inappropriate” or “distracting to the boys around them.”
Of course, no one should expect DeVos's Department of Education to investigate racial bias in school discipline anytime soon: her crew is too busy suppressing investigations. But while the intersection of sexism and racism makes these dress codes especially pernicious for girls of color, girls of all races are regularly made to feel ashamed of their bodies while in school.

Like in Florida:
Lizzy Martinez, 17, a junior at Braden River High School in Bradenton, Fla., had been swimming and tanning all weekend at a water park in Orlando. But when Monday morning came and she had to get dressed for school, Lizzy’s bra felt painfully constricting on her burned skin. 
So she ditched the bra and purposely chose to wear something dark and loose — a long sleeve, oversize, crew neck gray T-shirt — so she wouldn’t draw attention to her chest.
But around 10 a.m., about 15 minutes into her veterinary assistance class, Lizzy was called out of the classroom for a meeting with two school officials, Dean Violeta Velazquez and Principal Sharon Scarbrough. They asked her why she wasn’t wearing a bra
She said she told her school administrators about the sunburn. They insisted that she was violating the school dress code. (The 2017-2018 Code of Student Conduct does not say bras must be worn by female students.) They told her to put on an undershirt because boys were “looking and laughing” at her, a detail she later challenged. “No one said a thing to me until I got to the dean’s office,” Lizzy said. 
She was crying and wanted to go home, so Lizzy’s mother, Kari Knop, a registered nurse, was called at work. “I said, ‘Lizzy, I’m working,’” Ms. Knop said in a phone interview. “I told her, ‘Can you just put the undershirt on and call it a day?’” 
Lizzy was embarrassed and angry but she relented. When she returned wearing the undershirt, the school principal had left. The dean, according to Lizzy, instructed her to “stand up and move around for her.” 
“I looked at her and said, ‘What do you mean?’” Lizzy said. “I was a little creeped out by that.” The school has a strict disciplinary policy and she didn’t want to appear defiant. (School officials refused to comment, except in a statement.) 
The dean told her that her nipples were still showing through her T-shirt and she should use bandages to cover them up. “She told me, ‘I’m thinking of ways I could fix this for you.’ She said, ‘I was a heavier girl and I have all the tricks up my sleeve,’” Lizzy said.  
Lizzy was given four adhesive bandages from the school clinic. “They had me ‘X’ out my nipples,” she said.
Even if you have a conservative point of view on what is and isn't appropriate for students to wear at school... you can't tell me this story isn't creepy. But this is how we tell girls to think about their bodies now.

Another story from Michigan*:
With prom season in full swing, many teens attending schools with harsh dress codes are taking to social media to call them out. This week, one school in Michigan has decided to take their policies a step further with items that they’re calling “modesty ponchos,” and the students are not having it. 
Prom night at Divine Child High School in Dearborn, Michigan is set for May 12, and the school has already announced that they would be handing a colorful poncho-like piece of fabric to all of the girls who show up wearing something that the school deems too revealing, reports Fox 2 Detroit. A student told the news source that “teachers will determine whether what they’re wearing is compliant or not when they walk in the door.” She added, “I do believe the school has gone too far with this. As we walk into prom, we are to shake hands with all the teachers and if you walk through and a teacher deems your dress is inappropriate you will be given a poncho at the door.”
To be clear: I am not against schools setting some reasonable restrictions on student dress. No student, for example, should be allow to wear clothing that has wording intended to denigrate others. Reasonable people can disagree about where the lines are. But there is, to my eye, a distinct odor of slut-shaming in many of these policies -- which goes a long way toward explaining the racist skew in how they're implemented.

So, what have we got going on in America's schools these days?

  • Girls can't use the bathroom when they have their periods.
  • Women teachers who file charges of sexual harassment are told they are "hyper-complainers."
  • Teachers -- again, most of whom are women -- are told their protests against making a pittance are "at the expense of kids."
  • Girls are told by school officials they need to cover up, because their bodies are too distracting.
America's schools are swimming in sexism. Both teachers and students suffer from the consequences of systemic misogyny.

Add to all this the hidden (and not so hidden) curricula in racism, homophobia, heteronormativity, Islamaphobia, and so on...

You know, I don't know why a social conservative like Betsy DeVos is against public schools. They seem to be transmitting exactly the values she and her ilk hold so dear.

“I think that putting a wife to work is a very dangerous thing.”- Donald Trump.

* OK, yes, Divine Child is a Catholic school. But it's not like the phenomenon of slut-shaming at the prom is restricted to private schools:

Prom is supposed to be the most magical night of your high school life — you get your hair and makeup done; you wear the gorgeous gown that makes your mom cry, "You're all grown up"; and you generally look flawless as you kiss good-bye to your awkward years. 
For these teens, prom was ruined when their outfits were banned. Check out their "inappropriate" and "immodest" choices to see for yourself that these girls look beautiful, no matter what their school says.
I don't have daughters, but if I did, I wouldn't have a problem with them wearing any of these outfits. Your mileage may vary, but that's the point: why is the school making these decisions? As one of the girls -- who is wearing what I would say is a very modest dress -- says:
"Maybe instead of teaching girls that they should cover themselves up, we should be teaching boys that we're not sex objects that they can look at."

By the way: #6 is infuriating. What is wrong with people?

Monday, April 30, 2018

Don't Blame Teachers For School Underfunding: A Data Tale From Jersey City

The animosity between NJ Senate President Steve Sweeney and the NJEA, New Jersey's largest teachers union, is already well-known. Add to that the rivalry between Sweeney and Jersey City mayor Steve Fulop, and Sweeney's desire to amend the state's school funding system... well, Sweeney's latest dig at Jersey City's teachers and board of education really shouldn't have surprised anyone:
In a statement issued Friday, state Senate President Stephen Sweeney blasted the Jersey City Board of Education for approving the agreement, which will increase district spending on teacher salaries by 3.31 percent during the current school year and 2.72 percent during 2018-19. The board approved the contract by a 5-1 vote Thursday night. 
Sweeney (D-Gloucester) said the Jersey City school district already receives more state funding than it should – district officials have dismissed this as untrue. Sweeney added that salary increases amid a $71 million shortfall in the district's proposed budget sends the wrong message to other schools.
"What makes it even worse is that the Jersey City Board of Education wrote a blank check that taxpayers in every other school district in New Jersey are going to have to reach into their pockets to pay," Sweeney said. "That's because Jersey City continues to get $151 million a year more in state aid than it would be receiving if the school funding formula was run fairly with the 10-year-old growth caps and Adjustment Aid eliminated." [emphasis mine]
Others have reported Sweeney claims Jersey City is over aided by $174 million; let's stick with the lower figure for now to be conservative (you'll see why in a minute). Sweeney arrives at this figure because Jersey City, and several other districts, benefit from a provision in the School Funding Reform Act (SFRA) called "adjustment aid." This aid was included in the original 2008 law to mitigate against the shock school districts might face when transitioning to the new formula; it keeps districts from falling below the level of aid they received prior to the new law. However, it has also led to some districts currently receiving more state aid than they would get if the provision wasn't included.

Jersey City gets a lot of adjustment aid, which likely helps it keep its local taxes lower than they would be otherwise. To illustrate, I took this chart from the Education Law Center's website†:

There really is little doubt Jersey City should be contributing more local tax revenues toward its schools; whether it can at the moment, given the state's property tax cap, is an open question.* That said, and as ELC** points out in this brief, the district is still not getting all the funding it needs, from either the state or local sources, to provide an adequate education for its students.

Which makes Sweeney's statement even more interesting. Because his clear implication is that Jersey City is giving its teachers a big raise*** on the backs of other school districts, who don't get nearly as much state aid. But he's also claiming property taxes in Jersey City are artificially low, again because of an excess amount of state aid.

Is this possible? Is Jersey City so "over-aided" that can afford big teacher salaries and low property taxes?

Again, I'll leave aside the question of taxes and instead focus on teacher salaries. Because I happen to have data available to take a reasonable stab at answering this question: Are Jersey City's teachers significantly overpaid compared to their colleagues in neighboring school districts? If not, is it really fair of Sweeney to call this recent contract irresponsible?

Let's start by looking at how much JC's teachers make compared to their colleagues in the other school districts in Hudson County (click to enlarge).

At first glance, when we look just at the average Jersey City salary compared to the rest of the county, it appears JC teachers are doing relatively well -- not spectacularly well, but well. Bayonne, Gutenberg, Weehawken and East Newark**** teachers seem to pay a serious wage penalty for not working in JC...

Or do they? One of the problems with simply comparing average (or even median) salaries is that it doesn't account for how teachers are paid in the real world. For example:

Like all public school teachers (and like many, many others in both the public and private sector), Hudson County teachers are paid more when they have more experience; this explains the upward slope of these lines, showing pays raises when teachers gain seniority. Jersey City (the dashed red line) has a slightly earlier bump up in experience than most other Hudson County districts.

However, when JC teachers reach their 30th year, their pay is rather average. In fact, the best-paying district in Hudson County, accounting for experience, appears to be Hudson County Vo-Tech. Which, again, is interesting, given Sweeney's full-throated support for vo-tech schools.****

Now, whether Jersey City is paying relatively more than other districts for its teachers also depends on how experience is distributed. So let's look at that next:

Jersey City does have a somewhat larger concentration of teachers with 15 to 19 years of experience; that might help explain a somewhat higher average salary for all JC teachers than other Hudson County districts.

But teacher pay doesn't just vary with experience. Earning an advanced degree leads to higher pay; living in a labor market that's more expensive, or pays more for teachers relative to other professions, changes pay. Keep in mind: these factors are out of control of both the Jersey City Board of Education and the Jersey City Education Association, the local union that negotiated the contract. It's ridiculous to think either party could buck trends and norms followed across the state.

So how can we determine whether Jersey City teachers are really "overpaid"? I've approached the problem using a regression model: a statistical technique that allows us to "hold things constant." Using seven years of data on every teacher in the state, I've tried to model how experience, full-time/part-time status, labor market, job description, highest degree earned, and other factors affect teacher pay (nerds, I give the details on the regression model below).

The model allows us to predict how much a teacher might earn, given all these factors. The amount above or below prediction (the residual) can't be explained by the variables in the model; we will assume, therefore, that this amount is how much each teacher is "over-" or "under-" paid, relative to other teachers in the state.

So: are Jersey City teachers way overpaid? Put simply: no, not really.
This is expressed as a ratio of actual salary over predicted salary; a ratio of "1" means the salary is exactly what the model predicts, so the teacher isn't "over-" or "under-" paid, given their experience, degree, labor market, etc.

In Jersey City in 2016-17, teachers (as a group) were paid about 3.7 percent more than prediction. That hardly makes them the most "overpaid" teachers in Hudson County: Harrison, Hoboken, Secaucus, and Hudson Vo-Tech teachers were all "overpaid" more Jersey City school staff (again, this doesn't account for administrators, nor for staff without certificates).

Let me stop here and clarify something: I am deliberately putting "under-" and "over-" paid in quotes, because this model cannot account for many other factors that would affect teacher pay. It may be that Jersey City has to pay more to attract the same quality of teacher candidate for a variety of reasons that can't be measured. Maybe teacher candidates didn't want to teach in a district that was under state control for a quarter of a century. Maybe they've heard, as I have, that the state monitors have made staff feel unappreciated. Maybe the traffic sucks.

All I'm trying to do here is provide some sort of empirical analysis to determine whether there's evidence that Jersey City teachers are the beneficiaries of the "over-aiding" of the district. To that end: let's see what the "overpayment"****** of Jersey City teachers costs the district.

I could choose all sorts of denominators to use, but let's keep this simple: how much of the total appropriations of the Jersey City Public Schools can be attributed to the "overpayment" of teachers? About 1.3 percent.

But let's get back to Senator Sweeney's complaint: how much of the "over-aiding" of Jersey City gets gobbled up by the "overpayment" of Jersey City's teachers? About 6 percent -- that's barely a blip.

The idea that Jersey City's teachers substantially benefit from of the "over-aiding" of the district is not supported by a reasonable analysis of the available data.

I'm going to run the risk of pissing off a few friends here, but let me put this on the table:

Senator Sweeney and I have a lot of disagreements. I was, like almost every other teacher in the state, extremely disappointed by his support of Chris Christie's attack on our pensions and health benefits. I think Senator Sweeney is dead wrong about the benefits -- and largely blind to the harms -- of the expansion of charter schools in Camden (call them whatever you want, they're charter schools). I also think Senator Sweeney is dead wrong on taxation.

That said: Steve Sweeney has valid concerns about New Jersey's state school aid formula. He is right to note that the growth caps have got to be addressed. He is right to state that communities like Jersey City ought to be contributing more toward the funding of their schools. He is right to champion the districts in this state that are often overlooked in the debate over school funding, yet whose students are suffering real harm due to inadequate funding.

So I'm willing to take Steve Sweeney at his word. I do believe he is concerned that there are students in New Jersey school districts who are suffering right now because they can't get adequate funding for their schools.


The idea that the students of Bayonne are being denied an adequate education because of the greed of the teachers of Jersey City is just plain wrong.

There is no evidence Jersey City teachers are wildly overpaid. There is no evidence the small bump JCEA members enjoy in their wages is a major part of the "over-aiding" of the district. I understand NJEA gave Sweeney a few bruises. But making arguments that pin the blame for the underfunding of New Jersey schools on Jersey City's teachers is not helpful in the slightest.

Look, schools cost what they cost. If you want certain outcomes, you have to pay for them (we need to have a good long talk about this idea soon...). By the state's own formula, Jersey City's schools are not over-funded.

In addition: if you want good teachers, you need to pay good wages. New Jersey actually underpays its teachers relative to the rest of the labor market. If Jersey City is paying its teachers a bit more, that's a good thing. Why come down on the district for trying to get good people to come into the profession?

Senator Sweeney, instead of slamming Jersey City's teachers for standing up for themselves and demanding decent pay...

Why don't we instead work to get all districts the funding they need to bring the best and the brightest into New Jersey's classrooms?

For the record: I am a proud NJEA member, and I am proud to stand with my fellow public school teachers in Jersey City, and everywhere else in the state.

* I really don't want to wade into this on this post, because, to be honest, I just haven't had time to look at it carefully. But some, like Jeff Bennett, argue Jersey City could increase its revenues without the state raising its property tax cap. Bennett (who, despite our policy differences, I genuinely respect) has told me Jersey City hasn't even raised its tax rates as high as it could under the current cap. I have no reason to doubt Jeff, but I haven't looked into the topic myself. 

** For the record: I have done work as a contractor for ELC in the past.

*** Something worth noting: when you see a report that teachers are getting a "... 3.31 percent during the current school year and 2.72 percent during 2018-19," understand that doesn't mean all of the teachers are getting more money. Public schools operate on salary guides, which provide a raise for every year of service up until a final "step." You need to add money into guide like that just to maintain it. So those at the "top of the guide" might actually be getting no raise, depending on how the guide is structured.

Teacher salary guides is a really complex topic; maybe I'll try to get to it at some point...

**** Actually, the East Newark data for 2016-17 looks off because a lot of the teachers who should be 1.0 full-time equivalents are listed as 0.1 FTEs. I tried as best as I could to clean up this rather obvious mistake.

***** To be clear: I join with Senator Sweeney in supporting vo-tech programs and schools. More Vo-Tech!

I just don't understand why the senator is complaining about Jersey City teachers getting a raise when they make less than the county's vo-tech school. Why isn't he blaming them for underfunding elsewhere? (OK, he shouldn't, but you get my point, right?)

****** Yes, these quotes are stupid. You have a better idea?

The Regression Model:

I have a panel of certificated staff data from 2010 to 2017. 2013 is excluded because some of the teacher characteristics data weren't included. The model I use is:
salary = f(prior_exp_years FTE i.highest_ed_comp i.metajobcode i.lmencode i.data_year i.charter charter#data_year logEnroll)

  • prior_exp_years: Total years of experience, in and out of NJ or the district.
  • FTE: Full-time equivalency.
  • highest_ed_comp: Highest degree earned.
  • metajobcode: Job description, divided into larger categories (i.e., all science teachers bundled)
  • lmencode: Labor market; I used counties. 
  • data_year: The year. 
  • charter: Whether the school is a charter. I know some of you might push back a bit, but the fact is a teacher suffers a wage penalty for working in a charter. Given that reality, it's not rational to expect Jersey City teachers to make charter school wages; in fact, there is a very good case to be made that JCPS teachers are propping up the city's burgeoning charter sector through wage free-riding
  • charter#data_year: Given the volatility of the state's charter sector, interacting it over time seemed reasonable. 
  • logEnroll: OK, so this one had me thinking. We know for a fact that school districts enjoy economies of scale. It may well be those districts then use the savings to recruit more desirable teacher candidates, or make up for recruitment hardships that can't be measured. It may also, however, be that larger districts create larger teachers unions, which leverage more bargaining power. But do districts really have much control over how big they are? Hmm... Ultimately, I kept this in the model because it matters -- but I'm open to debate. In any case, removing it does up the "overpayment" ratio for Jersey City, but only to about 1.06. That's not enough to make a serious dent in the amount JC is "over-aided."
Bad mistake in the original post: I inadvertently put Newark's LFS v. Levy chart up, not Jersey City's. Sorry about that -- correction made.

Monday, March 19, 2018

The Facts About NJ Charter Schools, Part III: Segregation By English Proficiency

This week, I'm going into detail about a new report on New Jersey's charter schools I wrote with Julia Sass Rubin. In the last post, I showed conclusively that the charters enroll proportionally fewer special education students. In addition, the classified students charters do enroll tend to have less costly learning disabilities. This puts both fiscal and educational pressure on public district schools, which are forced to subsidize charters at the same time they must provide an education to students with special needs.

One of my more tenacious commenters keeps trying to make the case that the reason charters don't enroll as many special needs students is that they declassify special education students at higher rates than public schools. But there is no empirical evidence I am aware of to support this claim. Further, as I've showed before, NJ public district schools spend much more on the support services special education students need than charters. In addition, there are more support staff per pupil in the public schools than in the charters. All the evidence suggests the student populations of charters and public district schools are different.

I don't know why anyone would be surprised by this. The entire theory of charter schools is that they will enroll students who are a good "fit." Why, then, would we be surprised that the charter student populations aren't like the public school populations? Isn't that the entire point?

Keep this in mind as we now look at the differences in English language proficiency between public and charter school students.

Year after year, New Jersey's public district schools enroll many more Limited English Proficient (LEP) students proportionally than the charter schools.

Again, you can try to make the case that this is because the charters remove LEP classification more than public, district schools. But there's no evidence to back up that claim. Further, there is a significant incentive for charters to have students retain their LEP classification, as charter schools receive more funding if they have more LEP students.

The idea that charters are so much better than public district schools at teaching LEP students that they can immediately remove their status, even at a financial disincentive, flies in the face of logic. It's also contradicted by one of the other arguments charter cheerleaders often try to advance: that the difference in LEP classification in some cities is due to the location of the charters.

It is certainly true that the charters often tend to cluster in neighborhoods with smaller Hispanic populations  -- that is likely the explanation for the difference in LEP populations in Newark. But so what? The charters chose to locate in those neighborhoods -- now the district has to pay the costs of educating a concentrated LEP population. Considering that a district like Newark has been underfunded for years while the charters are "held harmless," this remains a serious problem.

Finally, let's consider individual communities, and how their charter sectors differ from public school districts:

As I've noted before, the racial profile of Red Bank Boro -- where the disparity in LEP percentage is the greatest in the state -- is very different than the profile of the area's charter schools:

The idea that the huge disparity in LEP rates between Red Bank Boro and the students attending charters* can be explained by either LEP declassification or location of the school is very hard to defend when it's clear that far more white students proportionally attend the local charter school. The much more plausible explanation is that "choice" has led similar families to "choose" the same schools. This lines up with a growing body of evidence that shows that parents rely on their social networks to make navigate a "choice" system.

All this said, look at some of the districts at the bottom of the table. In North Plainfield and New Brunswick -- communities with large rates of LEP classification -- the charter schools, as a group, actually enroll more LEP students.

As with special education classification rates, the data here show that the charter sector could be enrolling more LEP students. But why doesn't it? If charters are serving more LEP students in New Brunswick and North Plainfield, why aren't they serving at least a similar rate of LEP students in Jersey City or Morris or Passaic or Trenton?

I would suggest the data here shows that it's at least possible that charters could enroll more LEP students. Where then, has the state been during the last decade? Why aren't they demanding better from the entire sector? 

I'll talk about disparities between NJ charters and public district schools in socio-economic status next.

* To be clear: the disparity chart does not only include students who attend the local charter school; it counts all students who reside in the district but attend a charter anywhere in the state. So the "Charter LEP %" figure will not be the same as the local charter school(s) percentage.

Saturday, March 17, 2018

The Facts About NJ Charter Schools, Part II: Segregation By Special Education Need

In this series of posts, I'm breaking down a new report by myself and Julia Sass Rubin on New Jersey's charter schools. State data shows one incontrovertible truth:

New Jersey's charter schools enroll far fewer students proportionally who have learning disabilities, or who are Limited English Proficient, when compared to their hosting districts.

Here's a graph that shows this quite clearly:

Oh, sorry -- this graph isn't from our research. This graph is from a report published by the New Jersey Charter Schools Association, the state's biggest charter advocacy group.

Let me clean it up a bit for you...

It is, of course, completely inappropriate to use the same scale for measures that are as different as racial composition and special education classification. I would make my grad students resubmit their work if they ever tried to pull a stunt like this.

Still, you can clearly see that, according to the state's biggest charter cheerleaders, NJ charter schools enroll far fewer students proportionally who are classified with a learning disability, or who are English Language Learners.

Let's look at this in a more appropriate way. This graph is from our report (for real this time):

In our report, we compare all of the charter school students residing in a school district to the resident students who attend the public district schools. This method allows us to compare a community's charter students to its district students -- no matter where the charter students attend school. (I'll discuss this method in more detail later in this series.)

These findings are beyond question -- and they raise some serious issues. Even Chris Christie acknowledged that it costs more to educate a child with a learning disability; this particular fiscal burden falls hard on public district schools when charterization concentrates their proportion of classified children. It also makes comparisons between the academic outcomes of charters and district schools meaningless unless this disparity is accounted for.

The problem with most attempts to do this -- like the NJ CREDO study, which was commissioned by the state -- is that the statistical models employed use data wholly inadequate to the task. These data divide students into two groups: those with a learning disability, and those without. The problem is that classified students can have very different disabilities, and, consequently, very different educational needs.

As I've noted before, some disabilities, such as speech or "specific learning disabilities" (SLDs), are relatively low-cost. Others have a much higher cost. Guess which students are more likely to enroll in the charters?

The special needs students who are enrolled in NJ charters tend to have lower-cost disabilities the those in district schools. This analysis differs somewhat from above (see the report for details), but it matches our previous work. We're using 2016 data here; in that year, the state did not suppress data as they have done before.*

For a long time, charter cheerleaders have claimed -- with no empirical evidence -- that the reason their special education rates are lower is because their superior instruction and organization make special education classification necessary. The chart above directly refutes this. It's much less difficult to change the classification of a student with a speech or SLD disability than one with a traumatic brain injury, or blindness, or autism. If charters dissolve classified students' individualized education programs (IEPs) at higher rates than public district schools, it's only because the classified charter students have, on average, less profound disabilities than district students.

I've heard some make the case that school districts often place special needs students in specialized, out-of-district private schools, and that this is functionally no different than allowing students to enroll in charters. But that's a ridiculous argument on its face. When a school board makes a decision about an out-of-district placement, they make the decision, and they figure out how to pay for it. Charter school enrollments, on the other hand, are foisted upon school districts by the state with no ability for the district to approve or regulate the enrollment.

In other words: the state makes the decision to approve a charter school, but the district has to pay for it. Worse, if the district isn't where the charter is located, they don't even have the right to appeal the decision. If students in your town want to enroll in a charter school 20 miles away, you don't get any say in the matter -- your town has to pay for it, no matter the fiscal or educational harm.

And again: those students who enroll are less likely to have special education needs... most of the time:

This table shows the disparity between the charter population and the district population in the proportion of classified students for each population.** In North Plainfield, for example, 18.5 percent of the district's students are classified -- but none of the resident students who attend charters are listed as having a special education need. That disparity is the largest in the state.

But here's what's interesting: there are, in fact, districts where the charter and district student populations have similar proportions of special needs students. In fact, in New Brunswick, more classified students attend the charters, proportionally, than the public district schools. Keep in mind that, as we show above, the charter students in New Brunswick have less costly disabilities. This is a problem because the charter school funding formula treats all classified students, with the exception of students with a speech disability, the same in terms of the funds transferred to charters.

Still, New Brunswick shows that many of the other local charter school sectors could be enrolling more special needs students. So why don't they? Why are so many charter schools not stepping up and enrolling more special needs students -- even those with the least costly learning disabilities?

The charter sector has been promising for some time that it will start educating more children with special needs. Some charters do -- but many clearly do not. And why would they, when the state has refused for years to hold them to account? Why would they, when they could count on renewals and approvals for expansions even though it was obvious they were engaging in segregation by special education need?

During the Christie administration, the state turned a blind eye toward the segregation by special need that accompanies charter school expansion. Yet the data on this are so clear that not even the NJCSA doesn't dispute the truth. The Murphy administration, the NJ Legislature, and the NJDOE have got to start acknowledging this and come up with a plan to address it.

There's another student population NJ's charters have underserved, even more than special needs children: English language learners. We'll discuss that next.

* The data was suppressed in 2015 but not in 2014. I have no idea why. You can tell the data is not suppressed because there are many cells that have values between 1 and 9, even though in other years the cells were suppressed when less than 10.

** In the report, we limit the districts studied to those enrolling at least 50 students.

Thursday, March 15, 2018

The Facts About NJ Charter Schools, Part I: Prelude

This is long overdue:
New Jersey's new governor will consider changes to the state's charter school law, potentially slowing the expansion of controversial, yet in-demand schools championed by former Gov. Chris Christie
The state on Friday announced a "comprehensive review" of its charter school law, fulfilling one of Gov. Phil Murphy's campaign promises after an era of rapid school choice growth.
The next week, Murphy clarified his position:
Gov. Phil Murphy's administration is about to scrutinize charter school law, but that doesn't mean he has it out for charter schools, Murphy said Monday. 
"I have never been nor will I be 'hell no' on charters," the Democratic governor said during a radio appearance on New Jersey 101.5-FM. "I just don't like the way we've done it." 
"If a school is high performing and kids are doing really well based on an objective set of facts, count me as all in," Murphy said. [emphasis mine]
So we need "an objective set of facts," huh? Well, Governor, I've got just the thing with which to start...

This week, Julia Sass Rubin, Professor at Rutgers University in the Bloustein School of Planning and Public Policy, and yours truly released a new report: New Jersey Charter Schools: a Data-Driven View, 2018 Update. The report was funded by The Daniel Tanner Foundation, which funded our 2014/2015 series of reports on New Jersey charter schools.

If the reaction to this latest report is anything like the reaction to the previous series, you're probably going to see some serious pushback to our work over the next few weeks. So I want to spend the next few posts here going over exactly what Julia and I did in this report, and why we both believe Governor Murphy is correct in wanting to give serious thought to overhauling New Jersey's charter school laws and regulations.

But let me start with an overview:

- New Jersey charter schools are transferred a lot of money away from the public district schools.

This graph didn't make it into the final report, but it's still instructive. Year after year, charter schools are taking a larger share of the state's total school funding. This is highly problematic, as charter schools create redundant systems of school administration. Yet the state has not bothered to take a serious look at what this means for the overall fiscal health of NJ's public school system.

- The effects of charter proliferation in New Jersey are much more widespread than commonly reported.

The discussions around New Jersey charter schools mostly focus on their impacts in places like Newark and Camden. Unquestionably, these are the communities that feel the effects of charter schools growth the most -- but they aren't the only ones. There are charter schools in New Jersey that draw from over 40 different districts, which means the fiscal effects of charter growth are felt in public school districts all over the state.

- NJ charter schools do not enroll as many students with special education needs as public, district schools.

This data actually mirrors similar data presented by the New Jersey Charter School Association. It's a simple fact: the students in the charters are much less likely to be classified as having a learning disability compared to those in the public district schools. It amazes me that anyone would try to argue this point.

- NJ charter schools do not enroll as many students who are Limited English Proficient (LEP) as public, district schools.

Again, it's pointless to argue about this. This is the state's own data, and the pattern is very clear.

- There is wide variation in the differences in student socio-economic status between NJ's charter and district schools.

There are communities where the charter student population has close to the same proportion of free lunch-eligible students as the public school district. But there are many places where the charter population is very different compared to the district school population. In some places, the charters enroll many more FL students; in some places, the charters enroll far fewer FL students. Both of these situations are cause for concern.

It's also worth noting that free lunch-eligibility may be increasingly unreliable of a measure of student socio-economic status. If we care about the segregative effects of charter schools, we need to start collecting better data.

Again, I'll get into these individual points over the next few posts. But let me conclude this introductory post with this thought:

In New Jersey, a local community has no say in whether it has to pay for resident students to attend charter schools. This includes many towns where charters are not located. If a resident family in your school district wants their child to attend a charter miles away in a town that isn't close to yours, your town's taxpayers must still come up with the money to subsidize that "choice."

In other words: The power to approve, regulate, and expand charter schools is not aligned with the fiscal burdens of paying for those charters.  This is a serious problem that must be addressed in any future legislative overhaul.

Much more to come -- stand by...

Tuesday, March 13, 2018

Betsy DeVos's Florida Fantasy

Betsy DeVos's incoherent mess of a interview with Leslie Stahl (who I thought did a good job) on 60 Minutes Sunday night has got to be one of the most embarrassing performances by a sitting cabinet member in modern times.

I'm not sure my stomach could take a complete debunking of all of DeVos's nonsense. But I would like to focus on one exchange:
Lesley Stahl: Why take away money from that school that's not working, to bring them up to a level where they are-- that school is working?
Betsy DeVos: Well, we should be funding and investing in students, not in school-- school buildings, not in institutions, not in systems.
Lesley Stahl: Okay. But what about the kids who are back at the school that's not working? What about those kids?
Betsy DeVos: Well, in places where there have been-- where there is-- a lot of choice that's been introduced-- Florida, for example, the-- studies show that when there's a large number of students that opt to go to a different school or different schools, the traditional public schools actually-- the results get better, as well. [emphasis mine]
Stahl goes on to ask about how choice has worked Michigan -- as well she should. But I'd like to take a minute or two to examine what DeVos thinks she knows about Florida and school "choice."

Any time a policymaker talks about what "the studies show," watch out. Some of them are adept at reading and synthesizing research, but many are not; too often, they let their staffs, who tend to have cursory training in research methods (especially quantitative methods), assemble the evidence so that it matches their ideological predilections.

In DeVos's case, it's been clear for years that she supports school "choice," no matter what "the studies show." DeVos has publicly admitted that her advocacy for school vouchers is driven by her religious faith. In a way, that's not different than Milton Friedman's voucher advocacy, which also was not based on empirical evidence, but rather ideology.

The theory behind school "choice" has largely rested on the notion that competition forces improvements in schools; therefore, if we want to improve public, district schools, we should threaten them with losses of enrollments by introducing "choice" through a market-based system subsidized by taxpayers.

So, what do "the studies show" about school vouchers in Florida? Let's ask Patrick Wolf, writing here with Anna Egalite. Wolf is well-known within education policy circles as one of the foremost advocates for school "choice" in academia:
The competitive effects of Florida’s various voucher programs have been the subject of nine studies. All of them reported that the test scores of students who remained in public schools increased as a result of school choice competition. Although these positive effects of competition on public school achievement tended to be small, they were larger when school choice increased dramatically (Forster, 2008a). [emphasis mine]
I'm going to object to that last phrase. Forster's study -- which was not peer-reviewed, and has some serious methodological weaknesses (some of which are recounted in a review of a related study here) -- is hardly enough evidence to suggest that expanding school choice leads to better outcomes in public schools. Forster's study really does nothing to control for all kinds things aside from vouchers that might explain rising test outcomes.

That said, Egalite and Wolf are right: the effects of competition on Florida public schools were small.

Really small.

Let's take, as our best available example, the latest study on Florida's vouchers they reference: a 2014 peer-reviewed paper by David Figlio and Cassandra Hart. It's a clever piece of econometric work; not airtight by the authors' own admission (what is?), but still well worth considering. The authors exploit the fact that competition from vouchers varied considerably across Florida during the period under study: some schools, for example, have a private school nearby, while others have one further away. Some areas have a variety of private schools; some have only one type (say, evangelical).

Foglio and Hart looked at how this competition varied and correlated with outcomes for public schools. What did they find? If you're not used to reading this sort of research, it might be hard to grasp, so let's break it down:
Every mile the nearest private school moves closer, public school student test score performance in the post-policy period increases by 0.015 of a standard deviation.
The study excluded schools that were more than five miles from a private school, and the average public school was about 1.3 miles from a private school. What this study found was that if you put a private school a mile closer to a public school that already had one within five miles, the test scores at the local public school would increase from the 50.0 percentile to the 50.6 percentile.

Not impressed? Try this:
Adding 10 nearby private schools (just shy of a standard deviation increase in this measure) increases test scores by 0.021 of a standard deviation.
The average public school had 15.4 private schools within five miles. Add 10 more and you'll move the school from the 50.0 percentile to 50.8.
Each additional type of nearby private school is associated with an increase of 0.008 of a standard deviation. Adding an additional 100 churches in a 5 mile radius (a nearly one standard deviation increase) is associated with a 0.02 standard deviation rise in scores, and adding an additional 300 slots in each grade level in a 5 mile radius (just over a 1 standard deviation increase in this measure) increases scores by 0.027 standard deviations. Overall, a 1 standard deviation increase in a given measure of competition is associated with an increase of approximately 0.015 to 0.027 standard deviations in test scores. [emphasis mine]
In other words: increasing the competition measures by what is a very substantial amount results in moving test outcomes from the 50.0 percentile to between the 50.6 and 51.1 percentile. This is the most generous interpretation using this conversion.
While these estimated effects are modest in magnitude, they are precisely estimated and indicate a positive relationship between private school competition and student performance in the public schools even before any students leave the public sector to go to the private sector.
Well, yes, they are precisely estimated -- that's easy to do when you have a really big data set with over 9 million student-year observations.

But these results are not "modest" -- they are tiny. They represent no meaningful educational impact. To say, as DeVos does, that "the results get better" is just not accurate in any practical sense.

Now, voucher proponents could make the case, based on this study, that there is no evidence that the schools got worse. But I think that argument fails for at least a few reasons:

First, test score gains or losses are a very poor measure of whether a public school suffers fiscal stress due to the diversion of funds. Instruction in tested subjects would be the last thing a school district cuts if it's under competitive and accountability pressure. The question is what happens to instruction and programming not related to tests: extracurriculars, arts, history, science, student support services, etc. The truth is, we just don't know.

Second, if "choice" is introduced as a substitute for things like adequate and equitable funding, the overall progress of the system will be impeded. The sad fact is that the "Florida Miracle" has been grossly oversold; the state is a relatively poor performer compared to other states that make more of an investment in public education. Can that all be attributed to policy? No, of course not... but Florida is a state that makes little effort to fund its schools.

In any case, DeVos's contention that public, district schools see improvement when there is competitive pressure is just not held up in any practical sense by research like this. As I said in my last post, the effects sizes of things like this are almost always small. In this case, the effect is exceptionally small; in practical terms, it's next to nothing.

The idea that we're going to make substantial educational progress by injecting competition into our public education system just doesn't have much evidence to support it. I wish I could say that conservatives like DeVos were the only ones who believe in this fallacy; unfortunately, that's just not the case. Too many people who really should know better have put their faith in "choice," rather than admitting that chronic childhood poverty, endemic racism, and inequitable and inadequate school funding are at the root of the problem.

As always: I'm not saying we can't and shouldn't improve our public schools right now as best as we can. But DeVos's policy of expanding Florida-style school choice as a way of improving public schools makes as little sense as her policy of arming teachers to improve school safety. Neither policy has any empirical support, because both are clearly illogical.

School "Choice's" Best Friends

Monday, March 5, 2018

Things Economists Should Start Saying About Education Research

If there's one thing I find helpful about Jonathan Chait's work, it's that every now and then he gives us a "State Of The Reformy" piece that serves as a useful encapsulation of the current arguments among the neoliberal set for education "reform."

Chait's piece this time is especially notable because it's an explicit attempt to distance the Obama administration's education policies from those being pushed by the conservatives who have been emboldened by Donald Trump's win and, subsequently, Secretary of Education Betsy DeVos's rise to power.

Desperately, Chait wants to convince us that the agenda pushed by Arne Duncan and John King, Barack Obama's SecEds, represented some sort of middle ground between the hard-right's dream of privatizing education, and the left's indifference to the "failure" of American schools (allegedly a direct result of the vast influence and vast perfidy of teachers unions).

Chait's political argument is so silly it's almost not worth addressing: thankfully, Peter Greene, once again, does most of the work so I don't have to. The fact is that Chait makes sweeping generalizations about the left (and, for that matter, the right) that are absurd [emphases mine]:
"Left-wing policy supports neighborhood-based public schools, opposes any methods to measure or differentiate the performance of teachers or schools, and argues instead for alternatives to school reform like increased anti-poverty spending or urging middle-class parents to enroll their children in high-poverty schools."
"Unions that oppose subjecting their members to any form of measurement joined forces with anti-government activists on the right to protest Common Core and testing."
Chait, of course, gives no examples of unions not wanting to hold teachers accountable for their practice -- because the notion is nonsense. The unions have never -- never -- held the position that bad teachers should be allowed to continue to teach without any remediation or consequence. What they have insisted, quite correctly, is that there be due process in place as a check against abuses of power, so that the interests of students and taxpayers, as well as teachers, can be protected.

Now, as much fun as it might be to knock down all of Chait's straw men, I'd like to instead focus on something else from his piece. Because Chait, like all education policy dilettantes, likes to dress up his arguments with references to education research -- specifically, research conducted by economists. Throughout his piece Chait includes links to a variety of econometric-based research, all purporting to uphold his claims for the efficacy of reformy policies.

I have no problems with economists. Literally, some of my favorite people in the world are economists. And I enjoy a good regression as much as the next guy. Useful work has been done by economists in the education field. I can honestly say my thinking about things like charter schools and teacher evaluation has been shaped by my study of econometric research into those topics.


It has been my observation over the years that economists working in education have not been as forthcoming as they should about the limitations of their work. And this has led to pundits and policymakers, like Jonathan Chait -- and, for that matter, Arne Duncan and John King -- to draw conclusions about education "reform" that are largely unsupportable.

Chait's piece here is an excellent example of this problem. So allow me to take a pointed stick and poke it into the econometric beehive; here are some things everyone should understand about recent research on things like charter schools and teacher evaluation that too many economists never seem to get around to mentioning.

* * *

- Charter school lottery studies are not "perfect natural experiments." The economists who conduct these studies are often quite eager to tout them as "exactly the research we need" to make policy decisions about the effects of charter school proliferation. I am here to tell you in no uncertain terms: they are not.

The theory behind charter lottery studies is that the randomization of the lottery controls for all unobserved (better understood as unmeasured) differences between students that might account for differences in the effects being studied. In the case of charter schools, we might assume (quite correctly) that different parents approach enrolling students into charters in different ways.

Parents who care more about their child's outcomes on tests, for example, may be more likely to enroll their child in a charter school if their local public school has low test scores. These parents may be more diligent about making sure their child completes homework or attends school, which could lead to higher test scores.

The economists who conduct these studies are assuming, because assignment to charter schools is random in lotteries, that the differences in these unobserved characteristics of students and their families will be swept away by their experiment. There are some other assumptions built into this framework, but it's generally a reasonable theory...

Except it only applies to students who enter the lottery. If students who enter charter lotteries under one set of conditions differ from students who enter other another -- and there is plenty of reason to believe that they do -- we can't generalize the findings of a charter lottery study to a larger population. In other words: even if we find an effect for charter schools, we can't know that effect will be the same if we expand the system.

Further, we can only generalize the results of lottery studies to charter schools that are popular enough to be oversubscribed. In other words: if there's no lottery because student enrollment is low, we can't conduct the experiment. In addition, we are starting to get some evidence that charters, which have redundant systems of school administration and often can't achieve economies of scale, are putting fiscal pressure on their hosting public district schools.

While these points are sometimes mentioned in the academic, peer-reviewed papers based on these studies, I rarely see them acknowledged when economists discuss these studies in the popular press. Nor do I see any discussion of the fact that...

- Studies of education "reforms" often have very fuzzy definitions of the treatment.  The treatment is, broadly, the policy intervention we care about. For charter schools, the treatment is taking a school out of public district control and putting it under the administration of a private, non-state actor entity. The problem is that there are often differences between charters and public district schools that are not what we care to study, and these differences often get in the way of the things we do care about.

Here's a completely hypothetical pie chart from an earlier post. Let's imagine all the differences we might see between a charter school and a counterfactual public district school.

We know charter school teachers generally have a lot less experience than public district school staff, which makes staffing costs cheaper. There is almost certainly a free-rider problem with this, leading to a fiscal disadvantage for public district schools. But that's good for the charters: they can extend their school days and still keep their per pupil costs lower than the public schools.

But is this a treatment we really want to study? Shouldn't we, in fact, control for this difference if what we want to measure is the effect of moving students to schools under private control? Shouldn't we control for peer effects and attrition and resource differences if what we really care about is "charteriness"?

The economists who conduct this research often refer to their treatment as "No Excuses." What they don't do, so far as I've ever seen, is document the contrast the implementation of their treatment between charter schools and counterfactual public district schools. In other words: do we really know that "No Excuses" varies significantly between charters and public schools?

A lot of the research into charter school characteristics is, frankly, cursory. Self-reported survey answers with a few videos of a small number of charters in one city is not really enough qualitative research to give us a working definition of "No Excuses" -- especially when there's no data on the contrasting schools that supposedly don't adhere to the same practices.

So, no, we can't attribute the "success" of certain charter schools to their practices or organization -- at least, not based on these econometric studies. And we really need to step back and think about what we're using to define "success"...

- Test scores have inherent problems that limit their usefulness in econometric research. I keep a copy of Standards for Educational and Psychological Testing within arms length when I review education research that involves testing outcomes. And I have Standard 13.4 (p. 210) highlighted:
Evidence of validity, reliability, and fairness for each purpose for which a test is used in a program evaluation, policy study, or accountability system should be collected and made available.
This is a process that has been largely ignored in much of the econometric research presented as evidence for all sorts of policies. The truth is that standardized tests are, at best, noisy, biased measures of student learning. As Daniel Koretz points out in his excellent book, The Testing Charade, it is quite easy to improve test scores by giving students strategies that have little to do with meaningful mastery of a domain of learning.

Koretz also notes that multiple charter school leaders have explicitly said that improving test scores is the primary focus of their schools' instruction. As Bruce Baker and I note, there is at least some evidence that these improvements came at the expense of instruction in non-tested domains. That lines up with a body of evidence that suggests increased accountability, tied to test scores, has narrowed the curriculum in our schools.

I'm the last person to say we shouldn't use test scores to conduct research. But when the test score gains that econometric research shows are marginal, we should all stop and consider for a bit whether we're seeing gains that represent real educational progress. And many of these studies show gains that are quite marginal...

- Compared to the effects of student background characteristics -- especially socio-economic status -- the effect sizes of education "reforms" are almost always small. Student background characteristics are by far the best predictors of a test score. We know for a fact that poverty greatly affects a child's ability to learn

The claim of reformers, however, is often that education can be a great equalizer, leading to more equitable outcomes in social mobility. Over and over, the charter sector has claimed they are "closing the achievement gap," implying the education they offer is equivalent to the leafy 'burbs and that their students, therefore, are overcoming the massive inequities built into our society.

I made this chart a while ago, which compares the effects of charter schools, as measured by the vaunted CREDO studies, to the 90/10 income achievement gap:

The income achievement gap has actually been growing over the years; it now roughly stands at 1.25 standard deviations. No educational intervention I have seen studied using econometric methods comes close to equaling this gap. As Stanley Pogrow notes, economists seem to be all too happy to have the effect sizes they find declared practically meaningful, when often there is little to no evidence to support that conclusion.

One of the arguments made by some researchers is that these effects are cumulative: the intervention keeps adding more and more value to a student's test score growth, so that, eventually, Harlem and Scarsdale meet up. Except, as Matt DiCarlo points out in this post, you really shouldn't do that -- at least, you should only do so after pointing out that you're only making an extrapolation.

This is often where economists get into trouble: trying to translate their effects into more understandable terms...

- The interpretation of effect sizes into other measures, such as "days of learning," is often highly questionable. The CREDO studies have led the way in translating effect sizes into layman's terms -- with indefensible results. As I've pointed out previously, the use of "days of learning" in this case is wholly invalidated, if only because there is no evidence the tests used in the research have the properties necessary for conversion into a time scale. And the documentation of the validation of this method is slipshod -- it's really just a bunch of links to studies which in no way validate the conversion.

Recently, a study came out about interventions in Newark's schools and their effects on test scores. As I note in the review I did with Bruce Baker (p. 27), the effect size found of 0.07 SD was compared to "the impact of being assigned to an experienced versus novice teacher." But that comparison was based on a single study, by one of the authors, which compared teachers in Los Angeles who had no experience to those who had two years. This is hardly enough evidence to make such a sweeping statement.

In another interpretation, 0.07 SD moves test scores for a treatment group from the 50th to the 53rd percentile. Small moves like these are very common in education research...

- The influence of teachers on measurable student outcomes is practically small. I am a teacher, and I think what I do matters. I think I make a difference in the lives of my students in many ways, most of which can't be quantified. 

But in the aggregate, there is little evidence teachers have anywhere near the effect on student outcomes as out-of-school factors. As the American Statistical Association notes, teachers account for somewhere between 1 and 14 percent of the variation in test scores -- and we're not even sure how much of that is really attributable to the teacher.

One study that is cited again and again to show how much teachers matter is Chetty, Friedman, Rockoff. It's a very clever piece of econometric work, but in no way does it show that having a "great" teacher will change your life. Its effects have been run through the Mountain-Out-Of-A-Molehill-Inator to make it appear that teacher quality can have a profound influence on students' income later in life. But what it really says is that you'll earn $5 a week more in the NYC labor market when you're 28 if you have a "great" teacher (the effect if you were 30 is not statistically significant). 

Am I the only one who is underwhelmed by this finding?

* * *

Again: I have no objection to using test scores as variables in quantitative research designs. I will be the first to say there is evidence that policy interventions like charter schools in Boston or teacher evaluation in Washington D.C.* show some modest gains in student outcomes. It's valuable to study this stuff and use it to inform policymaking -- in context.

But simply showing a statistically significant effect size for a certain policy is not enough to justify implementing it. Some economists, like Doug Harris in this interview, make a point of stating this clearly. In my opinion, however, what Harris did doesn't happen nearly enough -- which leads to pieces like Chait's, where he clearly has no idea about the many limitations of the work he cites.

The question is: Whose fault is that? Have the researchers who inform our punditocracy's view of education policy done enough to explain how those pundits should be interpreting their findings?

Chait and others like him have the final responsibility to get this stuff right. But economists also have a responsibility to make sure their work is being interpreted in valid ways. I respectfully suggest that it's time for them to start taking some ownership of the consequences of their research. Explaining its limits and cautioning against overly broad interpretations would go a long way toward having better conversations about education policy.

* What they don't show is that student learning improved after a new teacher evaluation system was put in place. More on this later...