New Study: Teachers Who Pass ProTeach are better than Those Who Don’t

Cedar leavesBy Tom

A new study commissioned by Washington’s Professional Educator Standards Board shows that ProTeach – our teacher licensing assessment – seems to contribute to an increase in student learning. The study was conducted by UW Bothell’s Center for Education Data and Research (CEDR). It was a complicated study, but essentially they compared teachers who passed ProTeach with those who didn’t by looking at their students’ test scores using Value Added Models (VAM).

The main conclusion reached by CEDR is that teachers who passed ProTeach correlate to higher student test scores, especially in reading. Not so much in math. It's worth noting that this is essentially the same conclusion they drew conducting a similar study on teachers who earned National Board Certification. The study also found the effect was greatest for those teachers who scored high on Entry 2; the one that concerns classroom management and family communication. I found that interesting; it seems more likely that Entry 3 – which is all about teaching and assessment – would be the one more closes associated with higher test scores. I guess that goes to show how important classroom management is. Overall, the results seem to indicate that ProTeach is an effective measurement of teacher quality, which must make the PESB feel relieved.

Of course, we can also look at these results from another direction. Maybe they indicate that VAM is a valid measurement, at least for reading instruction. Personally I found the entire report a little presumptuous; strongly implying that VAM is the gold standard and that teacher performance assessments like ProTeach are valid to the extent they correlate with student performance assessments like VAM. I may be biased (I’ve never been a big fan of VAM), but I place a lot more credence on an evaluation that focuses on what teachers are actually doing when they teach than an evaluation that looks at student performance, which is effected by a myriad of factors that include teacher quality. So perhaps the fact that VAM results and ProTeach results are correlated might show that VAM is legitimate. At least in reading instruction.

In a larger context, it seems to me that if there’s a place for VAM, then this is surely it; used on aggregated data like in this study. Where VAM is not appropriate is when it’s used on individual classrooms and individual teachers. That’s a complete travesty. It’s also a shame, because advanced statistical analyses – like VAM – can be invaluable when it comes to showing which instructional practices are effective and which aren’t. And I’m here to tell you: teachers across the country who are evaluated using VAM hate it with a passion you don’t often see in education. If we manage to steer clear from that mistake in Washington, I can see the day when TPEP reaches maturity and we have a large database of teacher evaluations and researchers like the folks over at CEDR can use metrics like VAM to help us understand which teacher practices are most effective.

But for now, we’ll have to settle for what we have: a somewhat obvious conclusion that tells us that good teachers produce good students. 

One thought on “New Study: Teachers Who Pass ProTeach are better than Those Who Don’t

  1. Mark

    I am with you on this one. It is nice, I suppose to have data to support the assessment, but the conclusion seems rather obvious: those who passed are more effective than those who didn’t. That can be said of pretty much any assessment that determines minimum qualification. If someone cannot meet minimum qualification it bears to reason that they will be less effective. My worry is that this could lend supposed credibility to VAM as a means of individual teacher evaluation, as you point out.

Comments are closed.