Tag Archives: data

Every Dot is a Child

Who could have imagined 2020 as a year of unprecedented change and uncertainty? The closing of schools and statewide quarantine orders requires flexibility on the part of teachers. We’re still working–albeit from home. 

I’ve been participating in staff as well as Specialist and Building Leadership team meetings through Google Hangouts (as a side note, some teachers use this online platform to meet with their classes). In many ways our conversations in these meetings relate to the new challenges we need to overcome in our profession. In other ways, our conversations return to the usual concerns of our field.  

Your students may not be taking the SBA this year, but you will see plenty of other data on their academic performance. The data may come from iReady, DIBELS, MAPS, or another assessment preferred by your district. 

The push in education is toward data informed instructional practices like the work done by John Hattie through Visible Learning. During PLCs, staff meetings, or as part of evaluations, teachers look through data–numbers, graphs, and percentages–to gauge student progress and plan for remediation or instructional changes. Now is the perfect time to analyze data and adjust instruction to accommodate for the needs of our students. 

But please remember: every dot is a child. 

A graph depicting my student’s growth from baseline to summative assessment in vocabulary knowledge.
Continue reading

Data without Numbers

During the last teacher evaluation workshop I led for principals and teacher leaders, I closed with this quasi thought-experiment for them to ruminate on for the couple of weeks until our next meeting:

What if a law were passed that kept the TPEP student growth requirement but prohibited the use of any form of number or percentage as a means of showing of student growth: How might a teacher be able to demonstrate the impact of practice under such a law?

My intentions are simple: How else besides charts and percentages might we talk about student growth? As an English teacher, finding and using meaningful quantitative data was something I always wrestled with. I did eventually find a way to reduce my students to a number in a way that I felt was valid and productive. (Further elaboration here as well.)

However, as I coach both teachers and administrators in our continued intentional implementation of our evaluation system, it is clear for both groups that the pressure to generate numbers has remained great…and in many cases, has felt hollow if not contrived.

In our operationalized definition of data, we’ve come to rely upon information that is easy to communicate sometimes at the expense of information that means a dang thing at all. A graph, a chart of figures, or a line of numbers is pretty easy to pull together if we’re held more accountable for producing numbers than we are for thinking about what the numbers might communicate.

Particularly when we consider the statewide requirement that teacher evaluations include an examination of student growth data, the stakes feel oppressively high and the worry about producing inadequate or “bad” data is palpable in many conversations I have with teachers. I do want to point this out, though: The wording of the student growth rubrics (SG3.2 and SG6.2) which apply to every single classroom teacher in the state of Washington. Both those rubrics state this:

PROFICIENT: Multiple sources of growth or achievement data from at least two points in time show clear evidence of growth for most students. (Source)

Sure, there are some vague words in there: “multiple,” “clear,” and “most.” What isn’t there is pretty obvious to me: A requirement that growth be represented through a number.

When I think about my career, the most clear and convincing artifacts of my impact on student growth came during my candidacy for and renewal of my National Board Certificate. In both of these cases, the way I demonstrated growth was by contextualizing patterns of student work within my own deliberate practice, and then reflecting on the exact changes in student performance (not necessarily changes in score) that proved I had indeed contributed to student growth. This evidence included student work samples but was convincing because of the analytical narrative and reflection on practice that accompanied it all.

While I am a strong proponent for National Boards as a voluntary professional growth experience, I am not advocating for a National Board-like model for yearly teacher evaluations. I do believe however that the kind of longitudinal narrative analysis of student work I did during my candidacy and renewal was at least as convincing as any table of numbers I might have been able to produce for the same sets of kids.

Numbers have an important place, and as I said, the right numbers can paint a meaningful picture of growth. However, numbers should not be the only conceivable (or permissible) vehicle for communicating student growth in our evaluation. We need to be sure to make room for the premise that sometimes the best way to illustrate student growth might actually be to tell our story.