Tag Archives: TPEP

Creating Coherence

There’s a special kind of efficiency that happens when we’re able to see overlaps and connections. It is very easy to look at all of the demands upon us and see them as discrete and separate elements on a never-ending to-do list, but there is tremendous power in the pursuit of coherence.

One example: Student Growth Goals, Professional Growth Goals and Data.

We know that by law we all have to write and monitor student growth goals. I’m lucky to be in a district and building that gives us as teachers ownership of our goals, so we are empowered to design and implement growth goals that are meaningful to our students…not just for checking a TPEP box or demonstrating our compliance. In addition to student growth goals, we also have our professional growth goals we are expected to develop. If you’re on the comprehensive “all eight” evaluation (like I am), that means a small group student growth goal, a whole class student growth goal, a collaboration goal, and a professional growth goal.

Imagine if all of these things could be focused in a way that any data I gather serves to monitor all of these goals.

Here’s how I’m attempting to achieve this coherence:

I start by observing for a need. Those first weeks are critical for getting to know students as humans and as learners. Through observation and assessment, narrow my focus on a specific, high-leverage skill that I see as a gap in my kids’ academic performance.

Before I write their student growth goal, I consider the skills I want to develop. If I want to improve my students’ skills, I need to be deliberate about the practices I employ. Sure, I have some lessons from years past, but I want to consider what learning I need to do to enhance my practice around teaching this particular skill in a way that helps all students grow and improve. I explore some strategies, extend my own learning, and select a few specific teaching moves to try out. This becomes the seed of my professional growth goal.

Here’s where the unity starts to form: If I am going to change my practice, it should result in a change in student performance. Thus, I craft my professional growth goal and my student growth goal in the same block of text.

The core of my goal set, I nest “inward” for my small group goal. Within this skill, I have a subgroup who needs a bit more intervention. I expand my goal to address them and identify likely interventions.

Finally, I nest “outward” for my collaboration goal. Here’s the dirty little secret about collaboration goals: lots of teachers and administrators misinterpret what it takes to have proficient goals. The assumption is that my team and I have to have the same goals, use the same data, and demonstrate how we walk in lock step toward a common destination. Not so. If you read the actual rubric for 8.1sg, it is more about “playing nicely with others” than it is about everybody having to do the same thing the same way. So, I tag onto my goal how I plan to “play nicely.”

Here’s what my goal might end up looking like…it is long, but it is accomplishing multiple jobs, all the while letting me focus on just one:

By learning about building coherence in writing, I will improve my professional practice by trying at least two different scaffolds that help students achieve more coherent analytical writing. As a result, my students will be able to select and effectively use a pattern evidence to support a claim, as demonstrated in regular journal entries, formal literary analysis papers, and evaluation of informational text. By the end of the quarter, each student will increase by at least one level on the “Argument from Evidence” assessment scale. My subgroup will consist of the students who scored a Level One or lower on the first assessment. I will offer additional interventions (via targeted feedback and small group writing workshops) to assist these students to each increase by two levels on the scale. I will collaborate with my PLC to examine my goals during our every-other-week PLC meetings. I will share my assessments for feedback and we will examine student performance to strategize interventions as needed.

When the assessment data starts to roll in, I can now use the student’s performance not only to examine their growth, but also the impact that changes in my practice had on their growth. To me, that kinda seems like what the point was from the beginning. In the end, I write one comprehensive goal that represents a laser-like focus on improving my practice in order to improve student performance.


Image Source

Data without Numbers

During the last teacher evaluation workshop I led for principals and teacher leaders, I closed with this quasi thought-experiment for them to ruminate on for the couple of weeks until our next meeting:

What if a law were passed that kept the TPEP student growth requirement but prohibited the use of any form of number or percentage as a means of showing of student growth: How might a teacher be able to demonstrate the impact of practice under such a law?

My intentions are simple: How else besides charts and percentages might we talk about student growth? As an English teacher, finding and using meaningful quantitative data was something I always wrestled with. I did eventually find a way to reduce my students to a number in a way that I felt was valid and productive. (Further elaboration here as well.)

However, as I coach both teachers and administrators in our continued intentional implementation of our evaluation system, it is clear for both groups that the pressure to generate numbers has remained great…and in many cases, has felt hollow if not contrived.

In our operationalized definition of data, we’ve come to rely upon information that is easy to communicate sometimes at the expense of information that means a dang thing at all. A graph, a chart of figures, or a line of numbers is pretty easy to pull together if we’re held more accountable for producing numbers than we are for thinking about what the numbers might communicate.

Particularly when we consider the statewide requirement that teacher evaluations include an examination of student growth data, the stakes feel oppressively high and the worry about producing inadequate or “bad” data is palpable in many conversations I have with teachers. I do want to point this out, though: The wording of the student growth rubrics (SG3.2 and SG6.2) which apply to every single classroom teacher in the state of Washington. Both those rubrics state this:

PROFICIENT: Multiple sources of growth or achievement data from at least two points in time show clear evidence of growth for most students. (Source)

Sure, there are some vague words in there: “multiple,” “clear,” and “most.” What isn’t there is pretty obvious to me: A requirement that growth be represented through a number.

When I think about my career, the most clear and convincing artifacts of my impact on student growth came during my candidacy for and renewal of my National Board Certificate. In both of these cases, the way I demonstrated growth was by contextualizing patterns of student work within my own deliberate practice, and then reflecting on the exact changes in student performance (not necessarily changes in score) that proved I had indeed contributed to student growth. This evidence included student work samples but was convincing because of the analytical narrative and reflection on practice that accompanied it all.

While I am a strong proponent for National Boards as a voluntary professional growth experience, I am not advocating for a National Board-like model for yearly teacher evaluations. I do believe however that the kind of longitudinal narrative analysis of student work I did during my candidacy and renewal was at least as convincing as any table of numbers I might have been able to produce for the same sets of kids.

Numbers have an important place, and as I said, the right numbers can paint a meaningful picture of growth. However, numbers should not be the only conceivable (or permissible) vehicle for communicating student growth in our evaluation. We need to be sure to make room for the premise that sometimes the best way to illustrate student growth might actually be to tell our story.