By Tom
Today was an in-service day. One of the activities on which we worked involved analyzing various assessments that we administered to our students early in the school year. We worked in small groups of four teachers. The first test we discussed was the QRI, which stands for something that involves reading fluency. Our group had mixed opinions; I found the data useful, as did two other teachers, but the fourth thought her kids read much better than the data revealed, so she disregarded it.
Next we talked about the writing assessment. Again, three of us thought the data was useful, but another (different) teacher thought her kids could write much better than the data revealed, so she disregarded it.
Then it was math. Again, we had mixed opinions on the quality of the data, and a conversation ensued. But then I pointed out that it was moot; we were instructed at the math trainings to power through the curriculum regardless of how well the students were learning it. (It's a "spiraled" curriculum.) We had essentially been trained to disregard the data. They all agreed, so we disregarded the data.
So I got to thinking. To what extent are we, as a profession, data driven?
We like to think we are. And with the amount of data that we collect, we certainly should be. But I have my doubts.
First of all, the data isn't always that good. I've seen kids take tests. Sometimes they work hard at them and sometimes, for various reasons, they don't. But then we aggregate that data and assume that it represents the true capacity of our students, even though those of us who watch that data at its conception know for a fact that it doesn't. I once had a bright fourth grade student hand me his WASL less than ten minutes after I gave it to him. He just lost interest. I once saw a third grader get stuck on a math question, and then she freaked out, completely shut down and performed well below her capacity on the entire WASL.
Secondly, education data is frequently ambiguous. Every curriculum program on the market is "research based," despite the fact that a completely different program is also based on research. Different research, but research nonetheless. Remember SRA cards? Research based. Whole language? Research based. Remember Math Their Way? Research based. Everyday Math? Research based. Using data to select curriculum can be difficult.
But there's another reason why I question our so-called Data Driven Approach to education. It has to do with basic human nature. I think people make decisions and live their lives guided more by intuition and feelings than data. Even when the data is unambiguous and even when the implications of that data are clear. Why else would people live in Baltimore? Have you been there in the summer? Or the winter? Why else would a person, given a choice, (and these people are given the choice) be a Mets fan? Likewise, I've seen countless teachers stick to programs and lessons simply because they enjoy using them, regardless of the results of those lessons. And these same teachers will ignore new programs and approaches simply because they look complicated and confusing, or require time and effort to master, even when there's data that shows that those methods are useful. Of course, I've also seen teachers abandon solid, proven lessons in favor of high-tech, shiny methods, simply because the new methods are high-tech and shiny.
And it's not just our profession that ignores data. Doctors, those really smart people we knew in college, also ignore data. NPR did a profound story recently on health care, concluding that the amount of health care received by patients has little correlation to patient health. It turns out that doctors, who are paid according to the amount of procedures they administer, administer more procedures than necessary, which we all end up paying for.
And before you tsk-tsk, consider the most unambiguous data available. Data to which all of us have daily access. Data that has obvious implications on the way we should live our lives. I'm speaking of course about the data you glance at when you weigh yourself in the morning. To what extent are most Americans driven by this data?
So no; I don't think we, as a profession, are data driven. Maybe we should be, but we're not. Or maybe we are. Maybe we're just driven by the data we agree with.
I’ve rarely seen a nail’s head hit so accurately, Brian. Must be your carpentry background!
My Master’s degree is in Science Education, and we had a seminar where we discussed whether or not the scientific method could provide meaningful data for educational practices. The consensus was no. The classroom is far too complex to be able to control any variable. And there seems to be an underlying uncertainty principle at work. But that doesn’t stop us from doing “research”. Then, given the spectrum of often conflicting results, we are free to pick the ones we already knew were right.
Interesting topic. It reminds us of the debate as to whether teaching is an art or a science; whether close monitoring of prescribed curriculum and data results will really help schools make AYP, as some district administrations are now assuming.
I agree, data can take different forms, sometimes it’s qualitative and sometimes it’s quantitative. But regardless of the nature of the data we see, are we using all of it, or are we only using the data that confirms our hunches and intuition?
One of the problems with scores and research and data is that they are often gathered from other students, somewhere else at a different point in time than the children sitting in our rooms today. I have a hard time with that. A child’s current teachers know best where that child is and what will work for him. Besides, the 32 children in my room are at 32 places in their skills, and they have 32 different learning styles. I can handle that, if I’m allowed to do what I’m good at. I am hobbled when I’m told to use this curriculum in this way because “research shows” that it gets results.
And Tom, you’re right that we ignore data. A few hours after my dad’s cancerous lung was removed – and my dad was a smoker – I saw his surgeon, a man who removed lungs every day, lighting up in the hospital’s garden.
All measurement has a degree of error. Testing is no different. Students are not products or materials. Qualitative data are extremely valuable when making decisions regarding student progress towards a standard or level of conceptual knowledge. Teachers observing and having conversations with their students have a greater depth of knowledge of their students than someone contracted to write a test for a publisher. Teachers often need to go beyond numbers, percentages, and test scores to help their students find success in learning.
I agree, Tom, that the extent to which your descriptions represent teacher and physician behavior accurately, they do not appear data driven. And, I agree with Mark’s observation that data can result from more than one operation.
I’m curious, are you using teacher talk codes to say, “Because teachers choose not to use data to make instructional decisions that likely increase student learning rates, therefore, no data hold validity for those choices?
(Side note: the datum is; these data are.)
I think one limiter is how we use the word “data.” To me, data has always meant only quantifiable assessment to which a number can be assigned. Recently I was told that for a project I could gather anecdotal and observational data, not numerical data. To me, that was an oxymoron. How can it be data if it isn’t quantified. However, I didn’t resist…I find the anecdotal and observational data to be far more useful in shaping my practice. If I run my kids through a reading assessment, I find that data less useful than my observations of how they discuss a text when doing a small group reading protocol. The latter gives me deep, rich “data,” whereas the other is shallow and inauthentic.