So, you're working in a PBL classroom. You have a teaching partner that works with you. You are wrapping up a unit and your director (manager/principal... whatever) asks you the big question: "How are you going to assess the students?" Oh, yes - we all love hearing that question. Nevermind that we've been assessing students for weeks. Yet, the administration needs the data to show the state that you are actually doing something. Understand the scenario? So what can we do? Here we are doing real world things, in a real world setting, sometimes with real world partnerships and sponsors - but the administration needs something to show the state. They need data.
It's not that teachers are afraid of data. When I talk to most teachers they aren't concerned with how the data will make them look. They're truly more concerned with the amount of class time they will lose acquiring the data. Our school takes the NWEA exam, the Acuity exam, the PSAT exam... We also give our own diagnostic tests. With all of these tests and exams we are wearing the kids out. They are becoming "desensitized" as it were to the tests, themselves. They simply don't care.
Take our last pass through the NWEA a week ago. My teaching partner and I put the highest score up on the board with a message around it that said, "Can you beat the high score?" We appealed to their sense of competition. Also, we appealed to their rationale saying, "These scores help us decide teams, allocate partners, and define groups." As a final push we appealed to their sense of empathy saying - "We know that you have taken many tests, but the data gleaned from them is truly important. We need you to try your best."
What happened? The scores got worse. Most students actually lost points.
Why? They're tested out.
The hypothesis of the politicians, at least here in Indiana, is that Teachers create a "product". This product is student knowledge. They assume that human beings retain knowledge like computers, readily accessible at the drop of a hat. The truth is, human beings, and our minds, are elastic when it comes to retention. We remember key things that we need to remember, but we can forget things too.
There seems to be a correlation between stress and forgetting things.
Put a kid in the hot seat and watch them sputter and stammer while they try to formulate a response. I surmise that my students did poorly on the test because we are nearing the end of the semester. Students are scrambling to finish projects that they have been working on, prepare for final exams, and get last minute points for their grades. On top of that there are basketball games - everyday! - music rehearsals, concerts, drama plays, and movies coming out. This is not only the most wonderful time of the year, it's the most distracting. But nevermind that... we're going to set you down in a room in front of a computer and make you take this test. All in the name of getting data that will probably shoot us in the foot more than help us.
So, again, what can we do?
I was at the Kennedy Center for the Performing Arts in Washington DC last year for a CETA conference. While our school is not yet a CETA school, we were looking into the possibility of becoming one. One of the presenters, discussed this very topic. Her problem was that she would have students create amazing art that took intense work, but the parents and visitors would say "That's nice." when they saw it. "They didn't understand what it took to create it." she told us. She wanted to create a way to document the amount of work the students put into their projects. What she came up with was something simple. She created a word document that had a table on it. The table contained pictures of the processes that the students went through. Underneath each picture was a caption that explained what the students were doing and which standards they had to master to do it. The step by step table guided parents, and administration, through the process of learning that the students had taken. "All I had to do was have a camera handy to take shots ever-so-often." she told us. This method works for the general populous, but how can we get the "data" that the state wants? Lets face it. They want numbers. They want something that can be quantified.
How can you quantify true knowledge? Seriously...
I'm afraid I don't have an answer. I could go back to worksheets, but that won't solve the "test" problem either. Nor does a worksheet give a clear and true account of a student's "knowledge". You want to see knowledge? Put the kid in a situation; see what they really know. I have been stunned in the parking lot of my school, when one student's car wouldn't start and I was standing there, like a dunce, trying to help. A kid we've labeled as "stupid" comes up and says, "Oh, I see the problem..." and proceeded to get the car running. With a "That will only hold for a little while, you'll need to get it fixed permanently." he sent his fellow off while I stood there trying to figure out how he fixed the car. That's a brand of cunning I don't have.
How is the state assessing that? Is that kid really "stupid". Perhaps in English, but not in Engineering. He has obviously learned somewhere how to fix vehicles. I'm glad someone knows how to do it. I don't.
I guess someone smarter than me is going to have to figure this one out? I will pose the question, though: How do we, in a PBL system, show data that is quantifiable for the state, but still accomplish the task of actually teaching? With all of these tests how can we get our actual business done? When they finish collecting all of the data they need, what will the kids have to show for it? Another question... What does all of this quantifiable data amount to anyway? What are we really getting out of it?
Until next week.
No comments:
Post a Comment