I'm having a tough time with this one.
According to data collected from Ohio’s new value-added teacher ratings, there appears to be little correlation between how much value an instructor brings to each student and how much that instructor is paid.
This is the bit that had me throwing up my hands...
So, were they studying value/salary or value/quality? To me, it seems silly to expect variation when you know salary is simply based on years. I guess it's logical to think that experience leads to quality, but, come on, that's not necessarily true. A crap teacher this year is a crap year next year. And a fantastic teacher will probably improve from year to year, but that teacher also has to deal with a myriad of variables.In some ways, these results are no surprise: The way Ohio schools determine teachers’ salaries has nothing to do with how well they teach. It has everything to do with how long they’ve been teaching and whether they have a master’s degree.
My frustration over this article grew when I read...
The StateImpact/Plain Dealer analysis is based on the new assessment scheme that uses standardized test scores to determine how much teachers contribute to each individual student’s success. The approach seeks to give a more objective answer to the question “What makes a teacher good?”
Before my students took the Texas STAAR test this year, I told them that they weren't a number. A test or three tests do not define them. I guess we can't say the same thing about Ohio teachers.
UPDATE: On Dr. Ravitch's blog, I found a related post, Ohio Teachers Subjected to Junk Science Ratings.
UPDATE: On Dr. Ravitch's blog, I found a related post, Ohio Teachers Subjected to Junk Science Ratings.
No comments:
Post a Comment