Total Pageviews

Search This Blog

Thursday, December 1, 2011

The failed potential of grading schools

Below should sound very familiar. -cpg

From the New York Times Gradebook

By Phil Weinberg

In November 2007, when our school received its first progress report grade, an A, I wrote a letter to our staff to say that I was pleased with our grade. However, I also told our staff that “I believe the progress report fails to capture the essence of a school, and it fails to measure the things that make our school such a great place.”

Four years later, with the progress report on its fifth iteration, I think my letter remains an accurate appraisal of the most important rating tool now in use by the Department of Education.

The original idea behind the progress report was truly excellent: rather than rate schools on Regents pass rates, credit accumulation and graduation rates, each of which was largely predicted by the type of students a school admitted, we would measure the progress schools made with individual students.

Such a rating tool would have focused on improvement and growth over time, on aspirations and, because it was intended to set goals that all schools could work toward and ultimately all achieve, it would have encouraged schools to work together to improve their practice.

Such a report might have enabled us to truly, and thoughtfully, reorganize our school system so we could better meet the needs of our young people.

Sadly, with the progress report in its fifth year, we never found an authentic way to measure individual student progress. Because there was a palpable urgency to rate schools, we defaulted to a progress report that relies on a convoluted set of metrics that put schools in competition with one another in order to create ratings.

In so doing, we eliminated the essential focus on collegiality and cooperation among schools, which, if it had been implemented well, could have led to real improvement in our schools.

This year our school’s grade is derived from an Excel spreadsheet which is 1,572 rows long and 109 columns wide, and therefore includes up to 171,348 discrete pieces of data.

It is hard to fathom how to reduce this amount of data accurately to a single letter grade, or that such a simple rating (A, B, C, D or F) based upon such a vast amount of data can be accurate. Yet our city seems to accept that the grades produced by the progress report are “true.”

In the progress report system’s brief existence, we have learned to refer to schools as A schools or C schools, even though few of us understand how that A or C was earned.

Parents rely on the progress report when selecting the schools to which they send their children.

All news outlets report the release of schools’ grades.

When the city’s Education Department was of the opinion that bonuses might stir principals to better performance, it awarded those bonuses based solely upon the progress report, even though it accounted for less than a third of each principal’s formal yearly rating.

School-closing decisions have been tied directly to progress report grades.

And most important, for much of the last five years the progress report had been the clearest message schools received from Tweed about the direction teaching and learning should take in our city. Yet, it is silent on what quality teaching and learning look like.

The charge we receive is very clear regarding the outcomes we are to produce, but silent on what we should teach and how we should achieve those outcomes. Because of that, the progress report may not be serving our schools well.

And even though we accept its findings with little question, a very simple review of the progress report shows that it does not reliably measure schools’ performance.

If we created a system for rating doctors, I am certain that a key statistic to include would be mortality rates. However, if we then concluded that podiatrists were better doctors than oncologists, people might not take our system seriously (or, if they did, oncologists would probably shy away from treating people with cancer).

We would never create a rating system for doctors that caused them to compete with one another because that would stifle their desire to share ideas and train one another, key aspects of any true profession.

And we certainly would not want to create a disincentive for doctors to work with those who are most at risk.

The same goals should guide the way we rate our schools. Until we create a progress report that focuses on collegial sharing of best practices rather than on competition, we will be leading our schools astray.

Philip Weinberg is the principal of the High School of Telecommunication Arts and Technology in Brooklyn.

http://www.nytimes.com/schoolbook/2011/11/30/the-failed-potential-of-the-progress-reports/

No comments:

Post a Comment