When US News & World Report publishes "America's Best Colleges" each year, students and faculty scour the lists—but it's the tuition-paying parents who are the primary target of this massive exercise in grading the institutions that educate America's youth.
Mayor Michael Bloomberg may have had this in mind three years ago when he proposed grading and ranking every New York City school. "By next fall, we'll be sending user-friendly reports on every school to every public school parent across the city," he said in his 2007 State of the City speech. "Each school will receive a grade, from 'A' to 'F', on its year-to-year progress in helping student advance. Personally, I can't think of a better way to hold a principal's feet to the fire than arming mom and dad with the facts about how well or poorly their children's school is performing."
But the results are not what many parents might have predicted. The school that scored highest in 2009 isn't an academic powerhouse like the Bronx High School of Science. It doesn't offer Advanced Placement classes, or chemistry, or foreign languages. It has an attendance rate well below the citywide average. And many of its students reported in a city survey that they sometimes stay home because they don't feel safe at school.
Nonetheless, the High School for Hospitality Management, a small school in midtown Manhattan that opened in 2004, was ranked first in the city. It topped the list because of how well it moved low-performing ninth graders from one grade to the next and all the way to a successful graduation, on time, in exactly four years.
This, by the city's accountability measures, is rock-solid progress—and it is exactly what the Department of Education's Progress Report was designed to measure.
Each school's annual Progress Report is presented in a concise shorthand so parents, teachers and students can compare schools to one another. Yet its purpose is far more sweeping—and does not reflect traditional notions of high-performing schools.
The Progress Report is, in fact, the keystone of an accountability system that allows principals the freedom to manage their schools as they like—so long as students achieve a specific set of objectives related to year-to-year progress. The reports are used to determine which principals and faculty members will receive bonuses (up to $25,000 for principals). And they are a major factor in the chancellor's decisions about which principals to replace and which schools to shut down. They are by far the most important accountability tool in the city's school system today.
"Any school that earned a 'D' or an 'F' grade was automatically considered as a candidate for closure or other consequences, such as leader change," Department of Education (DOE) Chief Schools Officer Eric Nadelstern wrote in a memo to principals in January 2010.
Does that mean parents can rely on these reports to know which schools are best for their children? Not really, says Robert Hughes, executive director of New Visions for Public Schools. "They don't necessarily correspond to the educational experience of kids. And so I think they are very hard for parents to understand. You know, unfortunately, I think some parents do rely on it, and they make
The Progress Reports don't rank each school according to its overall performance. If they did, schools like Bronx Science would be squarely on top. Rather, they rank each school according to a formula meant to reflect the degree of progress its students make each year, compared with schools with similar populations. Elementary and middle schools are ranked according to a formula that includes students' year-to-year improvement on standardized state math and reading tests. High schools are ranked according to how many students pass their courses and their Regents exams, as well as how many graduate on time. The Progress Reports give schools a score from 1 to 100 (or up to 105 with extra credit), a letter grade from "A" to "F", and a percentile ranking that compares them to other schools in the city.
This system is designed to reward schools that do well with the most challenging students, including those with special needs and those in the lowest third of achievement as measured by statewide standardized tests. The formula gives extra credit to schools that strongly support students who start out with poor English skills and are learning the language well enough to improve their test scores. For example, two recent top schools—the 2008 winner, Brooklyn International High School, and the 2007 winner, Manhattan Bridges—specialize in serving young recent immigrants.
"We have special incentives to reward success for the highest-need kids, because that's what's going to drive the system forward," says Phil Vaccaro, DOE's executive director of student performance.
At the same time, schools are penalized if their weakest students fail to improve academically. Some otherwise highly regarded high schools have received poor Progress Report scores because some low-performing students failed at least one course. For example, Bard Early College High School in Manhattan, a demanding school that combines high school with two years of college, received a score of 66 out of 100 last year. It was ranked poorly, in the 49th percentile. The education department's assessment formula punished the school because several students in the lowest third of the ninth grade failed at least one class.
Similarly, Midwood High School in Brooklyn, known for its science honors program, got a score of 62.7 and ranked in the 42nd percentile last year, partly because many students who ranked in the bottom third of the school's tenth grade class failed at least one course.
Among school leaders, the Progress Reports can be flashpoints of controversy. But they are also recognized as important, if flawed, measures of school success. Education officials say the measure is useful, because students who successfully complete ninth grade and each grade after that are significantly more likely to graduate from high school. They explain that it is essential to reward schools that do well with their weakest students.
Complex and Sometimes Unpredictable
While the incentive for advancing low performers seems straightforward, the formula behind the Progress Report rankings is anything but. Instead of being a simple, constant instrument, grading the schools is a complex process, determined by increasingly complicated algorithms. Calculations and cutoff scores change every year as the DOE fine-tunes the reports and the grades themselves.
"I've been doing data analyses almost my entire life, and when I look through the complex array of calculations, it's kind of hard to keep it all in my head," says Robert Tobias, an education professor at New York University who was executive director of assessment and accountability for four New York City chancellors under the old Board of Education.
A lack of consistency among the various assessment tools contributes to the confusion. For example, Hospitality Management High School got an "A" for "performance" and "progress" on its city Progress Report, but a "D" on "school environment." The latter measure is derived from an annual survey of parents, teachers and students, who answer questions about things like school climate and safety.
But the greatest confusion comes from the fact that Progress Report scores, grades and percentile rankings can swing unpredictably from year to year, even when the schools themselves seem stable. This is especially true at the elementary school level. PS 8 in Brooklyn Heights, for example, earned a "C" in 2007, an "F" in 2008 and an "A" in 2009, while its percentile score went from the 24th percentile to the very bottom (or 0 percentile), then up to the 60th percentile.
On the Upper West Side, PS 87 earned a "B" in 2007, a "B" in 2008 and an "A" in 2009, but its score went from the 48th percentile in 2007 to the 28th percentile in 2008 to only the 19th percentile in 2009.
Hughes notes that the Harlem Children's Zone Promise Academy charter school posted scores in the top 2 percent in 2008 and the bottom 3 percent in 2009. "The volatility is extraordinarily high," he confirms. (City education officials will change the way scores are calculated in 2010 in an attempt to constrain this volatility. See "Building a Better Yardstick")Growth Pains
The Progress Reports were first launched in 2007 when the DOE empowered principals to make decisions over matters such as school budgets, hiring and curriculum. In exchange, they were to be held accountable for their results. Instead of being supervised by their superintendents, principals were allowed to do whatever they thought was best for their schools, as long as they got results as measured by the Progress Reports.
"Progress Reports are our lead tool for understanding how we expect schools to perform," says Deputy Chancellor John White. Each year, however, the Progress Reports have turned up some surprising results.
In the first year, 2007, there were howls of protest from parents and principals in middle class neighborhoods whose high-performing schools got low grades. At the same time, newspaper accounts ridiculed the DOE for giving "A"s to poorly performing schools. For example, The New York Times described a Bronx middle school that received an "A" even though teachers complained that kids overturned desks and threw books out the window. Education officials defended the grades, saying they represented progress, not performance. Nonetheless, the department tinkered with the formula for the following year.
But in 2008 there were more protests when PS 8 in Brooklyn Heights received an "F". Once again, officials defended the grade, saying the school failed to make progress compared with similar schools. Once again it tinkered with the formula and set what it considered reasonable goals for each school, raising minimum scores by a few points. "It's not like it's a really scientific process," says Vaccaro, the official in charge of student performance systems. "But it's not arbitrary, either. We try to look at the magnitude of the gains of the previous year and move a commensurate change in the scoring. We just want to raise the bar."
When the third year's Progress Reports were released in the fall of 2009, the DOE was criticized for grade inflation: This time, 97 percent of elementary schools and 75 percent of high schools earned "A"s or "B"s. This was the direct result of unanticipated gains in state standardized tests scores, which made it easier for schools to meet the prearranged targets set by the city. Some of the schools that received "A"s actually ranked as low as the 20th percentile, meaning 80 percent of schools had done better on their progress outcomes.
This time, the state vowed to make its tests more difficult for 2010, and the city vowed to mark schools on a predetermined curve, rather than set goals based on an unpredictable state test. "What people should understand is that the [2008-09 state test] grades were inflated," says Vaccaro. "That hurt the credibility of the system last year."
Closing the Bottom 10 Percent?
Last November, weeks after his re-election, Mayor Bloomberg spoke at a conference in Washington, DC, alongside the US Secretary of Education, Arne Duncan. "Secretary Duncan has challenged states to turn around their lowest-performing 5 percent of the schools. Arne, we'll see your 5 percent and we're going to double it," Bloomberg said.
"Our goal is to turn around the lowest-performing 10 percent of city schools over the next four years by closing them down and bringing in new leadership and holding everyone accountable for success," he continued.
In promising to close 10 percent of the city's schools in order to turn them around, Bloomberg was pursuing a hardnosed approach to school change—one that would require the DOE to shut down about 40 schools each year between 2010 and 2013. While the data-driven Progress Reports are the most important single factor in determining whether or not a school lands on the administration's list of schools considered for closure, it isn't the only one. (See "How the City Closes Schools")
Schools also get Quality Reviews, based on a one- to three-day visit by outside consultants (originally imported from Cambridge, England) or local reviewers. Officials say these reviews are used to help them interpret the data in the Progress Report. And like the Progress Reports, Quality Reviews have evolved every year. In 2010, a new swath of 60 fine-grained questions promise to add details about teaching and learning. But this review is not necessarily powerful enough to change the department's decision to close a school. Paul Robeson High School in Brooklyn has scored reasonably well on its Quality Review but poorly on its Progress Report—and the city has vowed to phase it out and close it down.
Nor are the city's accountability measures always in alignment with those of the state, which publishes an annual assessment of each school. State officials may praise a school for making its "annual yearly progress" goal—a measure designed to meet federal requirements under the No Child Left Behind Act—even as the DOE decides to close it down.
For some schools, the disparities among the various measure are stark. The New Day Academy, a four-your-old secondary school in Morrisania, the Bronx, received consistent "proficient" ratings in its Quality Reviews, which cited strong leadership, "clear vision and commitment" to student success, and "well-developed" advisory sessions for students and professional development for teachers. Both the middle school and the high school have met the state's annual yearly progress goals since the school opened. But in its Progress Reports, the middle school has earned three "C"s and the high school, a "D." Attendance is low, at less than 80 percent. And the school is on the city's short list for closure.
Improving the System
This year, the Progress Report metrics have been revised again in an attempt to limit volatility in the grades for elementary and middle schools, while ensuring an even distribution of high and low grades. Officials have decided to give 25 percent of elementary and middle schools "A"s, 35 percent "B"s, 25 percent "C"s, 10 percent "D"s and 5 percent "F"s, depending on their progress.
Leo Casey, vice president of the United Federation of Teachers, suggests that grading schools on such a curve is unfair. "Imagine
a teacher who, on the first day of class, told his students that no matter how well they performed, 5 percent would fail and another 10 percent would eke by with "D's."
Both Vaccaro and Shael Polakow-Suransky, head of accountability for the DOE, defend the new curve. They say the state has made the tests harder, but no one knows by how much. The last time the state rescaled the tests, in 2005, scores dropped across New York State.
"When the state says 'more rigorous,' what does that mean?" Vaccaro asks. "If we didn't preset the curve, we'd have to guess. It'd be a big guessing game. We're limiting the downside. We're providing downside risk insurance for the whole system not getting failing grades."
Changing the formula by which progress is measured and grades are awarded every year leads to confusion, officials acknowledge. "This is a drawback of making improvements," says Polakow-Suransky. When he and a colleague were asked whether the Progress Reports would ever become a stable, easily comparable tool for parents, both officials laughed out loud. But tweaking the status quo is part of the DOE's culture of innovation and constant refinement, says Polakow-Suransky.
"As you are developing a new system, you have a choice to make: Do you try and improve it, or do you hold firm and say, 'We need to stick with what we've got?' That's the tension we are always balancing."
Polakow-Suransky says even an imperfect accountability system represents an important step forward. "When conditions are very bad, it's irresponsible to wait for perfection to fix them," he says. "We never believed in making the perfect the enemy of the good."
Nadelstern, the administration's chief schools officer, acknowledges that the current Progress Reports for elementary and middle schools rely too heavily on the results of standardized tests, which are themselves unreliable because they fluctuate significantly from year to year. Adjustments, he says, will continue to be necessary and will improve the system.
"But to some extent, the hard work has been accomplished," Nadelstern explains. "The discussion years ago used to be, 'Can we hold schools accountable?' with the consensus being, 'We can't, because we don't control the variables of poverty in America.' No one is arguing whether or not we should hold the schools accountable anymore. Now the argument is, 'What's the best way to hold schools accountable?' And we're committed to trying to figure that out along with everyone else."