Joel Klein (pictured above) at PS 5, a school that received a F last year but an A this year.

The headline about a contest is just to grab your attention. But seriously, how did the grades for our NYC public schools come about? By the way, the New York Times published the K-8 grades.

I think I get it. Joel needs to look good for his private funders. So, the grading policy was drawn up more than two years ago so that whatever school "they" (the funders) wanted to help him look good got an F last year and this year an A.

The same works in reverse. See the protest from the Neighborhood School below.

Or whatever.

An F School? Depends on How It’s Judged
By ROBERT GEBELOFF, NY TIMES, September 28, 2008

In New York City’s data-driven system for assessing the quality of its public schools, the New Venture Middle School in the Morrisania section of the Bronx received a D on this year’s report cards. But unravel the complex formula the city uses to grade the schools, and tweak it to weigh factors differently, and New Venture would look a lot better — or worse.

If, for example, the city had counted two years of data rather than one, New Venture would have earned a B. On the other hand, erase the peer groups the city uses, and compare New Venture’s performance instead to all middle schools citywide, and it would receive an F.

While the city’s report cards are based on far more factors — including demographics, tests, parental surveys and peer comparisons — than just about any school system in the nation, the ultimate A through F grades are determined by a series of subjective decisions about which factors to use and how to weigh them.

To examine how these decisions, which have been the subject of much debate since the system was unveiled in 2007, affect the grades, The New York Times adjusted the formula used to grade schools and came up with four alternative grading methods. When the new formulas were applied, the grades for hundreds of schools, as at New Venture, came out differently.

The first alternative grading method broadens the microscope to include data from two years instead of one. This helps blunt fluctuations based on fluke one-year spikes or drops in performance that often occur in standardized testing, particularly among smaller schools; indeed, it eliminated many of the most radical swings in grades from year to year.

A second alternative shifted the emphasis away from test score improvement and toward pure performance: What percentage of a school’s students met state standards for proficiency.

The city counts year-to-year progress as 60 percent of the overall grade, performance as 25 percent; this weighting was reversed in the second alternative to reward schools more for general excellence in scores, even if the scores did not change much over the past year.

Under this method, instead of 18 schools receiving F’s, 7 would have. And 8 that received a D and 3 that received a C under the city formula would have failed.

In a third grading method, The Times removed the city’s complex method of comparing schools with socioeconomic peers. The city uses a peer index because test scores are, typically, heavily driven by demographics, and administrators feel it is both more fair and more revealing to compare schools serving poor children, for example, with other schools serving poor children.

This resulted in dozens of schools being awarded A’s, even though more than half of their students failed English, for example. With the influence of peer groups taken out, 66 schools that earned a B would have jumped to the A level.

Finally, a fourth grading method combined the previous two: The formula was changed to emphasize performance more than improvement, and peer comparisons were discarded.

This pushed even more B schools to the A level, and even more significantly reshaped the pool of failing schools. Seven F schools would have scored B’s under this method.

September 17, 2008
More New York Schools Get A’s
By JENNIFER MEDINA and ROBERT GEBELOFF, NY TIMES

The number of schools receiving A’s under New York’s much-contested grading system increased significantly this year from last in what Mayor Michael R. Bloomberg said was a clear sign of success and evidence that his signature accountability program was spurring improvement at schools across the city.

Nine of the schools that got F’s for the 2006-7 school year got A’s in this year’s grading; just one of last year’s A’s plummeted to an F this time around. Critics immediately seized on the wild fluctuations as proof that the system was flawed because of its emphasis on year-to-year progress by individual students rather than multiyear gains.

Over all, nearly 80 percent of the 1,040 elementary and middle schools judged got A’s or B’s, while the number labeled failures dropped to 18 from 43. The percentage of schools getting A’s jumped to 38 percent, from 23 percent last year. About 57 percent of last year’s A schools received A’s again, while no school was deemed an F twice.

“Not a single school failed again,” Mr. Bloomberg said at a news conference at Public School 5 in Bedford-Stuyvesant, one of the F-to-A schools. “That’s exactly the reason to have grades, to show what you haven’t done yet and what you do to improve. The fact of the matter is it’s working.”

Mr. Bloomberg, who has made changing the school system a cornerstone of his administration, has repeatedly presented the letter grades as a clear measurement that parents can rely on to judge the quality of the education their children are receiving and show the public how much the schools were improving.

But, as with the inaugural round of grades last year, the report cards had many who know the school system well somewhat befuddled.

Again there was a discrepancy between federal and city assessments — 30 percent of the schools deemed failures under the No Child Left Behind act earned A’s from the city on Tuesday, while 16 of the city’s 18 failures are in good standing under the federal guidelines. And again, schools that had enviable reputations received less than enviable grades, like Public School 8, a respected and popular Brooklyn Heights elementary school, whose F stunned parents.

The fluctuations and jarring results led critics to reiterate concern that the grades were little more than a snapshot of test score improvements between two years.

“I would say that what they are doing is clearly misguided,” said Walt Haney, an education professor at Boston College who focuses on testing issues. “These are showing dramatic changes that can have nothing to do with what is actually happening.”

Most school accountability systems focus on proficiency and, when measuring growth, look at this year’s fourth graders compared with last year’s fourth graders. But the city’s grades are determined largely by how much progress this year’s fourth graders have made since their third-grade tests.

So student performance at the schools that received A’s varied wildly. In more than 60 of the 394 A schools, for example, more than half the students failed to reach proficiency on the state’s reading test. And two of the schools that received A’s — Middle School 326 in Manhattan and Public School 224 in the Bronx — were added to the state’s most recent list of failing schools in the spring.

Mr. Haney said that it was likely that relatively small schools — of which there are an increasing number, because of the Bloomberg administration’s breaking larger schools into smaller thematic programs — would show the most change, since any fluctuation in performance among even a few students would have a sizable impact.

Indeed, according to Education Department statistics, the average enrollment of schools that changed by one letter grade is 683, while schools that moved at least two letter grades had an average of 577 students.

The grades are being used for a variety of rewards and consequences. Principals and teachers in schools that receive A’s are eligible for bonus pay, and the schools could receive extra money, while schools that receive a D or an F two years in a row could lose their principal or be shut down entirely.

Eight schools have gotten D’s two years running. Three-quarters of the failing schools last year received an A or a B this year.

Given the report cards’ emphasis on progress — it counts for 60 percent — the overall increase in the number of A and B schools is no surprise, since major gains on state standardized tests in reading and math were posted last spring in every age group and in nearly every neighborhood.

The grades on Tuesday were for elementary and middle schools. High school grades are expected next month, after graduation rates are calculated.

The complex formula to determine the grades was slightly amended this year. For example, schools got credit for students who scored in the highest of four categories on state tests two years in a row, even if their score within that top category dropped slightly (last year, this cost a school points). Schools also receive additional credit for improving scores for special education students. This year, each report card’s three subsections were also assigned three letter grades — for progress, performance and environment — instead of just the overall grade.

The weight given to progress increased in the 2007-8 school year, to 60 percent from 55 percent the previous year, while pure performance dropped in importance to 25 percent of the grade, from 30.

As in last year, all three sub-categories were measured in comparison with a set of “peer” schools deemed to be similar in demographics — race, poverty, special education and students still learning English. For middle schools, the system also accounts for how well its students performed in elementary schools.

Last year, the grades were based on a curve, limiting the number of schools that could get each grade. This year, there was no curve; the change, Education Department officials said, allows the public to see improvement.

James S. Liebman, the chief accountability officer and architect of the grading system, said it was the only school-accountability system in the country where the results could not be predicted by poverty or race, since all results are adjusted based on demographic peer performance. He also said that, analyzing the data, school size and class size do not appear to be important factors affecting progress on test scores.

“What we want with progress reports is to measure what schools add to kids, not what kids bring to the schools,” Mr. Liebman said.

Chancellor Joel I. Klein, who has spoken about the system all over the country and consulted with officials in Australia about developing something similar, said that he was confident that the grades would force schools to consider whether or not they were teaching students at all levels.

“This is not about reputation,” he said. Last year, Mr. Klein said that because the department would collect more data each year, it would consider using three years of test scores to measure improvement. Mr. Liebman said that officials did not do so this year because of the way they had changed the overall formula, but he said they would consider it in the future.

Most of the schools that soared to A from F did so because of significant one-year jumps in test scores from 2007 to 2008, in some cases after drops in 2007 from tests in 2006. Only one school, Public School 92 in the East Tremont section of the Bronx, went from an A to an F; that school had seen significant gains on test scores from 2006 to 2007, only to fall back again in 2008.

Ellen Foote, the principal of Intermediate School 289, a respected school in Lower Manhattan that got a D last year, said she was unmoved by its A this year.

“A school doesn’t move from a D to an A in one year unless there is a flaw in the measurement or the standardized test itself” she said. “We have not done anything differently, certainly not in response to the progress report.

”I think it’s just so disrespectful of the profession, to think that I would respond to a single letter by beefing up my test prep,” she said.

Rising Schools Grades Fail To Stamp Out Questions
By ELIZABETH GREEN, Staff Reporter of the Sun, September 17, 2008
LINK

There are more As and Bs and fewer Fs in the second round of letter grades handed out to public schools, but the improvements are not silencing questions about how much meaning the grades hold.

The rising tide was buoyed by big improvements on state reading and math tests last year.

Test scores account for 85% of each school's report card grade, with schools getting credit for both their students' overall scores and for how much their scores rose from one year to the next. The rest of the grade is based on attendance rates and a survey of parent, student, and teacher satisfaction.

Mayor Bloomberg and Chancellor Joel Klein implemented the grades last year as a way to give parents information — and a way to encourage schools to improve.

Staff at schools that receive high grades are eligible for cash bonuses, and schools that receive D and F grades could face closure.

Although there were some high-profile cases of schools dropping in the grades handed out yesterday — such as P.S. 8 in Brooklyn Heights, which got an F, as the New York Times first reported over the weekend, and P.S. 116 in Manhattan, which dropped to a C from an A — schools making dramatic gains were more common.

An elementary school in Inwood, the Muscota New School, jumped to a B from an F; P.S. 58 in Carroll Gardens got an A, up from a D, and a middle school in Battery Park City that last year was given a D, I.S. 289, received an A.

Overall, 71% of schools that received Cs and Ds last year got As and Bs this year, school officials said.

Mr. Bloomberg said the report cards themselves deserve credit for the rising test scores and better report card grades.

"Accountability leads to better results," he said.

He and Mr. Klein pointed to the school where they held the press conference, P.S. 5 in Bedford-Stuyvesant, which rose to an A this year from an F, as an example.

The principal of the school, Lena Gates, said she made a concerted effort to improve test scores, involving not just teachers but also parents and students in her push for a better grade.

Other principals said they were not affected by the report cards, and both the president of the principals union, Ernest Logan, and the teachers union president, Randi Weingarten, put out statements yesterday raising concerns about the quality of the report card judgments.

The principal at I.S. 289, Ellen Foote, said her staff had made no changes in response to their D, which the school received two weeks after winning a Blue Ribbon award from the federal government.

Ms. Foote said that rather than think about the report card grade, she focused on a battery of internal assessments that look not just at test scores but also science reports, writing samples, and math projects.

Ms. Foote said that when she finally opened the report card and found an A, she laughed; what meaning could the letter grade have if it had given the same school such different grades? she asked.

"I was just rolling on the floor," Ms. Foote said.

The grades released so far are only for elementary and middle schools; grades for high schools are still to come.

Contact: Cathy Albisa cathy@nesri.org

PRESS ADVISORY, SEPTEMBER 17, 2008 The DoE Thinks Our School Deserves a D?!

Parents at the Neighborhood School (PS 363) in the East Village are stunned and dismayed by their school’s precipitious drop from a B rating last year to a D this year in the Department of Education (DoE)’s letter grade system. A close examination of the grade reveals profound flaws in the assessment system.

Reporters: Does this sound like “failing” to you?

On 34 of the 35 Quality Review criteria created by the DoE, we were rated “well developed.” Furthermore, we were considered to have made good progress in addressing issues raised in last year’s quality review. The independent raters noted:

o “Leaders and staff use data effectively to improve instruction and raise student achievement.”

o “Monitoring is rigorous and on going and enables the school to focus on improvements and effectiveness.”

o The principal and staff have a detailed knowledge of each student so learning is nurturing and focused on individual needs.”

• “Students benefit from a rich and stimulating curriculum and interventions carefully aligned to their needs so they behave well and enjoy learning.”

On our New York State School Report Card, The Neighborhood School has always exceeded state-set targets for performance in every tested subject, on every grade, for all groups and subgroups of students! Students of all ethnicities, special needs and economic background thrive at the school. The School received a special award from the state for being a “High Performing/Gap Closing School.”

According to the DoE’s Progress Report, The Neighborhood School is off the charts (above the 100th percentile, which is gratifying and fascinating despite being statistically impossible) in “Academic Expectations,” “Communication,” and “Engagement,” according to the School Environment Survey responses of families and teachers. We earn these stratospheric ratings whether compared to all city schools or to our “Peer Horizon” schools.

The majority of our children score 3s and 4s on the standardized tests. Last year we had no children mandated for repeating a grade by test performance. Yet we are in the 4th percentile of schools citywide!

Why does the DoE think we’re so lousy?

In a very small school like ours (and as you know, small classes and small schools are both highly correlated with achievement), tiny changes – in demographics, test scores, attendance – can have huge raw score and percentile impacts. For instance, a change in attendance from 93.5% to 92.7% -- based partly on a busing glitch for a family of three children -- moved us down more than 20 percentile points in relation to our so-called peer horizon. For the much larger schools to which we are generally compared, a busing problem that affected three children would have far less effect on percentile attendance rankings.

The category with the most weight, student progress on standardized tests, actually looks at very few students. We’re talking 49 third graders, 39 4th graders, 46 5th graders and 34 6th graders.

The main reason our scores plummeted so jaw-droppingly this year: Our peer index – the group of schools the DoE compares us to -- was changed. In an effort at leveling the scoring field, the DoE factors in every school’s population of African-American, Latino, Special Needs and ELL (English Language Learner) students; having more such students bumps up one’s score. (Rightfully, since these populations often lack the advantages and external resources of other groups.) This year, however, we were moved into a much whiter, wealthier, more advantaged cohort. And thereby hangs a tale.

Let us explain: The city says the Neighborhood School is comprised of 50% "extra-credit communities." But in the testing grades, the school is 58% extra-credit communities. (As the East Village gentrifies, it makes sense that the more advantaged students are in the youngest grades.) That 8% difference would equal 15 of the 168 tested kids. Those 15 kids are spread out over four grades, and we have two classes per grade, which works out to two children per class. The upshot: we’re not getting credit for two non-white kids per class, which would change our peer index tremendously. In a tiny school, two kids is a big percentage! The difference between 50% and 58% doesn’t sound huge, but it actually works out to a 16% change in one of the major metrics used to determine our peer index. (Percentage of African-American and Latino students accounts for 30% of the peer index rating.)

Equally dramatic: The number of “special needs” students we're being given credit for is significantly lower than the number we actually have in the tested grades. School-wide, we have about 30% such kids, but in the tested grades, it’s almost 35%. That works out to about 8 more kids with Individualized Education Plans (IEPs), specialized programs and services designed to help them succeed. Again, this works out to one or two kids per class. (And percentage of these students accounts for another 30% of the peer index rating.)

As for our students’ actual scores: Median English Language Arts (ELA) proficiency actually went up by .01 points, and median math proficiency went up by 1.3 points! Compared to our new peer group, though, this was unimpressive. We lost almost 7 percentile points on that ELA score, and gained only 4 percentile points on the math. (Our assistant principal, Milo Novelo, winner of the 1995 Children’s Choice Award as most effective teacher, can talk journalists through the DoE’s nearly impenetrably baroque formulas.) This category also decreased in weight from last year, when it counted for 30% of our overall score; now it counts for 25%.

Our scores did go down a bit in ELA progress, the equivalent of four or five fewer children scoring a 3 or 4 on the test. That’s a lot of statistical weight on four or five pairs of small shoulders! Our non-test-obsessed school resolves to work harder, but not to mourn these kids as “failures,” just as we also choose not to do a jig about our students’ stellar math scores, which experienced a six-point bump. We understand it means that eight additional students scored 3s or 4s in math compared to last year. (Which only translated to two percentile points anyway.) As many a nine-year-old has been known to say: Big Whoop. The upshot: We’re talking about the scores of fewer than 10 children, the equivalent of two kids per class getting lower scores than they did the previous year. But any negative effects are hugely magnified when compared to our new and inaccurately and hurtfully assigned peer group.

So what do parents want?

We parents wish to restate that we are not opposed to assessment; we welcome it (and are happy to discuss our school’s rigorous methods of doing it)! We merely object to excessive, misused standardized testing and arcane formulas that distract us from rich curriculum, cost huge amounts of money and most importantly of all – given this administration's fervent desire for meaningful data -- DO NOT PRODUCE RELIABLE DATA. As Professor David Koretz, who teaches educational measurement at the Harvard Graduate School of Education, points out, for years testing experts have warned that results from a single year are highly error-prone, particularly for small groups. The vertigo-inducing grade changes this year, system-wide, cannot be credibly explained by true changes in school performance. (Really, is it possible that ¾ of city schools that got an F last year really improved so much in one year that they deserved an A or B?)

We Neighborhood School parents believe in a holistic, whole-child, multidisciplinary approach. Our school, which is 32.6% white, 27.8% Latino, 22% African-American, 16% Asian-American and 1.5% Native American, with 30% of the kids qualifying for free lunch, is a model of citizenship and community. “Our curriculum encourages problem-solving, emotional and social development, and respect for all kinds of people as well as intellectual curiosity and love of learning,” says Cathy Albisa, PTA co-president. “Challenging, nuanced curricula like ours help children become lifelong problem solvers.”

The Neighborhood School missed scoring a C by .5 point. But that’s not important. What is? The fact that improper use of data hurts our kids, our school, our families. We are now faced with the possible removal of our principal, financial penalties, reorganization, removal from the Empowerment Zone in which parents and teachers work together to make decisions they feel best for children as learners….even as our students excel and thrive.

We call on the DoE to work with families and educators on assessment measures that truly reflect our values, and our school’s very real accomplishments, as measured by the New York State and outside evaluators.

Section: New York > Printer-Friendly Version

Cash Rewards for Schools Stirs Debate
By ELIZABETH GREEN, Staff Reporter of the NY Sun, January 4, 2008
LINK

An announcement that top city schools would get cash rewards for winning "A" grades set off a fierce and personal debate yesterday over the direction Mayor Bloomberg has taken the city's schools, with city and Albany officials drawing on their children's experiences to make opposing cases.

The schools chancellor, Joel Klein, had just finished announcing that he would be sending $3.4 million in rewards to schools that scored "A" grades on the report cards the city handed out recently, when Assemblyman Mark Weprin, of Queens, stepped to the podium and condemned the report cards, the letter grades, and the Bloomberg administration's education record.

Mr. Weprin represents the Bayside neighborhood where the school at which Mr. Klein held the event, P.S. 46, the Alley Pond School, is located. The school is getting more than $14,000 in rewards for its "A" grade and a top mark on a qualitative review. Every city school that met those designations, 134 in total, is receiving $30 a child.

Mr. Weprin said the reward was welcome, but he went on to criticize the way the schools are now being run. He said the emphasis on test-taking is growing so strong that children are paying a price in terms of anxiety. "This has been a boon for pediatric psychotherapists in the city," he said.

City officials called Mr. Weprin's argument absurd, citing P.S. 46 as a prime example of the way schools can improve on their report card grades: not by focusing on tests but instead by giving a well-rounded education.

A top aide to Mr. Klein who attended yesterday's event, James Leibman, said his children, who attend public schools in Manhattan, do not mind being tested. "My kids like to be challenged," he said.

The superintendent of the school network to which P.S. 46 belongs, Judith Chin, said some schools in the city are emphasizing test-taking skills too heavily. But she praised P.S. 46 for striking a good balance between teaching a well-rounded curriculum and making sure students are ready for tests.

Visiting a fourth-grade classroom before the announcement, Mr. Klein asked children why their school received an A.

"Because," one child answered, "when the New York State ELA test is coming up, they teach us what methods to use, how to write."

Another boy added: "They do everything to help us with our ELA and all our tests."

Mr. Klein then asked, "Do we do any art in this school?"

The school's art teacher, Rita Rothenberg, said she works with most of the school's students once a week, though that particular class of fourth-graders does not visit her at the moment. And all students at P.S. 46 attend an enrichment program that includes activities such as dancing, photography, and a music class called "So You Think You Can Sing?"

The co-president of the school's Parent Teacher Association, Donna Benkert, said the program would not exist without the help of the PTA, which raises between $20,000 and $25,000 in donations each school year.

The president of the city teachers union, Randi Weingarten, said the new cash rewards unfairly neglect schools that received low grades.

"A" schools are receiving rewards on the condition that they share ideas and best practices with poorer performers. Ms. Weingarten said that is not enough. "What support and resources are being directed to schools that are falling behind?" she asked.
|
This entry was posted on 13.05 and is filed under . You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.

0 komentar: