Western New York’s hospitals ranked average or worse in a new but controversial quality rating released Wednesday by the federal government.
The rating – ranging from one to five stars – is an overall score based on 64 hospital measures the Centers for Medicare and Medicaid Services publishes on infections, complications, deaths and timeliness of care among other categories.
It works like a student’s grade-point average, summarizing the dozens of quality measure scores into a composite grade and allowing consumers to easily compare one hospital against another. It also comes with criticism from the hospital industry, which views the methodology behind the rating as seriously flawed.
Major teaching hospitals and hospital systems in Buffalo fared poorly, including 2-star ratings for Kaleida Health, Erie County Medical Center and Mercy Hospital.
Of 17 hospitals rated in the region, 11 received two stars and six facilities received three stars. None received four or five stars.
Nationally, in ratings of 4,599 hospitals, only 2.2 percent – 102 facilities – received five stars. Among the others, 20.3 percent received four stars, 38.5 percent got three stars, 15.7 percent received two stars and 2.9 percent received one star.
Slightly more than 20 percent of the hospitals did not report enough information to get an overall score.
Only one hospital in New York State received five stars: the Hospital for Special Surgery in New York City. Twelve facilities received four stars, including two in Rochester, Highland and Unity. Thirty-five hospitals had a one-star rating, 58 had two stars, 49 three stars and 35 one star.
The overall rating reflects how well the hospitals did on the 64 measures, which are publicly available and updated regularly.
For instance, in the latest listing, Erie County Medical Center performed better than the national benchmark for catheter-associated urinary tract infections, but had a higher wait – 237 minutes compared with the national average of 174 minutes – until emergency room patients left from their visits.
Kaleida Health scored below the national benchmark for surgical site infections, and better than the national benchmark for central-line bloodstream infections.
Mercy Hospital, meanwhile, had a worse stroke death rate than the national rate, but its rate of readmission after a discharge was better than the national rate.
The ratings – developed in an effort to improve quality and make more information about hospital care available to the public – are available at www.medicare.gov/hospitalcompare/search.html.
“We are updating the star ratings on the Hospital Compare website to help millions of patients and their families learn about the quality of hospitals, compare facilities in their area side-by-side, and ask important questions about care quality when visiting a hospital or other health care provider,” the agency said in a statement.
Among other complaints, teaching hospitals that tend to handle patients with more complex conditions got lower ratings on average. In addition, the overall scores do not take into account socioeconomic differences in patient populations.
“The plan was to originally release the ratings in April, but they were delayed for good reason. They may not be based on the best criteria,” said John Bartimole, president of the Healthcare Association of Western New York, which represents many hospitals in the area.
That delay to tweak the rating system further came about after an appeal to the Centers for Medicare and Medicaid by dozens of congressional representatives in the Senate and House.
The American Hospital Association expressed disappointment that the agency released the ratings despite continuing concern about the ranking.
“The new CMS star ratings program is confusing for patients and families trying to choose the best hospital to meet their health care needs. Health care consumers making critical decisions about their care cannot be expected to rely on a rating system that raises far more questions than answers,” Rick Pollack, president and chief executive officer of the hospital advocacy group, said in a statement.
The Association of American Medical Colleges contended the ratings are unfair because the agency used more measures to calculate scores for teaching hospitals than it used for some hospitals that treat patients with less complex conditions.
“This new system could have very serious consequences for patients who are choosing where to go for treatment, potentially driving them away from some of the best hospitals for their conditions,” Dr. Darrell Kirch, president and chief executive officer of the association, said in a statement.
The organization’s members include the University at Buffalo Jacobs School of Medicine and Biomedical Sciences, which bases its doctor-training programs at the city’s most prestigious hospitals.
The Centers for Medicare and Medicaid defended the rating system, saying it was developed with input from experts and the public, will be updated quarterly, and will incorporate new measures in the future.
It also noted that the average star rating for teaching hospitals, 2.9, was only slightly lower than that for non-teaching hospitals, 3.1, and that there were teaching hospitals that ranked highly. The agency also found, according to its definition of a safety net hospital, that hospitals treating poorer, more vulnerable patients had a similar average score, 2.9, to non-safety net hospitals, 3.1.
“CMS’s analysis shows that all types of hospitals have both high performing and low performing hospitals. In other words, hospitals of all types are capable of performing well on star ratings and also have opportunities for improvement,” the agency said in a statement.
Among the reaction here, officials at Catholic Health voiced support for transparency and characterized the rating as imperfect but a tool consumers can use to make apples-to-apples comparisons of hospitals.
They also talked up the quality of their institutions, citing good patient satisfaction surveys and recent research that suggests an association between patient satisfaction and the quality and efficiency of care.
“The methodology can always be improved, but it doesn’t really matter if your focus is on all the things you need to do to improve quality,” said John Kane, vice president of quality and patient safety. “The rules are the rules. You can never make them perfect. You have to concentrate on the game. We feel we are, and you can’t lose if you do that.”
Mark Sullivan, executive vice president and chief operating officer, also noted the importance of patient satisfaction and how patients’ experiences may correlate with quality.
“If you focus on patient experience, those other measures tend to do better,” he said.
Kaleida Health consists of four hospitals, including Buffalo General Medical Center, but the government rates it as one merged institution.
“We are pleased, but certainly not satisfied. There is still room for improvement,” Dr. David Hughes, chief medical officer, said in a statement.
He said Kaleida Health in the past 24 months overhauled its quality program, making significant changes in policy, processes and personnel. He cited progress in decreasing infection rates, falls and readmission rates.
“Like many public report cards, the methodology for calculating an individual hospital score has limitations and does not fully reflect a hospital’s comprehensive efforts to improve quality and patient safety. The new CMS ratings are misleading because they don’t take into account important differences in patient populations and the complexity of conditions hospitals treat,” Hughes said.