Infection Rate Reporting Lacks Validation, Yet
A team of public health officials plan to verify what hospitals are reporting is accurate
May 12, 2010 -- When Oregon officials release the state’s first-ever public report on hospital-acquired infections later this month, chances are it won’t represent the true number of cases. In fact, it probably won’t be even close.
Based on a preliminary review of more than 70 medical records at four central Oregon hospitals, and a report from Connecticut where teams conducted a much larger validation process, the number of central line blood stream infections reported by hospitals represented less than half of actual cases.
No one knows for sure whether Oregon hospitals have been submitting accurate reports, but advisors to the Oregon healthcare acquired infections program are planning a rigorous audit of its blood stream infection reports, one of three types of infections hospitals have been reporting to the state since January 2009.
The audit by a small staff of public health officials will include reviewing lists of blood cultures and on-site medical records several times a month with a report due by September 2011.
“There are several reasons to believe that what’s coming in is not perfect,” said Dr. Paul Cieslak, Department of Human Services communicable disease program manager who serves on the advisory committee and will head the validation team. “I have some hope that by doing this we will get a better sense of consistency.”
In Connecticut, one of only a few states to conduct a similar review, public health officials reviewed 476 medical records and found 49 cases that met the definition of a blood stream infection, but just 24 (47 percent) were reported.
Contributing to the inconsistencies was confusion over the definition of a central line, a misunderstanding of the guidelines pertaining to lab tests and confusion about what organisms the Centers for Disease Control recognizes as pathogens, Cieslak said.
State officials found similar deficiencies in reports of serious medical errors sent to the Oregon Patient Safety Commission.
At an advisory board meeting held May 12, hospital representatives said staffers do their best to report every infection required by state law. Several reported having trouble filling job positions for infection control specialists, often requiring they train individuals themselves.
Problems also persist with the complexity of some of the reports. Surgical site infections for knee surgery, for instance, require a hospital staff member to enter 57 data elements per case even though the public report only includes three of these elements. That’s because the program requires hospitals to also submit their infection reports to the National Healthcare Safety Network, which sets the reporting parameters.
“The hurdle for us on reporting is that a lot is manual,” said Julie Koch, director of quality resources for Good Samaritan Regional Medical Center in Corvallis. “We have to pull from multiple information systems, and there’s no way to make it electronic.”
Jodi Joyce, quality and patient safety vice president for Legacy Health System, described a similar issue. Joyce said the stack of medical records required to fill out an infection report can be more than three feet high as someone has to manually look for each data element, often having to decipher difficult handwriting.
“If we only had to submit the three data elements being used here we could report on all of the surgical site infections,” Joyce said. “But because we’re being required to submit 57 elements, that takes so much more time.”
So far, officials at the Oregon Health Policy and Research who manage the program say they plan to stick with the full NHSN reporting guidelines. One reason, said Dee Dee Vallier, a consumer representative on the committee, is so Oregon hospitals can effectively compare themselves to facilities in other states.
In addition, agitation is growing that OHPR is taking much too long to release the much-anticipated public report. Ron Jamtgaard, the other consumer advocate on the committee, called into question the usefulness of data more than five months old at the earliest.
“Each time we try to expand the number of infections, the pushback has been fierce,” said Jamtgaard. “It’s now the middle of May and the annual report at this point is very old. If you tell me some hospital had an infection rate two years ago, it doesn’t tell us that much. Single points on a graph aren’t that useful to compare either. Public reporting is only useful if it’s timely.”
For related articles on infections click here.
May 13 2010