The DMCB goes through the details! |
Knowing that the news media's word cannot be trusted, the DMCB steps up with a look at the details.
To summarize, both are observational studies that looked backwards in time at what happened after doctors' offices started to install EHRs and after Medicare started their Hospital Compare web site. It's been widely expected that EHRs would reduce duplicative and unnecessary x-ray studies and that Hospital Compare would "shame" hospitals into higher patterns of quality which would translate into lower death rates. These studies show that may not be the case.
While observational studies are very useful at looking at patterns and associations, they're less able to establish or rule out a direct cause-and-effect relationship. Think about the "assocation" between the presence of white hair and heart attack, which has no cause-and-effect relationship. Compare that to a similar "association" between high blood cholesterol levels and heart attack, which does have a cause-and-effect relationship. Observational studies can't distinguish between the two. As a result, they are often counterintutive, prone to overinterpretation, require a lot of statistical jiggering and always leave some room for doubt.
So, with those caveats, proceed with caution......
Both studies are in the latest issue of Health Affairs. The two studies' abstracts are available here and here.
The EHR study looked at the National Center for Health Statistics' 2008 "National Ambulatory Medical Care Survey." This data base contained information on 28,741 encounters with 1,187 primary care and specialist physicians. The survey was detailed enough that the researchers were able to tell which encounters resulted in any "imaging study." That term collectively refers to an x-ray, ultrasound, MRI or a CAT scan.
The survey also told the researchers if the ordering physician used a EHR with electronic access to 1) past high dollar test results (a typed report that describes a radiologist's interpretation of what the x-ray, CAT or MRI looks like) or 2) the images themselves ("click" and the x-ray itself appears on-screen).
The authors next used regression modeling that yielded "odds ratios" to ascertain if there was any statistical association between the likelihood of physicians ordering an imaging study and access to 1) results or the 2) images. Regression modeling allowed the authors to statistically account for or isolate any imbalances in patient or physician characteristics that could also tilt imaging frequency in one direction or another.
The results? The authors knew something was up when they saw the raw data: physicians with EHR access to results or the images were MORE likely to order an imaging study. 18% of encounters with access resulted in an imaging study, while 12.9% of visits without access resulted in an imaging study.
The association held up statistically when regression accounted for and statistically neutralized known patient (for example, age, gender or type of insurance) and practice (for example, spciality, private practice or income source) factors. The odds ratios (likelihood) for test ordering were increased at 1.71 and 1.78 if access to results and the images were present, respectively. In other words, the likelihood of ordering an imaging test was increased by between 70%-80% if an EHR was in the mix.
The DMCB can't say its surprised. One business proposition of commerically available EHRs is that their configuration enables test ordering by making it a few "clicks" away with all the necessary documentation. The authors point out that decision support at the point of care may blunt the physicians' increased tendency to use their EHRs to over-order tests. The DMCB points out that if the docs don't figure this out soon, radiology benefit management programs will continue their relentless expansion.
The second study looked at Medicare's much ballyhooed Hospital Compare. Readers may recall that this is Medicare's web-based system that pubicly compares hospitals on their quality of care for the key "big three" high frequency and high expense conditions of heart attack, heart failure and pneumonia. The reporting system was implemented in 2005 and, if it was supposed to work as intended, it should have ultimately led to a decreased in national death rates for three three conditions.
Not exactly.
The authors examined approximately 18 million Medicare admissions and death rates for the "big three" conditions as well as stroke, intestinal bleeding and hip fracture from 2000-2008. Readmissions and patients who were enrolled in Medicare Advantage at anytime were excluded from the analysis. The authors used "time series design,""nonlinear quadratic trends" and regression modeling with a comparison to the parallel trends to the non Hospital Compare conditions of stroke, hemorrhage and hip fractures. In case you're interested, risk adjustment was used to account for patient severity.
The authors found that while there was a drop in death rates from 2000-2008, the overall trends - except for heart failure - were well established before the implementation of Hospital Compare and simply continued their downward trends. The overall adjusted relative risks for heart attack, heart failure and pnsumonia were .97, .92 (significant) and 1.01. When the trends were adjusted and compared to the other three non-reported diagnoses, there appeared to be a slightly increased relative risk for pneumonia at 1.07. Last but not least, there was no evidence in the analysis that patients tended to shift toward any of the higher performing hospitals.
The DMCB can't say it's surprised here either. When patients are being hospitalized, they're not about to check with Hospital Compare first and the hospitals and their doctors were already working hard to optimize their care protocols.
But, based on the caveats described above, these studies are not conclusive. Time will tell and, alas, more studies are needed before we'll know if we're getting our money's worth from the EHR and Hospital Compare.
No comments:
Post a Comment