Monday, June 21, 2010

A Potporri of News About Marketing & Social Media, Savings from Telephonic Disease Management and Methodologic Problems in the PCMH Pilot Evaluations

Today brings a grab bag of worthy news to the Disease Management Care Blog readers. Topics include social media, old fashioned telephonic disease management and the Patient Centered Medical Home.

First off, did you know the population health/disease management company Healthways has a Facebook page devoted to its elder wellness program SilverSneakers? The Disease Management Care Blog did also, but what it didn't know is that, according to this press release, uptake lagged until a professional marketing firm began to promote it.

The DMCB insight: while social media is supposed to be a "viral," bottom-up democratic phenomenon, the professionals have moved in.

Secondly, there are two research findings on the topic of old fashioned "telephonic" disease management. Does it or doesn't improve quality? Does it save money?

In this early release publication in the prestigious Journal of General Internal Medicine (known to publish papers with high methodologic rigor), Daren Anderson et al were unable to demonstrate in a prospective randomized clinical trial that a telephonic disease management program versus usual care resulted in better blood glucose control. However, this was a single site study involving a community health center serving disadvantaged patients. Patients were enrolled in the study without regard to baseline diabetes control, the intervention group had a lower A1c at the start of the study and there was a significant prevalence of depression in both groups. In addition, it's possible the intervention may have changed provider behavior for the control patients.

The second study JL Rosenzweig and MS Taitel was only released in abstract form (its P3-715) from this year's Endocrine Society's meeting. This research involved enrollees from a Medicare Advantage plan that focused its telephone diabetes disease management program on high risk patients. This was also a randomized prospective study that showed the intervention patients....

"....decreased ... all-cause total medical costs by $984,870 per thousand members per year (PTMPY), compared to a $4,547,065 PTMPY increase in the Control Group (p≤.05). All clinical quality measures significantly improved from baseline (p≤.05) including A1C, LDL ,and microalbumin testing, retinal exams, foot exams ACE or ARB use,and aspirin use.

The DMCB insight: as publications expand in number, the science matures and other settings/patients are explored, both positive and negative studies are not unusual. However, if that Medicare Advantage study is correct, that's some serious money being saved. Looks like the the taxpayers were getting their money's worth after all in this particular Medicare Advantage plan. The DMCB is looking forward to seeing more details in a peer review publication soon.

Thirdly, in this study, the authors surveyed the larger PCMH pilots and provide a useful national-level summary overview. For example, per member per month (PMPM) payments for eligible patients range from $0.50 to $9 plus pay for performance (P4P) bonuses along with other payments to fund practice transformation or other care strategies, such as embedded nurse care managers or quality improvement programs.

But particularly worrisome was this finding:

"...the evaluation plan for nearly 60% of demonstrations had not yet been devised in detail at the time of the interviews. Even among those with more developed plans, the specification of which variables would be measured, along with surveys to be used, was uncommon."

What's more....

The heterogeneity in program design suggests an urgent need to incorporate evaluation in all programs’ designs. Less than half of the programs had well specified evaluation plans that were designed in conjunction with the pilot. In most cases, although evaluation is considered important, the evaluation designs had not been pre-specified, thus necessitating a reliance on existing data, and funding had not been secured to support a robust evaluation. Furthermore, many of the pilots do not identify adequate control groups against which to compare the intervention practices.

The DMCB insight here is that if the PCMH pilots are not careful, they could end up being condemned in a CBO report saying there's no good evidence that their intervention works. Based on the two disease management studies described above, it's safe to say that the disease management industry has learned its lesson: valid comparator groups are being used to accurately assess their impact. The industry has even developed a state-of-the-art evaluation guide that addresses many of the methodologic issues (that's for free, by the way) in program evaluation. PCMH pilot managers should take note.

No comments: