Thursday, September 4, 2008
The DMCB Responds to Dr. Marcia Angell's JAMA Editorial About Pharma Research & the Creation of an Institute: Open Source is Better
In the Sept 3 issue of JAMA, former New England Journal of Medicine (NEJM) Editor in Chief Marcia Angell editorialized about the effect of the pharmaceutical industry on U.S. medical care with a special emphasis on published research. After ticking off the usual litany of recent conflict-of-interest author sins including ghost-writing, stock options and speaker fees, Dr. Angells turned the spotlight on the published research itself, made faux thanks to spin, bias, and data suppression. She concluded that physicians can no longer trust peer review journals. Then she made a curious, if understandable, recommendation: the creation of an ‘institute for prescription drug trials.’
Hmm. The Disease Management Care Blog suspects this is akin to proposals for the Feds to establish a center for comparative effectiveness research. The DMCB knows many physicians are distrustful of blunt, oversized & unwieldy ‘solutions’ designed by otherwise well-meaning experts. Think FDA and you’ll understand the caution over the merits of more federally owned or sponsored institutes/centers.
The DMCB would like to offer a contrarian viewpoint on how to get from here (a broken system) to there (a higher likelihood of discerning the truth in clinical research trials).
Rather than ‘compact’ the storage, analysis, interpretation and reporting of data from drug trials, the DMCB recommends open sourcing it. In other words, following publication, post the original data from the trials in a downloadable flat file on the internet. Let anyone access it and let anyone offer up a similar or different interpretation.
Advantages include:
Far many more experts, both professional and amateur alike, will be able to openly and far more thoroughly vet the researchers’ original conclusions. The expertise of volunteer crowds has already been put to good use by Procter & Gamble, IBM and NASA. Think academics or amateur statisticians with down analytic time and little hope otherwise of getting into JAMA or the NEJM wont jump at the chance at being the one to find an otherwise missed clinically and statistically significant correlation? Think again.
This is a far more efficient – i.e., speedy, cheaper and reliable – approach than relying on an ‘institute.’ Given the track record of similarly contrived academic, governmental and bureaucratic central planning, the DMCB would rather rely on the increasing sophistication of today’s information markets to sort fact from spin. It’ll cost little and can start tomorrow. Make that today.
Allegations of flawed or biased conclusions, intentional or not, will be mitigated. This is a far better way for all parties, including the print journals, to promote transparency and demonstrate a commitment to getting to the truth. The DMCB suspects consumers will be more likely to trust the public domain than an FDA-clone institute. Think juries in New Jersey or Texas won’t pay attention? Well, maybe not, but it can’t hurt. As for the physician community, we’ve been accused of being saps when it comes to pharmas’ trickery. Open source the data and there are many of us that will show you otherwise.
Criticisms? There are some, but none seem insurmountable to the naïve DMCB:
Research participants’ privacy will be compromised by open source posting. The DMCB believes a) patient identifiers can be reasonably masked, b) bullet-proof privacy is less important to increasing numbers of persons in this age of YouTube, Facebook and MySpace, c) what, like other corners of health care offer bulletproof privacy? d) the remote possibility of a privacy breach can be included in the Informed Consent and e) for most potential research subjects, the pleasure of helping advance the advancement of scientific knowledge for the advancement of mankind will continue to outweigh the risks.
There is little evidence that open sourcing will perform better than the current system of closed peer review buttressed by an ‘institute.’ In fact, it may be worse. While open source Wikipedia has a reputation for critical lapses in accuracy, the DMCB believes a) repeated and independent analyses will eventually but always triangulate on the truth and b) experts – many of whom will be unexpected – will emerge as trusted third party super-vetters. And, like, given their track record, can we trust the current ‘expert-class’ of editors and academics to do a better job with their information monopoly? Their preference the status quo laced with some institute/center ‘steroids’ looks like an unseemly stab at job security. A quasi-governmental business model is simply out of touch with our distributed knowledge society. Last but not least, the editors’ dismay is undercut by the highly remunerative drug ads that appear in the same print journals that tut-tut editorialize about their underlying methods.
Research sponsors will never give their intellectual property away. Maybe that kind of attitude explains why much of their market capitalization has vaporized. In contrast, IBM and P&G, after struggling with the same issues of trade secrets, have witnessed an increase in the quality of their products and the price of their stock. Ironically, one solution is to ‘copyleft’ (vs. copyright) the data, which will impede competitors’ ability to take unfair advantage. While there’s a risk, perpetuation of the double-dog-secret closed information system has imploded too many times for too many otherwise pretty good drugs.
Print media will be undercut. These grand old dames may see circulation drop, especially if data posting and the original interpretation skips the hassle of publication altogether. The DMCB doesn’t think so because print journals have such a strong bullpen of reviewers.
The DMCB offers some possible wrinkles to think about:
This could be piloted. Compare open source vetting to closed system manuscript review to both and see which performs better.
What about the option posting the data with a preliminary introduction and summary of results without any interpretation and offering a prize for the ‘best’ analysis?
Hmm. The Disease Management Care Blog suspects this is akin to proposals for the Feds to establish a center for comparative effectiveness research. The DMCB knows many physicians are distrustful of blunt, oversized & unwieldy ‘solutions’ designed by otherwise well-meaning experts. Think FDA and you’ll understand the caution over the merits of more federally owned or sponsored institutes/centers.
The DMCB would like to offer a contrarian viewpoint on how to get from here (a broken system) to there (a higher likelihood of discerning the truth in clinical research trials).
Rather than ‘compact’ the storage, analysis, interpretation and reporting of data from drug trials, the DMCB recommends open sourcing it. In other words, following publication, post the original data from the trials in a downloadable flat file on the internet. Let anyone access it and let anyone offer up a similar or different interpretation.
Advantages include:
Far many more experts, both professional and amateur alike, will be able to openly and far more thoroughly vet the researchers’ original conclusions. The expertise of volunteer crowds has already been put to good use by Procter & Gamble, IBM and NASA. Think academics or amateur statisticians with down analytic time and little hope otherwise of getting into JAMA or the NEJM wont jump at the chance at being the one to find an otherwise missed clinically and statistically significant correlation? Think again.
This is a far more efficient – i.e., speedy, cheaper and reliable – approach than relying on an ‘institute.’ Given the track record of similarly contrived academic, governmental and bureaucratic central planning, the DMCB would rather rely on the increasing sophistication of today’s information markets to sort fact from spin. It’ll cost little and can start tomorrow. Make that today.
Allegations of flawed or biased conclusions, intentional or not, will be mitigated. This is a far better way for all parties, including the print journals, to promote transparency and demonstrate a commitment to getting to the truth. The DMCB suspects consumers will be more likely to trust the public domain than an FDA-clone institute. Think juries in New Jersey or Texas won’t pay attention? Well, maybe not, but it can’t hurt. As for the physician community, we’ve been accused of being saps when it comes to pharmas’ trickery. Open source the data and there are many of us that will show you otherwise.
Criticisms? There are some, but none seem insurmountable to the naïve DMCB:
Research participants’ privacy will be compromised by open source posting. The DMCB believes a) patient identifiers can be reasonably masked, b) bullet-proof privacy is less important to increasing numbers of persons in this age of YouTube, Facebook and MySpace, c) what, like other corners of health care offer bulletproof privacy? d) the remote possibility of a privacy breach can be included in the Informed Consent and e) for most potential research subjects, the pleasure of helping advance the advancement of scientific knowledge for the advancement of mankind will continue to outweigh the risks.
There is little evidence that open sourcing will perform better than the current system of closed peer review buttressed by an ‘institute.’ In fact, it may be worse. While open source Wikipedia has a reputation for critical lapses in accuracy, the DMCB believes a) repeated and independent analyses will eventually but always triangulate on the truth and b) experts – many of whom will be unexpected – will emerge as trusted third party super-vetters. And, like, given their track record, can we trust the current ‘expert-class’ of editors and academics to do a better job with their information monopoly? Their preference the status quo laced with some institute/center ‘steroids’ looks like an unseemly stab at job security. A quasi-governmental business model is simply out of touch with our distributed knowledge society. Last but not least, the editors’ dismay is undercut by the highly remunerative drug ads that appear in the same print journals that tut-tut editorialize about their underlying methods.
Research sponsors will never give their intellectual property away. Maybe that kind of attitude explains why much of their market capitalization has vaporized. In contrast, IBM and P&G, after struggling with the same issues of trade secrets, have witnessed an increase in the quality of their products and the price of their stock. Ironically, one solution is to ‘copyleft’ (vs. copyright) the data, which will impede competitors’ ability to take unfair advantage. While there’s a risk, perpetuation of the double-dog-secret closed information system has imploded too many times for too many otherwise pretty good drugs.
Print media will be undercut. These grand old dames may see circulation drop, especially if data posting and the original interpretation skips the hassle of publication altogether. The DMCB doesn’t think so because print journals have such a strong bullpen of reviewers.
The DMCB offers some possible wrinkles to think about:
This could be piloted. Compare open source vetting to closed system manuscript review to both and see which performs better.
What about the option posting the data with a preliminary introduction and summary of results without any interpretation and offering a prize for the ‘best’ analysis?
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment