With all due respect, and at the risk of turning the dead horse into a rug, the meta-analysis published by Carnaby-Mann and Crary, used excessively lax inclusion criteria, rendering the results very difficult if not impossible to generalize to the clinical setting. Here is one set of examples to support this comment, and there are several others that the discussion group certainly does not want to hear.
A meta-analysis is only as good as the criteria for allowing studies to be included. They need to be rigorous.
The use of the Physiotherapy Evidence Database scale (PEDro) scale for judging evidence quality, at a cutoff score of 4 on the scale, allowed studies such as Freed, et al. (2001) (we all know which study that was and all of its problems-it is only one example from this meta-analysis’s included studies), to be included in the meta-analysis. Freed et al. (2001), as we all are aware, a) violated intention to treat with 10% of their patients disappearing from the data analysis for unexplained reasons or because they could not pay for the treatment after insurance terminated coverage(PEDro item 9), used judges that were not masked to patient assignment (PEDro item 6 and 7), selectively assigned patients to electrical stimulation “because they were referred for the study” (PEDro items 3 and 5), compared dissimilar groups of heterogeneous patients and excluded patients for unexplained reasons (PEDro item 4), and did not randomly assign patients to groups (PEDro item 2). In addition the study was conducted by the owner of the patent for the technology investigated, a serious conflict of interest not even acknowledged as an important criterion for evidence quality assessment by the PEDro. These serious flaws notwithstanding Carnaby-Mann and Crary included this, and several other studies of poor evidence quality, in their meta-analysis. The results must be considered weak and ungeneralizable to the clinical setting, at best, given the inclusion of poor quality studies.
Meta-analysis is an effort to use the best literature available to answer important questions about cause and effect. This study did not accomplish that goal. The lack of good quality studies does not justify the use of poor quality studies to make important decisions about treatment effectiveness or efficacy. It is always possible for a faction to cherry pick results from various studies to support their point of view. That is not what meta-analysis is about. This is why Vioxx was such a colossal failure.
James L. Coyle
University of Pittsburgh
No comments:
Post a Comment