Despite their recidivism, one would hope that Schering Plough and Merck realize that they have not chosen an appropriate path yet again. Albert Einstein defined insanity as:
Insanity: doing the same thing over and over again and expecting different results.If everything is against you, bully your critics. If critics aren't fooled, argue "data quality". I'll discuss the discreditable "data quality" defence and the statistics of data quality in clinical trials later. I have extracted the various data quality excuses forwarded by Schering Plough. These are unlikely to stand up to any form of serious scrutiny.
The full WSJ article (link below) is worth a read. In the meantime, the executives involved may want to listen to the 1970's song 'Epitaph' by King Crimson.
Epitaph (UTube here or here, full lyrics here)
Knowledge is a deadly friend
When no one sets the rules.
The fate of all mankind I see
Is in the hands of fools.
Confusion will be my epitaph.
As I crawl a cracked and broken path
If we make it we can all sit back
And laugh.
But I fear tomorrow Ill be crying,
Yes I fear tomorrow Ill be crying.
TRIAL AND ERROR: Delays in Drug's Test Fuel Wider Data Debate
By RON WINSLOW and SARAH RUBENSTEIN
Extracts from Wall Street Journal March 24, 2008; Page A1 Link to full article
"The firms said they had merely been trying to correct irregular data."
The companies brought in a second lab to compete to produce more accurate results than the original research team.
"The companies say they didn't peek at the results or know Vytorin had failed in the study until very recently".
"What we were trying to do was to improve...the precision and the accuracy of the data so that at the end, the results would be credible," says Enrico Veltri, Schering-Plough group vice president of global clinical development
They also emphasize that early data checks turned up another problem they found even more troubling: missing or "implausible" data that the companies have previously cited as the reason for the long delay in reporting the findings. In some cases they were concerned about wide fluctuations in readings that were supposed to be precise. Other researchers say variation in readings is inevitable in most imaging studies, and that enrolling enough randomly assigned patients spreads any problems among both groups to avoid affecting the overall results.
Early in 2006, the companies' committee proposed a different approach to reading the still-blinded data, and pitted Dr. Kastelein's lab against an outside research team to see whether one would be more accurate. There was no meaningful difference. In January 2007, an independent consultant told the companies that the quality of the Enhance data was similar to what was found in other comparable trials.
But company officials still weren't satisfied. They say they kept exploring different ways to eliminate wayward readings and hone the study's precision. "It's very atypical for a trial to go through this sort of scrutiny," says Allen Taylor, chief of cardiology service at Walter Reed Army Medical Center, Washington, D.C., and an expert in imaging of neck arteries.
The companies defend the effort. "It wasn't that the study looked like it was totally inadequate," says Merck's Dr. Musliner. "The more you can reduce your variability, the greater your chances of showing the significance of smaller differences."
By RON WINSLOW and SARAH RUBENSTEIN
Extracts from Wall Street Journal March 24, 2008; Page A1 Link to full article
"The firms said they had merely been trying to correct irregular data."
The companies brought in a second lab to compete to produce more accurate results than the original research team.
"The companies say they didn't peek at the results or know Vytorin had failed in the study until very recently".
"What we were trying to do was to improve...the precision and the accuracy of the data so that at the end, the results would be credible," says Enrico Veltri, Schering-Plough group vice president of global clinical development
They also emphasize that early data checks turned up another problem they found even more troubling: missing or "implausible" data that the companies have previously cited as the reason for the long delay in reporting the findings. In some cases they were concerned about wide fluctuations in readings that were supposed to be precise. Other researchers say variation in readings is inevitable in most imaging studies, and that enrolling enough randomly assigned patients spreads any problems among both groups to avoid affecting the overall results.
Early in 2006, the companies' committee proposed a different approach to reading the still-blinded data, and pitted Dr. Kastelein's lab against an outside research team to see whether one would be more accurate. There was no meaningful difference. In January 2007, an independent consultant told the companies that the quality of the Enhance data was similar to what was found in other comparable trials.
But company officials still weren't satisfied. They say they kept exploring different ways to eliminate wayward readings and hone the study's precision. "It's very atypical for a trial to go through this sort of scrutiny," says Allen Taylor, chief of cardiology service at Walter Reed Army Medical Center, Washington, D.C., and an expert in imaging of neck arteries.
The companies defend the effort. "It wasn't that the study looked like it was totally inadequate," says Merck's Dr. Musliner. "The more you can reduce your variability, the greater your chances of showing the significance of smaller differences."
Earlier|Later|Main Page
1 comment:
In the highly dysfunctional culture that likely exists at Merck, I think a better explanation for these seeming shenanigans is that the teams responsible for the analyses were afraid of being the bearers of bad news.
Their fear was simple: unemployment.
Post a Comment