Friday, June 02, 2006

We don't need Medical Informatics here, Part II

In a prior post here, I reported on the lack of medical informatics experts in the pharmaceutical industry and how Gartner Group found that to be a problem relative to drug adverse event risks. I also outlined how I was "blown off" in an interview with a former FDA adverse events official at Merck, after being laid off as part of the "Equinox" mass layoff of Nov. 2003 due to several late stage drug failures (before VIOXX was on the radar).

Now, in "Merck admits a data error on VIOXX", New York Times, May 31, 2006, I observe that medical informatics expertise might have been of some preventive value on an issue that's likely to be costly in the courtroom:

In an admission that could undermine one of its core defenses in Vioxx-related lawsuits, Merck said yesterday that it had erred when it reported in early 2005 that a crucial statistical test showed that Vioxx caused heart problems only after 18 months of continuous use.

That statistical analysis test does not support Merck's 18-month theory about Vioxx, the company acknowledged yesterday.

But Dr. Peter S. Kim, Merck's chief scientist, said the company stood by the overall findings it reported in 2005 — including the conclusion that the drug's heart risks were not apparent if patients took it less than 18 months.

But outside scientists said yesterday that Merck's admission, when considered along with other clinical trials of the drug and studies tracking real-world Vioxx use, supports critics' longstanding claims that Vioxx caused heart problems quickly.

"There never was any evidence for the 18-month story," said Dr. Alastair J. J. Wood, a drug safety expert at Vanderbilt University.

... When it reported the Approve results in The New England Journal of Medicine early last year, Merck said that it had performed a statistical test to examine whether Vioxx's risk changed over time. That test found with almost total certainty that the drug had significantly higher risk than placebo only after the 18-month benchmark — but no extra risk before that time.

Yesterday, Merck said it had made a mistake in reporting that result last year.

In reality, the test that the company said it had used to check the results shows that there is a 7 percent chance that Vioxx has an equally high risk of causing heart attacks both before and after the 18-month benchmark is reached.

That 7 percent figure may seem like a relatively small chance of error, but scientists say it is high enough to mean that Merck has not proved its theory.


Clearly, these studies could benefit from a broader inclusion of skills and insights. As Gartner observed in its "2006 Industry Predictions" report for pharma, "only a small percentage of biopharmas routinely utilize personnel with medical informatics backgrounds to search for adverse events in approved drugs."

What will it take to change this?

2 comments:

Anonymous said...

There are many problems here that go far beyond the absence of Medical Inforematics skills. The real problem is why Merck is involved at all as such an embedded partner in supposedly academic research.

The language of the report you quote is fascinating, and underlines how much we have lost without even noticing.

-"Merck & Co. acknowledged that it misidentified a
statistical method"

-"Merck has contended that the study shows.."

-"The company said in a statement yesterday.."

-"Merck research chief Peter Kim said
the company stood by its analysis.."

-"Merck has insisted that its 18-month argument
holds up.."

-"while outside scientists have cast doubt on the
company's interpretation..."

Why don't we
think it strange that a commercial company is
making any comment at all about science? On
my count there were 10 University investigators on
the APPROVE paper, and 2 named Merck "investigators". The question is where were the *internal*
investigators, what were they thinking, did they
see the data, did they have it, why did none of them
think to repeat the analyses themselves prior to publication? -- and is this science in the traditional sense of that word at all?

These are puppet
investigators giving non-traditional "science" a a veneer of credibility (and then failing
to behave as academics when it gets hot). What we need is science back, not more commercial overlay.

Aubrey Blumsohn
Sheffield

Anonymous said...

Yes agree with you in turn. But no matter how big that talent pool, there will still be a problem unless industry gets its long-needed brain transplant.

Given that University academics are expected to function as academic prostitutes (obviously not in this publication, but in general) it seems most unlikely that industry academics will fare much better. Given also that some journals and the certainly the regulatory bodies are increasingly being lumped into that same category, it hard to know where the solutions will originate. Perhaps it is only through eventual increasing protests from our patients (and sadly but perhaps usefully though lawyers) that academics within and outside industry (including informatics ones!) will be able to function with integrity. Until then, I don't believe packing yet more technical expertise into industry will be helpful.

Aubrey