A California appellate court has affirmed the exclusion of an expert witness who “(1) unreliably found causation based on [a single] study alone while disregarding other human data …; (2) analyzed animal data even though he was unqualified to do so; and (3) misapplied [several] of the nine factors of the Bradford Hill analysis.” Onglyza Product Cases involved claimed negative cardiovascular effects from a diabetes medication. The rulings are, however, broadly applicable to expert witnesses generally and to the recurring minefield that is Bradford Hill.

“A trial court does not abuse its discretion in excluding expert testimony on general causation when the expert’s opinion is based on a single study that provides no reasonable basis for the opinion offered.” Here, the studies’ authors said more study was needed to address causation.

“We do not hold that one randomized controlled trial is never sufficient to establish general causation, but on this record, the trial court did not abuse its discretion in finding that Dr. Goyal’s reliance on SAVOR alone to establish general causation was logically unsound, especially given Dr. Goyal’s own agreement that SAVOR’s finding needed to be replicated in order to determine causation.”

The trial court’s “decision was based on various methodological defects it found in Dr. Goyal’s application of six of the nine Bradford Hill factors, and that because he failed to weigh them together, it could not identify any predicate opinion on a specific factor that was not essential to his ultimate opinion. As a result, it concluded that methodological defects in any of the factors would upset the ultimate opinion on causation. This was a proper exercise of the court’s gatekeeping responsibility.”

In some instances, the court ruled that the expert was “refusing to engage with a factor of the Bradford Hill analysis on its terms” by essentially re-defining the terms to suit his opinions. Sound familiar?

For example, “consistency … is upheld when the same finding is shown in multiple studies across different populations and settings.” Yet the expert relied on only one study. He also relied on data from preclinical animal studies, though he was not qualified to interpret animal data.

Similarly, “specificity” is met “if the exposure is associated only with a single disease or type of disease.” The expert testified that specificity was nonetheless met through the single study because “the randomized controlled trial allows you to fulfill that criterion.” “[A]nother example of Dr. Goyal refusing to engage with a factor of the Bradford Hill analysis on its terms.”

“‘Biological plausibility’ refers to whether there is a plausible biological mechanism to explain a cause-and-effect relationship between exposure and disease. … The trial court noted that the strongest mechanism Dr. Goyal could identify was only ‘a proposed hypothesis.’” His opinion was therefore rejected because he did “not undertake an analysis of whether the data that exists supports or undermines his opinion that the proposed mechanisms are plausible.”

“‘Analogy’ considers whether there have been associations found between a related or similar substance to the one at issue and the disease or outcome.” The expert analogized to a different class of diabetes medication than the one at issue (DPP-4). “The trial court reasonably concluded that this opinion was not reliable because the only reason for Dr. Goyal to analogize saxagliptin to TZDs rather than to other DPP-4 inhibitors was that the former supported his ultimate conclusion on causation and the latter did not.”

Bonus for the defense: because general causation must be proven by expert evidence, and plaintiffs’ sole expert on general causation was excluded, summary judgment followed. The trial court denied plaintiffs’ request to re-open discovery and allow them to find another expert. The Court of Appeal affirmed that too.