Letters to the Editor

Applied Clinical Trials

Applied Clinical Trials, Applied Clinical Trials-04-01-2005,

What’s on your mind?

The Psychology of COX-2 Research


As a psychologist, it strikes me that the COX-2 televised drug company-FDA public meeting was full of relevant psychological implications that probably weren't recognized. It's been too easy to cast some researchers as "bad guys" and some as "good guys," as if that label addresses or resolves any of the COX-2 problems. The real problem is that highly educated and well-intentioned researchers are trying to make decisions under conditions of uncertainty--a psychologically difficult task. Perhaps what we've seen in the televised meeting was due to:

1. How information was presented. As I recall, all the presenters offered statistics via what looked like PowerPoint. But PowerPoint is not a completely neutral presentation format. In his 2003 booklet "The Cognitive Style of PowerPoint," MIT information expert Edward Tufte (http://www.edwardtufte.com/) notes that "In particular, the popular PowerPoint templates (ready-made designs) usually weaken verbal and spatial reasoning and almost always corrupt statistical analysis." (p3) In a 1999 study1 of how different data displays affected accuracy of physician researchers' decisions to stop a hypothetical clinical trial, researchers found that such decisions were affected by both the type of data display (icons, tables, pie charts or bar graphs) and the framing (positive or negative) of tabular data. Since interpretation of statistical data can't be separated from the presentation style of that data, it's possible that presenting the same COX-2 data in different displays may have lead to different committee recommendations.

2. Describing COX-2 risks. Researchers and media might report that heart attacks increased in COX-2 users by 50%, but not report that a 50% increase actually was from 10 placebo patients to 15 COX-2 patients. A 50% increase is frightening; a difference of five patients is not. Plus, drug risks are never presented in the context of other relative risks people face on a daily basis. For example, how does the risk of having a heart attack or dying from a COX-2 drug compare to the risk of dying in an automobile accident, or being struck by lightning, or by having a fatal fall in the bathroom? Describing COX-2 risk as having "greater risk for x happening" is of no help for doctors or patients trying to decide whether to take a COX-2 inhibitor or not; they really want to know "If I take this drug, what's my chance of having a heart attack or dying?" If 100 patients take a COX-2 inhibitor, how many will have a heart attack or die? How does that compare to 100 patients not taking COX-2 drugs? To be useful, risk information for patients must be specific, not abstract.

3. Different thinking styles. Because drug company researchers, FDA regulators, and independent researchers all have their own thinking styles and frames of reference, these three groups--with three different ways of thinking (including the potentially dangerous "groupthink")--can look at the same data and arrive at different conclusions. Sometimes, people see what they want to believe. That's not malevolence or incompetence, just the way the brain works.

Plus, different decisions may be based not only on data, but on well-known strategies for how people process complex information, such as: 1) confirmation bias (where people seek information that confirms their beliefs and ignore information that contradicts those beliefs); 2) hidden or absent data such as the differences noted above in how risks are described and reported; or 3) sharpening and leveling, in which some data is emphasized (sharpened) while other data is de-emphasized (leveled). Psychology does a better job of explaining complex decisions than assuming that COX-2 panel members who were drug industry consultants voted in favor of the drugs because of financial conflicts-of-interest. Neither researchers nor the media should confuse correlation with cause-and-effect--which is another example of how thinking style affects decisions.

4. Clinical trial databases. Comprehensive COX-2 databases might not aid the decision-making process. Not only will consumers (and researchers) probably experience information overload if data is presented without interpretation, but trying to make sense of that much complicated, conflicting, and uncertain information may lead them to make decisions based not only on that data, but on "heuristics"--mental shortcuts that simplify complex information. With complicated information consumers (and perhaps researchers and physicians) will make decisions based less on the data but on the "availability" of what they know and remember about COX-2 drugs, "vivid" media reports about those drugs, or how well the database "represents" their mental model of COX-2 drugs. To make sense of too many statistics, or too much uncertain information, the brain will use a variety of psychological strategies to understand the data and arrive at a decision.

References
1. L.S. Elting, C.G. Martin, S.B Cantor, E.B. Rosenthal, "Influence of Data Display Formats on Physician Investigators' Decisions to Stop Clinical Trials: Prospective Trial with Repeated Measures," BMJ, 318, 1527-1531 (1999). [http:bmj.bmjjournals.com/cgi/content/full/318/7197/1527]

Mark Hochhauser
Readability Consultant
Golden Valley, MN
MarkH38514@aol.com