Deeply conflicted
How can we insulate ourselves from conflicts of interest? The most popular solution—disclosing them—turns out not to help
Deeply conflicted - The Boston Globe
If your doctor recommended a drug whose manufacturer’s consulting fees financed his summer home, would that give you pause? Would you trust a stockbroker who wanted to sell you on a risky mutual fund that gave him a commission for every sale? How about a public official touting a new energy technology made by a company she invests in?
They’re questions worth asking, because conflicts of interest like these are commonplace. In just about any profession?—?medicine or real estate, accounting or academia?—?people giving information and advice may carry agendas that bias their judgments, or find themselves in situations where duty and personal benefit clash.
Within many fields, one solution has emerged: require people to disclose any ties that might sway their judgment. Such transparency, the rationale goes, encourages those in authority to behave more ethically, and lets those relying on their guidance take the bias into consideration.
But recent research by experimental psychologists is uncovering some uncomfortable truths: Disclosure doesn’t solve problems the way we think it does, and in fact it can actually backfire. Coming clean about conflicts of interest, they find, can promote less ethical behavior by advisers. And though most of us assume we’d cast a skeptical eye on advice from a doctor, stockbroker, or politician with a personal stake in our decision, disclosure about conflicts may actually lead us to make worse choices.
“None of us are saying that transparency is a bad thing,” says Daylian Cain, a behavioral economist at Yale University. “But almost always, it fails to work as well as we think it does.” By assuming that disclosure is always a benefit, he and his colleagues argue, regulators may be failing to address the real problems caused by conflicts of interest. In fact, biases are rooted deep in our psychology, and can’t be dispelled with a simple confession. Policies of disclosure, far from being a panacea, may be drawing attention away from the much harder work of removing conflicts and making sure that people’s advice and their interests align.
Ideally, all of us would be unconflicted actors, working in the best interests of the people we serve. In reality, though, we all navigate a sea of competing desires, and some of these create financial or social pressures that interfere with our objectivity. In some cases, the consequences of these conflicts are severe enough that industries have established rules for managing them.
One of the most popular?—?and least costly?—?solutions is disclosure. The notion is that requiring experts to put everything on the table should give them an incentive to behave ethically and avoid tarnishing their reputation: Transparency begets honesty. But work by Cain, in collaboration with Don Moore at the University of California Berkeley and George Loewenstein at Carnegie Mellon University, finds that disclosure can have the opposite effect.
Cain, Loewenstein, and Moore conducted a series of experiments meant to mimic a situation in which a person in authority?—?such as a doctor, consultant, or real estate broker?—?is giving advice that influences another person’s decision. Certain study participants were required to make an estimate?—?evaluating the prices of houses, for instance. Meanwhile, other participants were selected to serve as experts: They were given additional information with which to advise the estimators. When these experts were put in a conflicted situation?—?they were paid according to how high the estimator guessed?—?they gave worse advice than if they were paid according to the accuracy of the estimate.
No surprise there: People with a conflict gave biased advice to benefit themselves. But the twist came when the researchers required the experts to disclose this conflict to the people they were advising. Instead of the transparency encouraging more responsible behavior in the experts, it actually caused them to inflate their numbers even more. In other words, disclosing the conflict of interest?—?far from being a solution?—?actually made advisers act in a more self-serving way.
“We call it moral licensing,” Moore says. “After having behaved honestly and virtuously, you then feel licensed to indulge in being a little bit bad.” Other recent findings on ethical behavior, he says, show that people compensate for virtuous acts with vice, and vice versa. “People behave as if they have a moral ‘set point,’?” Moore says. Indeed, it appeared that disclosing a conflict of interest gave people a green light to behave unethically, as if they were absolved from having to consider others’ interests.
What, then, about the other half of disclosure’s supposed benefits? In effect, what the experts were doing was passing the buck on managing their bias to the people they were advising. So does disclosing a conflict of interest enable the people receiving advice to take that information with the proper grain of salt? Research again suggests the answer is no.
Sunita Sah, a researcher at Duke University’s Fuqua School of Business, has conducted experiments focusing on doctor-patient interactions, in which a doctor prescribes a medication but discloses a financial interest in the company that makes the drug. As expected, most people said such a disclosure would decrease their trust in the advice. But in practice, oddly enough, people were actually more likely to comply with the advice when the doctor’s bias was disclosed. Sah says that people feel an increased pressure to take the advice to avoid insinuating that they distrust their doctor.
Sah sees people complying with biased advice as a way of helping their advisers, even in one-off interactions between strangers participating in a study. “People feel pressure to behave generously even if it’s not in their best interest,” she says. In these situations, she says, “instead of being a warning, disclosure places this burden on the very people it’s supposed to protect.”
At a recent conference on conflicts of interest at Harvard Law School, Harvard psychologist Mahzarin Banaji said that the core problem is a fundamental misunderstanding about the pervasiveness and power of bias. We assume we’re in command of our preferences and decisions, but psychology and cognitive science have shown that much of our decision-making occurs unconsciously. Banaji pointed out that we have preferences for everything from politically similar people to the letters in our own names. “There is no ‘neutral’ at the implicit or unconscious level,” she said.
This disconnect results in policies that underestimate not only the prevalence of bias, but also its burden on society. “The big missing ingredient is that people don’t understand how dangerous conflicts of interest are in the first place,” Cain says. He points out that people’s decisions are easily influenced by information they receive beforehand, even if they know the information to be incorrect, irrelevant, or biased. This phenomenon, called anchoring, has been shown time and again in psychological experiments. Thus, experts can’t simply overlook their own personal interests, and those who get advice can’t easily discount experts’ prejudices, even if they want to.
Personal connection adds a further layer of complexity. Francesca Gino and her colleagues at Harvard Business School have found that people who are prescribed medicines by personal doctors are less likely to recognize the potential dangers of their doctors’ conflict of interest. Although most of us recognize that conflicts of interest are a problem in the abstract, we don’t want to acknowledge them in people we know. That’s because we don’t see bias as something that affects good, intelligent people. But in fact, Gino says, “there are lots of very subtle factors that can push us to cross ethical boundaries without us realizing that these factors are having an effect.”
If disclosure is as ineffective?—?or even counterproductive?—?as these studies suggest, is there any hope for it as a tool? Some studies suggest that disclosure of conflict of interest works better when people on the receiving end are well informed?—?it might, for example, work better among colleagues than for doctors and patients. Sah’s research, meanwhile, points to a number of ways disclosures can be improved. She found that people were more likely to discount biased advice from doctors if disclosures were made by a third party, if they were not made face-to-face, or if patients had a “cooling off” period to reconsider their decisions.
Even if these fixes make disclosure more effective, the true implication of these studies is that transparency is not a blanket solution to problems of corruption. “Regulators should be looking harder at eliminating conflicts,” Cain says. Unfortunately, requiring disclosure is much easier than changing the status quo. As he puts it, “I’d rather tell you I’m on the gravy train than get off it.”
Furthermore, as Moore admits, in some cases the high costs of eliminating conflicts of interest may not be worth the effort. But in circumstances where conflicts cause harm, changing the system could be worthwhile. Regulators, Moore says, need to look for ways to structure systems so that experts’ personal interests are matched with the interests of those they are helping. “Restructuring to align interests is difficult,” he says, “but when you do it, it can be beautiful.”
Cain, Daylian M., Loewenstein, George F. and Moore, Don A.,
The Dirt on Coming Clean: Perverse Effects of Disclosing Conflicts of Interest (December 1, 2003). Available at SSRN:
The Dirt on Coming Clean: Perverse Effects of Disclosing Conflicts of Interest by Daylian Cain, George Loewenstein, Don Moore :: SSRN / http://www.princeton.edu/chw/lectures-conferences/lectures/past-lectures/spring2005/conflicts.pdf
Conflicts of interest can lead experts to give biased and corrupt advice. Although disclosure is often proposed as a potential solution to these problems, we show that it can have perverse effects. First, people generally do not discount advice from biased advisors as much as they should, even when advisors' conflicts of interest are honestly disclosed. Second, disclosure can increase the bias in advice because it leads advisors to feel morally licensed and strategically encouraged to exaggerate their advice even further. This means that while disclosure may [insufficiently] warn an audience to discount an expert-opinion, disclosure might also lead the expert to alter the opinion offered and alter it in such a way as to overcompensate for any discounting that might occur. As a result, disclosure may fail to solve the problems created by conflicts of interest and it may sometimes even make matters worse.
Cain, Daylian M., Loewenstein, George F. and Moore, Don A.,
When Sunlight Fails to Disinfect: Understanding thePerverse Effects of Disclosing Conflicts of Interest (July 7, 2010). Journal of Consumer Research, Forthcoming. Available at SSRN:
When Sunlight Fails to Disinfect: Understanding the Perverse Effects of Disclosing Conflicts of Interest by Daylian Cain, George Loewenstein, Don Moore :: SSRN /
https://apps.olin.wustl.edu/cres/research/calendar/files/LoewensteinG.pdf
Disclosure is often proposed as a remedy for conflicts of interest, but it can backfire, hurting those whom it is intended to protect. Building on our prior research, we introduce a conceptual model of disclosure’s effects on advisors and advice recipients that helps to explain when and why it backfires. Studies 1 and 2 examine psychological mechanisms (strategic exaggeration, moral licensing) by which disclosure can lead advisors to give more-biased advice. Study 3 shows that disclosure backfires when advice recipients who receive disclosure fail to sufficiently discount and thus fail to mitigate the adverse effects of disclosure on advisor bias. Study 4 identifies one remedy for inadequate discounting of biased advice: explicitly and simultaneously contrasting biased advice with unbiased advice.