What Is Preference Falsification?
Preference falsification refers to misrepresenting private beliefs and thoughts in public. It is universal and occurs in many contexts.
This behavior can be due to people's discomfort holding a minority opinion and the social pressure to conform in a group setting.
Classic experiments by Solomon Asch demonstrate the tendency to yield to the majority, even when the majority is wrong.
“You mean before you give a number, you think about what they want to hear?” Deborah Stone asks this question in her book Counting: How We Use Numbers to Decide What Matters (2020).
Essentially, yes, says Stone’s friend, a social scientist who is suffering from the excruciating pain of cancer. This woman carefully formulates her response to her physician’s question about the pain’s intensity each time—does she want to appear “wimpy” or “stoic?” Does she want to obtain more pain medication or keep the higher dose for later when the pain becomes unbearable? Should she rate the pain as less intense this week so her physician will feel she is succeeding and not give up on her?
Though Stone doesn’t use the term, she presents an example of preference falsification, a "universal phenomenon" in which we misrepresent publicly what we really think or believe or want privately. In his book Private Truths, Public Lies: The Social Consequences of Preference Falsification, (1995), Timur Kuran, a professor at Duke University, writes that when faced with real or imagined social pressure, people will “deliberately project a contrived opinion.”
Preference falsification is “an individual act” that depends on the context: people mask their preferences in one setting but not in another, often depending on the rewards or punishments associated with a chosen preference. Sometimes, it can occur in "very innocent situations," said Kuran, in a 2020 interview, when people falsify their opinions in order not to hurt someone, i.e., telling a "white lie." Kuran has described preference falsification as the “tyranny of the should” (1995).
An "ostrakon" is a pottery shard used as a ballot in ancient Athens to vote to expel (for ten years) someone deemed to be a danger or political threat and from which we get our word "ostracism." Agora Museum, Athens.
Penalties for voicing a public preference, though, can be physical, economic, or social and can range from a negative remark, a disapproving gesture, or guarded criticism to unmitigated denigration, harassment, loss of reputation, imprisonment, torture or even death. Subjects who are privately critical of an autocratic regime are more apt to need preferential falsification for their survival. What makes a government democratic, though, says Kuran, is "not that it keeps people from being penalized for their public preferences," but rather that a democracy “merely restricts the menu of possible penalties.” In every government, people can be at odds with public opinion. Secret ballots enable people to feel freer to express their private beliefs and "ensure preference falsification doesn't happen"(Sunstein, The New Republic, 1995).
While secret ballots can mitigate against preference falsification in politics, secrecy during the confidential peer-review process for scholarly journals may encourage it. There is a movement in recent years by Allison and his colleagues (Valdez et al, F1000 Research, 2020) to make research and its review process more transparent, but this is still far from the norm. "As every academic knows, anonymous referees, unleashing jealousies, animosities, and prejudices, are notoriously quick to condemn articles they would not dare to criticize openly," says Kuran (1995).
Subjects in research who overtly misrepresent what they do is yet another area in which there can be evidence of preference falsification. Researchers, such as Allison and Heymsfield, for example, have known for years (Lichtman et al, New England Journal of Medicine, 1992; Klesges et al, Journal of Consulting and Clinical Psychology, 1995; Dhurandhar et al, Journal of Nutrition, 2016) that people under-report their caloric intake when using food frequency questionnaires, food diaries, and diet recall. These self-reports compared with quantitative measures, such as doubly-labeled water or urinary nitrogen levels, (Ravelli and Schoeller, Frontiers in Nutrition, 2020), are considerably inaccurate.
Although under-reporting may be due to “honest” mistakes about portion size or even due to package mislabeling (Allison et al, JAMA, 1993), many, particularly among those who are embarrassed privately about their weight or how much they consume, deliberately falsify what they report publicly. Implausible data have potentially significant public health consequences: they create a “house of cards” and "destroy the entire evidential foundation” on which nutritional research is based (Dhurandhar et al, International Journal of Obesity, 2015).
A "distinguishing characteristic" of preference falsification, though, is that it brings "discomfort to the falsifier," because it is like "living a lie" at least momentarily or even more chronically but it is not necessarily all bad for a society. Sometimes, it can have a stabilizing and constraining effect (Kuran, 1995), a so-called "laundering effect," when it serves to "filter out inclinations that people consider illicit and would rather not have" (Kuran and Sunstein, Stanford Law Review, 1999). Further, preference falsification can evolve into preference adaptation such that it may be a "stepping stone" toward preference modification (Klick and Parisi, The Journal of Socio-Economics, 2008).
One of the reasons that people tend to adapt or conform to the beliefs of others is that they lack reliable information (Kuran and Sunstein, 1999): when information is "absent or ambiguous" and others have information that seems trustworthy, "the only sensible reaction" is to conform (Hodges et al, Journal of Personality and Social Psychology, 2014).
It was Solomon Asch back in the 1950s who conducted his now classic studies on conformity and social pressure (Asch, Psychological Monographs, 1956). He created an “artificial situation” in which he asked one stooge, among a group of “confederates” to match the length of a given line to one of three “exceedingly unambiguous” choices (Gleitman et al, American Psychologist, 1997). By asking the stooge to answer publicly, he "introduced a sharp disagreement between one person and the entire group…creating a minority of one” (Asch, 1956).
Asch was apparently “surprised and dismayed” at the considerable amount of “yielding,” i.e., giving “patently wrong answers,” among 35% of his subjects (Gleitman et al, 1997). Stanley Milgram, whose subsequent experiments on obedience to authority “were inspired directly” by Asch’s research, had studied with Asch (Gleitman et al, 1997).
Significantly, those who “yielded” to group pressure reacted with “puzzlement” and tried to form “explanatory hypotheses;” some even feared they were suffering from an undisclosed defect (Asch, 1956). Surprisingly, rates of yielding fell considerably when one of the confederates provided a different response, even another incorrect one (Levine, Personality and Social Psychology Review, 1999).
One black chair in a row of blue chairs. In Asch's classic experiments, there was one stooge among a group of confederates.
Bottom line: Preference falsification is a universal and pervasive phenomenon in which we misrepresent publicly what we really think or believe or want privately because of the fear of the consequences or because we wish to gain some benefit. It is a form of lying and occurs in many different interpersonal situations, such as in the political arena where voicing private sentiments publicly against a group can have dire consequences, within a research or peer-review setting where it can contaminate data or damage colleagues, or even among friends where it can affect relationships. Many people respond to social pressure to avoid being "a minority of one."
Note: Special thanks to Harvard law professor Cass R. Sunstein for calling attention to Kuran's concept of preference falsification in his new book, This is Not Normal: The Politics of Everyday Expectations (2021).
Special thanks, as well, to Dr. Andrew Brown of the School of Public Health of Indiana University, Bloomington, for calling my attention to the similarity between preference falsification and social desirability bias. Social desirability bias--the tendency of research subjects to answer questions so they will be viewed more favorably by investigators, i.e., minimizing their own undesirable behavior--is a subset of the much broader concept of preference falsification.