[Elster, Jon (1996), Review of Timar Kuran: Private Truths, Public Lies, Acta Sociologica, 39 (1) : 112-115]

[ Books | Articles | Reviews | Index | The main menu ]

Review of Timar Kuran: Private Truths, Public Lies

Jon Elster

[start of page 112]

Timar Kuran: Private Truths, Public Lies: The Social Consequences of Preference Falsification. Cambridge, Mass.: Harvard University Press, 1995.

This is an important but flawed book. It is important because it provides, for the first time, a rigorous analysis of the dynamics of conformism. It is flawed because the scholarship is floppy, many details of the argument unconvincing, and some

[end of page 112, start of page 113]

of the empirical applications dubious. Although mandatory reading for anyone interested in social dynamics, it must be taken with many grains of salt.

Contrary to what is suggested by the subtitle, it does not deal exclusively with preference falsification, i.e. the overt or public expression of preferences that one does not hold in private. It also deals extensively with preference change, i.e. the transformation of one set of privately held preferences into another. Among the many reasons why people may disguise their preferences, Kuran focuses on preference falsification motivated by the fear of the social disapproval that one would incur by expressing one's privately held view. He does not deny that the other mechanisms exist, but they are not his concern in the book. Among the many ways in which preferences can change, his theory appeals to a 'cold' or cognitive mechanism, viz. conformist preference change caused by reliance on what he calls the heuristic of social proof: 'if a great many people think in a particular way, they must know something that we do not'(p. 163). He discusses and rejects a 'hot' or motivational mechanism of conformist preference change, viz. Leon Festinger's hypothesis that people conform to reduce cognitive dissonance.

Preference falsification and preference change might (and in my opinion do) interact in two ways. (i) When other people falsify their preferences, they might cause my preferences to change. I might come to believe, falsely, that a majority holds a certain view, to which I then conform. The mechanism might, as suggested, be either (ia) hot or (ib) cold. (ii) When I falsify my preferences in response to outside pressure, I might end up changing them too. This could only be due to a hot mechanism, i.e. the tendency to reduce the dissonance caused by saying in public what I do not believe in private. As Kuran does not believe in motivated preference change, his theory has room only for (ib).

Kuran's theory of preference falsification is both more striking and more plausible than his theory of preference change. (It is also the theory that does most of the explanatory work in the book.) Roughly, it may be summarized as 'Tocqueville + Schelling', combining Tocqueville's insight into democratic conformism with Schelling's 'tipping model' of social change. I do not know how much Kuran actually owes to these two writers, but even if he owed them a lot that would not detract from the originality of his contribution. All original work has to start from somewhere.

To get an idea of Kuran's basic model, suppose that initially 60 per cent of the population believe and express a certain idea. Of the remaining 40 per cent, half will publicly embrace the idea as soon as it is expressed by more than half of the population, one-quarter will express the majority view only when it is expressed by three-quarters of the total population, and the remaining quarter will require a nine-tenth majority to go on the bandwagon. Under these assumptions, the majority will increase first firom 60 per cent to 80 per cent, then from 80 per cent to 90 per cent, and then from 90 per cent to 100 per cent. Even if for exogenous reasons all members of society lose faith in the idea, they will still continue to embrace it publicly, assuming that no one is willing to appear as the sole dissenter. Moreover, each individual will believe that he or she is the only one who does not believe in the idea. Yet this 'opinion fantome' can crumble overnight. If a small proportion a becomes willing to dissent unconditionally, and another proportion b is willing to dissent if a proportion a has dissented, and a further proportion c is willing to dissent if a proportion (a + b) has dissented, and so on, a reverse snowball effect may start up that leads to everybody expressing disbelief in the idea. Moreover, this outcome can also be realized if the loss of faith is less than universal, if some believers in the idea are induced by the general desertion to falsify their preferences by expressing disbelief.

Many chapters in Kuran's book are devoted to ingenious and illuminating

[end of page 113, start of page 114]

variations on this basic argument. The brief sketch offered above does not in any way do justice to his argument, and I strongly urge the reader to look up the original. In particular, I recommend a close study of the simple yet powerful diagrammatic mechanism that allows Kuran to analyze the dynamics of preference falsification. Yet, as I said, the book is not flawless. Apart from various minor infelicities, 1 have two main objections. First, 1 think he misunderstands the nature of Communist conformism; second, that he misunderstands Festinger's theory of cognitive dissonance.

Kuran apparently believes that under Communism the outward conformity of a citizen to the demands of the regime caused other citizens and regime officials to believe in his inner conformity as well. I believe he is on the wrong track. From a certain time onwards - perhaps as early as 1960, certainly by 1970 - everybody knew that nobody believed in the ideology. By that time, nobody believed in the fake enthusiasm for the (fake) fulfilment of the plan, or in the fake hatred of the class enemy. People had to go through the motions, but that was all. ('They pretend to pay us and we pretend to work.') In China, for instance, 'One attitude toward study and criticism is: well, we have to go along, even though we hate it; we know everyone is lying, but we have to go along so we don't leave a bad impression. . . . The situation is an embarrassing one: everyone is aware of the ridiculous and undignified role he plays in this charade; its seriousness is ensured by the foreman, silent and attentive, but always very much in evidence.' My own travels in Eastern Europe before and after 1989, as well as a brief visit to Cuba in 1994, confirm this general idea.

Why did the regime require people to go through the motions? Vaclav Havel and Leszek Kolakowski suggest much the same explanation: to humiliate the citizens and demoralize them so that no will to resistance was left.2 Suppose you know that unless you go to a party meeting and speak out against imperialism, you will be demoted. If you go and speak, nobody at the meeting will believe that you hate imperialism. Nor are the 'Festinger conditions' (see below) under which fake belief will induce real belief satisfied. The main effect is erosion of character caused by the knowledge of playing a 'ridiculous and undignified' role in a meaningless charade. The only effect on 'belief' is to fill up the mind with a number of cliches and stereotypes that occupy the space, as it were, that would otherwise be available to beliefs.

Quite generally, we may distinguish between vertically and horizontally induced conformism. Democratic conformism, such as that described by Tocqueville and by Kuran in his chapters on affirmative action, is horizontal. The expressed preferences of the citizens are shaped by the censure of other citizens, in ways ranging from raised eyebrows to all-and-out social ostracism. If you disagree with the majority view, or fail to express agreement with it, or fail to censure those who disagree with it or fail to expreas agreement with it, the majority will censure you. Anticipating such censure, you embrace the majority view, censure those who do not embrace it and those who do not censure those who do not embrace it. Although some may be more resistant to pressure than others, they, too, will cave in when the majority become large enough.

Totalitarian conformism, as described by Kuran in the chapters on Communism or in his occasional references to the Spanish Inquisition, is vertical. The expressed preferences of the citizens are shaped by the censure of the authorities. There is no reason to expect a snowball mechanism of ever-larger majorities coercing ever-smaller minorities into adopting their view, because the pressure is exercised simultaneously and with equal force upon all. Often, the authorities will find it more effective to ask citizens to inform on each other (a vertical communication) than to ask them to censure each other (a horizontal communication). And if there is horizontal pressure, it is caused by vertical

[end of page 114, start of page 115]

pressure. If some individuals speak out against the system others will indeed withdraw from them, but only because they are afraid of being seen as guilty by association.

Whereas Kuran's theory of preference change is entirely cognitive and rests on the social proof heuristic discussed earlier, Festinger argued that under certain conditions people change their beliefs to avoid the unpleasant state of cognitive dissonance between what they profess in public and what they believe in private. More specifically, he claimed that people do not adjust their real beliefs to bring them into line with their professed beliefs if the reward for professing the belief is large enough to justify the behavior to themselves. If the reward ie small, however, the dissonance can only be eliminated by adopting the beliefe one is professing.

In his confrontation, Kuran observes, correctly, that Feetinger's theory implies that an Iranian who defends Islamic rule under the threat of imprisonment can engage in preference falsification without transformation, whereas someone who defends the rule because of peer pressure will undergo a change of preferences in favor of the regime. He then claims that these effects are better explained by his own theory, which assumes that the peer-pressure conformist is more vulnerable to the social proof heuristic because his belief is more shallow. Festinger's argument, however, is supposed to hold for a given strength of belief. His explanandum is the difference in behavior among individuals with the same strength of conviction. Contrary to what Kuran claims (p. 183), this is not the same phenomenon that he is trying to explain. In fact, he would have to deny that it can exist at all. In the standard kind of experiment adduced to support Festinger's theory, a large number of subjects are randomly divided into two groups. In each group, subjects are asked to produce an argument for an opinion that they may or may not actually hold themselves. In one group, they are highly rewarded (the equivalent of severe punishment), while in the other they are offered a small reward (the equivalent of peer pressure). Because of the random assignment, subjects in each group will on average have the same initial degree of adherence to the opinion in question. After producing the argument in favor of the opinion, the average adherence to it is larger in the second group. Kuran cannot offer an alternative account of this phenomenon: he has to deny that it exists. But this is a challenge that he does not take up. And given the many experiments that confirm Festinger's hypothesis, I do not see how he could successfully take it up.

1 From two reports from the People's Republic of China, cited in A. Walder, Communist Neo-Traditionalism, Berkeley and Los Angeles: University of California Press 1986, pp. 156, 157.

2 See notably L. Kolakowski, Main Currents of Marxism, Oxford University Press 1978, vol. 3, pp. 83-91.

Jon Elster
Columbia University, USA

[end of page 115]

[Elster, Jon (1996), Review of Timar Kuran: Private Truths, Public Lies, Acta Sociologica, 39 (1) : 112-115]

[ Books | Articles | Reviews | Index | The main menu ]