Thursday, August 13, 2009

Conformity and dissent

A fanatic is a person who cannot change their mind and will not change the subject.
(Winston Churchill, attr.)

… they cannot change it, because they have no other subject. That is the nature of their crippled epistemology, without which they would not be fanatics.
Russell Hardin, elaborating (pdf).
Had a prolonged Aha! experience some time ago. Firstly, from a splendid book (Why Societies Need Dissent by Cass Sunstein) and secondly from a concept used in the book but taken from an article (The Crippled Epistemology of Extremism [pdf]).

Inklings of what Sunstein covers I had already worked out for myself in general terms, but he provides a much more precise vocabulary and backing from a slew of empirical studies I had no idea existed.

Hardin’s concept of crippled epistemology is used by Sunstein to good effect. What Hardin is concerned to provide is an explanation of fanaticism, Sunstein with the mechanisms (and dangers) of conformity.

On fanaticism
Hardin sees fanaticism as generally a group phenomenon. He starts with a theory of the acquisition of knowledge. One’s belief in the truth of X can depend on the rewards of counting X as true. Acquiring some knowledge can have considerable costs. Much of our knowledge is ‘happenstance’ or ‘byproduct’ knowledge that comes to us essentially free. Some knowledge is a ‘consumption good’ – its acquisition is pleasurable. We tend to rely greatly on others for knowledge.

Suppose you are always a big loser from normal politics. Loyalty (which is self-denying), voice (which has already failed) or exit are your options. If you choose exit, you may find a like-minded group. Over time, those with weaker commitment will tend to leave such a group, intensifying identification with particular beliefs and practices within the group. If the group becomes more isolated, both paranoid cognition (supposing the worst of those you are not in communication with) and sinister attribution (exaggerating the degree to which they are the target of attention) are likely to grow. Both these aid group loyalty while damaging the knowledge-acquisition of members. A crippled epistemology can then greatly aid group cohesion.
Read More...
Prosperity and democracy undermine extremism – by increasing the stake people have in the current situation, de-legitimising coercive politics and increasing the range of information available (particularly about alternative perspectives). Illiberal politics are required to sustain the crippled epistemology of extremism.
Suppressing knowledge is the route to power, strangely even the power of an idea, albeit a crippled and crippling one.
Sunstein summarises Hardin’s characterisation of fanatics as being people relying on a small subset of information mainly derived from fellow extremists.

Hardin’s account would be improved with the addition of some economics of communication and the power of commitment to a particular identity.

If you are committed to a particular view of yourself, there are large costs involved in acquiring knowledge that undermines that view. Refusal to pay those costs is understandable, but damages your knowledge acquisition. A group of like-minded people will tend to reinforce each other in such judgements. Sticking with the like-minded is inherently congenial, it being much less costly to communicate with each other on such matters than with outsiders (communication meaning two-way exchange; monologues and diatribes have no such costs, except in so far as they cut one off from information).

So, the group provides mutual authority and recognition while aiding and abetting the shared epistemological crippling.

Clearly, this is a model that applies rather more broadly than simple fanaticism

Group costs and individual benefits
Sunstein is interested in the remarkable human tendency to conform. Unchecked by dissent, conformity can have major negative consequences. Some empirical results he cites include:
highly contentious corporate boards tend to work better than consensual ones
investor clubs which are not socially-bonded work better
(in both these cases, conformity lowers earnings)
the "Bay of Pigs" disaster, a classic case of bright people being consensual in stupidity
Judges vote differently depending on who else in on a judicial panel with them
Conforming juries head towards extreme results.
Conformity – going along with what others apparently know – provides a good rule-of-thumb in the absence of personal knowledge or expertise but can deny public important information – so conformity carries group risks. In personal terms, it’s the other way around – conformity is a form of free-riding, dissent carries personal risks. Hence the problem – society benefits from behaviour which carries significant personal risks. Dissent is not always good (Hitler was a dissenter), but societies and institutions work better if dissent can operate.

Studies shows that overt self-confidence and firmness is highly persuasive and that unanimity is very powerfully persuasive. (One dissenting voice can have a very strong effect simply by breaking unanimity.) Also, out-group membership decreases information flows. Dissent counts a lot less, or information generally, if it comes from someone identified as an out-group member. (So, a differing propensity to identify members of one group – e.g. the left – than another – e.g. the right or ‘conservatives’ – does actually matter.)

(All this being the case, it is particularly damaging for professions or milieus allegedly involved in the pursuit or dissemination of knowledge to de-legitimise alternate points of view, regularly use group denigration or punish divergent views as showing some moral aberration or lack of personal worth.)

In explaining the behaviour patterns revealed by research and observed more generally, Sunstein uses two causal tendencies. First, we rely on others for information. Second, we want to have a good reputation.

Apart from conformity, Sunstein is particularly interested in group cascades – increasing waves of common belief or behaviour – and group polarisation – intensification of belief or attitudes via mutual reinforcement. He is ecumenical in his examples – one of the things I like about the book is the way he moves back and forth from ‘left’ to ‘right’ for his examples.

The various behavioural studies he cites produce some notable results. Such as the tendency towards collective conservatism – groups will remain committed to certain judgements or decisions even when members turn over. Or that many people will assent to propositions opposite to that which they apparently believe if confronted with a series of opinions that support the reversed view.

Persons of high social status or high confidence in their own views are less likely to conform. People who are frightened or confronted with a difficult judgment are more likely to conform. If there are financial rewards for getting it right, conformity decreases for easy judgments, increases for difficult ones (which is important for market behaviour). Conforming also tends to increase confidence in the conforming judgment. The number of public supporters for dominant opinion tends to increase conformity, though a single ‘voice of sanity’ has considerable power to reduce errors. Publicly-voiced and privately-held opinion can move in different directions (often to the majority in the former, to the minority in the latter, if the minority opinions are confidently put and not isolated voices). In ambiguous situations, expert opinion is much more likely to be followed if not openly questioned. It is also surprisingly easy to induce false confessions.

Sunstein discusses patterns of legal compliance and non-compliance, including a few striking examples – such as the 1988 US Toxic Release Act which led to a 45% decline in toxic releases from 1988 to 1995 by the simple expedient of requiring companies to publish their type & level of toxic releases.

Informed people can stop cascades. Cascades are less likely if people are rewarded for correct group decision. When conformity is rewarded, cascades and mistakes are more likely. Cascades can be informational (following what other people believe to be true) or reputational (following what other people believe to be right). Reputational rewards for conformity greatly increase the likelihood of cascades and errors.

(Aside: which means, of course, a milieu which deems some opinions as a sign of virtue, and others as a sign of wickedness, is highly likely to be conformist, produce cascades and be in error.)

If conformity is rewarded, early dissenters are particularly likely to be penalised, having a chilling effect on dissent in the future. Conformity and cascades reduce the procedural cost of decision-making but increase the risk of error.

Dissenters can be disclosers (people releasing into the public arena privately held information) or contrarians (a mixed blessing). It is not dissent per se but useful dissent which is of value. Senseless, hysterical, paranoid, hateful or dehumanising comment is not legitimised by being dissent. Freedom of speech is the best corrective to erroneous conformity and cascades.

Unlike many cascades, group polarisation operates through deliberation. But one can have polarisation entrepreneurs who mobilise people through polarisation (history is full of them; Milosevic in Serbia and Osama are classic contemporary examples).

Where groups are like-minded, they tend to polarise towards a more extreme manifestation of their like-mindedness. Like-minded people have a natural tendency to dwell on shared or common information. That sets up resonances, which increase confidence in common positions, encourage a shift towards more intensity (of belief and of content). The polarisation effect is magnified if both informational and reputational cascades are set up. It is further intensified if rhetorical advantage lies towards increased extremism (as it tends to, due to its greater ‘purity’).

Antecedent extremism and a sense of common group membership both increase the tendency to group polarisation. The easier group exit is, the more likely polarisation is as moderates will tend to leave. Opposed sub-groups tend to discourage polarisation.

Group information diversity strongly tends to aid better decisions, value diversity is more mixed as it can get in the way of group decision-making. (Which only matters if the group has a high need for common decision-making.)

Which is all very interesting, but why was I so impressed? Because it gives a basis for understanding issues I have been worrying at for some years.

I don’t like the term political correctness much. It runs together two different phenomena – evangelical niceness and opinion-bigotry – and, as a term, is a little too obviously a weapon in the culture wars. I coined the term moral vanity to try and pin down a certain type of behaviour and Club Virtue to identify an opinion hegemony. But neither comes with a useful heuristic, even though Club Virtue came from thinking of the economics of clubs, given the clear attempts to exclude moral legitimacy from dissenting opinion while mutually endorsing and displaying shared status as being of the virtuous. (A club is a public good – one which provides shared benefits – from which people can be excluded.)

Add in Hardin’s notion of a crippled epistemology – which, for example, clearly bedevils many academics, such as those discussed by Haynes and Klehr in In Denial and I have frequently observed among academic commenting on ‘economic rationalism’, ‘globalisation’ and ‘neo-liberalism’ – and the dynamics of conformity as outlined by Sunstein, and it all becomes much clearer.

2 comments:

  1. "...a sleigh of empirical studoes..." Possibly you mean a slew of studies? A sleigh is what Santa rides in.

    ReplyDelete