Sunday, November 29, 2020

The Class That Cannot See Itself

The inability to see oneself is a core element of much modern moralising

We live in a time of a striking pattern. In many ways, societies are much more socially egalitarian than they have ever been. Openly class-based social superiority language is much, much rarer than it used to be.

Conversely, we live in an age of intense moral in-egalitarianism. For instance, academic commentary on Trump voters has predominantly been about voting for Trump as a sign of moral delinquency. As has much of the academic commentary on Brexit voters. Such moral elitism is far more common than any explicit class elitism.

There is, of course, very much an underlying social dimension to this moralised denigration, but it pertains to a class that cannot see itself. Specifically, the possessors of human-and-cultural capital. (Human capital being skills, habits, knowledge and other personal characteristics that affect our productivity.)

The human-and-cultural capital class do not, and in a sense cannot, preen as a class because they very determinedly do not see themselves as a class. They are a meritocratic elite, so obviously not a class. Hence the socially egalitarian language.

If one was to nominate the single biggest difference between contemporary “woke” progressivism and previous iterations of progressivism, it would be the disappearance of class as an object of social concern.

A rather telling indicator of this is that it makes sense to talk of “woke capital”, but woke workers are not a thing. Indeed, if “woke” cultural politics become sufficiently salient in electoral politics, working-class voters become much more likely to vote in dissenting, anti-woke ways.

Nevertheless, modern “woke” progressivism is very much driven by class politics. The class politics of status rather more than the class politics of economic interest. But that is not surprising. It is actually hard for any class to combine on the matter of economic interests, as the biggest competitors for one’s income are people seeking the same sort of income. It is much easier for a class to combine on the matter of class status, for that is a benefit they can, and do, share. Status as a member of a group is non-rivalrous (at least over some range) within the group, though it may be highly rivalrous with respect to other groups.

(In terms of political dynamics, income and benefits from the state that are not dependent on working for the state are, obviously, in a bit of a different category to ordinary economic interest. The more benefits the state hands out, the stronger the incentives to act collectively to increase benefits received and minimise costs paid. Though the weaker become the incentives to act cooperatively to independently create positive social returns.)

Unwillingness to see

There are two levels of self-deception going on in the class politics of “woke” progressivism. The first is the inability to see themselves as a class. The second is the inability to see the mimetic nature of their moralising. (The latter term may be a little mysterious: I will explain presently — mimetic desire is desire copied or imitated from others.)

The inability to see themselves as a class is straightforward, though it goes beyond seeing themselves as a meritocratic elite, and so not a class. “Woke” progressivism is overwhelmingly concentrated in the human-and-cultural capital class. The most “woke” industries are the ones most dominated by the possessors of human-and-cultural capital: entertainment, education, news media and online IT. The industries that constitute the cultural commanding heights of contemporary society.

Marxism is by far the most elaborate schematics of class available to progressive thought. Human capital is not a form of capital that Marxism grapples much with, the concept not being developed in detail until decades after Marx’s death, even if the concept dates dates back to Adam Smith. The human-and-cultural capital class thus becomes effectively invisible as a class, both analytically and, given their sense of themselves as a meritocracy, in self-identity.

Conversely they typically very much see themselves as a moral elite, a moral meritocracy: that they possess the correct, and highly moral, understanding of the world. The sense of being members of a cognitive meritocracy (and elite) converges with a sense of being members of a moral meritocracy (and elite). This is, to invoke a touch of Rene Girard, mimetic moralising: mimetic desire being desires copied from another. They copy moral postures from each other based on a common desire to be, and to be seen to be, members of the moral meritocracy.

Unlike other forms of mimetic desire, mimetic moralising is not inherently rivalrous. On the contrary, moral agreement creates a mutually reinforcing sense of moral status.

This common desire to have a mutually self-reinforcing sense of moral status, not merely cognitive status but also moral status, creates a powerful tendency towards conformity. Specifically, it creates prestige opinions, opinions that mark one as a good, informed person. Opinions that make one a member of the club of the properly, intelligently, moral.

Expansive stigmatisation

The immediate corollary of this is that contradictory opinions become evil, wicked, ignorant, stupid. The opinions that are possessed by those, who by having those opinions, are outside the club of the properly, intelligently moral. Those who do not have moral merit in their opinions, who are not members of the moral meritocracy.

For opinions can only create prestige if contradicting them has negative prestige. Thus, dissent from these morally prestigious opinions cannot be legitimate, because if dissent is legitimate then there is nothing special about the putative prestige opinions. Opinions that it is legitimate to disagree with do not sort the morally meritorious from those not morally meritorious. Thus, they are not boundary-setting opinions that mark membership of the club of the morally meritorious.

Mimetic moralising thus insists on the right to police legitimacy, to police what is seen as morally acceptable. It turns morality into the property of the mimetic elite, who are deemed to have the right to police the public space.

It also makes the set of prestige opinions something of a moveable feast. The key thing is to stay inside the moral club, not to protect some permanent doctrinal purity. On the contrary, being alert enough to keep up with shifts in the prestige opinions, and shifting linguistic taboos, is part of how one proves and maintains membership of the moral meritocracy. With the necessary linguistic sensitivity helping to further sort those who are morally meritorious from those who are not.

The level of linguistic attentiveness required does much to ensure that working-class folk never quite make it into the morally meritorious. Particularly if they express themselves on various taboo-laden topics.

If mimetic moralising is the highest (status) good, then reason, evidence and consistency must be subordinated to it. Indeed, those who invoke reason, evidence and consistency against any of the prestige opinions are enemies of (mimetic) righteousness, because they fail to converge with the righteous opinions and they contest the mimetic elite’s ownership of morality. They thus proclaim their failure to join the club of the morally meritorious. Worse, they threaten the very gatekeeping distinctions that creates the status of being morally meritorious, that make club membership valuable.

Such stigmatisation of those outside the boundary of the morally meritorious has more power if it levers off things already widely accepted as being wrong or abhorrent. Thus, the accusation of racism! works so effectively, not because people are generally racist but precisely because they are generally not. The more racist society actually was, the less effect, the less negative resonance, the accusation would have. Conversely, the less racist society becomes, the more potential effect the accusation of racism has (with some adjustment for diminishing returns from over-use). This pattern is aided by the cognitive tendency to expand the ambit of a category or concept as the thing originally captured by the category or concept become rarer.

As this is a status strategy, the demand for grounds to stigmatise will be driven by the conveniences of club-gatekeeping rather than what is actually happening. Thus, the demand for racism and acts of racism as weapons of stigmatisation will (and does) tend to exceed, often quite significantly, the supply of actual racism and racist acts. There is thus a double inflation: acts that are not racist (or may not even have occurred) will be denounced, hence the startling high rate of hate-crime hoaxes. Meanwhile, actual acts of racism will be inflated in their significance.

There will also be an ongoing search for new grounds of stigmatisation to continue the separation of the morally meritorious from the not so. The multiplication of belief sins (all the -ist and -phobe accusations, those of cultural appropriation and so forth) is precisely what one would expect in a time of mimetic moralising as a status strategy by members of the human-and-cultural-capital class.

There is also an obvious capacity for purity spirals. And for more junior employees to leverage their moral commitment against more established staff less au fait with the linguistic and moral nuances. Or who retain lingering normative commitments outside the mimetic moraliising.

Display versus signal

It is useful to understand the difference between display in general and signalling specifically. In biology and economics, signalling involves the incurring of costs: the greater the cost incurred, the stronger the signal.

The mimetic moralising outlined above involves moral beliefs being on display but it rarely involves incurring any cost in such display. On the contrary, moralising as a status-game is all about the benefits of displaying one’s membership of the morally meritorious.

Even so, keeping up with shifts in linguistic taboos and prestige opinions does take attention, so does work as a signal. Not giving heed to the inconsistencies between, and hypocrisies within, the prestige opinions also works as a signal. Especially if it means wearing derision or critique from those pointing out such inconsistencies and hypocrisies. Thus, inconsistency and hypocrisy acts as more of a feature than a bug: it provides a signal of commitment to the club of the morally meritorious, a willingness to pay the membership dues.

Moral norms are norms held unconditionally: things people believe are morally right regardless of the expectations of others. Social norms are norms based on the expectations about what others will do, and what others expect people to do, that have associated social sanctions. Descriptive norms are norms simply based on expectations about what others will do and do not have associated sanctions. (This is the framework for norms developed by philosopher Christina Bicchieri.)

There is a perennial tendency to parade social norms as moral norms. Adhering to a moral norm, rather than a social norm, is a stronger signal of personal commitment and is more presumptively meritorious.

Parading social norms as moral norms allows use of the most complete language of normative commitment: indeed, the language of trumping normative commitment. It also permits a useful level of self-deception: I am not adopting this outlook because it is expected of me and there are costs if I do not, I am really morally committed to it. It becomes a stronger signal of moral in-group membership, of being soundly clubbable.

Mimetic moralising as a status strategy has a range of consequences. One is that it easily slides into a more general expectation of emotional and social protection. The claim that certain words being published make people feel unsafe makes more sense in this context. Being of the club of the morally meritorious easily generates a wider expectation of cognitive safety and protection, of not having to deal with dissenting ideas in one’s social (including work) milieu. The more one’s sense of identity and status is invested in a sense of being of the morally meritorious, the more that is likely to be the case. The more also one is likely to be willing to stigmatise, or otherwise sanction, those who dissent.

Another consequence is creeping organisational capture. The more people adhere to the mimetic moralising status strategy, the less willing they are likely to be to risk losing its protections and benefits by supporting dissenting voices or considering inconvenient facts or perspectives. Much of the power of mimetic moralising as a status strategy is precisely from mutual confirmation of moral righteousness, of being of the morally meritorious. But the elevation of such mimetic moralising then hollows out any use of contrary norms, even if they are longstanding norms of the organisations or institution. Indeed, particularly with the fading away of Christianity, the process of capture is likely to more effective the less competition there is within the normative space for contrary norms that people can acceptably invoke.

Unawareness required

The capacity for manipulation of all this by self-serving actors is very high. Nevertheless, in general, all this only works if the mimetic elite does not fully and consciously understand what they are doing. There is a real sense in which it is vital that they do not see themselves. For if they saw their attempt to control legitimacy, to possess morality as their property, to stigmatise disagreement, to protect their moralised sense of status, for what it was, it would stop working.

Prestige is a bottom-up status process, so if it is fully revealed, and accepted to have been so revealed, the mimetic prestige game would not grant moral prestige. The social dominance would become obviously raw self-interest. Their stigmatising exclusions, in all their projective exaggerations, transparently self-serving.

Conversely, if this mimetic moralising is the highest good, then mobbing is natural and inherent. Mobbing — that is, scapegoating stigmatisation — unites and protects the mimetic moralising, the shared sense of being a moral meritocracy. They are united in, and by, the stigmatisations that protect their sense of status, of being of the moral meritocracy.

What are the sins that they are stigmatising folk for? Typically, for failing to conform to righteous moral harmony, a society entirely without bigotry or ill-feeling. If some grand social harmony is insisted on, but does not yet exist in society, then someone must be to blame for the lack of harmony. Hence the scapegoating of those who disturb harmony is natural to the grand elevation of social harmony as the proper goal. Harmony being a much more all-pervasive and controlling ideal than mere order.

The more this wonderful harmony-to-be-created looks different from the morally-disorderly past, the more the past can be scapegoated. The past cannot answer back, after all. Unless there are scholars brave enough to stand against the mimetic moralising of their colleagues, and the stigmatisation that is likely to engender. (What was remarkable about the 1619 Project was not that so many of the morally meritorious rolled over for it, following the dynamics of mimetic moralising, but the number of historians who were willing to speak against it; in part due to more traditional leftists pushing back against identitarian progressivism. They had norms external to the mimetic moralising that they were willing to stand up for.)

Separating people from their past also separates them from norms and framings that might be invoked against the mimetic moralising.

This scapegoating of the past leads to further self-deception. People whose moral postures are utterly conventional in their social circles (as being morally conventional, and so mutually meritorious, is precisely the point), who shift their moral postures to keep up with what is conventional, who are assiduously morally conventional within their social milieu, laughably claiming that they would not have adopted the moral postures that were conventional in the past.

This massive, and utterly self-serving, arrogance being the basis for their contemptuous rejection of the past which fails to live up to their lofty standards. Standards for which they make no sacrifices, beyond some (relatively minor) signalling costs. Indeed, which are all about them avoiding any sacrifice. Well, any sacrifice on their part.

Their mimetic moralising involves plenty of sacrificing of others to their sense of righteousness. Sacrificing the ability of outsiders to speak, to be heard; sacrificing reputations, jobs, careers. Lots of sacrifice imposed on others by the mimetic elite, none on themselves.

(This lack of sacrifice is why the notion of virtue signalling is somewhat problematic. There is precious little virtue in doing things that involve no sacrifice. Piety display is a better term for what goes on.)

But such sacrifice of others is how scapegoating and mimetic moralising works. Protecting the sense of status one finds congenial by loading guilt and rejection onto the objects of stigmatised sacrifice. Such an object of sacrifice is not a victim, but full of guilt, so deserves what comes to them as they are sacrificed to preserve the distinction between the morally meritorious versus the stigmatised, morally other who is thereby evil.

Thus, to return to where we started, we can see that the eclipse of class from the language of progressivism makes perfect sense. For modern progressivism is dominated by the class that cannot see itself by a class. It is dominated by mimetic moralising that cannot see its own self-serving conventionality. Nor its stigmatising scapegoating. All of which works by shared habits of self-deception and could not stand genuine self-awareness.

Welcome to the world of the class that cannot see itself attempting to achieve social dominance by policing legitimacy in the service of its mimetic-moralising status strategy. And doing so via the stigmatising scapegoating it imposes on others.

And no, for those wondering, online IT is not going to stop its stigmatising exclusions any time soon. There is far too much status at stake.

Cross-posted from Medium. Like most of my pieces on Medium, this is more of an ongoing meditation rather than a finished piece, so is subject to ongoing fiddling.

Friday, November 20, 2020

Peers and parents, past and present

Typically, the child-rearing not done by parents is done by peers.

Source.

Fatherhood, being (unlike biological paternity) a socially-constructed relationship of social identity, provision, care and attention, varies much more across (and within) cultures than does motherhood, which is far more biologically-grounded (at least until the infant is weaned). A small number of societies traditionally did not recognise the relationship of fatherhood at all, the role of male protector being taken by uncles. Even beyond such outliers, evidence from the anthropological literature is that the father’s attention is a highly variable factor in child-rearing.

The investment in attention and feeding required to raise a human child is considerable. Far more than is required to raise the young of any other species. As pregnancy and lactation ties women to the raising of infants, this leads to recurring risk-management patterns across human societies whereby activities that are compatible with breast-feeding, and other aspects of child-minding, become presumptively female roles. Those activities that are not thus compatible become presumptively male roles. The first scholarly explication of this was in a classic short analysis by anthropologist Judith K. Brown. An analysis that subsequently acquired considerable statistical and other support.

In his The Lifeways of Hunter-Gatherers: The Foraging Spectrum, archaeologist Robert L. Kelly includes a very useful summary of the anthropological evidence on child-rearing and its interaction with the acquisition of culture. (All quotes are from Lifeways, mainly Pp108–110.)

The two dominant social sources for transmitting or acquiring culture are parents and peer groups. They generate rather different patterns. Parents, particularly, mothers, are predictable and consistent providers of resources — breast milk, food, affection, attention and protection. Over time, the parents provide less and less resources until eventually the child is cut off.

It has been reasonably argued that parent-raised children learn that “resources and desirable goods are limited and hard to obtain”. They can also be expected to tend to be assertive and independent, focusing on using knowledge of terrain and technology to gain resources rather than social favours. In nomadic societies, boys tend to “de-emphasize male-male competition and focus more on manipulation of the world through technology”. Parent-raised children can be reasonably expected to show greater within-group variation in beliefs and behaviours.

Peer-rearing produces different patterns. Potentially from a young age (as early as around two years old) the child finds themselves in a group, often with older siblings, who become the focus of the child’s social interaction. A group that has less status and power differences within it than are involved in parent-child interactions. Children learn that there are “many sources of food and desirables other than their parents”. Peer-raised children learn to network (who people are, what they have, how they are disposed towards them); that resources are acquired by manipulating social relations. They learn that “resources are not scarce and can be acquired through persuasion”.

The acquisition of culture within the peer group can also be expected to generate less variation in beliefs and behaviours. (It is a long-standing pattern that any organisations seeking to impose uniformity of outlook tend to be inherently suspicious, even hostile, to family autonomy and seek to undermine the authority and independence of families.)

Shifts in the activities of parents are likely to also shift societies from having children being parent-raised to being them peer-raised. In foraging communities, the shift to sedentism leads to longer foraging trips by males and more intense food acquisition and processing by females, as consumption shifts to more-effort foods. This leads to less parental attention to children, leading to more peer-raising. A mechanism that “may account for why sociocultural change seems to occur so quickly once hunter-gatherers become sedentary”.

Peer-raising typically has quite different effects on boys and girls. Girls are typically assigned as caretakers, contributing to “girls having attitudes favoring nurturance and prosocial behaviors … [and] more restricted spatial ranges”. Fathers being away for extended periods of time “is associated with boys who have poor attitudes towards females, who are aggressive and competitive towards other males, and who, when grown, give little attention to their offspring, encouraging a continuation of peer-rearing”. A pattern we can see in contemporary societies in deprived urban communities.

In sedentary societies, adolescent male peer groups have greater importance than in mobile societies, with more violent initiations and harsher punishments. “These competitive groups define a boy’s success in life more than in mobile societies where, presumably, fathers are more often present.” Street gangs are peer groups with added income or self-protection element

Early C20th Chicago street gang

Source:

Cross-cultural surveys have found that “when men spend a lot of time with their offspring and cooperate in child rearing, there is less cultural emphasis on competition”. If men spend more time away from children, “there is a general physical separation of male and female tasks, and competition among men is encouraged”. (This could be a bit chicken-and-egg, if presumptively male tasks are such that they have to be done elsewhere.) “Partially as a response to male behavior, peer-raised girls show expression of sexual interest and assumption of sexual activity early in life, while also showing negative attitudes towards males and a poor ability to establish long term relationships with one male”.

Shifts in adult labour patterns can be “expected to have dramatic effects on enculturation and hence on cultural change”. These patterns clearly have implications well beyond foraging societies. Much of the literature on fatherlessness, for example, reads as the difference between parent-raised and peer-raised children. A sole parent clearly has less potential to counter-balance peer effects than do two parents.

In complex societies, peer groups may vary considerably. Much of the motivation to get students into “good” schools (whether public or private) is to ensure that they have a better quality peer group. Peer group effects also complicate trying tease out the specific effects of a school or teaching on education outcomes.

One of the counterpoints raised to the literature on fatherlessness is the claim, famously made in psychology Judith Rich Harris’s book The Nurture Assumption, that if one control for income/socio-economic status, the negative effects of sole-parenthood disappear. This is a classic problem of co-dependent variables. Bering a sole-parent family tends to greatly affect (i.e. lower) household income and socio-economic status. Normal regression techniques are not analytically reliable in such circumstances. One has to use techniques such as comparing single-parent with two-parent families who are in the same income range. When that is done, the dis-advantaging tendencies of being raised by sole-parent families become quite clear: as one would expect. (For those interested in pursuing this further, the footnoted citations in this essay provide a useful place to start.)

The anthropological evidence also lends support to identifying fatherlessness as a negative factor in raising children. With much of the effects of fatherlessness being due to the effects of peer groups lacking competition from family life, particularly for boys. This obviously has implications for socially-deprived urban neighbourhoods and other localities suffering from absent fathers. Elijah Anderson’s classic study Code of the Streets (a useful summary essay is here) is very much congruent with this anthropological literature. J.D.Vance’s Hillbilly Elegy also touches on these issues, as does the excellent study Trump’s Democrats.

The developing pattern in many Western societies of marriage (and fatherhood) being strong in the upper reaches of society but decaying in the lower reaches is yet another socially-polarising factor in such societies.

Cross-posted from Medium.

Tuesday, November 17, 2020

Left-Hand Path, Right-Hand Path: someone being wrong on the internet

Good versus evil is not a universal religious or moral framing

I have a wide range of interests, many of them historical. One of the YouYube channels I have watched regularly is Dr Jackson Crawford’s channel on old Norse culture: serious scholarship delivered congenially in (generally) bite-sized pieces.

I have also listened to several of Tom Rowsell’s offerings from his Survive the Jive channel. He is mainly interested in matters Indo-European, but he ranges more widely, and some of his videos can be charming, such as this one on Hinduism in Bali. His material seems to be accurate (I have not spotted a significant error yet) and is engagingly presented.

Which brings me to Arith Harger’s channel. I listened to his video on What is the Left Hand Path? My interest in these matters is entirely historical and intellectual, but he gets SO much wrong in this video. Rather than give a detailed critique, I will cover the same ground based on available scholarship.

The first section of the video is actually quite a good discussion of the difference between Right-Hand Path and Left-Hand Path. He is correct in arguing that it does not neatly line up with good and evil. Even though, for Star Wars fans, the Right-Hand Path seems a bit Jedi-like and the Left-Hand Path rather Sith-like. It is when Harger tries to put the matter in a larger context that he goes seriously wrong.

The original source of the Left-Hand Path/Right-Hand Path distinction is from India. Specifically, from Yogic and Tantric traditions.

There is an excellent introduction to this distinction, and its likely cultural and historical origins, in Thomas McEvilley’s classic essay The Archaeology of Yoga. This is available on Jstor here and on Scribd here. The essay is, in part, a precursor to McEvilley’s masterpiece The Shape of Ancient Thought in which he examines the history of Greek and Indian philosophy and their interactions.

McEvilley argues that the Left-Hand Path/Right-Hand Path distinction is driven by the interaction between the patrilineal (and patricentric) culture of the invading Indo-European pastoralists and that of the matrilineal (and matricentric) culture of the resident farming population. To use slightly old-fashioned scholarly language, Aryans versus Dravidians.

Whether the culture was patrilineal or matrilineal matters because, in a patrilineal culture, a child without a father lacked a crucial element of social identity. Patrilineal cultures tend to be particularly restrictive of female sexuality for that reason. Conversely, in a culture where a child gets their social identity from their mother, not having a designated father is likely to be less of an issue. Especially if uncles can readily substitute as protective male relatives.

So, mystical and occult traditions in patrilineal cultures are likely to be sex-restrictive, to be ascetic. Conversely, mystical and occult traditions in matrilineal cultures are more likely to be sex-permissive. The former naturally inclines towards the development of spirit-focused disciplines, where ascetic denial of indulgence and the body is the seen as the path to self-development. The latter naturally inclines to the development of self-capacity-and-bodily-focused disciplines.

There is no direct connection in this to any good-versus-evil dichotomy. But, outside the monotheistic traditions, good-versus-evil is not the normal religious dichotomy. In religions in the animist-polytheist spectrum, the more normal distinction is order-versus-chaos. This is particularly true in agrarian societies, where a bad harvest presages disease, death and, if there is a sequence of such harvests, disaster. This concern for order can be seen in the ancient Egyptian concept of maat. How to construct and maintain social order is a central concern of Chinese philosophy while to seek the Tao (or Dao) is seek to be in accordance with the natural order of the universe.

Ascetic, sex-restrictive disciplines tend to be more orderly than than more sex-permissive disciplines. Especially if, as is the case in various forms of Left-Hand-Path, the deliberate breaking of conventions and restrictions is seen as a technique to develop one’s capacities. The highly patrilineal Indian elite would clearly tend to see the mother-right vestiges of Dravidian mysticism and occult practices as very much other. So, the Left-Hand-Path would be seen as more chaotic (because it was) and therefore negatively.

Arith Harger does not seem to be aware of any of this background. Of course, admitting the Left-Hand Path/Right-Hand Path distinction is originally Indian does rather get in the way of presenting it as Old Pagan Wisdom. Though somewhat similar, though less developed, patterns of patrilineal pastoralist Indo-European overlay interacting with matrilineal farming-religion survivals can also be traced in European cultures and their pre-Christian religious and occult traditions.

Where Harger gets particularly confused is over the good-versus-evil distinction. This is not remotely an originally European idea. The pre-Christian religious traditions of Europe fairly clearly follow the normal order-versus-chaos division. Thus, in Norse mythology, various monsters of chaos threaten the order upheld by the Aesir in a pattern that recurs across mythologies.

The good-versus-evil distinction is essentially (as far as we can tell) an invention of Zarathustra (aka Zoroaster). It fits nicely in with monotheism — with a Creator God who creates both the material and the moral order. Opposition to such a God is not merely chaotic and disorderly, it is anti-moral and destructive.

Hence the original Jewish understanding of the sin of Sodom and Gomorrah was that they were cities that were systematically anti-moral, preying on the weak and vulnerable and refusing to respect others. (Which makes way more sense than the later interpretation of the key sin of the cities of the plain being unnatural sex: see Chapter Four of Michael Carden’s Sodomy: A History of a Christian Biblical Myth.)

Norman Cohn’s book Cosmos, Chaos and the World to Come is a good introduction to the hugely important shift in moral perspectives from order-versus-chaos to good-versus-evil. A shift in moral perspectives that explains much of the antipathy between the Romans (who were mostly definitely all about order, including in ways that seem wildly immoral, or even evil, to us) and the Jews and later the Christians. If one is going to truly embrace a pagan perspective, good-versus-evil has to go.

So, the Right-Hand Path/Left-Hand Path distinction does not map to good-versus-evil. As it originally arose in the intermixing of cultures on the animist-polytheist spectrum, that is not surprising.

The Right-Hand Path/Left-Hand Path is originally an Indian distinction, but as patrilineal pastoralist Indo-European overlay over the culture and religious perspectives of resident matrilineal agrarians we see in India also occurred in Europe, the distinction has a relatively easy path into European pagan traditions, even revived ones.

But surely it is better to get the history correct, rather than hopelessly confused.

Cross-posted from Medium.

Thursday, November 12, 2020

The US's turn to be hobbled by Leftism?

Ideologies come and go. It is the social dynamics that matter.

Source:

In one of his fun and informed alternative history videos, the operator of the Whatifalthist YouTube channel makes the point that Communism was good for the US, because it hobbled the two countries most able to rival it — Russia and China.

Both Communist regimes killed millions of their citizens in mass famines, depressed fertility rates (intentionally in the case of China) and adopted highly inefficient command economies. Stalin did the US the extra favour of systematically killing off the kulaks — that is, those peasants who showed any initiative.

Stalin was ruthlessly effective at deploying available resources to create heavy industry to support a war machine. (Something that Mao and the Kim family regime of North Korea copied.) Nevertheless, at no stage did the Soviet economy achieve economic growth rates as high as did the Tsarist regime. Nor did the Soviet Union have the cultural, intellectual or scientific vitality of the Tsarist regime. (Soviet technological advances were either a result of theft or massive deployment of resources.) Under the Tsars, Russia had become a major agricultural exporter. The Soviet Union had perennial problems feeding itself.

Adding to the Soviet regime’s record of tyranny and mass murder, bringing back slavery (in the Soviet labour camp system) and serfdom (from 1940 to 1956, no worker could change workplace without the workplace’s permission) made it utterly clear that, no the Soviet Union was no sort of general moral advance over Tsarism. And, despite its Cold War rivalry with US, the Soviet Union was a smaller population state with a smaller economy than Russia would have been if it had never suffered Communist rule.

Both Russia and China have since become various forms of market economies. Nevertheless, the legacy of Communist rule means that both countries have less people and lower standards of living with less intellectual and cultural vitality than they would have if they had never been afflicted by Communism.

A case can be made that it is now the US’s turn to be hobbled by Leftism.

By Leftism I do not mean labourism. Labourism is the working class asserting itself through unions and political organisation. Communist regimes do not permit independent working class political action, unless (as in the case of Poland) they have been weakened by other pressures.

When folks attempt to define Leftism, they typically do so ideologically. But I am interested in the social dynamics, so I am not going to attempt an ideological definition.

In terms of social dynamics, Leftism is the human-and-cultural capital class seeking to achieve social dominance via social transformation politics. It is based on three propositions: (1) that the adherents have a clear understanding of social dynamics; (2) that they possess, or can discover, a clear path to the morally positive transformation of society: and (3) that path will be achieved if sufficient power is handed to people like them.

Leftism is members of the human-and-cultural capital class having massive tickets on themselves. Which is why Leftism tends to end up dominating those institutions and organisations that themselves are dominated by the human-and-cultural-capital class. Leftism not only gives them a (flattering) sense of status and purpose, it also generates a shared status strategy.

Adherence to Leftism means adherence to a set of prestige opinions (what good, smart, informed people believe and what malicious, stupid or ignorant people don’t) ready-made to assert social dominance. The latter is based on the deemed moral necessity of preferring to appoint, or otherwise support, people with said prestige opinions that mark the “good, smart, informed” people while stigmatising, and otherwise penalising, those who demur.

Once a particular version of Leftism achieves sufficient prestige-opinion dominance, it can sweep through organisations and institutional remarkably quickly. Especially bureaucracies, as it provides the advantage of simplifying selection processes (one picks “folk like us”, so folk select in their same image and likeness); simplifying internal coordination (people have common outlooks and expectations); and generates moral projects to be getting along with.

Without being able to mobilise and coordinate people with organisational skill, Leftism could never come to dominate societies sufficiently to hobble them. The existence of such capacity does not, however, mean that Leftism will be beneficial to a society. On the contrary, the features that make it good at gaining positions of power are also what recurrently turns its dominance into a human and social disaster.

Leftism in power hobbles societies because (1) none of its three constituent propositions ever turn out to be sufficiently true to have any other effect other than to hobble their societies and (2) Leftism destroys or distorts feedback mechanisms.

The second factor operates quite straightforwardly. The belief that Leftists have such profound social understanding and moral purpose immediately discredits any feedback that seems to contradict that status and purpose. Moreover, to achieve the deemed morally urgent social transformation, a level of social power has to be achieved sufficient to enable overriding of any resistance. This means breaking up any capacity to organise to resist the Leftist moral project, or to persuade others to do so. The result is a pervasive attack on, and blocking of, feedback mechanisms. This inevitably severely represses the information flows and incentives need to have effective error detection. The ability to detect error, consider alternatives, adjust actions and so on become sufficiently reduced that more disastrous policies can and will be followed.

As for the key propositions of Leftism — (1) that adherents have a clear understanding of social dynamics; (2) that they possess, or can discover, a clear path to the morally positive transformation of society: and (3) that path will be achieved if sufficient power is handed to people like them — they are simply never sufficiently true to have a dominant Leftism having other than having overall a highly negative effect on society.

The first problem is that the content of Leftism is primarily driven by the status strategy, because that is its prime appeal and it is that which the selection processes will operate most strongly to serve. Selection for the most efficacious set of prestige opinions and moralised dominance strategy is not selection for truth or accuracy.

On the contrary, the complexities of reality will get in the way of the prestige-and-dominance strategy, so there will be selection for simplifying moral salience. Including casting entire sections of society into having profoundly negative moral salience, so that any idea, experience or concern coming from them will be presumptively denigrated, dismissed or otherwise discounted. This will strongly tend to block or narrow the social understanding underpinning whatever flavour of Leftism has become dominant.

In particular, the particular flavour of Leftism will have some factor, or small set of factors, that will be deemed “to explain” the dynamics of the society to be transformed. Or, at least, the dynamics of whatever deemed moral failure requires said social transformation. As these factors have to play a set narrative role in the project of social transformation, in order to support that project of social transformation, social understanding must be narrowed so that those factors can play that narrative role. Any social factors that get in the way of that narrative will be denigrated, dismissed or ignored, ensuring that the transformation project will itself be based on a partial understanding, profoundly inadequate to fulfil the role cast for it.

The second problem is that the existing society has to be cast as being a profound moral and social failure so as to justify the deemed social transformation. That means, the society’s actual achievements must be dismissed or belittled. This means that the processes which led to those achievements will be denied, denigrated or dismissed. The more successful the society actually is, the more disastrously inaccurate that process of denial, denigration and dismissal will be. Conversely, the more fragile the underpinnings of existing social achievement, the more disastrous the effect of that denial, denigration and dismissal is likely to be.

The third problem is that the concentration of social power needed to achieve the deemed morally urgent social transformation will tend to aggravate the effect of all the above intellectual flaws and failings, precisely because of the concentration of social power operating on their basis. Moreover, a new selection process will be set up whereby manipulative personalities will be drawn to what is, in effect, the only game in town, given the massive concentration of social power involved. Indeed, there will be something of a selection process in favour of Dark Triad personalities (narcissistic, Machiavellian, psychopathic) as they will be able to operate in, and manipulate, the power dynamics more effectively. In part because they will have fewer scruples, though the sense of moral urgency in the social transformation project will inherently tend to override scruples.

The entire pattern can be understood without any reference to some particular ideological claim. It is the social dynamics that are crucial. The particular doctrines involved mainly have an effect in aspects of how the social dynamics play out, not the fundamental social dynamics themselves.

We can see the patterns of Leftism well underway in the contemporary US: most obviously in California’s various dysfunctions but more generally in those metropolises dominated by progressive politics and in institutions dominated by the human-and-cultural capital class (such as media, Big Tech and education). As there is no sign that contemporary Leftism is losing its fervour or institutional hold — on the contrary, both seem to be increasing — the social dynamics of Leftism in the US have considerable way to go yet. How badly they end up hobbling the US will entirely depend, as it did in Russia and China, on how dominant Leftism becomes.

Cross-posted from Medium.

Tuesday, November 10, 2020

The Oppression Two-Step: creating mountains of bullshit out of molehills of truth

Systematic misuse of the concept of oppression disorders moral and analytical judgement.
Afghan women in different decades

Source:

I was reading a book, a good book, on the application of ordinary language philosophy to literary theory (yes, I am that nerdy) and I came across the following passage:

I have (of course) nothing against the fundamental project of intersectionality theory, which I’ll preliminarily define as the attempt to understand the experience of complex forms of oppression, the identities formed under such conditions, and the power structures than produce them. (P.91)

What follows is not a shot at the author, who is a wonderfully clear writer and comes across as a sensible and humane person. What cries out for critique is the normalising of what are, at bottom, ridiculous and inflated claims about contemporary Western societies.

Let us start with forms of oppression. It takes well-developed historical blindness, or historical ignorance, to characterise the ordinary experience of folk in contemporary developed (“Western”) societies as oppression.

What is imposed on a labour camp inmate, brought vividly to literary life in Aleksandr Solzhenitsyn’s One Day in the Life of Ivan Denisovich, is oppression. Living in a Nazi death camp is oppression. A slave chained in the bowels of an Atlantic passage slave ship experienced oppression. A slave chained in a caravan marching across the Sahara experienced oppression. If that slave ended up in hot sand as the standard after-being-castrated procedure, his oppression had been intensified. They are all cases of oppression, of intense oppression, because they all entail the profound and systematic destruction of human well-being; usually to serve the well-being of others.

The issue of the treatment of female African-American workers legal academic Kimberle Crenshaw describes in her original essay that launched the academic career of intersectionality is a labour dispute. To call being underpaid, even systematically underpaid, oppression is to hugely cheapen the term. To call the experience of Western women in the 1950s, let alone the experience of contemporary Western women, oppression is to hugely cheapen the term. Constraint, even exploitation, is not oppression unless it destroys well-being (rather than insufficiently generating or unfairly distributing it). Unfairly distributing well-being is not, in itself, oppression.

Is there unfairness and inequities within Western societies. Yes. Are there pockets of oppression? Yes. Are they remotely characteristic of such societies? No.

There has been a systematic, wilful, creation of an intellectual edifice built on oppression, and related concepts, to generate a sense of moral urgency, moral catastrophe and moral self-importance about the least oppressive societies in human history. That in itself might merely be regrettable. What makes it far more than merely regrettable is that it is tied to, and helps create, an expanding intellectual structure built on a serious, systematic, blindness to genuine social achievement, to the profound and pervasive achievement that makes these societies the least oppressive in human history. A blindness that has become a required marker of moral and intellectual seriousness when in fact it is a marker of wilful self-deception and systematic mischaracterisation of social and historical realities. A blindness that can only be expected to leave social destruction in its wake.

If you define achievement and success as vice, and then act upon that characterisation, you will generate a great deal of failure. Even more deeply, the above mindset encourages a certain sort of destructive arrogance that thinks protections and freedoms that evolved for good reasons can be dispensed with by sufficiently moral people engaged in the sufficiently morally urgent task of opposing oppression.

The oppression two step — declare oppression, thereby also proclaiming look at me!, I am opposing oppression! — generates identity, purpose and status in one package. But it is, at bottom, built on distortions and self-deception, on creating mountains of bullshit out of molehills of truth.

Critical theory

In 1922, a famous seminar was held in Germany that kicked off the Frankfurt School. The Great War had killed millions. Postwar revolutions had come and gone. The Bolsheviks had triumphed in Russia, but so had the Fascists in Italy and traditionalist authoritarians elsewhere. The question those at the seminar grappled with was; why was the working class not being revolutionary in the way Marx and Marxism postulated?

Critical theory, and the cultural turn in Marxism that the Frankfurt School embodied, was the result. For those with eyes to see, it was already obvious that the Bolshevik regime was murderous and tyrannical. (As Rosa Luxemburg had predicted it would be.) But what became the standard defence — Leninists were doing Marxism wrong — was already being mounted.

One can listen to any number of YouTube lectures and presentations, often information and intellectually serious ones, going through this history. A common refrain from those sympathetic to the Frankfurt School is to say something like “even though inequality was getting worse, the workers were not being revolutionary”.

Tuesday, November 3, 2020

Paying the Palestinians not to make peace


Israel-Palestine is the third rail of online commentary. It brings out all the crazies. Nevertheless, it is worth noting an overlooked structural feature that explains so much.

That structural feature is quite simple: Western and Muslim states pay the Palestinians hundreds of millions of dollars a year not to make peace with Israel.

They do this through UNRWA: the United Nations Relief and Works Agency for Palestine Refugees in the Near East, founded in 1949. UNRWA has an annual budget of about US$800m. Most of this money goes as payments to 5.6million registered Palestinian refugees. Vanishingly few of these people ever lived within the 1948 borders of Israel. How could they? A person would have to be at least 71 to have been a refugee from the 1947–9 war that created the pre-1967 borders of Israel.

Why are there at least 5.6million registered Palestinian refugees? Because Palestinians are the world’s only multi-generational hereditary refugees. If you are the patrilineal descendent, or the adopted child, of someone who lived within the 1948 borders of Israel for two years or more and left as a result of the 1947–9 War, then you are a Palestinian refugee.

Why two years? Because decades of Jewish influx to Palestine, dating back to the late Ottoman era, had stimulated economic activity within Palestine. That commercial energy drew people in from the rest of Middle East.

The creation, the ethnogenesis, of Palestinian national identity is Jew-centric in multiple ways. It is an identity that would not exist without the Jews and the influx of Jews. Not only has the Palestinian identity been formed in opposition to Zionism, many Palestinians are descended from people whose association with Palestine is a result of the Jewish influx.

More generally, Jew-centrism pervades discussion of Israel-Palestine, from various forms of Jew-hatred, antipathy to any form of Eurocentric nationalism and post-colonialist critique. The Palestinians are cast in the role of foils to these Jew-centric, Israel-centric, patterns of analysis, that have come to dominate Western media and academic commentary on Israel-Palestine. This encourages not considering the patterns of Palestinian politics, or the incentive structures they face, outside of the actions and policies of Israel. The concern of this piece being precisely those other incentives, to avoid the normal Jew-centrism, the normal Israel-centrism.

For instance, it is very hard for a Palestinian to become a citizen of most Arab countries. It is much easier for them to become citizens of a Western country. Why? Because the Arab countries prefer to keep Palestinians as stateless sticks to beat Israel with.

In the course of the C20th, there were lots of mass movement of people across boundaries. (For instance, the 1944–50 expulsions of Germans and the 1923 Greek-Turkish population swaps.) Palestinians are the only ones who their ethnic and civilisational confreres systematically refused to take in as citizens. It was much more important to keep then as stateless sticks to beat the Zionist entity (Israel) with. A state that, for decades, no Arab state was willing to recognise.

The Christian-led state of Lebanon was acceptable to the Arab world in a way that the Jewish state of Israel was not because Christians were an acceptable power-people and Jews were not. Obviously, the endurance and success of Israel has undermined that relegation. Moreover, Christian-led Lebanon was inside the ambit of Arab nationalism; an ideology that many Christian intellectuals had helped develop, as it provided an identity which included them, when Muslim identities did not. As is normal with nationalism, Jews were not included within Arab nationalism, no matter how long they had lived in the region.

This was particularly intensely so for Palestinian nationalism. Palestinian nationhood has developed as an identity based around the existence of Israel as a crime against them. An identity based around the nakba, the disaster. Such a disaster-narrative is not remotely a positive or productive basis for an identity. (Especially given that no Arab state has ever fought to establish a Palestinian state.)

Israeli diplomat Abba Eban famously said that the Arabs never miss an opportunity to miss an opportunity. More and more, this seems to be a specifically Palestinian pattern. Unfortunately, rejectionism always seems to win out within Palestinian politics. It won out in the 1930s, when Mufti Husseini’s rejectionism ended up undermining more conciliatory voices. It won out again with the post First Intifada return of Yassir Arafat, who dismantled the civil society networks that had developed the First Intifada, the only successful Palestinian campaign, in favour of the patronage structures that his power was based on. Arafat then went on to block any peace treaty with Israel. Nowadays, Hamas seems to have more popular support among Palestinians than Fatah and its allies.

Not that the Palestinians are politically unified. Israel, Egypt and the Palestinian Authority have agreed on blockading Gaza since the Hamas takeover of Gaza in 2007.

Palestinian media and popular attitudes tend to particularly intensely display the Jew-hatred which has become intense across the Arab and Islamic world since the 1930s. This shift, which extended to pogroms and did much to create the post 1948 mass exodus of Jews from Muslim countries, largely replaced the dismissive sense of superiority that had been the more common Muslim outlook, based on the legal and social dominance of Muslims.

Those 850,000 or so Jewish refugees from Muslim countries became citizens of Israel, or of the Western countries they fled to. They have largely disappeared from narratives about the conflict between Israel and the Arab world, as neither those fleeing Jews, nor their descendants, were frozen in status as refugees. Similarly, the mass emigration of Christians from the Middle East is also often overlooked, as they have also been accepted as citizens in the countries they legally entered.

If the conventional definition of a refugee was applied, then the number of Palestinian refugees would be a diminishing pool, and the Arab countries blocking of them, and their descendants, from becoming citizens would become blatant. By creating this special definition of a refugee, being a Palestinian born in an Arab country was excised from being a citizen of that country.

Hence, the world’s only multi-generational hereditary refugees.

Lebanon is particularly hostile to Israel diplomatically because it has a large Palestinian refugee population that the competing Lebanese elites do not want to incorporate into Lebanon as citizens. A danger that would be a natural consequence of peace with Israel, as the Palestinians would then stop being refugees.

Sunday, November 1, 2020

Truth, knowledge and self-deception


Source.

One of the fundamental, persistent, claims of wisdom traditions is that to be wise one must seek to not deceive oneself. That unsparing knowledge of oneself is a key element in wisdom.

We live in societies of ever-expanding evolutionary novelty. An expanding novelty that better understanding of our evolved selves is necessary to navigate, and increasingly so. Contagious self-deception is the opposite of what we need.

One of the adaptations from French theory that has feed into contemporary ideas about progress and social justice makes it much easier to deceive oneself. This is the characterisation of discourse as entirely self-referential; that text, language, discourse only refer to themselves. The consequence of this characterisation is that truth no longer operates as the marker of having apprehended reality. There is, in this construal of reality, no unified truth to act as a arbiter of reality.

The concept of truth is inherent in language. ‘Cat refers to a cat’ is a truth claim. What is the word for …? What is the name of …? What does that mean …? These all involve truth claims. Truth is odd, however, because it refers to a certain form of success in using words.

Words are generally defined by sorting definitions — the definition allows you to sort things into what is, or is not, referred to by the word. There may well be fuzzy cases, but even a fuzzy case is sorted into being a boundary case.

Typically, there is a set of characteristics such that if a thing has all of them, it is in the set; if it has none of them it is outside the set; and the fewer of the characteristics a thing has, the closer to being outside the set it is. With the boundary being set by the use of the word, as it evolves over time.

Truth does not work the same way as words normally do. First, it refers to a certain sort of successful use of statements and not individual words or things in the world. Second, precisely because truth refers to a certain sort of success in word use, no definition of truth will allow you to sort (merely by use of the definition) any statement into true or not true. Truth does not sort in the way words that are directly about things do.

The same also applies to knowledge, as it is about the successful apprehension of reality. No definition of knowledge will, by itself, tell you whether you have successfully apprehended reality. Whether what you have, in a particular case, is knowledge. Neither truth nor knowledge entail sorting definitions in the conventional sense.

This is not a problem of reference. Saying something is false does not mean it has failed to refer. On the contrary, it is because the reference is grasped that a judgement about a statement’s truth can be attempted. Saying a statement is nonsense often implies it has failed to refer (though it can also be used to say it has failed to refer to how the world is).

The denial that language has reference beyond itself is an adaption from Jacques Derrida. (A useful quick guide to Derrida on language is here.) As with the other French theorists, Derrida is a pre-Darwinian thinker. That is, he is post-Darwin-the-icon-of-science but not Darwinian in his thinking, he does not embrace the evolutionary lens. (Ludwig Wittgenstein’s view of language, that language is defined by use, fits much more naturally with evolutionary thinking.)

Language is a technology; a technology that assists us to act in the world across a wide range of aims or goals. Words are tools for communication, but tools that work because they enable us to act in the world more effectively — otherwise the capacity for language could not have evolved. (One reason that Noam Chomsky does not have much regard for the French theorists intellectually is that he is very much a Darwinian thinker.)

Structuring truth

Humans have certain physical characteristics, nutritional needs, cognitive structures etc. The world has various structures that operate in various structured ways. So human languages tend to develop somewhat similar structures (to the extent there are patterns in, for example, vowel shifts over time) while still evidencing considerable linguistic variety.