Hunting Predators: #MeToo and the Strange Psychology of Mob Justice

One year on, and our culture doesn’t seem to have reached a consensus on the impact of the #MeToo movement. For some, it has clearly gone too far, while, for others, it has yet to accomplish the social transformations necessary to ensure women’s safety and wellbeing. Writing in the Guardian, Sonia Sodha argues that too many women apply a double standard when they fail to call out the men they love (father, partners, brothers, friends) for their predatory behavior: “So long as we judge the predators we like differently, we’re all, to some extent, complicit.”

My aims in this essay are not to assess the claims of the #MeToo movement in detail. Men can and do, under certain conditions (and more often than women), commit acts of sexual violence. At the same time, the lack of nuance in current discussions of masculinity, and our failure to acknowledge the ways in which women can also be violent warrants more discussion. I explore these questions in my article on toxic masculinity (see also the peer-reviewed, longer version).

In this piece, however, I wish to situate the #MeToo movement within the broader human obsession with predators, call out culture and mob justice. Before examining the consequences of beliefs which, at their most extreme, paint fully one half of the human species as predators, we should ask how we got here to begin with. Access to new channels of communication, which allow us to denounce acts previously shrouded in silence, has certainly played a role in this process. But so have intuitive errors and common risks resulting from our evolved psychology.

Let’s examine four common human tendencies: the risk of inferring predation where there is none; the risk of treating other groups as personified predators; the tendency to pick on the powerless, not the powerful, in our quest for justice; and the pull of crowd psychology in mobilizing hatred and violence.

Predator or Prey?

In The Coddling of the American Mind, their book-length essay on the dangers of safetyism, Jonathan Haidt and Greg Lukianoff present us with a fascinating evolutionary puzzle. When playing tag, why don’t children like to be it?

All mammals like to play. Among large-brained animals in particular, unstructured play serves the crucial developmental function of consolidating skills and strategies that confer important fitness advantages like, say, learning to take measured risks, solving complex coordination problems, or putting features of the environment to novel, creative use. Kids, cubs and pups in social species all like to play tag, chase each other and pretend-fight.

Humans are no exception, but, unlike wolf cubs, who generally prefer to do the chasing, human cubs (like rat pups) prefer to be chased. Why? Because unlike predatory wolves, who have been at the top of food chain for much longer, we humans have been preys for most of our evolutionary history. Natural selection thus gave our species the intrinsic drive to detect, run from, elude (and later eliminate) predators.

But here’s the caveat: humans are now at the top of the food chain. Paleoanthropologists generally agree that the transition to behavioral modernity, when our ancestors began leading the kinds of complex, symbolic, technologically enhanced lives that roughly resemble those we live today, occurred 100,000–200,000 years ago (100–200 kya). By 100,000 years ago, humans likely possessed the same kinds of cognitive capacities for cultural learning that characterize our species today: i.e. they would have been able to figure out how to handle modern technologies like smartphones if others around them had been proficient in their use. Humans successfully transitioned (or so it would seem) from prey to predators in record evolutionary time.

By around 50,000 years ago (50 kya), our ancestors had become so good at hunting large game that they had begun to massively deplete megafauna, prompting the need to diversify their foraging strategies and build better tools, which led to more depletion. This natural experiment was replicated ad nauseam. The arrival of early humans and the loss of megafauna coincided almost immediately in each corner of the globe we colonized. Thus, large mammals began disappearing in Africa and Europe over 50 kya and in Australia around 45 kya. Similar events began to occur in Japan ∼30 kya; North America ∼15 kya; South America ∼13 kya; the Caribbean, ∼6 kya; the Pacific islands ∼3 kya; Madagascar ∼2 kya; and New Zealand ∼700 ya. By 12,000 years ago, plant domestication had begun. By 9,000 years ago, humans were living in the first cities, and the last of the ice age mammals were all but extinct.

How Predatory Is Our Hunter Psychology?

If we’re such good hunters, you’d expect our species to enjoy the advantages of a predator psychology, and to be biased toward the detection of prey. So why don’t our children, like wolf cubs, like to be it? Surely, our intrinsic dispositions have evolved to support biases towards talents that benefit us. Think of our fascination with fire, or the ease with which we can turn anything into a useful tool. But, again, there are caveats. Haidt and Lukianoff mention guns and cigarettes: two historically novel havocs we are spectacularly bad at reasoning about. The link between pulled triggers, bullets and dead bodies, after all, seems causally straightforward. Or take the obvious danger of moving cars, which kill over 1.2 million people each year. As any parent can attest, it takes immense effort, faith, and a mind-bogglingly long time to teach children how to cross a street safely. Wouldn’t the kind of basic intuitive understanding of ballistics and the properties of solids that even infants innately possess be enough to figure out car danger on our own? It seems, rather, that the threat of cars (like the one posed by guns and cigarettes) is too evolutionarily novel to be registered as an intuitive aversion. We reserve such aversions for rats, spiders—or even entirely imaginary predators such as ghosts, bogeymen and monsters under our beds.

Crowdsourcing Decisions

When learning to cross a busy street, children typically outsource their gaze and movement to the adult or older child walking in front of them, while remaining completely blind to everything else (including moving cars!) in their surroundings. Try this with adults at a busy crosswalk, at your own moral peril. All it takes is one daring jaywalker for a crowd of headphone-clad pedestrians to follow, their eyes fixed on the maverick in front of them, but almost never on passing cars. These examples are telling. They help explain why we are such good hunters and how we are, fundamentally, pack creatures. Rather than predatory cognitive dispositions, it is our instinct for threat detection, along with our drive for cultural learning, tool use, cooperation, material culture and for outsourcing knowledge, skills and decisions to others that make us so smart and powerful.

Threat Detection and Purity-Seeking in Crowd Psychology

In many ways, our evolved psychology is inefficient at updating our intuitions and beliefs to reflect novel features of our self-designed environments: the task is a little like trying to crunch Big Data with a 1980s Atari processor. Human minds are designed for social use. They are optimally adapted to outsource information and behavior to other people at minimal energy costs. These same social minds remain slow and inefficient, however, at reasoning about the complex social processes in which they are embedded. Thus, we tend to treat social groups using the tools our minds know best: by assuming they have minds like ours and friendly or hostile intentions towards us. These biases are at play when we routinely assume that such abstractions as the university, the media, the government, or the French ‘want’ to control us. Note how easily we ascribe mental states and predator–prey psychology to such processes. When our threat-detection modalities become hyperactive, these assumptions translate into delusions and psychoses, in which people typically perceive a world inhabited by maleficent agents with dark intentions towards them. Conspiracy theories often go viral precisely for this reason: because they are fully intuitive.

The stories, myths, archetypes and narratives we devise to make sense of the world reflect these evolved intuitions: they are full of dark forces, usually personified as ogres, demons, monsters, witches and predators of all kinds. These predator-detecting narratives also piggyback on the evolutionarily older drive towards disgust, which motivates the avoidance of pollution and infection. Thus, good social forces are typically personified as clean and pure. At the level of myth and ritual, the clean is usually elevated as sacred (and must therefore not be desecrated or questioned), while the bad and polluted becomes taboo (and must therefore not be touched or defended). This old poison detection mechanism is routinely applied to intergroup conflict and prejudice, in which members of the perceived enemy group are described as predatory or toxic. It also influences folk beliefs about maleficent control and pollution (e.g. the witches poison our wells, the media poison our minds, they abduct our daughters, they control the finances, they want to destroy us, etc.)

Once beliefs about what (and who) is sacred and taboo are fully installed in our cultural systems, any violation of the ritual codes that mark the good from the bad will typically trigger automatic (often violent) reactions, which completely bypass reflective thinking. Consider, for example, how most modern Americans would react to the desecration of their flag (sacred), or an argument defending the sexual identity of pedophiles (taboo). Once the collective gaze is turned toward something, it becomes cognitively and socially difficult to look the other way (like the child looking at her parent, but not at the road, when crossing a busy street).

Student union poster for a social justice event. ‘Sexual violence’ is represented as a dark force personified as a predator. Source: Concordia Student Union

Natural-Born Paranoids: The Negativity Bias and False Positives

Our old prey psychology is poorly adapted to modern environments. Surely, however, these mechanisms must still convey fitness advantages. Pessimism and paranoia are indeed advantageous from an evolutionary standpoint. Our minds and cultural beliefs alike tend to be biased toward the negative—ideas and events containing information about threats and dangers will be more noticeable and easier to remember. Culturally, this means there will always be a greater consensus about what is considered taboo than what is deemed to be sacred. The flag example is telling. For some readers, the notion that a flag is only valuable as a relatively arbitrary symbolic convention will seem obvious. That pedophiles are not the incarnation of evil may be a more difficult notion to entertain for most of you.

Exercising vigilance also carries adaptive value. Rather than assume safety and risk disappointment or death, we are better off working from the hypothesis that something or someone poses a threat, until proven otherwise. This translates in cognitive bias towards false positives: even if we are wrong part of the time about what constitutes a threat, we are better off being vigilant. Cultural beliefs will always converge toward a negativity bias, while cultural practices (such as call out culture) will be biased toward false positives.

Surely then, individual cognition and public consensus are sometimes—perhaps most often—correct when identifying threats. But consider again, how, until very recently, the intuitive disgust we reserve for pedophiles was assigned to such groups as Jews, blacks, gypsies, untouchables, the poor, women outside the home, Kurds, the Irish, Italians, redheads and even albinos and blacksmiths. That such beliefs are still alive and widespread in many parts of the world—meaning that, as automatic false positives, they are fully intuitive and go unquestioned by otherwise normal people—should give us pause.

What happens, then, when intuitions increasingly converge toward the idea that fully one half of the human species—namely males—are predators?  Before we can answer this question, we must examine the most counterintuitive aspect of this puzzle. Social justice movements that target oppressors usually pick on losers, and not winners—in other words, they are typically directed towards the powerless. We also need to understand why humans can be bad at noticing—or not being overly concerned with—injustice.

Humans Care About Fairness, Not Equality  

The need to achieve equality of outcomes and opportunities for all is an important modern value. In addition to being historically novel, however, the idea of equality remains highly confusing and counterintuitive for most of us. Psychological science has shown that what humans universally care about is fairness, as defined by relatively arbitrary social conventions. In any given context, people can intuit the rules and social norms (explicit and implicit) that dictate how one ought to behave, and how one should be treated. People tend to get upset and register unfairness when these rules are broken. Experiments in developmental psychology have shown that children can quickly learn arbitrary conventions and express outrage at those who don’t play along fairly, and tend to express more anger when children they associate with their in-group violate those rules. This next rule of thumb (our tendency to hold people from our closest in-groups to higher standards, and our related tendency to lash out at them more than at strangers) explains, for example, why feminists like Sonia Sodha feel the need to call out their brothers, fathers, sons and husbands. It also explains why, contrary to another myth, the people we dehumanize the most are not distant strangers, but those closest to us. When changing social conditions bring people into competition for prestige, societies fragment and polarize, as people work together to eliminate perceived threats to their group identities.

System Justification

Group tensions usually begin from within, until new classes of pariahs eventually come to be perceived as a predatory out-group. This tends to happen under predictable historical conditions. When recognizable social rules are played well, what may be registered as unfair at one historical moment can go unnoticed for a very long time in another.

When everyone plays along, a psychological bias for system justification will usually prompt people not to question things around them. In addition, the human thirst for status and prestige further solidifies people’s admiration of those above them in the social hierarchies, even when it is impossible to join their ranks. This may be why systems like slavery and feudalism, which seem very unjust by modern standards, could endure for centuries without a hint of social unrest. From a cultural evolution perspective, our notion of equality simply reflects the normative terms of our current social game, which is defined by the idea that we all ought to have the same opportunities and do the same things. It is in this context that, for example, differences in social roles are likely to be registered as unfair and trigger outrage.

In the Search For Justice, We Usually Pick On Losers

To understand this strange rule of thumb, we should consider an observation made by Alexis de Tocqueville, the nineteenth-century French diplomat and political scientist famous for his studies of democracy in America and the changing social factors leading to the French Revolution. De Tocqueville points out that, at the time of the revolution in 1789, the living conditions of the poor were actually much better than those of past centuries (when they hadn’t seemed aware of their oppression), while the legal power of the aristocracy had already significantly diminished. Many factors—from the economic rise of the bourgeois commercial class, to the rapid spread of ideas stemming from Enlightenment philosophy made possible by the printing press—had contributed to the first slow, then rapid decline of the royalty. By the 1780s, the legal privileges and political influence of the nobility had largely vanished, while previously unimaginable dreams of social mobility had slowly been installed in the popular imagination through the example of the bourgeoisie.

One common interpretation of this rising hatred toward a class of royals in decline points to the nobility’s persisting wealth in the absence of a legal framework to provide a justification for its existence. People, recall, are fond of simple stories that justify the order of things. Add to that novel cultural expectations about what constitutes fairness, and conditions become ripe for pogrom-style mobilization of the masses against the perceived enemy (if a class of petty shop owners could achieve status and wealth, why, after all, couldn’t everyone else?).

In addition, rising expectations of social mobility, inspired by the example of the bourgeoisie, also brought the masses into perceived competition with the elite. As another evolutionary rule, we tend to compete the most with people we perceive to be our near equals. It is usually those we perceive to be just above (those whose status is within reach) or just below (those who may catch up to us) who are registered as a threat. Unlike the admiration we reserve for elites, or the care or pity we reserve for our inferiors, we tend to be most vigilant at noticing (or constructing) our near equals’ flaws. In this context of novel competitive pressure, former superiors lose their aura of prestige and become the objects of mass bullying. Here’s another rule: humans love prestige, and they hate losers. That’s why we compete, first and foremost, for the symbolic resources of social status: admiration, pride and a justification for accessing other resources.

Beyond the erosion of a recognizable (and hence justifiable) social role for the aristocracy, it is the erosion of the prestige formerly associated with that class that made the final blow inevitable. A once glamorous group of people, in other words, had become ridiculous. The human thirst for prestige and excellence (expressed in our search for role models who excel at playing by the rules of our social games) makes us efficient and cruel at picking on losers: those uncool individuals who are obviously unskilled at playing the game.

At the time of the French Revolution, the rise of comedic political theater, depicting members of the noble class through grotesque puppets is a testament to this human drive to bully losers. For an archetypical character to be deemed intrinsically funny, it must embody a grotesque violation of the social codes that govern folk ideas of goodness, prestige and virtue.

One signals one’s social virtue by calling out, pointing at and ridiculing the taboo person.

The Seth Rogen Effect: How Men Became the New Losers.

The example of the French revolution can help shed light on the changing historical conditions that gave rise to the #MeToo movement. Over the past century, living conditions and opportunities for women have significantly improved. In Western democracies, men’s legal privileges have been entirely eradicated, while positive discrimination practices (in the form of specialized scholarships, hiring policies, the training of educators, gender studies programs, the moral obligation to embrace feminism, etc.) benefitting girls and women have become the norm. In popular culture, the virtues once assigned to traditional markers of masculinity like strength, endurance, dignity, protection and selflessness have slowly eroded, giving rise to largely absent, or at best confusing, models of culturally admirable social roles that men can embody. In her book Manning Up (2011), Kay Hymowitz plots the rise of a new archetype of man the loser in American TV and movies of the 1990s and 2000s: a trope she argues has come to offer one of the most prominent role models for Gen X, and millennial (now Gen Z) boys. Homer Simpson, for example, embodies the stereotype of the goofy, impulsive, unsophisticated, accident-prone idiot, incapable of functioning without the wisdom of his wife Marge. In contrast with her brother Bart, who is equally impulsive and troublesome, Lisa is the picture of genius, talent and virtue. By the late 1990s, a new archetype of the man-child, can-never-get-it-right goofball was fully installed in our culture, and was being broadcast in films and series starring such actors as Ben Stiller, Adam Sandler, Seth Rogen, John C. Reilly and Will Farrell. We might term this the Seth Rogen effect. In this modern myth, the masculine loser archetype usually comes to show or develop redeeming qualities, but typically needs the wisdom of a woman to help sort him out.

This myth, to be sure, is an old and transcultural story. Take the Lienü Zhuan, for example: history’s oldest surviving manual concerned with the education of women. Compiled toward the end of the Han dynasty (at the beginning of the first century CE), the book tells of the importance of women’s wise interventions when sons, fathers, husbands, or rulers strayed from the path of virtue. That men are on average less empathetic (less skilled at understanding other people’s needs and intentions), more impulsive and more self-destructive than women has been amply documented by psychological science. From an evolutionary perspective, socialization (often seen as the culprit of sex differences in feminist discourse) enables us to cultivate the roles that best harness these complementarities, allowing us to raise well-adjusted children. This is why agreeing on the kinds of roles that work best with our evolved dispositions is of crucial importance, and why most pre-feminist cultures usually cultivated gender-specific modes of pride and prestige associated with the specific strengths of each sex.

Changing social conditions in which a class endowed with prestige comes to lose its status tend to occur in the context of new competitive pressures. The massive entry of women into the workforce has brought them into competition with men in a niche that they (men) formerly occupied almost exclusively. This was bound to cause some damage. Evolutionarily, seeking status, social networks, and resources outside the domestic unit through something that gradually came to resemble work and politics has been a strategy that allowed men to attract mates and signal their potential as good caregivers. That women evolved to be uniquely sensitive and attracted to men’s social status (as a proxy of their capacity to invest in their families) is one of psychology’s most well documented effects. When competitive pressures harnessed these complementarities, men competed among themselves for status and women. That’s why, in evolutionary anthropologist Sarah Hrdy’s words, we may think of men as “one long breeding experiment run by women”!

Consider now a context in which men are increasingly devalued as providers, actively discouraged from seeking status, ridiculed and pathologized for being masculine—all while facing fierce competition from women candidates, who are better trained and more highly valued than they are, and for jobs whose hiring committees openly discriminate against men. That most people fail to register this as a recipe for disaster is a testament to the blindness of our mob psychology, which too readily makes us pick on losers.

The point here is not that humans would be better off if women stayed at home. Women have clearly made extraordinary contributions to science, medicine, the arts, literature, politics—and just about every meaningful sphere of human activity. Men also continue to occupy certain key positions, particularly in politics, some branches of science and at managerial level in corporations. That this is perceived as an injustice (while men’s nearly exclusive representation in backbreaking blue-collar jobs and workplace-related deaths isn’t) rather than a reflection of averages in the sexually dimorphic distribution of interests is an unfortunate testament to the confusing goals of our culture. A different kind of feminist perspective might celebrate the underrepresentation of women as CEOs, when, on average, such cut-throat roles clearly demand psychopathic levels of callousness, disregard for the needs of others and time commitments that are entirely incompatible with a quality social and family life.

The Many Sides of #MeToo: False Positives Apply to All Sides of a Debate

We have seen that our evolved predator-obsessed crowd psychology makes us uniquely prone to mass hatred and violence toward disgraced losers when cultural bearings shift too quickly. Rather than conclude that our species is doomed to mass bullying and stupidity, however, we should remember that not all acts of threat-detection are false positives. Undoing the feudal system certainly improved the lives of millions, as did the human and women’s rights movements.

Deciphering the kinds of facts that are being reported on by #MeToo proponents is a complex endeavor. Four hypotheses seem equally plausible: 1) acts of sexual violence have always been around, and victims were previously silenced, but these acts can now be reported and dealt with; 2) cultural and individual expectations as to what is considered abusive have shifted, resulting in a culture of fragility, hypersensitivity and hypervigilance; 3) some men exhibit more out-of-control behavior now that they lack clear models on how to be good men; 4) #MeToo has turned into an irrational witch-hunt targeted at a class of people which is the object of discrimination.

Ockham’s razor demands that the simplest, most parsimonious explanation be employed to make sense of any matter being investigated. But, while this works well in the lab social science—which examines billions of interrelated events and rarely produces any consensus as to how people interpret their individual experiences—is exceedingly complex, and cannot be subject to such a simple rule. The truth about #MeToo likely entails, to varying degrees, all four of the perspectives offered above, and countless others.

The rules of thumb I have described in this article describe general patterns of cultural change in which disgraced social groups come to be treated—and sometimes even eliminated—as predators. But people interpret such events in differing ways. On average, my students find this rule of thumb toolkit intellectually and morally empowering—as long as they can apply it exclusively to the stupidity of the other group. When it comes to #MeToo or other moral projects that are important to them, many of my students feel compelled to tell me that the rules don’t apply—because, as they see it, predatory, racist, and sexist behavior is the norm in our current culture.

My students agree when I remind them that most acts of mass violence, like genocide or mass incarceration, are not motivated by cruelty, but by a desire to do something the perpetrators perceive as morally right, within their cultural framework and historical moment. Together, we can also agree that wanting to do the right thing does not always leads to violence. Finally, my students often remind me that the prey psychology rule of thumb applies equally to the anti-#MeToo camp. Indeed, from the perspective of men who feel attacked by #MeToo, assuming the existence of a unified feminist predator that actively seeks to destroy them seems intuitive. However, it is a grave logical error that can also—as the incel movement has shown—lead to mass violence. As a student once put it, “it’s not like we women are all walking around with #MeToo signs telling each other how much we hate men!”

Check Your False Positives 

The assumption that all men are predators, or that all women hate men is simply wrong. In epidemiological terms, it assumes a prevalence rate of 100% for the incidence of a pathology. Pathology by definition points to an exception, a glitch in the normal functioning of a system. To help us become more vigilant about the risk of false positives in determining whether something poses a threat, we can apply another rule of thumb: a counterintuitive, but highly logical rule. If the base rate is higher than the false positive rate, our intuitions are likely to be wrong. It takes mental effort, but very simple maths, to wrap our heads around this rule. Paul Bloom provides a good example: suppose the test for a disease (say, a rare kind of encephalitis) has a 5% false positive rate, and you test positive—should you be worried? Most people find a 95% accuracy rate chillingly precise. But the relevant information here is the base rate. If the rare disease has a prevalence rate of 1 in 1000, that means that, out of 1000 people, only one person is statistically likely to carry the disease, while 50 out of each 1000 people will test positive!

Determining the relevant base rates for any problem is always a difficult task, for which there is no rule of thumb. The rule, rather, should be to keep on questioning our intuitions. And we should remember that most other people are as confused by this question as we are.

The Rules of Responding to Witch Hunts

Rather than seek comfort in parochial identities, which confirm our fears (men who think women hate them; women who fear that men are all predators) we should take our cue from a final rule of thumb, drawn from crowd psychology. Research on witch-hunt style campaigns to eliminate polluted elements from society has shown that most people are never completely gullible. As the accusations, trials and mass punishments spread, most people never quite believe the simplistic charges laid against the perceived enemy. But, depending on how the punitive rules are enforced, they are simply too scared to speak in defense of the wrongly accused.

When the enforcers of the witch hunt come from a small class with control over militarized bodies that routinely harass the masses (as in Pinochet’s Chile, for example), people tend voice their dissent among themselves and protect one another. When systems of enforcement include informants and sincere ideologues from every rank of private life (as in Soviet-ruled East Germany, for example), fear becomes the rule, as even our closest friends and relatives could potentially denounce us. These are the systems of violent moral policing that Hannah Arendt called terror. In a system of terror, Arendt reminds us, not even the executioners are sheltered from fear.

Conclusion: It’s Not Easy to Cross the Street On Your Own.

I am reminded of Hannah Arendt’s definition of terror each time feminist journalists urge women to police their brothers, husbands and sons, or each time lost young men declare that women are their enemies.

More than a critique of the #MeToo movement as an isolated event, this essay is an invitation to moderation for all those who feel convinced that they are on the right side of history and have identified the enemy. As you feel the magic pull of justice, in the righteous fight against predators, remember that resisting mass bullying and violence takes a lot of impulse control, reflection and courage: it’s just as hard as being a four-year-old trying to cross a busy highway on her own.

If you enjoy our articles, be a part of our growth and help us produce more writing for you:

2 comments

Leave a Reply

Inline
Inline