Plato’s Socratic dialogues—in which Socrates questions and argues with a variety of interlocutors who claim to know the nature of such things as justice and virtue—are known for their exhortation to follow soundly crafted arguments wherever they lead. Parties to a disagreement should try to agree on certain starting points and then take seriously whatever conclusions they logically lead to, rejecting them only when they arise from faulty premises or reasoning. According to Socrates, the disputants should have a shared desire to discover the truth rather than to win the argument. This Socratic ideal is an indispensable ingredient of a culture of reason—a culture to which people need to be introduced early in life and which is woefully lacking in many current disputes and even academic disciplines.
This lack is revealed when parties to a disagreement notice that the argument they are following is leading somewhere terrible, somewhere we should not even consider going. What if it leads us to condone slavery or torture? Indeed, what if it casts doubt on permissions that most liberal-minded people take for granted, such as the moral permissibility of abortion? In those cases, it might seem, the last thing we should do is follow the argument. We should follow something else: gut feeling, law, social consensus, religious teaching, or whatever else can save us from having bad ideas.
Two recent and disturbing attempts to discredit or even suppress scholarly research highlight how unpopular disinterested enquiry can be, when its outcomes threaten sacred beliefs.
The first concerns an article by Brown University Assistant Professor Lisa Littman on gender dysphoria, published by the journal PLOS One. As Jeffrey S. Flier explains, Littman argues that more research is needed on the growing phenomenon of teenage girls, who, encouraged by social media communities, report late-onset gender dysphoria. Brown University replaced its promotional reference to Littman’s article with a warning that her work might harm members of the transgender community. The journal in which the article appeared promised to undertake additional expert review. However, neither Brown nor PLOS One provided any evidence of academic misconduct, which would normally provide the reason for such a review.
The second case concerns the research of Theodore Hill and Sergei Tabachnikov on the average sex differences in the variability of certain cognitive and personality traits. The paper was accepted for publication in Mathematical Intelligencer, but Tabachnikov—a professor at Penn State—came under pressure from the Women in Mathematics chapter of the Penn State Mathematics Department to consider the danger the paper would pose to the cause of gender equality, and he was urged by some colleagues to withdraw his name from the paper. Soon afterwards, Mathematical Intelligencer rescinded the paper, citing fear of its likely reception. The paper was then accepted by the New York Journal of Mathematics but—astonishingly—disappeared from the volume soon after publication.
These recent cases will disturb those who innocently believe in sincere, truth-focused free enquiry. When the conclusion of such enquiry seems unacceptable, the Socratic method prescribes a dialectical process of question and answer, to determine whether it is as bad as it appears. This involves the close examination of premises and arguments. Automatic suppression of a conclusion is alien to this method.
These cases are chilling examples of what can happen in highly educated circles. In the wider public arena, things are more blatantly nasty. Although many commentators do their best to be Socratic, a significant proportion of debates, especially on social media, are a maelstrom of anger, incoherence, irrelevance, willful ignorance, distortion, exaggeration, false dichotomies and uncharitable attributions of nefarious beliefs and motives.
Fast Ideological Reflexes
To understand what is happening here, I suggest we borrow a central idea from Daniel Kahneman’s celebrated book Thinking, Fast And Slow. The author argues that we have two systems of thought: a fast system (System 1), which evolved to deal with everyday situations that call for immediate action, and a slow system (System 2), for more complex problems. System 2 is exemplified by careful, painstaking reasoning, which is open to counterintuitive conclusions. We all need to use both systems, and we can leap to false judgments if we instinctively use fast thinking when we really need the slow variety.
For example, suppose you see a public health poster warning that if you are a non-smoker who lives with a smoker (i.e. if all your ingestion of tobacco smoke is secondhand) the risk you face of developing a specific serious disease is increased by 25%. Your System 1 sets off alarm bells—you have a one in four chance of getting this disease! But if your System 2 is functioning properly, it will tell you that you need to know what the 25% is 25% of—i.e. you need to know the original risk. If the original risk is one in a thousand, then combining the original risk with 25% of that risk gives you an absolute risk of 1.25 in a thousand, which does not seem like a good reason to panic.
The fast System 1 is good at detecting immediate threats to our physical safety, and has presumably done so for thousands of generations, thus facilitating survival and reproduction. For example, most of us are good at detecting physical threats from strangers. We notice—perhaps subconsciously—something about the stranger’s gait, facial structure or expression that warns us to stay away. Unfortunately, this intuitive ability to detect danger sometimes produces false positives: we may treat someone as a threat when he’s actually just a bit weird. But, from the point of view of safety, it is better to tend towards false positives than false negatives. Failure to recognize that someone really is a threat can mean that you are not long for this world.
I suspect that, in a similar way, the language used by an ideological opponent—the people he quotes, the examples he uses, or his sex or social background—often trigger the threat! (or enemy!) response, together with an intellectual fight-or-flight reflex. It doesn’t matter how good we are at System 2 thinking, when we choose to use it. Certain triggers tell our System 1 that there is no need to switch to System 2, when we encounter such a person. If this is correct, it helps us to see why high intelligence can so easily co-exist with deep irrationality and intolerance.
For example, debates about politics and religion can engage highly sophisticated reasoners, but those reasoners often start from assumptions that they never reconsider. Hence, when someone challenges these assumptions (or seems to) they immediately detect a threat to their values, their in-group or their very identity. Such people are often very good at identifying questionable assumptions in the arguments of those they disagree with. But they are emotionally averse to questioning their own assumptions and hence do not apply System 2 to them.
Obviously, there are times when sincere people do succeed in curbing the impulses of System 1. During a real engagement between two intelligent, truth-seeking individuals, who have a strong, emotionally based disagreement on something of practical importance, the argument, although at least initially polite and containing sincere attempts to stick to the point, often quickly becomes convoluted and nitpicking and eventually requires exhausting intellectual efforts to conduct or follow.
Suppose someone makes a contentious claim. Another person objects, asking for clarification of an abstract point, requesting that a key notion be disambiguated or pointing out that what the opponent says does not follow from her premises. The first interlocutor responds that the opponent has misunderstood her premises or wrongly attributed a complex fallacy to the way she arrived at those premises. The second person then says that he has not misunderstood the premises, but concedes that it was fair to think he might have done, and proceeds to elaborate three ways in which such a misunderstanding might have arisen (and subdivides the third way into three more apparently similar but actually crucially different interpretations). You, the listener, are probably confused and bored by this point. This is the stuff of specialist journals, in which highly analytical and truth-focused individuals indulge their penchant for ultra-careful System 2 thinking—and are read by about seven enthusiasts.
If all goes well and the disputants are disciplined, some conclusion might be agreed upon. But eventually, even among highly focused people, emotions become hard to suppress, and the debate may turn sarcastic and snarky. Finely tuned arguments can quickly degenerate into displays of cognitive superiority. We should never underestimate exceptionally intelligent people’s capacity for sheer rage. Their emotional control can quickly evaporate, even as their mental focus persists. The result can be seemingly intellectual but actually emotionally rooted feuds that last for many years.
A Culture of Reason
The sad lesson of this is that there is no quick fix for the increasingly acrimonious and polarized state of public debate, especially on such inflammatory topics as gender or racial politics. The current situation is not entirely bad, though. There are careful thinkers who are advancing the debate through Socratic reasoning, by either challenging or by upholding received ideas. Such people are gaining followers, and their detractors’ poorly evidenced ad hominem attacks only serve to highlight their effectiveness. A good example is Christina Hoff Sommers, who attracts a good deal of prejudice and vitriol (see, for instance, this tendentious Rational Wiki page), but little convincing refutation. If there is a remedy for the current fury, it will come from reasonable people playing a long game. The temptation to counter polemics with polemics, and ad hominem attacks with retaliatory ad hominem attacks should be resisted. We need a long-term strategy to uphold a culture of reason, starting in schools and universities—consisting not in shrill assertions of the value of reason but demonstrations of it.
The success of science came about in just this way: science, when conducted properly, is self-correcting and is eventually seen to be so. The same is true of reason, which includes the ability to spot factual inaccuracies, partial truths, logical blunders and informal errors such as ad hominem vitriol and the substitution of anger for argument. Fortunately, many people eventually see through these things, and false or grossly exaggerated claims lose their attraction.
The culture of reason also has an ethical dimension. Truth matters, whether we welcome it or not. For everyday purposes, facts are real—whatever metaphysical theory (if any) we adopt about the nature of their reality. People often reject factual claims, not based on evidence, but because they supposedly betray a dangerous hidden agenda, or because their truth is ideologically unpalatable. On the social/political Left, this continues to plague the debate about the blank slate theory of human nature, which many people endorse because they think it is racist or sexist to reject it (a stance powerfully challenged in Steven Pinker’s The Blank Slate). On the Right, influential people continue to deny the reality of man-made climate change, the social consequences of extreme wealth inequality and even the theory of evolution by natural selection.
It is tempting to deal with these things with sarcastic one-liners and insults. But, although there is a modest place for mockery, it is no substitute for patience, factuality and a willingness to regard antagonists as capable of reason, rather than stupid or wicked. If anything has emerged from our post-millennium culture wars, it is that religion, ethics and politics have a powerfully tribal dimension. This is probably inevitable, since it serves useful purposes as well as bad ones. But we must hope that the slow, often boring thinking processes of System 2 can aid the spread of understanding. Academic research is becoming increasingly agenda led, rather than genuinely truth seeking, and many intellectual mediocrities have been advancing their careers by pursuing those agendas. The remedy is to teach young people how to think, rather than what to think, and allow them slowly to see for themselves that this approach pays off.