Desire to know why, and how, curiosity; such as is in no living creature but man: so that man is distinguished, not only by his reason, but also by this singular passion from other animals; in whom the appetite of food, and other pleasures of sense, by predominance, take away the care of knowing causes; which is a lust of the mind, that by a perseverance of delight in the continual and indefatigable generation of knowledge, exceedeth the short vehemence of any carnal pleasure.— Thomas Hobbes, Leviathan
In his 1972 book, Albert Einstein: Creator and Rebel, mathematician and physicist Banesh Hoffmann relates that Einstein didn’t attribute his success to any special talent or to his intellect. One of the greatest physicists—and greatest minds—to ever exist considered himself a testimony to the benefits of being curious.
Einstein’s praise of curiosity was unusual at the time, as it is only quite recently that psychologists have tried to map out the science behind our curious minds—although philosophers have tackled the subject for centuries. This is unsurprising. As astrophysicist Mario Livio points out in an interview about his book on curiosity:
We all know about the Middle Ages, the medieval times when curiosity was almost taken out of existence. It was mostly the church that wanted to convey to the masses the feeling that everything worth knowing is already known. They built walls around all types of knowledge and really oppressed curiosity in this way.
From before the Middle Ages to up until the early seventeenth century, curiosity was primarily seen as an “intellectual vice,” as Peter Harrison puts it. It was generally characterized, Harrison explains, as a “form of intemperance” and dismissed as “useless” by certain philosophers, while the Church fathers associated the trait with Adam and Eve’s sin when they partook of the forbidden tree of the knowledge of good and evil, making curiosity “along with pride and disobedience, thus implicated in the first sin and in the subsequent fall of the whole human race.”
This dramatic interpretation of this common human faculty led to its widespread condemnation. It was to take centuries for attitudes to change.
Once they did, however, scientists—especially psychologists—were free to become curious about curiosity. The father of American psychology and pragmatism, William James, refers to curiosity as “the impulse toward better cognition” in his 1899 book, Talks to Teachers on Psychology:
Novelties in the way of sensible objects, especially if their sensational quality is bright, vivid, startling, invariably arrest the attention of the young and hold it until the desire to know more about the object is assuaged. In its higher, more intellectual form, the impulse toward completer knowledge takes the character of scientific or philosophic curiosity. In both its sensational and its intellectual form the instinct is more vivacious during childhood and youth than in after life.
There are many forms of curiosity, but I will focus here on the intellectual kind, which James also called “theoretic curiosity.” Academia, research, science and all adjacent branches of knowledge are built upon and driven by the blessing of epistemic curiosity—a concept first proposed by Daniel Berlyne in the mid-twentieth century: the desire to acquire knowledge.
George Loewenstein built on James’ notion of incomplete knowledge by developing information-gap theory. Like James, Loewenstein describes epistemic curiosity as a “cognitive induced deprivation that arises from the perception of a gap in knowledge and understanding.” Loewenstein claims that curiosity is an intrinsic method of seeking information, by which he means that some people are naturally driven to fill their information gaps.
Lately, our epistemic curiosity is under threat within academia and, increasingly, beyond it. Many journals are unwilling to subject themselves to the intense academic scrutiny that characterizes the scientific method. One story of curiosity quashed can be found in Quillette’s recent collection Panics and Persecutions: 20 Tales of Excommunication in the Digital Age.
In “A Mathematics Paper Goes Down the Memory Hole,” Theodore P. Hill relates that, in 2016, he and a partner proposed a mathematical hypothesis concerning the applicability of the greater male variability hypothesis (GMVH) to human intelligence. The GMVH essentially asserts that, as Stuart Reges puts it, “although men and women may have the same average ability in many areas, men tend to have a higher variance, leading to more outliers at the extremes (the tails of the distribution).” Applying this to human intelligence would mean that most women have close to average intelligence, while more men cluster at the extremes (the average intelligence of both sexes remains the same). These distributions do not, of course, allow us to infer anyone’s individual intelligence (e.g. her IQ): we’re talking about average differences here.
Although Hill’s paper initially enjoyed a positive reception, his argument soon attracted fierce resistance, ranging from normal criticism by fellow mathematicians to the astonishing attempt to make his paper completely disappear. These were the unfortunate results of his epistemic curiosity. Hill had an information gap that had to be satisfied: “Darwin had also raised the question of why males in many species might have evolved to be more variable than females, and when I learned that the answer to his question remained elusive, I set out to look for a scientific explanation.”
By censoring or rejecting controversial arguments like Hill’s, we essentially suggest that epistemic curiosity has no place in academia—an absurd and dangerous precedent likely to damage academia far more in the long run than any paper on greater male variability could ever do. Since epistemic curiosity is such a fundamental value within academia, major restrictions of this kind will be a gut punch to practically any research field. Hill’s story therefore paints a very bleak picture of academia.
Many of the current victims of their own curiosity within academia and elsewhere won’t receive a platform and don’t have the financial backing to withstand cancellation or punishment. Partly thanks to the exposure it received after his publication in Quillette, Hill’s paper was finally published many years after it was written. Not everyone has been so lucky. As the signatories of the 2020 Harper’s Letter put it, “The restriction of debate, whether by a repressive government or an intolerant society, invariably hurts those who lack power and makes everyone less capable of democratic participation.”
Hardly anyone would want to return to a pre-scientific age in which curiosity was seen as an intellectual vice and a religious sin. Curiosity is a fundamental pillar of the academic community and motivates research in most if not all fields of study. Inhibiting its use might give rise to epistemic anxiety: a bottled-up desire on the part of some to acquire or distribute certain kinds of knowledge, within an external environment (academia) that keeps imposing further limitations on the fulfilment of this desire.
As in the pre-scientific age, a modern-day religion is now exerting arbitrary restrictions on our information-seeking faculties. However, epistemic anxiety will have a greater effect on today’s curious individuals and on academia in general—not only because there are far more academics than there were four centuries ago, but because contemporary society views curiosity more favourably than they did in the past. Present-day academics therefore have much more to lose when they are asked to limit their curiosity.
Rather than defining curiosity as a disorder that can be suppressed by restricting its practice, as the Church fathers during the Middle Ages attempted to do, in our current academic climate, we have a duty to let it flourish. Curiosity allows academia to make progress and it is in everyone’s best interest to encourage it to continue to do so.