Aesthetic Denialism and the Rise of Acadumbia: Yale, Bloom, English and How It All Went South

The Awakening of a Passion

In the early 1990s, when I was in high school, I fell in love with literature and decided I wanted more. I attended a middle class, suburban New Jersey public school, and the English curriculum was good enough, though it had its share of lapses. At home, spurred on by a father who generally had better taste than the educational bureaucrats, I set out upon my own idiosyncratic jaunt through the canon. But what mattered more than anything else, more than any particular great book I read, was the enthusiasm of others: my father’s, my teachers’. Kids learn to love by imitating the loves of others in what the great literary theorist René Girard called “mediated desire.” Just as children bereft of secure attachment will often prove unable to exhibit attachment in their adult relationships, in order to develop into adults passionate about literature, culture, ideas, science—or, really, anything whatsoever—children need to see adults modeling such passions.

We often speak of a cycle of poverty, but there is another pernicious cycle: the cycle of uninterest. It kills the spirit as surely as the cycle of poverty kills the body. Adults without interests raise kids without interests. Those kids grow up to take on the deadening default interests foisted upon them by mass culture: the abode of trash TV, celebrity gossip, cable news, superhero franchise flicks, vulgar pop and hip hop and anything and everything remotely associated with the depraved beast known as Nicki Minaj. We see everywhere around us those who reside in this morass. They sit in trains, scanning their social media feeds or clicking away frantically at the colored pellets materializing in the array of rat mazes that do double duty as one of the many varieties of electric bars keeping them condemned to the glowing rodent cages they hold in their own hands. Their daily commute is but a part of a permanent state of transit during which they do nothing at all and think neither nothing nor something but the shadow thoughts in between.

It is possible for a child raised with or for nothing to accumulate a fortune, whether material or spiritual. But this takes extraordinary luck, an extraordinary soul or both. The Matthew effect Thomas Piketty describes in the economic realm holds true in the realm of the passions too: both inherited advantage and inherited disadvantage compound and accumulate. A childhood of neglect and consignment to the cycle of uninterest is hard to overcome. And, more often than not, the passions awakened in childhood and adolescence must be dutifully nurtured into at least early adulthood if they are to persist lifelong.

Yale and Harold Bloom

Inspired by parents and teachers, I rode the passion I had borrowed and made my own all the way to Yale University, which, at that time, had what was widely considered America’s foremost English department. The sheer number of wonderful professors with whom I had the pleasure of studying exceeded all my expectations, but the unquestioned high points were the three classes I took—two on Shakespeare and one on twentieth-century poetry—and the senior thesis on Wallace Stevens I completed, with Harold Bloom, our most prominent and singular living literary theorist. Despite all his other accomplishments, what mattered to me as a wide-eyed undergraduate was only this: Bloom was a phenomenal teacher.

Not all of his students would agree with that assessment, I am sure. His seminars were, for all practical purposes, lectures. Occasionally, he would break into his own train of thought to ask a question, posed initially, it seemed, to himself, but then, as though he had momentarily awakened to the fact that there were students in the room, redirected outward in our general direction. Surprised or concerned to find a hand raised in response, he would allow the student a few halting words, doing nothing to conceal the look of bafflement, strain or anguish on his face, then shake his head in dismay—“I usually do not like to tell people they are absolutely wrong, but my dear … [head shaking]” is one memorable reaction I recall—or, in the case of a worthier contribution, interrupt politely, picking up on a word, phrase or thought and running away with it in a direction of his own.

Interactions between him and us were enjoyable largely on account of the substantial squirm factor: the unsettling medley of commiseration and schadenfreude one felt watching the spectacle of as-yet-incompletely-consolidated egos deflated and bruised time and again. Grades were a unique experience, too. There was no syllabus, no dutifully composed course outline, no written description of what it was we would have to do to consummate the journey in a manner recognizable to university administrators. We would be told each week what to read for next time. Then, midway through the semester, we would be met with something like the following: “I suppose I should speak about this dreadful business of the paper. Fifteen pages is probably too short, fifty is probably too long. It is due on [such and such a date].” And that was that. The last week of class, we would find the papers, graded, sitting in a stack in the middle of the seminar table. As you flipped through to find yours, you saw grades and comments right there on the cover page, and in marked contrast to the kind of grade inflation prevalent in the Ivy League, grades spanned the gamut. I recall to this day how my heart skipped a beat when I saw a paper—not mine—assigned a D– with a single line of comments: “Poorly written. Poorly argued. This grade is a gift.”

With so much to rub a sensitive soul the wrong way—by today’s lights such a class would surely be deemed one uninterrupted, semester-long micro-aggression—Bloom had his share of detractors. I was not among them. Why not? What was there to like about a professor whose seminar was a two-hour-long monologue, whose interpersonal skills ran roughshod over every social nicety, who seemed entirely indifferent to the logistics of what it meant to conduct a college course and who might give you a near-failing grade with hardly any explanation? The answer should be clear by now: Bloom approached his subject with so much commitment and passion, so much intensity and deep seriousness, that, sitting there in that rarefied atmosphere, you had no choice but to catch the itch. You felt as though this subject matter you were studying, the greatest works of our literary tradition, was the most important, enthralling and exalted business on earth. And—when the mystical two hours in that room and the weeks and months of class and even many years and decades had passed—that feeling lingered, receding in the light of mundane days and bleary nights, but never straying from just beyond the threshold, a spirit world ever ready for the summoning. 

Aesthetic Denialism and the Radicalization of the Humanities

This fact is not, or should not be, controversial: we all believe some works of art are better than others. Over time, the aesthetic preferences of many individuals coalesce, with certain voices carrying greater weight due to their expertise in the art form at issue. A canon comes into focus. The aesthetic truths informing its composition are not timeless or objective or incontrovertible, and, at the margins, disagreements are possible. In what sense, then, can aesthetic truths be deemed true at all? They are true intersubjectively, that is, relative to the commonly shared capacities and interests of an able, relevant audience. Maintain SpongeBob SquarePants is better than Shakespeare, and you will find yourself in the same position as someone who insists lemons are sweet.

To be sure, merit has never been the sole criterion manning the canonical watchtower. Sometimes, as Richard Ohmann has argued in The Shaping of a Canon: US Fiction 1960–1975, or as H. J. Jackson has argued with respect to Romantic poetry in her book, Those Who Write for Immortality, economic and social factors can help make reputations. Indeed, the very reason we acknowledge the force of such critiques is that we are capable of distinguishing such extraneous factors and agreeing that they should be considered illegitimate.

Here, then, is another common sense truth, a proposition so obvious that it is bizarre that I should even need to set it forth: the race, gender, religion, sexuality or physical ability or disability of its creator is not a legitimate component of an art work’s quality. In fact, these are the very kinds of irrational social considerations that have sometimes regrettably distorted the picture of what has—and, more importantly, has not—been deemed canonical but which we, whenever we become cognizant of such errors, should try to discount.

But, in recent times, a strange inversion has taken hold of our thinking on the subject of canonicity. Instead of viewing such superficial aspects of the author’s identity as illegitimate variables the influence of which we should resist as far as possible, we have opened the floodgates in order to admit authors to the canon precisely on the basis of such considerations. Entire courses and majors have grown up around these superficial identitarian affiliations, and works of art—mistaken, perhaps, for democratic legislative bodies—have been lavished with praise because of their success in representing—in the sense of political rather than aesthetic representation­—the experience of this or that subgroup. Moreover, the same people who advocate for this identity genre literature also make a habit of assailing the canon for failures of representation, as though the quality of works could be gauged through a demographic survey. This form of philosophically unsound willful self-blinding to the hard truths of aesthetic superiority—which I would call aesthetic denialism—has become as pandemic in many segments of academia and the left as climate change denialism is on the right and has done great harm to the reputation and status of the humanities.

Lending an air of gravitas to this aesthetic denialism, there has been a proliferation of various branches of continental theory—largely post-structuralist and Marxist—that espouse a generally critical attitude towards existing hierarchies—aesthetic hierarchies included—seeing in these either the reification of indefensible and arbitrary distinctions (post-structuralism) or the pernicious reflections of power (Marxism). To adapt the argument advanced by John Guillory in his Pierre Bourdieu-inspired landmark work, Cultural Capital, during the decades between the downfall of the hereditary aristocracy and the emergence of our modern-day techno-financial elite, university education, particularly classical humanities education, came to serve as a dividing line between an educated aristocracy and common rubes and plebs. In the era of high modernism and prestigious print journalism, university departments conferring such knowledge and its attendant degrees enjoyed substantial cultural cachet and, to dispense such cachet, needed to agree upon a more or less unitary body of learning—the canon—as a boundary between education and inadequacy. But, as the old literary aristocracy gave way to a new moneyed elite, which elbowed its way into the ranks of the upper crust through highly compensated tech and finance industry jobs and needed to know how to read and write nothing more sophisticated than an office memo or PowerPoint presentation, the high school composition curriculum became more than adequate to its needs. Traditional high culture and university humanities were rendered supererogatory, becoming the devalued province of effete and useless intellectuals. Stripped of its most obvious practical function, the university’s role in what Marxist theorist Louis Althusser would have called the ideological state apparatus, serving to reproduce existing power relations—and the humanities professoriate drifted off unmoored into the great unknown. A unitary canon was no longer indispensable because the humanities themselves were no longer indispensable.

This led to two related developments. First, with the humanities no longer closely tethered to prevailing power structures, the stability and traditionalism that a close link to power demands fell away, and prominent humanities scholars with radically anti-establishment views were free to crawl out of the woodwork. Second, the new attitude of disrespect—and, increasingly, open scorn—that the techno-financial elite and much of the rest of society came to exhibit towards humanities academics led to a natural tit-for-tat. If you are disrespected, you are likely to seethe and lash out at your tormenters. You may, in fact, adopt your own posture of hauteur and disrespect, as if those people were hopelessly beneath you and will never understand you, since they are either too committed to the power relations in which they are embedded or else simpletons, blind to those power relations.

In this fertile ground for resentment, the attitude of critique took root, building on philosophical currents which first surfaced in the late nineteenth century and began to assume a simulacrum of their present-day form during the countercultural era of the 1960s. Bloom regularly fulminated against what he aptly termed these “schools of resentment,” which I discuss in more detail here. The approach of these peddlers of critique—sometimes known as the hermeneutics of suspicion—was to question all established norms and power structures, including the hegemonic structures that had allegedly informed the composition of the canon. Thus, instead of looking up toward the works they studied, these new anti-humanist humanitarians looked at them askance and endeavored to expose and unravel their inner tensions and contradictions and the hierarchies that had produced those works and entrenched them as objects of veneration. If the old humanities had once offered access to the upper echelons of society, what the new anti-humanities marketed was an attitude of superiority towards the society that had scorned them.

As new generations of students reared under the tutelage of these scholars entered the workforce and academia, the cultural capital enjoyed by the posture of critique predictably increased. Canonical lists were blown open, infiltrated by works that were aesthetically second-rate, but politically favored. A bevy of majors and departments in all sorts of identitarian oppression studies departments crystallized. Critique went corporate. Diversity became an industry: spawning seminars, consultants, initiatives and company retreats. At a time when our society had never been more tolerant, open and inclusive, media organizations, now staffed by graduates of these radicalized humanities departments, began to make a living trafficking in identitarian hysteria about racism, sexism, homophobia and transphobia.

The numbers confirm this story. They show a pronounced leftward shift in humanities departments since 1990, around the same time when the tech sector, which contributed markedly to the marginalization of the humanities, began its economic takeover. To quote Sam Abrams of Heterodox Academy, which tracks such data:

Between 1995 and 2010, members of the academy went from leaning left to being almost entirely on the left. Moderates declined by nearly a quarter and conservatives decreased by nearly a third … Professors were more liberal than the country in 1990, but only by about 11 percentage points. By 2013, the gap had tripled; it is now more than 30 points.

A (2005) paper by Stanley Rothman et al. uses faculty survey results from 183 universities to investigate an anecdotal hypothesis advanced by conservative critics that, in the 1990s, sixties radicals had embedded themselves in academia in sufficient numbers to swing the balance sharply leftward. The authors report on earlier work, which found that “the proportion of faculty who identified themselves as liberal or left [had] declined from 45% in 1969 to 39% in 1984,” belying any notion that the counterculture movements of the 1960s had been, in themselves, sufficient to engineer the leftward lurch. But the data from the 90s, following the marginalization and alienation of the humanities, told a very different story. While, in 1984, 39% of faculty were left/liberal and 34% were right/conservative, by 1999, those numbers had undergone a seismic shift: from 39% to 72% left/liberal and from 34% to 15% right/conservative. Moreover, the most pronounced shift to the left was in the humanities, with 81% of faculty identifying as left/liberal. English literature, which was 88% liberal and 3% conservative, led the way, with the social sciences (75% left-liberal) not far behind. The vocal, agenda-setting humanities and social sciences had managed to pull faculty in the rest of the university leftward as well, but the hard sciences had a far lower 51% of faculty identifying as liberal, while business faculty—remaining, for obvious reasons, most closely connected to existing capitalist hierarchies—showed a still less pronounced shift, with only 49% of faculty leaning left.

Today, the situation is still more dire. A comprehensive National Association of Scholars report from April 2018 headed by Mitchell Langbert of Brooklyn College, which tracked the political registrations of 8,688 tenure-track professors at top liberal arts colleges, found that “78.2 percent of the academic departments in [his] sample have either zero Republicans, or so few as to make no difference.” At the leftward end of the spectrum were the newly emerged ideological fields, such as gender studies and Africana studies, in which there was not “a single Republican with an exclusive appointment.” The traditional humanities followed. Trailing far behind, once again, were science and engineering, and then, at the very rear, the professional fields of business, accounting and the like. 

Yale and the English Major Today

To quote George W. Pierson, Yale’s first official historian, in the 1920s English “ruled heaven and earth,” with more than half of Yale undergrads majoring in it. When I started at Yale in 1993, English had declined from its 1920s (or even 1970s) heyday, but was still a vibrant, respectable discipline. But, as a 2017 article by Finnegan Schick for New Criterion recounts, “between 1991 and 2012, the number of Yale students majoring in English fell from 162 to just 62.” Between 2006–7 and 2016–17, English majors awarded fell by a third. Nor did the students simply shift over into other humanities subjects. The total number of humanities degrees fell from 601 to 390. Political science degrees likewise fell from 171 to 123. Meanwhile, the number of economics degrees held steady. And, of course, applied sciences, which roll out a four-year-long welcome mat to the professional world, experienced significant growth, with computer science up from 10 to 51, chemistry from 16 to 27 and applied mathematics from 4 to 25.

These happenings at Yale are part of a much-noted larger trend. An article by Benjamin Schmidt in the Atlantic from August 2017 summarizes some of the latest doom-and-gloom data about the ongoing crisis of the humanities, with D.O.E. numbers that show that the number of English majors at American universities has fallen by nearly half since the late 1990s alone, and elite universities have registered a drop-off of some 70% in the number of humanities majors. While some of the drop-off may be due to an (inaccurate) perception of lower future earnings, a big part of the picture is that students are less interested in the humanities of 2019 because the humanities of 2019 are, themselves, less interesting.

A July 2018 report by the ADE Ad Hoc Committee on the English Major connects the decline of the English major with “a national decline in leisure reading” and “the reshaping of reading practices by electronic media.” It notes, however, that the English major has itself been changing, and now reflects “the increased importance of issues of race, ethnicity, gender and identity; the focus on the historical contextualization of literary works; the rising attention to global Anglophone literature; [and] the upsurge of new media.” It also notes that, while Shakespeare was required reading at only four of the twenty-two institutions examined, almost half of those universities required an entire “course in diversity.” Traditional requirements to study the history of literature have been “contracted or liberalized to accommodate other elements in an expanding curriculum.” As the report’s Appendix B makes clear, the result has been a nearly 30% drop in English majors between 2010 and 2016, even as the total number of bachelor’s degree completions nearly doubled over the same period. This latest reduction, the report states, is part of a larger downward spiral: “English has seen a long decline since 1993.”

Just two years ago, in 2016, Yale students, already smelling blood, sent a strident dispatch to the English department (no longer publicly available from its original online source, but quoted in a variety of publications), demanding that they dismantle the mandatory two-year introductory “major English poets” course:

Students who continue on after taking the introductory sequence are ill-prepared to take higher-level courses relating to race, gender, sexuality, ethnicity, nationality, ability, or even to engage with critical theory or secondary scholarship. We ask that Major English Poets be abolished, and that the pre-1800/1900 requirements be refocused to deliberately include literatures relating to gender, race, sexuality, ableism, and ethnicity.

Instead of dismissing these fringe activists’ fundamental failure to understand the aesthetic character of literature, Yale’s educators did what they, of late, have done time and again: they capitulated, introducing a new alternative course entitled “Readings in Comparative World English Literature.” A paragraph from Finnegan Schick’s New Criterion piece from 2017 furnishes a general sense of the current state of the English major at Yale:

A decade ago, English students at Yale could find Shakespeare on twelve different course syllabi … Now, the bard appears on fewer than six. Other authors have undergone similar reductions: Austen and Spenser, who appeared in three separate courses in … 2006–7 … will each be featured only once in the next year. Not to mention D. H. Lawrence, who has disappeared from the English department altogether.

The Fall of Passion and the Rise of Acadumbia

It is impossible to teach students to love a discipline you resent. It is impossible to teach students to love the likes of Chaucer, Shakespeare, Milton, Wordsworth and Whitman when your most readily accessible thought about such canonical luminaries is that they are dead white males. And it is impossible to teach students to love English literature at its finest without teaching them to love the likes of Chaucer, Shakespeare, Milton, Wordsworth and Whitman. A new canon that is a transparent product of aesthetic denialism cannot substitute for the real thing. Regardless of whether their race, gender or sexuality happens to be politically favored, second-rate authors will remain second rate, and first-rate students will see through the charade. If your approach to the study of English literature is to attack those very figures who achieved the heights of English literature, you will not attract students eager to be infected with your passion for English literature; instead, you will attract the much smaller, more undesirable and less talented contingent of angry, brittle, strident, resentful students eager to be infected with your passion for attacking—as racist, sexist, homophobic, transphobic and Islamophobic—English literature, high culture and the West. And that is precisely what has occurred.

College students, who are often especially in touch with the zeitgeist, can smell a rat. English and the humanities are not attracting our best and brightest. They are turning off and shutting out conservatives, moderates, traditional liberals and even plain old non-conformists like me. They are turning off people who are actually interested in devoting their lives to the sublime and difficult details of the eternal humanities, rather than in ephemeral, mundane power relations, the study of which inspires contempt for the text. The quality of academic scholarship in the humanities and the closely related social sciences has, as a result, suffered from a predictable epidemic of stale ideas, unchallenged orthodoxies, confirmation bias, groupthink and plain old uninteresting (and sometimes ludicrous) research into passing fancies, all rattling around in a hollow echo chamber. The recent spate of well-publicized hoaxes popularly known as Sokal Squared in which James Lindsay, Helen Pluckrose and Peter Boghossian submitted papers that were politically fashionable but obviously outlandish (e.g. a paper on the need to create a new category of “fat bodybuilding” to challenge “fat-exclusionary (sports) cultures”) to identity-mongering journals and had many of them accepted is a testament to that. As Langbert’s study of political bias in academia puts it, “political homogeneity is problematic because it biases research and teaching and reduces academic credibility.” In other words, academia has become acadumbia.

For saying this, I will undoubtedly be labeled racist, sexist and the rest. I do not care. Being called racist or sexist by these people and their sundry offspring in the media and infotainment industries is like being labeled an enemy of the people in a totalitarian state. Their epithets have been stripped of any meaning they once had. They are not charges to be taken literally, or even seriously. They are no more than these people doing the only thing they have been taught to do as the bread and butter of their acadumbic profession.

Perhaps the most apt metaphor here is of an immune reaction gone haywire until it has turned into a fully fledged autoimmune condition. Like our body when repeatedly exposed to the same toxic stimuli, humanities and social science professionals, repeatedly primed to react to the few genuine instances of oppression in their milieu, can no longer discriminate between these cases and everything else. Like the autoimmune-afflicted body, unable to stop summoning up white blood cells and haplessly attacking and destroying itself from within, these acadumbic scholars and their increasing number of acolytes in the workaday world have conjured up a massive autoimmune reaction, which has swallowed up not only entire academic disciplines but the very foundations on which the West was built. America, Europe and much of the first world are in the throes of that reaction, with elites—to the great bafflement of the rest—seemingly hard at work undermining themselves and our society at every turn. Tearing down, as hallmarks of every variety of oppression, the greatest achievements and monuments marking our intellectual heritage, they are destroying the very foundation on which our cultural glories and hard-won political liberties stand. Throwing off delicate compromises and tenuous balances, they are un-paving our roads and exposing the dusty tracks leading back to barbarism, which, in the end, will consume them as well.  

My Decision

In 1996–97, towards the conclusion of my Yale experience, I faced a choice: would I pursue a PhD in English or find myself something more practical to do in the world? I knew which way my heart was inclined, but my head wasn’t buying in. Yes, I had loved my experience at Yale, and its English department had been full of great professors, but I also saw which way the wind was blowing. The younger professors and rising stars in the department were of a different complexion than their elders—still interesting and gifted, of course, but noticeably less so. They seemed more small-minded, with axes to grind. And they were interested in things—pedestrian extra-literary considerations of race, gender, sexuality, colonialism and the rest—that left me feeling alienated. Did I want these and the still lesser ones who would follow to be my future colleagues?

In his notorious 1992 speech on the floor of the Republican National Convention, Pat Buchanan correctly identified the reality through we were living as the inception of a culture war. The demons of political correctness had already begun heating the coals under our feet. In class, Bloom regularly observed the decline of genuine literary culture in the academy and beyond. Lambasting both the philistine conservative moral majority and the philistine lemmings of the left—as he sometimes called them—who complained about the elitist character of what were, naturally, some of the most elite cultural artifacts our civilization had produced, he clearly saw the coming collapse.

I was on my way to graduating from Yale summa cum laude, Phi Beta Kappa, and had been awarded a prestigious prize for excellence in the humanities. I was skilled at the delicate art of reading and interpreting literary texts—it was something I did naturally, something I loved to do. I should have been a perfect candidate for a PhD. I should have been exactly the kind of student academia would have wanted to attract.

At some point towards the end of my time in his classes, Bloom asked me what I was planning to do after Yale. It was, I knew, rare for him to direct a personal query to one of us, or to take a personal interest in an individual student. I felt nearly nauseous at the answer I had to give: “Law school,” I said. His look in response betrayed his melancholy. “It’s a worthy profession,” he intoned quietly, clearly not knowing—uncharacteristically for Bloom—what else to say. Then, regaining his footing after a brief pause, he added, “I suppose it’s for the best given the age in which we are living.” I knew exactly what he meant: aesthetic denialism was afoot, and there was no future here in the acadumbic humanities —or, at least, not a future someone passionate about the old, once-timeless, academic humanities could welcome.

If you enjoy our articles, be a part of our growth and help us produce more writing for you:
Total
112
Shares

11 comments

  1. Pingback: Never Yet Melted
  2. Something happened in the 70s that sent the humanities into a death spiral. I think the rot started in Fine Arts then permeated into the rest of the Humanities. Fine arts switched from teaching realistic skills and expert scholarship into vague notions of ‘creativity’ and snobbery. The lecturers and professors seemed to be examples of the last man standing who were people who weren’t very bright or spent their days worried that they might be caught out for not really knowing anything. I spent my undergrad BFA just making it all up, we had no practical lessons on anything, and it showed. The students outputs were invariably naïve or infused with weak pretentiousness. When this occurs the students have to second guess what’s required, so ‘identity’ becomes the leitmotif that they produce. This is only because there isn’t anything else required. Couple this with that its difficult to make a career in the arts business, and its a marketing race to the bottom, and it shows in local arts events and galleries.

    Each artist isn’t valued for the work they produce but the lame identity story attached to the catalogue (which is usually a word salad of nonsense).

    And while a Fine Arts student will invariably say ‘well that might have been your experience but it isn’t mine or my school’ the reality in Australia is that all schools are compared to others via a process of government accreditation and regulation. All art schools are the same, (here in Australia at least) the same way a law school education is the same whether its in North Queensland or Inner City Melbourne.

    The rot is all encompassing and the dumbness of academia spreads like a dull happy cancer.

  3. I’m skeptical of your analysis of the cause of the decline. Institutional take-over is the better explanation. But you’ve pointed to what I think is the far bigger problem in the academy than the take-over itself: the decline in the intellectual depth, breadth, and interests of academics. As Salzman points out below, more and more courses have been reduced to the same shallow set of ideological doctrines. It doesn’t matter whether it’s history or English lit, it’s all the same lecture.

  4. Highly recommend Jerome McGann’s text ‘Are the Humanities Inconsequent?’ (Prickly Paradigm Press, 2009; “Paradigm 36”). E.g., Academiana: “It’s a sheep eat sheep world”; “We advance funeral by funeral” (p. 18).

  5. One of the causes as the author sees it for the current parlous state of the humanities in acaemia is a scorn shown to the humanities by those in the technical lite creating in response a posture of hauteur and disrepct by humanities scholars.

    My perception is that the many in the humanities have always regarded themselves as superior to those who work in tehcnical disciplines and take a pride in their own ignorance and lack of capability in tehcnicla and mathematical fields. That pride in ignorance has never been reciprocated as a pride in ignorance of teh humanities amongst the technical.

    The scorn for humanities academics is based on the pronouncements of academics who seem detached from the real world confidently making prounoncements and moral judgements which to many people seem deeply hypocritical and based on fantasy.

    The scorn is an effect not a cause.

    1
    1
  6. That introduction sure sounded like a screed from an old man complaining about “kids these days”. I would probably qualify as one of those people you complained about, in that I spend the majority of my daily commute on my phone…reading Areo articles. Maybe I should stop reading “trash” like this and move on to some “real” literature.

    4
    5
  7. I saw a similar process in a different field, and from an earlier vantage point. Having done a Ph.D. in cultural anthropology at the University of Chicago, in the premier department at that time, I look a professorial position at McGill University, the top Canadian university. I taught there for fifty years, transitioning to Emeritus last year. I watched the field that I loved as it transformed from an attempt to understand human life and culture scientifically, to an identitarian “social justice” field; from a field that tried to understand impartially, to one in which only advocacy counted; from at attempt to gain an objective understanding based on evidence, to one in which subjectivity reigned. Anthropologists invented cultural relativism, which over the years morphed into ethical relativism, and eventually epistemological relativism. Now “everyone has their own truth.” I would not recommend anyone go into anthropology today, or any of the social sciences, humanities, education, or social work, all of which have gone down the same dark path.

    12
    1
  8. I have many interests. anyone who knows me would describe me as an extremely interested person. amongst those interests is some literature (recently read a fantastic book, Akenaten: Dweller in Truth, written by the first Nobel laureate of the Arabic language. prior to that book was the Norm Macdonald autobiography, which was hilarious). but I’ll admit literature is one of my lesser interests.

    I’m keenly interested in art, particularly renaissance and baroque (the portrait galleries that bore most people stiff are my biggest thrill at art museums). I draw and paint myself. nudes, beaches, alleyways, nonrepresentational explorations of shape and color, and many, many portraits of my non-dominant hand (a model whose schedule is always open, and works for free). among my favorite artists of all time I would list Caravaggio, and the pinup artist, Coop.

    I’ve always been interested in music. mostly rock of all eras, soul from the 60’s and 70’s, heavy metal, industrial, punk, some country, some hip-hip, a fair amount of pop from my youth, and an awful lot of what was called pop by former generations, but is now characterized differently. as much of it delightfully crass as stirringly deep. my favorites are often the records that manage to do both in equal measure.

    I am a dyed-in-the-wool film buff. my collection includes the works of Sidney Lumet (Network), Robert Altman (The Long Goodbye), John Huston (The Maltese Falcon, Beat the Devil), Sergio Leone (Once Upon A Time In America), Stanley Kubrick (The Killing, The Shining), and Akira Kurosawa (Rashomon, Yojimbo and Sanjuro). it also includes Kingpin, The Jerk, Heavy Metal, Army of Darkness, and the complete Marvel Cinematic Universe so far, from Iron Man, through Ant Man and the Wasp (I also have both Deadpools, the first two Superman’s, Burton’s Batman, the first X-Men, and Spider-man 2). and I posses not a stitch of any of it for reasons of kitsch or irony. I’ll allow that I only own The Incredible Hulk, Doctor Strange and the first two Thor’s for completist reasons, but other than those I don’t buy movies I don’t like.

    point is, I don’t see hard lines between high culture and low culture. I’ve seen too much low culture that’s astonishingly smart, and too much high culture that’s insipidly stupid to label either avenue as either bereft of merit, or the sole abode thereof. I tend to regard people who claim to be film buffs, but are unwilling to acknowledge merit within the mainstream Hollywood product as poseurs with no deeper appreciation of film than the person who only sees movies that top the box office. I believe a true seeker of beauty in art and culture is one who appreciates both the independent and the commercial, the insurgent and the established, the tightly structured and the deconstructionist, the rapier wit and the slapstick. nothing wrong with being selective and discerning in any of that, but to rule out wide swaths of culture as trash is to be eventually proven wrong as the cream of that supposed trash rises to the surface and becomes viewed as objects of high art.

    Film Noir used to be written off as trashy detective pictures. then the French named it, and gave us permission to see the art within the genre. myself, I don’t wait for permission. see you there! I’ll be all settled in, maybe I can show you around when you get here.

    5
    4
    1. Agreed. I think the word “pretentious” is currently overused but, in my opinion, someone who claims that superhero movies are garbage as a rule definitely qualifies.

      4
      4
    2. I have several thousand movies on Blu-ray including the entire Marvel Cinematic universe and every Francois Truffaut film so far released in the format.

      I have all the Lego movies and the every film directed by Andrei Tarkovsky. At least two of those Tarkovsky films are as good as Lego Batman.

      I watch films depending on my mood, not to impress my friends.

      2
      1

Leave a Reply

Inline
Inline