If you teach journalism, as I have for several decades at a trio of US universities, you will hear your most engaged students gush one or both of the following canonical tributes to their calling’s supposed raison d’être: journalism is about speaking truth to power; journalism is about giving a voice to the voiceless.
They’ll enshrine the words in all caps on whiteboards in the offices of student newspapers and TV stations, and intone them with a special evangelical fervor during classroom take-downs of this or that politician or corporate potentate. I knew one student who grew so exultant at discovering either of the lines in a textbook or other assigned reading that he’d come up after class and bring his find to my attention, as joyful at each iteration as the junior archaeologist at some momentous dig who unearths a new remnant of fibula or sternum and rushes it over for the head guy to see.
My student-journalists also invoke the first phrase when applauding each other, if, say, a member of the clan has published some pungent exposé of the college administration in the school newspaper. As they pass each other in the halls, they’ll fist-bump and someone will declaim, far louder than necessary, speaking truth to power, bro! like a not-so-secret verbal handshake between collaborators in some special samizdat.
However, as noble as both ideals may sound, and as heartening as it is for a professor to see his students display genuine enthusiasm about anything, one cannot rejoice in the mindset that motivates their self-congratulatory behavior. For thanks to the postmodern mission creep that characterizes (and compromises) today’s media, both truth-speaking and the conferring of voice really represent the shameless abdication of journalism’s commitment to fairness and political neutrality.
The words truth and power, as well as the concepts they signify, have undergone a profound connotative evolution in recent decades, especially amid the past decade’s frenzy of social justice activism. Journalism ranks second behind only academia for the corrosive influence of postmodernism: thus, academic journalism programs are uniquely affected by such rethinkings. And because daily media showcase America’s zeitgeist, journalism’s givens, left unchallenged, tend to become society’s givens—all the more so when journalism is the political monolith that it is today (with the exception of the much maligned Fox, which enjoys less credibility in journalism schools than Bigfoot.) To argue that one can remain immune to journalism’s nonstop messaging is to argue that one can remain immune to advertising’s messaging; the world’s top brands tell a different story.
Prior to the 1960s, except in freshman philosophy class, truth was understood to describe an objective state of being—the factual, provable, repeatable—what we all knew for sure or had every reason to believe we knew. The postulating of such truth was left to experts or to a sizable consensus of reasonable people. If, in a given situation, we lacked sufficient information to ascertain truth, we conceded as much; we understood that we did not know what we did not know, and we left such points open—we did not add overlays of supposition and posit them as truth. What’s more, we generally perceived the impropriety of invoking the word when making moral or qualitative judgments; those were understood to be matters of opinion. The ethos was best summed up by financier/philanthropist Bernard Baruch in 1950: “Every man has a right to his own opinion, but no man has a right to be wrong in his facts.”
Admittedly, power, as both word and concept, was never quite as neutral as truth in its connotations. Americans of the so-called Greatest Generation, who lived through World War 2, had a love-hate relationship with politicians and titans of industry, holding the Rockefellers, for example, in both esteem and contempt. Surely, however, for many more Americans then than now, power and status were regarded as aspirational longings stitched into the fabric of the American Dream. In any case, the use of term itself, stripped of a specific context, was relatively neutral. Power was the condition of wielding authority or influence.
American journalism throughout its history has had an interesting marriage of convenience with such words, as well as with objectivity and neutrality in the larger sense. Contrary to what many neophyte journalism students believe, the institution did not start out as pure as the driven snow and then suddenly go to hell with the October 1996 advent of Fox News. The earliest American press was anything but objective, functioning in service to the colonies’ revolutionary aims and then later unambiguously cleaving to one side or the other during the Civil War. But, by Reconstruction, journalism had begun to recognize a growing demand for neutrality and stipulate the professional tenets of the same. As Columbia Journalism Review’s managing editor Brent Cunningham writes, “The press began to embrace objectivity in the middle of the nineteenth century, as society turned away from religion and toward science and empiricism to explain the world.” The journalistic quest for truth was reimagined as more of an epistemological activity, wherein journalists would serve as their readers’ eyes and ears, furnishing them with an objective sense of the world around them.
Granted, some newspaper magnates continued to encourage abuses for their own commercial purposes—but they did so cynically, knowing they were transgressing in order to goose newsstand sales. There was no serious pretense of the integrity claimed by today’s overstepping journalists. One thinks of the infamous yellow journalism of the bi-coastal circulation feud between Hearst and Pulitzer, whose excesses historians cite as a prime factor in our involvement in the Spanish-American War. That is a very different matter from betraying journalism in the name of journalism—that is, embracing with an avowedly virtuous heart practices that are anathema to the rules of impartial and ethical fact-finding.
Newspaper reporting—as formalized at Missouri and other embryonic J-schools of the Jazz Age—was supposed to be clinical, quasi-scientific in its methodology. The point was to paint a vivid word picture without cropping or photoshopping, as it were. Budding academically trained journalists were encouraged to peel back the layers of the observable in order to reveal the deepest core of fact, thereby arriving at a knowledge as untainted as possible by human interpretation. Ideally, if they did their jobs well, all reporters who had access to the same set of facts would’ve written the same story: Readers who wandered on scene would’ve reached the same conclusions as the journalist about what had factually happened. When real-world journalists fell short of this standard of reportage, editors passionately reiterated such detachment as the goal. Punishments were meted out to reporters who routinely transgressed.
At one time, journalists would even report on stories of some social sensitivity without revealing partisanship. A reporter might point out, for example, that seventeen people were living in makeshift shacks under a bridge on Canal Street—but the story was not covered in such a way as to imply that anyone was responsible for caring for those seventeen people. Interestingly, at the height of the Greatest Generation, a story written with such sympathetic overtones likely would have elicited a hostile reception from the public, inasmuch as the seventeen squatters would’ve been regarded as bums and ne’er-do-wells. Addiction, too, was then considered a contemptible character flaw, rather than an unfortunate disease. By no means am I implying that journalists should take such a jaundiced view of homelessness or its first cousin, addiction. I’m merely noting that there were once starkly different ways of conceptualizing stories that we now take for granted as human tragedies.
This insistence on journalistic neutrality became all the more critical with the advent of TV news, in which the prejudicial potential of imagery greatly complicated matters. Early TV journalists took great pains to keep upper lips duly stiff and inflections opaque. Listening to Walter Cronkite’s broadcasts, one had no sense of how he felt about the material he presented. The one chink in Cronkite’s armor of professional detachment occurred at 2:38 pm eastern time on November 22, 1963, when he removed his glasses to tell America in a cracking voice that president John F. Kennedy had succumbed to his wounds in Dallas. Cronkite’s uncharacteristic surge of humanity became a watershed moment in the pantheon of media history, the exception that confirmed the rule. For most of journalism’s early academic and professional history, teachers and serious-minded editors and news directors tirelessly counseled journalists to make every effort to keep themselves and their sentiments out of stories.
Until, that is, the 1960s ascendancy of the so-called new journalism, which openly granted writers license to process facts through their subjective filters. Suddenly, the journalist himself was the unimpeachable arbiter of what the story should signify to the reader and what constituent parts merited emphasis. This new journalism was provocative, edgy, salable: The golden age of American magazines was built on such sexy, interpretive reporting, which allowed the reporter to impose his own moral givens on a story. Moreover, because journalism has always attracted those with a Utopia complex—who seek to change the world for the greater good of mankind—those moral givens tended to be humanistic in orientation. Journalists took a stand for the underdog, for the masses.
At about the same time, the modern human potential movement took root as part of a broader cultural revolution against orthodoxy. The movement’s early visionaries recognized that facts could be disempowering. America was no Lake Wobegon, where everyone was above average: talent, looks and money were not apportioned equally. So self-help’s pioneers assiduously set about redefining people’s relationship with the external world. Thus was launched a fifty-year crusade to provide psychic mechanisms that enabled people to avoid inconvenient truths. The tactical goals may have varied from program to program or guru to guru, but the overarching strategic message was always the same: Reality is what’s most comfortable for you to see.
Take as an example the self-esteem juggernaut in America’s schools, which sought to inoculate children against their demonstrable failures. Teachers were advised to shun red ink, as it was stigmatizing for kids to have to face the fact that a wrong answer was indeed a wrong answer. Districts embraced pass/fail grading, which lumped everyone into broad categories, thereby minimizing individual differences. Recesses famously began featuring games without winners and losers.
Soon, a complementary campaign evolved for adults: the so-called empowerment movement of Anthony Robbins and copycat gurus, who urged an altered state of consciousness in which commonly understood standards and barriers were of minimal import or even ceased to exist. Among other things, they advocated a program of repetitive narcissistic affirmations. (I am the most attractive man. I am the most capable, etc.). Now we were all winners. We could all stand on life’s gold medal podium at once. Regrettably, as psychologists Roy Baumeister and Jean Twenge have pointed out, this narcissistic rethinking took a darker turn when expectations went unmet, causing feelings of thwarted entitlement and rage.
By the early 2000s, Deepak Chopra, Eckhart Tolle and other New Age shamans were taking solipsistic philosophy mainstream. With an assist from their de facto publicist, Oprah Winfrey, they promoted a bizarrely democratized take on truth that negated the very existence of objective reality. Unshackled from a core body of consensus knowledge, all of life was now subjective, perceptual. In 2006, the worldwide phenomenon The Secret posited that due to the interconnectedness of all matter, you could, like some latter-day Uri Geller, bend the obedient universe to your whims if you eliminated all self-doubt and were sufficiently single-minded in your vision of your deserved fate. Sound pretty wacky? Well, The Secret sold over 30 million copies; derivative seminars and workshops sprang up everywhere. Main Street America walked around spouting the associated buzzwords.
This phenomenon dovetailed neatly with the postmodern deconstructionism then emerging in academic and social justice circles, which preached a social reality anchored in lived experience; as such, this reality was neither answerable to, nor could be refuted by, objective data. Everything was eye-of-the-beholder. Data themselves were maligned as racist or sexist or otherwise tethered to the insidious forces of the status quo. No longer was equal opportunity enough; the movement demanded equivalent outcomes. And if you felt you’d been denied your desired outcome or otherwise mistreated, an injustice had ipso facto occurred. There ensued Black Lives Matter and Time’s Up, movements predicated on the argument that society must accept anecdotal evidence as a kind of extrajudicial truth, even if such evidence fell well short of customary legal standards.
From a journalistic standpoint, America may have received the most striking object lesson in this ethos during the hearings for then Supreme Court nominee Brett Kavanaugh, which pitted Kavanaugh against Christine Blasey Ford, who claimed that he had attempted to rape her in the early 1980s. For several weeks, the hearings were front and center in the American consciousness, as reporters and pundits paying tribute to the #MeToo movement credulously reminded viewers again and again of her truth: an attempted rape had occurred. Relegated to a supporting role was such staple journalistic nomenclature as accusation, allegation, contention. The authorized phrase, again and again, was her truth.
But there was not, in fact, a her truth here (or a his truth, for that matter), and the point isn’t merely semantic. Something either did or did not happen between Kavanaugh and Blasey Ford on that long-ago summer evening. One of these two seemingly upstanding American citizens was invested in, and outspoken about, a falsehood. This went all but unsaid in mass media, especially in discussions of Blasey Ford.
Power’s fall from grace has been perhaps less dramatic than truth’s because, as noted, there has always been some natural suspicion of the powerful, both in and out of journalism. As recently as the late 1990s, however, big-name self-help authors were churning out series after series of best-selling paeans to excellence, success, wealth and power, and the patented tools that supposedly could yield such outcomes (no doubt by helping you make the proper overture to the obliging universe). Power was thus portrayed, with a few obvious exceptions, as the organic consequence of a meritocratic process that rewarded excellence and/or achievement. In this framework, power was attainable to all who were willing to pay their dues—to “awaken the giant within,” as Tony Robbins put it in his marquee tome.
But, with the ascendancy of postmodern thought, power acquired pejorative subtexts, which were increasingly codified in media discourse. Other words, such as oppressive and patriarchy, became tacit corollaries to the formal definition of power. Today, in academic settings and much of the media, power is framed as less a byproduct of talent or mettle than the result of a kind of spoils system that rewards privilege. No longer is power attainable to all who tap into that inner giant. Rather, it is deliberately, systematically withheld from most. Even Sheryl Sandberg, the iconic Silicon Valley executive and feminist author whose slogan, lean in, became a rallying cry for professional women, has of late recanted, implying that there are social forces at work that can foil even the most empowered female.
Today the personal empowerment of just a decade ago seems archaic, quaint. Indeed, one hears the word power and also hears, softly, the accompanying adjective unjust.
In contemporary mainstream media, then, the terms truth and power are conjoined in a jaundiced twin metaphysic wherein the powerful—by nature oppressive and corrupt—are to blame for disenfranchising everyone else. It isn’t about merit; it’s about malfeasance. The great mass of citizens have been denied their due, and journalists must help rectify this injustice in order to make America right with the universe.
My young journalists see it as their eventual mandate to help undo the disequilibriums of the free market, to catalyze via journalistic means the egalitarian nation that has failed to evolve on its own. In even the most cursory reading of such aims it is hard to miss the Marxist timbre: the notion that in an ideal society all people should have approximately equal power and amplitude of voice. Thus we have in play here a different kind of redistributionism: not income redistribution per se but impact redistribution. Affirmative action for power.
So it is that my students, in their Utopian rapture, betray not a trace of the agnosticism with which journalists are supposed to regard life in their role as conduits of the noteworthy. Never does it cross their minds that both of the august mission statements they so adore, at least as they interpret them, are actually partisan agendas and as such are antithetical to the goals of honest journalism. The truth that they presume to speak is rooted in a social-justice-inflected vision of the way things ought to be. Of course, the way things ought to be entails a distinct moral calculus. And because moral positions tend to be expressed in political positions, a journalistic outlook that has an in-house moral sensibility also will have an in-house political sensibility.
It does not occur to my students—or to many working journalists—that the reporter who presumes to posit moral truth is less a journalist than a clergyman.
Nor does it occur to my students that if your daily routine consists of championing the interests of the disenfranchised by giving them a voice, you are less a journalist than an Obama-style community organizer.
In closing, then, to students who seek to speak truth to power, I offer a simpler prescriptive. Just speak truth—that is, unvarnished facts—to everyone. Then leave it to your audience to raise their own voices as they see fit.