And they exhort journalists to stop splitting the difference between the two parties and to start telling the truth about which one is really driving the polarization and refusing to compromise.
Political ideologues don’t merely fail to compromise, however; they also concoct their own reality. Mann’s and Ornstein’s recommendations, accordingly, can be extended to the realm of reporting on fact itself—e.g., to the media watchdogs who strive for evenhandedness as they fact-check the statements of politicians on both sides of the aisle.
After all, we live at a time when Republican “Big Lies”—ranging from the denial of global warming, to claims about “death panels” (PolitiFact’s 2009 “Lie of the Year”) and a “government takeover of healthcare” (the 2010 “Lie of the Year”), to the assertion that President Obama goes around “apologizing” for America—are everywhere. In this context, is it not appropriate that the arbiters of political reality also take a stand?
In fact, there is reason to think that, at least in a subtle way, they already have—because the facts have forced them to.
As the influence of political fact-checking has grown, the temptation to crunch the numbers on fact-checker performance has proven irresistible. Early into the fray was the Smart Politics blog at the University of Minnesota’s Humphrey School of Public Affairs, which analyzed the Pulitzer Prize–winning PolitiFact’s work during the period from January 2010 through January 2011, surveying more than 500 stories. Sure enough, it found that while the site fact-checked roughly as many statements by current or former Democratic elected officials as current or former Republican officeholders during this period (179 versus 191, respectively), Republicans were overwhelmingly more likely to draw a “false” or even “pants on fire” rating (the worst of all). Out of the ninety-eight politicians’ statements that received these dismal ratings, seventy-four were made by Republicans—or 76 percent. Sarah Palin and Michele Bachmann fared worst, with eight and seven PolitiFact slams, respectively.
The Smart Politics blog went on to suggest, based on these numbers, that PolitiFact is biased against the right—precisely the type of knee-jerk centrism that Mann and Ornstein have called into question. After all, there is another possibility: the left just might be right more often (or the right, wrong more often), and the fact-checkers simply too competent not to reflect this—at least over long periods.
So which interpretation is correct? While certainly not definitive, a study undertaken for my book The Republican Brain and updated for this article—with dedicated data-gathering and statistical analysis from an assistant, Aviva Meyer—lends additional credence to the latter possibility.
We examined the work of the Washington Post’s “Fact-Checker” column, launched by Michael Dobbs in 2007 and appearing through late 2008, then restarted in 2011 under Glenn Kessler (and now co-written by Josh Hicks). The Post bestows “Pinocchios” for false or misleading claims, with the number depending on the egregiousness of the error. Thus, getting “four Pinocchios” from the Post is comparable to a “pants on fire” from PolitiFact.
Our analysis found that from the inception of the “Fact-Checker” column in September 2007, through December 2011 (after it resumed under Kessler), Republicans were given ratings on the Pinocchio scale 178 times and Democrats 137 times, for 315 ratings overall. (We did not include liberal, conservative or neutral groups and individuals in this analysis, and we also omitted the Post’s late 2011 “biography” items on the GOP presidential candidates, as these would have greatly inflated the GOP total and had no Democratic parallel.) Already, then, Republicans were flagged for significantly more misstatements by the Post. Indeed, we found that Republicans received 436 Pinocchios, while Democrats received just 291. This means about 60 percent of all Pinocchios for explicitly political falsehoods went to Republicans and about 40 percent to Democrats during this time.
One might argue, however, that this is misleading: if Republicans were rated more times overall, then of course they got more Pinocchios. But it turns out that the average Republican rating (2.45 Pinocchios) was worse than the average Democratic rating (2.12 Pinocchios). Not only were Republicans given more Pinocchios by the Post, then, but whenever they got rated, they tended to do worse than Democrats.
Arguably the biggest insight, however, comes from looking at the individual Pinocchio categories. Republicans got nearly three times as many “four Pinocchio” ratings as Democrats (thirty-three versus twelve), according to our analysis. They were also overrepresented in the “three Pinocchio” category (forty-two versus thirty-one) and the “two Pinocchio” category (seventy-six versus fifty-five), the most frequent category used.
But, interestingly, this trend did not hold up in the “one Pinocchio” category, in which Democrats predominated (forty versus twenty-six). In other words, although the Post flagged Democrats for a lot of minor sins, the more egregious falsehoods were clustered among Republicans. (In checking one of President Obama’s statements, Kessler acknowledged it was such a minor infraction that it might deserve a “half-Pinocchio,” if there were such a thing.)
It is important to note that Kessler analyzed his own work from 2011 and similarly found that Republicans, on average, got more Pinocchios than Democrats. His averages were somewhat different from ours, apparently because of differences in how we categorized a few items. Invited to comment, Kessler had this to say: “I pay little attention to whether I am rating Democrats or Republicans, believing the numbers average out over time. My own experience, after three decades of covering Washington politicians, is that both sides will spin the facts if they think it will give them a political advantage. I sensed the GOP ratings were higher, and more frequent, because of the Republican primary season—i.e., Republicans attacking each other—but perhaps this means I need to pay closer attention to the Democrats.”
As Kessler’s words suggest, interpreting these data—or, for that matter, the aforementioned data on PolitiFact—as evidence of a liberal bias among the fact-checkers discussed here would be shortsighted and simplistic. All indications are that both outlets try (or even bend over backward) to be as balanced as possible.
A potentially simpler explanation for these results, then, is that the fact-checkers are simply doing their job—and Republicans today just happen to be more egregiously wrong. Democrats, meanwhile, are certainly not innocent when it comes to making misleading statements, but their pants are not on fire.
Why does this matter? The reason I undertook this analysis was to examine, in the real world, my book’s thesis—closely related to Mann’s and Ornstein’s—that Republicans, overall, process information differently from Democrats. Mounting evidence from psychology, for instance, suggests that one’s politics are driven partly by one’s personality, and Democrats and liberals are simply more open to new information and experiences as well as more tolerant of ambiguity and uncertainty. Moreover, this difference has been exacerbated by a well-documented turn toward psychological authoritarianism in the Republican Party over the past four decades. Increasingly, the GOP has become the party of those who are more rigid, less given to compromise, and more inclined to see the world in black and white.
In this context, it is perhaps inevitable that allegedly neutral arbiters of political fact would come to show a somewhat “unbalanced” body of work. I don’t expect the fact-checkers to stop trying to be bipartisan—or to stop calling out Democrats when they deserve it. Still, the real message of their work may best be captured in a line by Stephen Colbert: “Reality has a well-known liberal bias.”
Link to the original article on The Nation