Yesterday, at the National Press Club in Washington, D.C. the heads of all of the major media "fact checking" organizations convened for a panel discussion. On the panel were PolitiFact editor Bill Adair, Washington Post fact checker Glenn Kessler, the Associated Press's Jim Drinkard, and it was moderated by Brooks Jackson of FactCheck.org. The title of the panel was, "Deception Alert: Fact Checkers Forecast Deceptions in 2012 Presidential Debates." Aside from the rather bizarre notion of fact checking events that have yet to take place, you can probably gather from the title of the panel that the discussion leaned toward self-congratulatory, and away from measured and reflective. Here's a sample of the kind of brazen deception we can expect in the debates, courtesy of FactCheck.org's Jackson*:
Quite likely we will hear Mitt Romney say that gasoline prices have doubled under Obama, which is an example of one of those things that is, yes, literally true if you look at where gas prices were when he took office. They had plummeted due to the worldwide near depression, but the fact is that they have not quite gotten as high for Obama as they were for weeks under Bush in the summer of 2008.
So Mitt Romney is making a claim that is "literally true," but it doesn't seem that way if you measure Obama's track record on gas prices against Bush, who by the way, is not running against him for president. Further, Jackson is happy to pile on debatable macroeconomic context that is favorable to Obama on gas prices, but offers no context supporting the underlying critique of Romney's "literally true" claim, which is that Barack "energy prices will necessarily skyrocket" Obama has done a number of things that could be said to have significantly upped the price of gas. For instance, why isn't Jackson pointing out that Obama's got a pretty lengthy track record of impeding offshore drilling? What about holding up the Keystone XL pipeline? Couldn't that be said to raise the price of gas? And that's just the tip of the iceberg—there have been many extensive and legitimate critiques of Obama's energy policies and their contribution to increased gas and other energy prices that could be mentioned here.
Obviously, there's a difference between fact checking and saying someone is being deceptive because, well, here's some highly selective context. But media fact checkers remain obdurately unwilling to make a distinction between what is "literally true" and passing off a one-sided argument as "fact checking." There's a distinct lack of self-awareness here.
And this lack of self-awareness could not be more evident when you confront fact checkers with evidence of their partisan bias. At the panel discussion today, Time's Michael Scherer asked if the fact checkers thought that one presidential campaign was being more deceptive than the other and why. Everyone on the panel respectfully declined to single out a campaign, and most were at pains to reassure that they believed all politicians lie and they don't play favorites. With that in mind, I asked the panel a question:
Hemingway: I want to pick on Bill [Adair of PolitiFact] a little bit just because the data is more readily available for him than it is with the other fact checkers. You talked about that website and you said you’re not sure what the indices tell you. [NOTE: Adair had previously referenced a liberal website http://whosmorefullofsh-t.com that quantifies PolitiFact rulings] Well, I’m curious about that because they’ve done a few studies because Politifact rulings are fairly easy to quantify. The University of Minnesota School Humphrey School of Public Affairs study last year found that Politifact rated Republicans false at a rate of three to one over Democrats, and more recently a George Mason University study from I think June through September found that Politifact rated Republicans false at a rate of two to one. Now, it’s also true that Politifact targets Republicans overwhelmingly for evaluation, regardless of how they come down on the rulings. So that to me, say, when Romney pollster Neil Newhouse suggests fact checkers have a partisan agenda that they bring to the table is what he was suggesting. I think that’s why he was saying he wasn’t going to let the [Romney] campaign be dictated by the fact checkers not because he didn’t think he was going to go whole hog on an [dishonest welfare reform] ad that was effective. Don’t Republicans have a pretty legitimate grievance there if they’re being unfairly singled out you know overwhelmingly by the number of times that they’re being targeted by fact checkers compared to Democrats?
Adair: Well, I’m not sure I agree that they’ve been unfairly singled out, didn’t the [George Mason] studies show that we had checked roughly the same number of Democrats as Republicans or something? I don’t know, I saw the press release. You know I don’t find the “hey, you gave my team...” [complaints persuasive] because I hear it from both sides. I was at a party over the summer and a guy came up to me and said, “Hey, I think that, I really think Politifact Virginia has been unfair, they’ve been very biased against Tim Kaine,” the Democrat. And then like a week or so later the Virginia Republican party came out and said Politifact Virginia is unfairly targeting the Republicans. And you know, I think the nature of what we do is disruptive to the status quo. I think we are easier to analyze because of our unique structure, but I don’t find the numerical count analysis to be particularly persuasive. Now what I’d like to talk about are if you have substantive questions about something we’ve done we’re happy to talk about it. We make a mistake, we correct it, but I think, you also have to reflect this is journalism, we’re not social scientists, we are not randomly selecting things.
Ok, a mea culpa here. I was speaking extemporaneously here and Adair is right, the George Mason study did find that PolitiFact only checked the statements of Republicans slightly more than Democrats. "PolitiFact checked the assertions of Democrats slightly more often than those of Republicans (54% vs. 46% of all statements)," according to the study. But that makes what PolitiFact is doing even worse! If you're fact checking a roughly equal a number of statements by each party, and you find one party lies twice as often, wouldn't that be revealing? To be more specific, here's what the George Mason study, which tabulated PolitiFact rulings between June 1 and September 11, concluded:
However, PolitiFact rated Democratic statements as “mostly true” or “entirely true” about twice as often as Republican statements -- 42% true ratings for Democrats vs. 20% for Republicans.
Conversely, statements by Republicans were rated as entirely false about twice as often as Democratic statements – 29% false ratings for GOP statements vs. 15% false ratings for Democrats. (This includes categories labeled “false” and “pants on fire.”)
And yet, Adair just refused to answer the question of why his organization overwhelmingly finds Republican statements false. It's not just the George Mason study, either. Again, the University of Minnesota analyzed "more than 500 PolitiFact stories from January 2010 through January 2011 finds that current and former Republican officeholders have been assigned substantially harsher grades by the news organization than their Democratic counterparts. In total, 74 of the 98 statements by political figures judged 'false' or 'pants on fire' over the last 13 months were given to Republicans, or 76 percent, compared to just 22 statements for Democrats (22 percent)."
So why does Adair have no explanation for this? Confronted with actual data, a.k.a. "facts," is he really dismissing the critique PolitiFact disproportionately rules against Republicans because Democrats come up to him at parties and complain? Sure looks that way.
If he's confident in his organization's evaluations why not just say, "We strive to be fair, but the results seem to show Republicans tell falsehoods more often"? I suspect that the reason he doesn't stand behind the aggregate results of the PolitiFact rulings is because he knows that it looks bad. It would cause people to start taking a closer look at the specifics of PolitiFact rulings to see whether this remarkable result is warranted.
Which brings me to my final point regarding Adair. When he says, "what I’d like to talk about are if you have substantive questions about something we’ve done we’re happy to talk about it." That's not my sense of it at all. I've written several articles citingchapterandverse of why PolitiFact rulings are either tendentious or flat out erroneous, and PolitiFact is downright unresponsive beyond dismissive references to "partisans" at THE WEEKLY STANDARD.
For instance, Adair began his presentation today with a video of Jon Stewart confronting Herman Cain with the PolitiFact's charge that Mitt Romney's welfare reform ads are deceptive—after confronting Cain, Stewart actually does a victory dance to drive home the point that he's caught Cain in a lie.
Yet, this is an issue that PolitiFact has not only gotten wrong, but been thunderously wrong about it. Indeed, I just wrote four thousand words unpacking the Obama administration's recent waivers to the work requirements in welfare reform, relevant statistical analysis, and the anti-welfare reform statements of the top welfare policy makers in the Department of Health and Human Services. It turns out that the Romney campaign is making a pretty credible argument, and PolitiFact is most certainly not. And not only that, PolitiFact's rulings on the matter are wrong to the point that PolitiFact is now in direct conflict with a ruling one one of the key issues made by Adair's fellow panelist Glenn Kessler. (Compare here and here.) PolitiFact has now given two "Pants on Fire!" rulings to the Romney campaign—and a related "true" rating to Bill Clinton—based on dubious sources and little understanding of welfare policy. I haven't seen PolitiFact respond to myself, the Washington Post, or Heritage Foundation policy expert Robert Rector—the "intellectual godfather of welfare reform" who wrote the law's original work requirements. Rector has repeatedly and quite publicly called out fact checkers on the issue.
But wait! We're just getting warmed up! All of the panelists responded to me. Glenn Kessler was perfectly reasonable throughout the whole panel, even mentioning that welfare reform was more complicated than other fact checkers on the panel were making it out to be. Kessler wisely steered clear of the PolitiFact wreckage, only proffering that he gets a roughly equal amount of criticism from all sides of the political spectrum. He did, however, make this unintentionally illuminating observation:
The only thing I can say more broadly is that I do think that, and I don’t know if this is your experience, that Democrats tend to be more angry about and more upset about some of the things that I write, I guess that’s because they kind of believe the myth of the liberal media.
Huh. If Democrats believe the "myth" of the liberal media and Republicans have been on record for years as believing the media are liberal, at what point does this cease to be a myth?
For his part, Drinkard began the inevitable circling of the wagons by accusing me of overlooking relevant facts:
Another thing I would say is that nobody mentioned this but we’re in a year which has been dominated by a primary season where only one party had a primary. And they had 21 or 22 debates. That is going to produce a certain number of fact checks, and they’re all going to be about Republicans.
Well yeah, but the studies I cited were from last year in January and then from June through September, so, I mean, the primary wouldn’t have been in effect.
According to the timing, neither study would have really been impacted by the primary. Oh and the George Mason study on PolitiFact also found the following result. It is revealing and also goes a long way toward ruling out the presidential primary affecting the results:
The same pattern holds for statements made directly by the presidential candidates and their campaigns. A majority of the Obama campaign’s statements (55%) were rated as true or mostly true, compared to one out of four statements (26%) by the Romney campaign.
Again, why can't any of these professional fact checkers offer up a plausible excuse for why PolitiFact finds Republicans tell falsehoods at two to three times the rate of Democrats? I think it's fair to say that FactCheck.org's Brooks Jackson seemed mildly annoyed with my impertinence on this point, and basically came around to the only conclusion that is both face-saving and obvious. Unfortunately, even Jackson seemed to have difficulty believing this conclusion isn't improbable:
Jackson: I think the Weekly Standard would agree that when you look at civil rights for example the goal should be equality of opportunity, not equality of result. So I think you have, if you are seeing, and I don’t know that the figures you gave me are accurate, if you are seeing criticisms of one-side or another, it might reflect the fact that there is a Republican primary going on, or it might reflect the fact that one party at that particular time is failing the same standards, the same journalistic standards, more than the other. You can infer if there are more criticisms of one side than another, and I think this varies all the time, this is the actual fact of it, but that [doesn't] mean there is any bias on our part. That’s just a false logic.
Hemingway: At a rate of three to one? I mean, you believe...
Jackson: I don’t know that it’s three to one, and I don’t know…
Hemingway: ...well I’m citing the Minnesota School of [Public] Affairs.
Jackson: ...and I don’t know if some of those are complaints about Republicans criticizing other Republicans. It’s certainly not three to one on our side.
So then, Jackson raises the possibility that the disproportionate "false" rulings for Republicans accurately reflect that they lie more than Democrats. But pressed on this, he suddenly feels the need to clarify that his organization, unlike PolitiFact, isn't fact checking Republicans as false over Democrats at indefensible rates. This is not exactly a vote of confidence in PolitiFact.
I guess the results here are as predictable as they are ironic. Try and talk to fact checkers about facts involving their own work, and the result is trying to nail Jello to a wall. The panel wrapped up shortly after my question, and a gentlemen who's name I wish I'd caught came up to me and said, "Did you notice they answered your question the way that they complained political campaigns responded to them?" Oddly enough, I did.
I also noticed that last week, Gallup reported "U.S. Distrust in Media Hits New High." But don't worry, Adair informed everyone at the panel this morning that this is going to be the year that everyone notes that fact checking took hold in the public consciousness. I wouldn't want to suggest a correlation between these two things, lest I again be accused of "false logic." But after yesterday's performance at the press club, perhaps you'll agree that more "fact checking" may not be what we need to restore faith in the fourth estate.
*Correction: I originally identified the Associated Press's Drinkard as the source of this quote.