The Blog

Once Again, PolitiFact Struggles to Explain Data Showing They Treat GOP Unfairly

1:42 PM, May 31, 2013 • By MARK HEMINGWAY
Widget tooltip
Single Page Print Larger Text Smaller Text Alerts

The Center for Media and Public Affairs (CMPA) at George Mason University is out with a new study on media fact checkers, and unsurprisingly, their results suggest that PolitiFact has it out for Republicans. Dylan Byers at Politico summarized CMPA's findings:

The fact-checking organization PolitiFact has found Republicans to be less trustworthy than Democrats, according to a new study.

Fifty-two percent of Republican claims reviewed by  the Tampa Bay Times fact-checking operation were rated "mostly false," “false” or “pants on fire,” versus just 24 percent of Democratic statements, according to George Mason University's Center for Media and Public Affairs. By the same token, 54 percent of Democratic statements were rated as "mostly true" or "true," compared to just 18 percent of Republican statements.

The CMPA looked at 100 statements -- 46 by Democrats, 54 by Republicans -- that were fact-checked by PolitiFact between January 20 and May 22.

PolitiFact head Bill Adair was compelled to respond to the new report and sent this statement to Politico:

PolitiFact rates the factual accuracy of specific claims; we do not seek to measure which party tells more falsehoods.

The authors of this press release seem to have counted up a small number of our Truth-O-Meter ratings over a few months, and then drew their own conclusions.

We've rated more than 7,000 statements since we started in 2007. We are journalists, not social scientists. We select statements to fact-check based on our news judgment -- whether a statement is timely, provocative, whether it's been repeated and whether readers would wonder if it is true.

Adair's statement is lawyerly and bordering on dishonest. CMPA did not draw their own conclusions—they simply tallied up all of PolitiFact's ratings during a specific time period to get a representative sample. All the CMPA did was present relevant data, they most certainly did not "draw their own conclusions." (You can read the full release on the study by CMPA here. It's very straightforward.) Ironically, the big problem with PolitiFact is that they claim to make psuedoscientific judgments about the "facts" and frequently end up drawing their own erroneous conclusions. It's both telling and unsurprising that they'd be confronted with raw data about their organization from an academic study and dispute it as being slanted.

This is also not the first academic study that concluded PolitiFact might be putting their thumb on the scale when it comes to selecting and evaluating political statements. Last year, during the height of campaign season, the CMPA tallied up PolitiFact ratings. That study also showed PolitiFact tends to be much harder on Republicans:

The study examined 98 election-related statements by the presidential candidates, their surrogates, and campaign ads fact-checked by PolitiFact.com from June 1 to September 11. Major findings:

PolitiFact checked the assertions of Democrats slightly more often than those of Republicans (54% vs. 46% of all statements).

However, PolitiFact rated Democratic statements as “mostly true” or “entirely true” about twice as often as Republican statements -- 42% true ratings for Democrats vs. 20% for Republicans.

Conversely, statements by Republicans were rated as entirely false about twice as often as Democratic statements – 29% false ratings for GOP statements vs. 15% false ratings for Democrats. (This includes categories labeled “false” and “pants on fire.”)

Further, the University of Minnesota School of Public Affairs looked at over than 500 PolitiFact stories from January 2010 through January 2011. Their conclusion:

Current and former Republican officeholders have been assigned substantially harsher grades by the news organization than their Democratic counterparts. In total, 74 of the 98 statements by political figures judged 'false' or 'pants on fire' over the last 13 months were given to Republicans, or 76 percent, compared to just 22 statements for Democrats (22 percent).

So contrary to Adair's ungracious insinuation, CMPA's most recent results aren't a selective anomaly. The data really do seem to suggest that PolitiFact doesn't like Republicans. (On that point, CMPA also tallied up Washington Post fact checker Glenn Kessler's rulings and found him to be much more even handed than PolitiFact when it came to evaluating Republicans and Democrats.)

In fact, at a panel discussion on media fact checkers in Washington last year I asked Bill Adair point blank why multiple studies keep showing PolitiFact is unduly hard on Republicans. Here was his response:

You know I don’t find the “hey, you gave my team...” [complaints persuasive] because I hear it from both sides. I was at a party over the summer and a guy came up to me and said, “Hey, I think that, I really think Politifact Virginia has been unfair, they’ve been very biased against Tim Kaine,” the Democrat. And then like a week or so later the Virginia Republican party came out and said Politifact Virginia is unfairly targeting the Republicans. And you know, I think the nature of what we do is disruptive to the status quo. I think we are easier to analyze because of our unique structure, but I don’t find the numerical count analysis to be particularly persuasive.

In other words, don't pay any attention to the actual data on PolitiFact because Bill Adair is content to survey what he hears at cocktail parties. This is a self-serving and inadequate response, to put it mildly. And to make matters worse, it's not just that PolitiFact's rulings appear tremendously biased when looked at in aggregate. As I noted last year, "I've written several articles citing chapter and verse of why PolitiFact rulings are either tendentious or flat out erroneous, and PolitiFact is downright unresponsive beyond dismissive references to 'partisans' at THE WEEKLY STANDARD." At the aforementioned panel discussion last year, Bill Adair was crowing about his organization's coverage of the welfare reform debate during last year's presidential campaign. PolitiFact's coverage of the issue amounted to a repeated and sustained attack on the credibility of the Romney campaign. As I laid out in detail last fall, PolitiFact got the facts badly, badly wrong. (And it wasn't just me that highlighted this—compare the stark difference in coverage of the issue between the Washington Post fact checker Glenn Kessler who basically got it right and PolitiFact's gross misunderstanding of how the law works.)

I suspect that deep down Adair knows how unconvincing he is, but that he's just hoping that he can kick up enough dirt to distract people from the fact the he's incapable of mounting a real defense of PolitiFact's credibility. Sure enough, liberal website Salon headlined their write up of CMPA's results "Study: Republicans are 'the less credible party,'" as though PolitiFact's editorial judgment is sacred. On the opposite end of the spectrum, Kurt Eichenwald of Vanity Fair calls CMPA's study "flawed" and "statistically silly." Unlike the ridiculous Salon piece, Eichenwald's reasons for skepticism seem to be rooted in both a healthy skepticism of PolitiFact and the limits of social science to draw objective conclusions about subjective political statements. However, Eichenwald seems to misunderstand what the sudy is about. He assumes that the study is meant reveal which political party is more truthful:

PolitiFacts editor Bill Adair says that the comments are selected based on the group’s news judgment. That’s fine for examining the issues of the day, but it hardly lends itself to statistical analysis. If someone’s subjective opinion determines the data set, the statistics are flawed from the get-go.

Then another level of subjectivity is employed: PolitiFact’s judgment on truthfulness. There has been plenty of criticism—from both the left and the right—of PolitiFact’s judgments. While that might be political sour grapes, it means that the group’s determinations are not objectively accepted fact. So now you have two subjective elements—the choice of statements to review and the determination of their accuracy.

Eichenwald's heart is in the right place, but he's looking at it through the wrong end of the telescope—this is media analysis, not political science. CMPA's study is really meant to tell us about the PolitiFact's methodology, not the veracity of politicians. (Though his confusion is somewhat understandable considering outlets such as Salon are misrepresenting the study to grind an ideological axe.) As for Eichenwald's statistical concerns, they might be considerably alleviated knowing that this is not the first study of its kind producing the same result and tallying up PolitiFact's judgments is an ongoing project of the CMPA.

The bottom line is this: PolitiFact consistently calls Republicans liars at two or three times the rate of Democrats, and its individual judgments are regularly erroneous in ways that make it hard not to suspect the organization has a serious problem with political bias. Luckily for PolitiFact, fact checking the fact checkers is often an exceedingly complicated thing to do. And it doesn't help the PolitiFact and media partisans dismiss and misrepresent data that would prompt a responsible organization to engage in a badly needed self-examination.

Recent Blog Posts

The Weekly Standard Archives

Browse 18 Years of the Weekly Standard

Old covers