Once Again, PolitiFact Struggles to Explain Data Showing They Treat GOP Unfairly
1:42 PM, May 31, 2013 • By MARK HEMINGWAY
The Center for Media and Public Affairs (CMPA) at George Mason University is out with a new study on media fact checkers, and unsurprisingly, their results suggest that PolitiFact has it out for Republicans. Dylan Byers at Politico summarized CMPA's findings:
PolitiFact head Bill Adair was compelled to respond to the new report and sent this statement to Politico:
Adair's statement is lawyerly and bordering on dishonest. CMPA did not draw their own conclusions—they simply tallied up all of PolitiFact's ratings during a specific time period to get a representative sample. All the CMPA did was present relevant data, they most certainly did not "draw their own conclusions." (You can read the full release on the study by CMPA here. It's very straightforward.) Ironically, the big problem with PolitiFact is that they claim to make psuedoscientific judgments about the "facts" and frequently end up drawing their own erroneous conclusions. It's both telling and unsurprising that they'd be confronted with raw data about their organization from an academic study and dispute it as being slanted.
This is also not the first academic study that concluded PolitiFact might be putting their thumb on the scale when it comes to selecting and evaluating political statements. Last year, during the height of campaign season, the CMPA tallied up PolitiFact ratings. That study also showed PolitiFact tends to be much harder on Republicans:
Further, the University of Minnesota School of Public Affairs looked at over than 500 PolitiFact stories from January 2010 through January 2011. Their conclusion:
So contrary to Adair's ungracious insinuation, CMPA's most recent results aren't a selective anomaly. The data really do seem to suggest that PolitiFact doesn't like Republicans. (On that point, CMPA also tallied up Washington Post fact checker Glenn Kessler's rulings and found him to be much more even handed than PolitiFact when it came to evaluating Republicans and Democrats.)
In fact, at a panel discussion on media fact checkers in Washington last year I asked Bill Adair point blank why multiple studies keep showing PolitiFact is unduly hard on Republicans. Here was his response:
In other words, don't pay any attention to the actual data on PolitiFact because Bill Adair is content to survey what he hears at cocktail parties. This is a self-serving and inadequate response, to put it mildly. And to make matters worse, it's not just that PolitiFact's rulings appear tremendously biased when looked at in aggregate. As I noted last year, "I've written several articles citing chapter and verse of why PolitiFact rulings are either tendentious or flat out erroneous, and PolitiFact is downright unresponsive beyond dismissive references to 'partisans' at THE WEEKLY STANDARD." At the aforementioned panel discussion last year, Bill Adair was crowing about his organization's coverage of the welfare reform debate during last year's presidential campaign. PolitiFact's coverage of the issue amounted to a repeated and sustained attack on the credibility of the Romney campaign. As I laid out in detail last fall, PolitiFact got the facts badly, badly wrong. (And it wasn't just me that highlighted this—compare the stark difference in coverage of the issue between the Washington Post fact checker Glenn Kessler who basically got it right and PolitiFact's gross misunderstanding of how the law works.)
I suspect that deep down Adair knows how unconvincing he is, but that he's just hoping that he can kick up enough dirt to distract people from the fact the he's incapable of mounting a real defense of PolitiFact's credibility. Sure enough, liberal website Salon headlined their write up of CMPA's results "Study: Republicans are 'the less credible party,'" as though PolitiFact's editorial judgment is sacred. On the opposite end of the spectrum, Kurt Eichenwald of Vanity Fair calls CMPA's study "flawed" and "statistically silly." Unlike the ridiculous Salon piece, Eichenwald's reasons for skepticism seem to be rooted in both a healthy skepticism of PolitiFact and the limits of social science to draw objective conclusions about subjective political statements. However, Eichenwald seems to misunderstand what the sudy is about. He assumes that the study is meant reveal which political party is more truthful:
Eichenwald's heart is in the right place, but he's looking at it through the wrong end of the telescope—this is media analysis, not political science. CMPA's study is really meant to tell us about the PolitiFact's methodology, not the veracity of politicians. (Though his confusion is somewhat understandable considering outlets such as Salon are misrepresenting the study to grind an ideological axe.) As for Eichenwald's statistical concerns, they might be considerably alleviated knowing that this is not the first study of its kind producing the same result and tallying up PolitiFact's judgments is an ongoing project of the CMPA.
The bottom line is this: PolitiFact consistently calls Republicans liars at two or three times the rate of Democrats, and its individual judgments are regularly erroneous in ways that make it hard not to suspect the organization has a serious problem with political bias. Luckily for PolitiFact, fact checking the fact checkers is often an exceedingly complicated thing to do. And it doesn't help the PolitiFact and media partisans dismiss and misrepresent data that would prompt a responsible organization to engage in a badly needed self-examination.
Recent Blog Posts