Getting Rear-Ended by the Law
Red-light cameras actually cause an increase in rear-end accidents. The pro-camera forces know this and are trying to keep you from seeing the data. Part 4 in a series.
11:00 PM, Apr 3, 2002 • By MATT LABASH
The capstone of Retting's work, however, is a pair of reports known as "The Oxnard studies." Monitoring the effects of red-light cameras in Oxnard, California, in 1997, Retting compared camera and non-camera sites. He concluded that the number of red-light-running incidents was reduced at nine camera sites by anywhere from 22 to 62 percent--a huge shot in the arm to camera boosters. The only hitch was, during the same period, his three non-camera sites performed even better, with decreases in violations on average 10 percent greater than at the camera sites.
For many researchers, this might seem problematic. But not for Retting, who theorized that the "statistically insignificant" difference between the sites was due to "spillover effect"--that is, the red-light cameras caused reductions at non-camera sites. Score one for automated enforcement! The fact that the non-camera intersections outperformed the camera intersections for what might have been any variety of reasons (public education, police presence at other intersections, etc.) didn't alter Retting's conclusion. He declared victory and left town, saying that further study of violations in Oxnard would be pointless since publicity resulting from the state's more than doubling the fine for running a red light, from $104 to $270, would influence results.
In April 2001, Retting introduced the second of his Oxnard studies, this time dealing with crash effects at red-light camera intersections. As could be expected, Retting concluded that red-light cameras "reduce the risk of motor vehicle crashes, particularly injury crashes." In fact, he extrapolated, even though cameras were used on only 2 percent of the approaches to the city's intersections, there were crash reductions citywide. (More spillover effect!)
But one doesn't have to review the report all that closely to uncover significant problems. First, Retting admits that the crash data he studied "did not contain sufficient detail to identify crashes that were specifically [caused by] red light running." Some might consider that a fatal shortcoming in a study that purports to examine red-light-running crashes. Next, he discloses that he didn't study crashes at the 11 red-light-camera intersections, but rather at all intersections, since "prior research documents" a large "spillover effect." (The prior research, of course, being his.)
Most interesting, Retting picked three control cities miles away from Oxnard that were in no danger of getting splattered by "spillover effect." While a table in Retting's report shows crashes at all signalized intersections in Oxnard decreasing 5.4 percent, two of his non-camera-enforced control cities also saw crashes decline, with camera-free Santa Barbara decreasing by 10.2 percent. How does Retting explain this? He doesn't. Perhaps most duplicitously, he claims that during the time of the study, "no other comprehensive traffic safety programs," were implemented in Oxnard that could account for the reductions. Unless, you count California more than doubling its penalty for running red lights (which gave Retting sufficient cause to discontinue his first study).
But the bad news for Retting doesn't end there. Curious about some of Retting's crash conclusions, the National Motorists Association's Jim Kadison secured accident data for the red-light-camera intersections Retting used in his latest Oxnard report. Retting had estimated that the use of red-light cameras had resulted in a tiny 3 percent increase in rear-enders at all signalized intersections. But after expanding the definition of an intersection to include 100 feet into the approaches, where rear-end accidents would logically occur, Kadison found that during the time of Retting's study, rear-end crashes at red-light camera intersections increased from 18 (before installation) to 156, for a total rear-end accident increase of 767 percent.
When I called Retting to needle him about the inconsistencies in his studies, he grew peevish. "The studies speak for themselves. . . . You can look at it any way you like, I have nothing to apologize for." Somehow, he seemed to discount the criticism, since I was not at his "professional level" and had no grasp of logistic regression models. "If you don't have the ability to appreciate the logistic regression model," he condescended, "it's really a waste of time." Perhaps so. But I can appreciate Greg Mauz's assessment of Retting's reports: "Swiss cheese doesn't have as many holes."