On a cold February night in 2009, a turboprop commuter plane out of Newark was only a few miles from Buffalo when the “stick shaker” suddenly triggered. The plane had slowed to 135 knots after the crew had lowered the landing gear and extended the flaps, and the plane threatened to enter an aerodynamic stall. (That’s not when engines stop working, but when the wings cannot maintain lift.)
The automatic pilot disengaged, as it should have, and the pilot seized his shuddering control yoke, dragging it back to raise the nose and increase altitude—a fatal mistake. The plane needed speed, not height. Dipping the nose might have sacrificed a few hundred feet but provided the velocity to recover. The pilot’s reaction only further slowed the plane: The automatic stall-avoidance system activated and tried to pull the yoke forward, but the pilot fought it and ensured a stall would happen. It took only a few seconds for the plane to roll right and left, pitch up and then down for good, plummeting into a suburban home and killing 50 people.
When the National Transportation Safety Board (NTSB) investigated the crash, it identified another cause besides the pilot’s immediate response to the trouble. In the minutes leading up to it, the pilot and copilot showed “a significant breakdown of their monitoring responsibilities” and missed “explicit cues” of impending danger. At the critical moment, the pilot reacted incorrectly; but in the preceding 30 seconds, pilot and copilot failed to pay attention to the flight instruments showing airspeed and pitch attitude. The NTSB simulation of the final moments of the flight—which you can download at the Wikipedia page about the crash—indicates that, as the plane rapidly lost speed on the final approach, they didn’t seem to notice. Not until the stick shaker activated and the automatic pilot deactivated did the crew realize the problem, and it startled the pilot without warning. To prevent future disasters like this one, the NTSB recommended tighter monitoring procedures during flights and more training for pilots in monitoring skills.
Nicholas Carr takes that 2009 crash as a prime exhibit of a perilous trend. In previous books—The Big Switch (2008) and The Shallows (2010)—he examined digital innovations often hailed and hyped in the tech world and sounded a contrary judgment. (The title of his widely read 2008 Atlantic essay, “Is Google Making Us Stupid?” indicates his angle.)
The Glass Cage applies a similar skepticism to a broader development. To Carr, a deeper cause was to blame for the slack supervision in the cockpit that night, and it applies more widely than we realize. It is simple: The crew relied too much on the automatic pilot. They relaxed their awareness because the automatic pilot handled so many things for them so well, as it does on every flight. Computers in aviation have become so advanced, in fact, that pilots typically control a plane only briefly, for a minute or two on takeoff and landing. Computers maintain speed, stability, and direction, scan for nearby aircraft, adjust cabin pressure, and alter flight paths. Pilots don’t turn a plane, they tell a computer to do it. Yet for all the gains in efficiency and cost (cockpits no longer need navigators and radio operators), automation creates a risk in that the better it works, the more a human operator slackens his effort and the more his skills decline.
This is the warning of The Glass Cage. Carr emphasizes the airline industry because, while pilots seem like an extreme case, they “have been out in front of a wave that is now engulfing us.” The cockpit reveals dramatically a deterioration that subtly affects us all, more or less: When you use a calculator too much, arithmetical skills slip; a GPS dulls your sense of direction; computer-aided design software mars the hand’s ability to draw; digital cameras loosen the “discipline of perception.” If we don’t exert motor skills, they lag. Moreover, our concentration changes. As computers increasingly shoulder our labor, we suffer automation complacency (“when a computer lulls us into a false sense of security”) and automation bias (“when people give undue weight to the information coming through their monitors”) as a natural consequence.
It’s a common complaint, as mathematics teachers who object to calculators and English teachers who deplore spell-check well know, and Carr’s summations come off as strong and precise, but familiar:
As we grow more reliant on applications and algorithms, we become less capable of acting without their aid—we experience skill tunneling as well as attentional tunneling. That makes the software more indispensable still. Automation breeds automation.