The nuts and bolts and cables of the Internet.
Dec 17, 2012, Vol. 18, No. 14 • By JAMES BOLOGNA
Recently, Google unveiled a new feature on its website: the ability to tour, via “street view,” its Lenoir, North Carolina, data center, one of its numerous, highly guarded campuses. Google is attempting, at least partially, to lift the iron curtain—for which it has been much maligned—and show the world one of the physical strongholds where our personal data are stored. Might we trust the behemoth more if we can catch a glimpse of it from inside?
Google Data Center
The Internet is largely thought of as a nebulous cloud of information, floating around us everywhere but existing nowhere—light, ephemeral, omnipresent. But our general concept of the web couldn’t be further from the truth. As Andrew Blum explains here, the Internet is very much a solid structure, grounded in the need for electricity, undersea crossings, and transcontinental fiber-optic cables. The Internet hangs on poles outside our homes, it slithers underneath our office buildings, and gets converted into wireless cell phone signals by operator towers made to look like pine trees.
When things happen—natural disasters such as Hurricane Sandy, a fallen tree in Ohio causing the 2003 East Coast power outage, or, as Blum writes, “a seventy-five-year-old grandmother in the country of Georgia slicing through a buried fiber-optic cable with a shovel, knocking Armenia offline”—the invincible Internet stops working. Not quite an information superhighway, the Internet is more like a network of airports (which are vulnerable to weather conditions), where bits of information are shuttled to and from hubs all over the world.
Fortunately, instead of a mind-numbing recitation of technical statistics (although he certainly can speak that language), Blum opts for a compelling tale that could be considered travel literature. Starting in his own neighborhood, he reaches out to network engineers and far-flung experts in his journey to visit the physical place that is “the Internet.” He drives along the Jersey Shore in an attempt to locate where the underwater cable from Europe rises out of the Atlantic. He visits Portugal to see where Europe and Africa really connect. And he enters the subterranean world of New York’s utility workers, men in hardhats who lay mile upon mile of optical fiber under the metropolis each night.
One of his stops is in Ashburn, Virginia, in the suburbs of Washington D.C., about three miles from Dulles airport, where the massive tubes of the Internet literally come out of the ground. This network warehouse farm—like others Blum visits in Los Angeles, New York, Oregon, London, and Frankfurt—is where the routing (and “peering”) of global Internet traffic takes place. As he explains, it is in facilities such as these that the zeros and ones of our Netflix streams, cat meme emails, and Honey Boo Boo tweets are redirected either to networks feeding our computers or to giant data centers (such as Google’s in Lenoir) which contain the servers that house websites.
In these giant routing warehouses (such as the ones in Ashburn), with no windows, few doors, heavy security—airlocks, bulletproof glass, biometric scanners—and lots of air conditioning, sit rows and rows of router towers, filled with the same routers you might rent from Comcast or Verizon, but on steroids. These warehouses, strategically located all over the planet, are places for companies to lease space for their routers, and for their networks to plug into other networks (a network of networks!)—Verizon’s routers sit next to Amazon’s routers and Comcast’s routers and Netflix’s routers, and so on. In a sea of yellow and blue wires, a handful of network engineers, dwarfed by the colossal heat-throwing machines, continually string new networks together. Cables on top of cables all converge into one final box with blinking lights, the last box before the mother lode of tubes exits the building and enters the soil.
That tube leads to thousands of other tubes just like it, transporting all our web traffic to other parts of the world. But where, exactly, are our pictures, videos, e-books, and emails stored? When Blum attempts to find out—by “touring” one of Google’s data centers from the outside—he’s guided around nondescript Silicon Valley buildings, encircled by layers of barbed-wire fencing, surrounded by handlers unwilling to answer even basic questions.
But the street view of the Lenoir campus is a baby step toward Internet transparency. It’s important because more and more of our digital lives are moving to these web-based data centers—pictures on Facebook, files in Google Drive, music on Apple’s iCloud, books and entire libraries in electronic formats—and these campuses, some totaling more square footage than six U.S. Capitol buildings, grow larger and more energy-hungry each year. (Two percent of global power consumption is already attributed to data centers, and that number expands 12 percent annually.)