What Macy’s Wrought
Of computers and the convergence of minds.
Jun 9, 2014, Vol. 19, No. 37 • By JOSHUA GELERNTER
In 1882, Louis Bamberger bought the stock of a bankrupt dry goods store and used it to open a store of his own in Newark, New Jersey. By 1928, it was one of the largest and most profitable businesses in the country: Bamberger’s department store had expanded from a rented storefront to a million square feet and 3,500 employees. For customers, it boasted a toll-free telephone number and a no-questions-asked, money-back guarantee; for employees, it offered job security and an on-site lending library. The eight-story flagship had its own radio station and launched what would become the Macy’s Thanksgiving Day Parade in 1924, when Bamberger decided to retire and sell his store to Macy’s.
Institute for Advanced Study, Princeton
Time & Life Pictures / Getty Images
Bamberger was childless; so was his sister and business partner, Caroline. They decided to give a million dollars of the Macy’s sale profit to their longest-serving employees and use the rest to start a school of higher learning. For their school, the Bambergers had two requirements: It had to benefit the state of New Jersey, which had been good to them, and it had to be a refuge for Jewish students being turned away from the many institutions with Jewish quotas.
A New Jersey-based medical school seemed like just the ticket. The mathematician Oswald Veblen and education reformer Abraham Flexner caught wind of the idea and thought they had a better one: not a medical school but a school for advanced study in every field. They pitched their plan to the Bambergers, who were suitably impressed. In 1930, the Institute for Advanced Study was founded.
As the institute laid its cornerstone in Princeton, the Nazis were taking over in Germany—a catastrophe that worked out well for the institute: “The Nazis launched their purge of German universities in April 1933, and the exodus of mathematicians from Europe . . . began just as the Institute for Advanced Study opened its doors,” writes George Dyson. The institute quickly stocked up on the biggest names and best minds in European scholarship: Their first hire was Albert Einstein; their second was John von Neumann.
Everyone knows who Einstein was, but von Neumann might have been the greatest mind of the 20th century. He was born in Budapest to a secular Jewish family. By adolescence, he was fluent in five languages and had started working independently on “the deepest problems of abstract mathematics.” Said the physicist and Nobel laureate Eugene Wigner: “Whenever I talked with von Neumann, I always had the impression that only he was fully awake.” The mathematician Herman Goldstine once said that von Neumann’s lectures made complex problems so perfectly clear that students didn’t need to take notes. When von Neumann obtained his doctorate in 1926, his oral examination featured a single question: “Pray, who is the candidate’s tailor?” Von Neumann was also a snappy dresser.
Doctorate in hand, von Neumann spent seven years traveling back and forth between Hungary and Germany, busily revolutionizing mathematics. When the Nazis began firing Jewish professors in 1933, he crossed the Atlantic and started revolutionizing mathematics over here—although he never forgot where he came from, once remarking that he felt “the opposite of a nostalgia for Europe,” which, he explained, was “an infernal pesthole.” According to his wife, “His loathing for the Nazis was essentially boundless. They came and destroyed [a] perfect intellectual setting. In quick order they dispersed the concentration of minds and substituted concentration camps.”
As soon as he became an American citizen, in 1937, von Neumann applied for a commission in the Army; but, already in his 30s, he was rejected as too old. Instead, he was recruited by the Army, Navy, and Marine Corps to apply science to weaponry. Some of his work for the Navy is still classified. When the atomic bomb was being developed under a heavy veil of secrecy at Los Alamos, von Neumann had innumerable irons in the fire, working out problems of nuclear fission and inventing the digital computer.
Those irons, hammered into shape at the Institute for Advanced Study, are the subject of Turing’s Cathedral. Ostensibly, this is a book about the invention of the computer; but it’s really a love letter, of sorts, to the men America saved from Hitler and who, in return, made America the world’s scientific superpower. Von Neumann stars as the man whose design for a digital computer underlies every computer in the world today. He’s joined by an impressive cast of supporting characters—Einstein, Wigner, Edward Teller, Kurt Gödel, Stanislaus Ulam, Wolfgang Pauli—all refugees from Nazi Europe, all based at the Institute for Advanced Study, and all accompanied by fascinating back stories.
The one person not featured here, at least not heavily, is the title character, Alan Turing. It was Turing who laid out a precise definition of a digital computer and what it would be able to do: His ideas created the field of computer science. He brought his ideas to the institute, where he worked briefly with von Neumann before the war pulled him home to England. Turing made a decisive contribution to the war effort by leading the fight to crack the Enigma code. Without Turing, World War II might well have turned out differently.
Without question, Alan Turing was one of the great men of the 20th century, a tragic genius who committed suicide not long after being chemically castrated as legal punishment for homosexuality. Arguments rage about who deserves credit as the true inventor of the digital computer: Turing, who laid the theoretical groundwork, or von Neumann, who laid the practical groundwork. The argument rages a little more fiercely when Turing’s status as a gay martyr and von Neumann’s as an anti-Communist bomb maker are factored in. George Dyson doesn’t take sides, but Turing gets only one chapter. The computer may have been Turing’s cathedral, but von Neumann figured out how to put it together—and then built it. That’s the story here, interspersed with a lot of pithy anecdotes.
Indeed, it’s the side stories that dominate, touching on subjects from the Lenni Lenape Indians of prehistoric Princeton to von Neumann’s brief and abortive career as a skier. They’re included because they’re entertaining, and because Dyson clearly savors background detail. He writes well, and the tangents are fun to read. But the actual inventing-the-computer material presents a problem, the author alternating between compelling narratives and technical passages that are too long and too dry. If you understand the inner workings of a computer, the technical parts will be interesting. If you do not, they’ll be incomprehensible.
That’s a flaw, and you’ve been warned. But don’t let it put you off the rest, where the good outweighs the confusing.
Joshua Gelernter is a writer in Connecticut.