January 23, 2008: Features
Alumni who changed America, and the world.
“An astonishing ratio of mind to mass”
By George F. Will *68
I am writing this wee tribute to the greatest Princetonian on a morning that began, as most of my mornings do, with a predawn walk accompanied by my dog. His name is Madison. I am wearing my favorite necktie. It is blue, with silver profiles of James Madison. Later this morning I shall work on a book I am writing. It is to be titled The Madisonian Persuasion. I am not one who needs to be persuaded that Madison merits being ranked as Princeton’s greatest gift to the nation.
Before I turned to journalism — or before I sank to journalism, as my father, a professor of philosophy, put it — I was, briefly, a professor of political philosophy. I cheerfully accepted that I never would be nearly as original and consequential as the philosophic Madison had been. Then I became a newspaper columnist, a role in which I always have known that I could never be nearly as original and consequential as was Madison, America’s foremost columnist.
The Federalist Papers, of which Madison wrote the two most important, were, of course, columns written to advance the ratification of the Constitution — in whose drafting Madison was the most subtle participant. If a student of American thought fully unpacks the premises and implications of Federalist 10 and 51, that student comprehends not only this nation’s political regime but also the Madisonian revolution in democratic theory.
Before Madison, almost all political philosophers who thought about democracy thought that if — a huge “if,” for most of them — democracy were to be feasible, it would be so only in a small, face-to-face society, such as Pericles’ Athens or Rousseau’s Geneva. This was supposedly true because the bane of democracies was thought to be self-interested factions, and only a small society could be sufficiently homogeneous to avoid ruinous factions.
But America in the second half of the 18th century, although small compared with what it would become, was in size already a far cry from a Greek polis. Besides, Americans had spacious aspirations. A small nation? They were having none of that. At a time when 80 percent of them lived on a thin sliver of the eastern fringe of the continent, within 20 miles of Atlantic tidewater, what did they call their political assembly? The Continental Congress. They knew, more or less, where they were going: California.
Madison understood the need for philosophic underpinnings for an “extensive republic,” a phrase that seemed oxymoronic to others. He can be said to have had a political catechism, which went approximately like this:
What is the worst outcome of politics? Tyranny.
To what form of tyranny are democracies susceptible? The tyranny of a single, durable majority.
How can this threat be minimized? By a saving multiplicity of factions, so that majorities will be unstable and transitory.
Hence in Federalist 10 he wrote that “the first object of government” is “the protection of different and unequal faculties of acquiring property.” From these differences arise different factions in their freedom-preserving multiplicity.
Having said in Federalist 10 that “neither moral nor religious motives can be relied on as an adequate control” of factions, Madison turned, in Federalist 51, to the institutional controls established in the Constitution — “this policy of supplying, by opposite and rival interests, the defect of better motives”:
“Ambition must be made to counteract ambition. ... It may be a reflection on human nature, that such devices should be necessary to control the abuses of government. But what is government itself but the greatest of all reflections on human nature? If men were angels, no government would be necessary. If angels were to govern men, neither external nor internal controls on government would be necessary. ... You must first enable the government to control the governed; and in the next place, oblige it to control itself.” In a masterpiece of understatement, Madison said, “Enlightened statesmen will not always be at the helm.” No kidding. And Madison did not mince words regarding those about whom no one nowadays dares to say a discouraging word — “the people.”
“There is,” he said, “a degree of depravity in mankind which requires a certain degree of circumspection and distrust.” Let the record show that once we had a president who had spoken of the voters’ depravity. Those were the days. Madison did qualify his astringent judgment about the people by acknowledging that there “are other qualities in human nature which justify a certain portion of esteem and confidence.” But notice the carefully measured concession to the public’s sensibilities: “a certain portion,” indeed.
In 1976, the nation’s bicentennial, the presidency was won by someone who, pandering in the modern manner, promised to deliver government “as good as the people themselves.” One can imagine Madison muttering, “Good grief!”
In Washington, the seat of the government that he did so much to design, there is no monument to Madison comparable to the glistening marble temples honoring Jefferson and Lincoln. There is, however, a splendid Madison Building, which is part of the Library of Congress. It was said of the 5-foot-4 Madison that he contained an astonishing ratio of mind to mass. So it is altogether right that Madison, physically the smallest of the Founders, is honored in his nation’s capital by a repository of learning.
My next dog, if female, will be named Dolley.
Columnist George F. Will *68 won the Pulitzer Prize for commentary in 1976. In addition to writing his regular column for Newsweek and a column that appears in 450 newspapers, Will is a regular contributor on ABC’s This Week on Sunday mornings.
By Teresa Riordan
As a child in England, Alan Mathison Turing was cheeky and tantrum-prone. Often ink-smudged and reeking of chemical experiments, he was not so much disobedient as perpetually bewildered that he was expected to conform to society’s expectations.
Turing, born in 1912, would turn out to be one of the most brilliant minds of the century — conceiving, as he did, of the modern computer decades before it came into existence. But his independent character was a source of endless exasperation for his teachers. He was impatient, his work was sloppy, and since he would expend effort only on those subjects that interested him, he nearly failed English and Latin. “He is the kind of boy who is bound to be rather a problem in any kind of school,” sighed his boarding-school headmaster in a progress report.
As his brother John put it in Andrew Hodges’ masterful biography, Alan Turing: The Enigma: “You could take a safe bet that if you ventured on some self-evident proposition, as, for example, that the earth was round, Alan would produce a great deal of incontrovertible evidence to prove that it was almost certainly flat, ovular, or much the same shape as a Siamese cat which had been boiled for 15 minutes at a temperature of 1,000 degrees Centigrade.”
Turing went on to study mathematics at King’s College, Cambridge, and, after a long-distance run one afternoon, he lay down in a meadow to ponder. By the time he got up to run back, he had worked out a new definition of computability. Without consulting his mentor, or anyone else for that matter, he sat down and wrote “On Computable Numbers,” which conceived of abstract “machines” that function much like modern computers. The year was 1936. At the time, of course, actual computers did not exist. Turing was 23.
It turned out that someone older and more respectful of precedent — Princeton professor Alonzo Church ’24 *27 — recently had come up with a definition of computability that was mathematically equivalent to Turing’s. But Turing’s approach, more concretely definitive than Church’s, was the more original and enduring.
Worried that Turing’s propensity to work in isolation might stunt his intellectual growth, his adviser arranged for him to study at Princeton with Church. Arriving on campus, Turing did not cut much of a profile. Few attended his first talk at the Mathematics Club in Fine Hall. When his seminal paper was finally published the next month, few fathomed its revolutionary importance; only two people asked for reprints.
Turing started to build a working model of his imaginary computer at Princeton, using the physics department machine shop to fashion some relays for switches. The fruits of this effort are unclear. He was far better with abstract machines than with real ones. When a friend sold Turing his 1931 V8 Ford and taught him to drive, Turing accidentally threw the car into reverse, nearly drowning both of them in Lake Carnegie.
The great mathematician John von Neumann was one of the few at the time who appreciated Turing’s originality. Von Neumann was a worldly sophisticate whose Princeton household was a glittering social hive. Turing was a loner, unapologetically homosexual, maddeningly oblivious to convention.
Von Neumann offered Turing a position at the Institute for Advanced Study to continue his mathematical research. But Turing was homesick. Doctorate in hand after two years of study, he returned to Britain in 1938.
As World War II began, Turing joined the British cipher group at Bletchley Park. He rode his bicycle to work wearing his gas mask (it alleviated his hay fever), drank from a tea mug that he had chained to the radiator, and ultimately played a key role in cracking the Nazis’ seemingly impenetrable Enigma code.
After the war, Turing obsessed on the idea of an artificial brain, envisioning a time when machines would simulate human thought. He went on to invent the now well-known Turing test, which holds that a computer can be said to be intelligent if it can fool a questioner into thinking that it is human.
“His life was full of paradox, not least that he, of all people original and socially nonconforming, should be the foremost advocate of the view that the mind was purely mechanical,” wrote Turing’s biographer Hodges, in an essay reflecting on his legacy.
A brief, injudicious affair with a 19-year-old landed Turing in jail on charges of gross indecency; he was forced to take estrogen treatments to “cure” him of his homosexuality. Two years later, in June 1954, he was found dead, a half-eaten apple at his bedside and cyanide nearby. Was it a suicide? An accident? An assassination by government agents who feared a homosexual’s past access to state secrets? The cause of Turing’s death — like the origin of his genius — remains an enigma.
Teresa Riordan, a senior writer for the School of Engineering and Applied Science, has written extensively on innovation and is a former patents columnist for The New York Times.
After serving as Princeton’s 13th president, America’s 28th president achieved a host of domestic reforms, including the creation of the Federal Reserve System, the income tax, and the Federal Trade Commission, which ushered in a new era of government regulation. He led the United States through World War I; his Fourteen Points remain an outstanding liberal expression of international relations, though he failed to gain U.S. entry into the League of Nations.
The author of “A Theory of Justice” and other works, Rawls has been called one of the most important political philosophers of the 20th century and the most influential proponent of liberalism since John Stuart Mill. In an essay written shortly after his death in 2002 (see www.princeton.edu/ paw/archive_new/PAW02-03/08-0129/features3.html), former Princeton provost Amy Gutmann, a student of Rawls’ at Harvard, described how Rawls devoted much of his life to studying one urgent question: “What, he asked, does justice require of individuals and institutions, and how can we realize it?” His work was an influential force in discussions of civil rights, educational opportunity, and other social issues.
The only person ever to have won two Nobel Prizes in physics, he co-invented the transistor, which transformed the electronics industry, and developed, with two colleagues, the first successful explanation of superconductivity, which has helped make possible MRIs, CAT scans, and mobile phones, among many other things.
American diplomat and ambassador to the Soviet Union, he was largely responsible for U.S. policy during the Cold War. Kennan framed the “containment” policy intended to check Soviet expansion in Europe and Asia. From that policy stemmed the Truman Doctrine, which marked the beginning of Ameri-can intervention to prevent countries from being drawn into the Soviet sphere, and the Marshall Plan, which helped rebuild Europe after World War II.
Physician and statesman, Rush signed the Declaration of Independence, was an early opponent of slavery and capital punishment, helped prepare the Lewis and Clark expedition, and is considered America’s first psychiatrist for his groundbreaking research into mental illness and addiction. Rush believed, for example, that mental illness was a disease of the mind, and not a “possession of demons.”
Author of The Great Gatsby and one of the greatest American writers of the 20th century, Fitzgerald defined the Jazz Age for generations of readers and embodied many of its ideals and excesses.
An economist and educator, Shultz served as secretary of labor, treasury, and state in three Republican administrations, oversaw American foreign policy at the end of the Cold War, and advocated pre-emptive attacks on terrorist organizations.
Secretary of state for most of President Eisenhower’s tenure during some of the darkest days of the Cold War, Dulles was a forceful advocate in support of NATO and against Soviet expansion, which led him to advocate the nuclear doctrine of mutual assured destruction (MAD) and to recommend the coup that restored the shah of Iran to his throne.
Becker won the Nobel Prize in economics for “having extended the domain of economic theory to aspects of human behavior which previously had been dealt with — if at all — by other social-science disciplines such as sociology, demography, and criminology.” His work has been influential in setting policy relating to labor, education, crime, discrimination, and other issues.
By Todd S. Purdum ’82
He began his songwriting career in the Triangle Club with a “How to Succeed”-like send-up of big business at an advertising agency called S.E.L. & L., and he ended it as one of the most successful creative forces in the history of noncommercial television. In between, Jeffrey Arnold Moss ’63, the founding head writer of Sesame Street, racked up 14 Emmys and an Academy Award nomination, and helped teach hundreds of millions of children worldwide a good bit about what it means to be human.
It was Moss who made one of Jim Henson’s scruffy blue Muppets, then known only as Boggle Eyes, into that indelible chocolate-chip-aholic, Cookie Monster, and who wrote Oscar the Grouch’s grumpy anthem: “Oh, I love trash! Anything dirty or dingy or dusty. Anything ragged or rotten or rusty. Yes, I love trash.” It was he whose humble hymn to a bathtub toy, “Rubber Duckie,” became a million-selling single that wound up in the top 20 on the Billboard pop music charts in 1970–71. If Henson gave the Muppets their form, and Frank Oz provided so many of their voices, Moss helped give them their souls.
So it should come as no surprise that a panel dominated by parents of post-baby boomers saw Moss, who died of colon cancer at 56 in 1998, as a must for its list of Princeton immortals. Precisely because he wrote the words that cultural touchstones still speak and sing, the panel saw his work as more lasting — and thus more influential — than even that of a better-known and perhaps even more-beloved icon, Jimmy Stewart ’32, who narrowly missed making the final list.
By the end of the panel’s discussion, it also was clear that, fond as the group was of Moss, he was something of a proxy for a whole pond of Princetonians in the performing arts — from Stewart, the director Joshua Logan ’31, and actor José Ferrer ’33 to Moss’ contemporary on the staff of Captain Kangaroo, Clark Gesner ’60, whose You’re a Good Man, Charlie Brown ranks as the most-often-produced musical comedy in the world, owing to its countless high school and amateur incarnations. (In his new biography of Charles Schulz, Schulz and Peanuts, David Michaelis ’79 estimates that there have been more than 40,000 productions of the show over the years.)
“I would say Jeff would be, in his quiet, enthusiastic way, so thrilled,” Moss’ widow Annie Boylan tells me by phone from her home in the Hudson Valley of upstate New York. “Obviously, Princeton was huge in Jeff’s life. He loved his experience there. He came out of New York City, where he was No. 1 in his class” at the Browning School, “and then, he said, he finally got into a school where he wasn’t No. 1 anymore. He had to cope with that the first two years, and then he found the Triangle Club.”
Moss came by his creativity naturally. His father, Arnold Moss, was a respected Shakespearean actor, longtime college drama teacher, and prominent creator of New York Times crossword puzzles, who played the Ziegfieldian impresario in the original Broadway production of Stephen Sondheim’s Follies. Moss wrote his Princeton thesis on Shakespeare, and went on to write numerous books of poetry and stories in verse, including Bob and Jack: A Boy and His Yak.
“What was most interesting about Jeff is that he wrote ‘Rubber Duckie’ and all those wonderful songs before he was married and had a child,” his longtime literary agent, Esther Newberg, recalls. “That he had the sensibility to write all those romantic and lovely children’s songs. He was as precise in writing those songs as he was in reading his contracts and royalty statements. There are lots of writers who are hugely famous who don’t know anything about the details. He always took the time. Some of his songs are deceptively simple. But he worked so hard.”
Upon graduation, Moss was offered two jobs at CBS: as a production assistant for the CBS News, or for Captain Kangaroo. He chose the latter, he once said, because, “I’ve seen the news.” In 1969, along with other Kangaroo alumni, Moss wound up helping to start Sesame Street, with the intention of appealing to both adults and children. Over the years, a parade of famous figures from Johnny Cash to Julie Andrews performed Moss’ material, including Ralph Nader ’55, who insisted on changing the lyrics of “The People in Your Neighborhood” to replace a colloquial “that” with a grammatical “whom.”
“Jeff was sort of a genius, a creative genius,” says Joan Ganz Cooney, the creator of the Children’s Television Workshop and founding mother of Sesame Street. “Jeff was a sort of child himself, and I don’t mean he had a way with children. I mean he, himself, was a kind of a child. He could be very difficult, very prickly. He could write anything, he could imagine anything. He himself was a very sober guy, not funny at all ... I loved him.”
Moss’ widow says, “I think there’s a small chance that maybe he didn’t completely get the credit” he deserved in the public mind, “like the speechwriter doesn’t get the credit. What would all those puppets be without Jeff?” She adds:
“A lot of people remember Henson, and Henson, of course, was brilliant. But Jeff put the words and the personality in the puppets. He formed most of the personalities of these people.”
Moss began writing children’s poetry when the actress Marlo Thomas asked him to contribute to a book she was putting together, Free to Be a Family. He wrote a poem called “The Entertainer” about a little girl who feels imposed on when her parents ask her to perform at parties. Comparatively late in life, Moss married and fathered a son of his own, Alexander. “He was diagnosed when Alex was 3, and he told me he was going to get this kid to the age of reason,” Boylan recalls. “He had 14 operations. Three months after Alex turned 7, Jeff died.”
Cooney still remembers how Moss, in a wheelchair with a portable oxygen tank, came to the Children’s Television Workshop offices to say goodbye to his fellow writers and old friends. “People got to stand up and say how they felt about him, and he did the same,” she says. “It was the most generous thing I have ever seen a dying person do.”
Boylan tells me that, not long before his death, Moss attended a memorial service in the Princeton Chapel, at which all the names of Triangle members who had died since the club’s founding were read solemnly. Alone at a piano, he performed a composition he had written for the occasion called “The Song Goes On.”
“The words were, ‘After the singer has died, the song goes on,’’’ she says. “I think that’s quietly the point here.”
Todd S. Purdum ’82, former White House correspondent for The New York Times, is national editor at Vanity Fair magazine.
The only woman to crack our panel’s top 25, Kopp conceived of Teach for America as her senior thesis; it is now one of the leading public-service projects in the country and has sent more than 14,000 recent college graduates to teach in poor schools throughout the United States.
A Nobel laureate in physics for expanding the theory of quantum electro-dynamics, the theory of the interaction between light and matter, he changed how scientists understand the nature of waves and particles. He also did important research into particle theory and the superfluidity of liquid helium, assisted in the creation of the atomic bomb, and invented “Feynman diagrams,” a tool to calculate and conceptualize the behavior of interacting subatomic particles.
As chairman of the Federal Reserve Board from 1979 to 1987, Volcker pursued policies that ended the inflation crisis of the 1970s by tightening the growth of the money supply rather than targeting interest rates. His remedy worked — inflation fell from 13.6 percent in 1980 to 3.2 percent by 1983 — but it came with a high cost: a significant recession, with unemployment levels that had not been seen since the Great Depression.
Deputy attorney general under John Kennedy and attorney general under Lyndon Johnson, Katzenbach was active in several groundbreaking Justice Department civil rights initiatives, including the desegregation of the University of Mississippi in 1962, the desegregation of the University of Alabama in 1963, and the passage of the 1964 Civil Rights Act. [Read an interview with Katzenbach, click here.]
He founded a company that has published dozens of influential writers, including Ernest Hemingway, F. Scott Fitzgerald ’17, Thomas Wolfe, and Kurt Vonnegut; his descendants, notably son Charles Scribner 1875, helped the company flourish and began publication of some of the first mass-circulation magazines.
Son of oilman John D. Rockefeller, he was one of the early venture capitalists but made his reputation as a philanthropist and conservationist, funding the expansion of Grand Teton National Park and supporting dozens of institutions including the Memorial Sloan-Kettering Cancer Center, the Museum of Modern Art in New York, and Princeton.
By Deborah Fausch *99
In 2005 the Vanna Venturi house, completed by Robert Venturi ’47 *50 four decades earlier, was one of 12 masterworks of Modern American architecture honored in commemorative stamps by the U.S. Postal Service. In striking contrast to the other buildings honored, the small residence is like a child’s drawing of a house — a pitched roof, a chimney, windows cut into a flat front façade, a large opening for the door. This archetype of home represented a revolution in architectural thinking. Whereas the other Modern buildings are “about” the nature of their own construction and the activities that occur inside them, the Vanna Venturi House is “about” what people think of when they hear the word “house.” Venturi’s concern with symbolizing the everyday life within and around buildings, and reconnecting a building to its own history, constitutes the core of the contextual and symbolic revolution engendered by Venturi and his partner and wife, Denise Scott Brown.
Venturi’s Princeton education taught him to appreciate both Modern architects like Ludwig Mies van der Rohe and Frank Lloyd Wright and the architecture of the past. He called his 1966 book, Complexity and Contradiction in Architecture, a “gentle manifesto” on the pleasures and profits of studying the entire historical and cultural context of architecture, from ancient to contemporary, from high art to everyday environments. Irreverent toward the simple and abstract rules of Modern buildings — recasting Mies’ famous aphorism “Less is more” into “Less is a bore” — it teaches that people understand buildings not as isolated and self-referential works of art, but in context, as parts of environmental milieus and historical traditions. For example, Venturi and Scott Brown’s Sainsbury Wing of the National Gallery in London, a sensitive, slightly whimsical addition to a neo-classical British landmark, plays off the Corinthian columns and cornices of its neighbor in a kind of jazz counterpoint to classical correctness while at the same time completing the northern edge of Trafalgar Square and framing the cheerful jostle of pedestrians and vehicles. Today this idea is so basic to architectural thinking that even the most abstractly sculptural of buildings show an awareness of their history and surroundings.
The book Learning from Las Vegas, which Venturi wrote with Scott Brown and Steven Izenour in 1972, proposes that architects can learn something valuable from the “lowbrow” commercial building styles of suburban American cities and the ordinary places in cities where people love to be. Their work brings these observations into the designs of buildings as well as urban districts. In their public and institutional buildings, such as the Lewis Thomas Laboratory on the Princeton campus, the reticent Modern glass box has been replaced by a decorated and expressive brick exterior skin. Here the corridors are “main streets” that intersect the stairs and elevators at nodes opened up for casual congregation — lounges for people to gather and converse, where new theories will be invented. More controversial than his 1966 book, Learning from Las Vegas predicted the impact of American commerce on the built environment all over the globe and suggested that, rather than resisting this influence, architects guide it. Contemporary architects now struggling to make sense of new Asian megacities are indebted to Venturi and Scott Brown’s understanding that “ugly and ordinary” everyday urban environments can provide inspiration for new forms of building.
In recognition of his and Scott Brown’s impact on architecture, Venturi was awarded the prestigious Pritzker Prize in 1991. Today the firm’s work continues to suggest with gentle persistence that architecture, by reflecting its everyday and historical context, can have a positive effect on contemporary culture.
Deborah Fausch *99 teaches architectural design at the University of Illinois, Chicago, School of Architecture.
The “Venturi era” on campus began in 1983; many believe that it rescued Princeton from a reputation of timidity in its modern buildings. These campus buildings were designed by Venturi:
Schultz Laboratory (1993) and Fisher-Bendheim Hall (1991)
(Princeton building photos: Matt Wargo for VSBA; Tom Bernard/courtesy VSBA (Wu Hall))
Time magazine’s Person of the Year in 1999 and one of the great dot.com pioneers, he founded Amazon.com, which changed the way Americans shop for everything from books and toys to groceries and home appliances.
Art historian and first director of the Museum of Modern Art in New York, he helped bring contemporary artists to public attention and expanded the museum’s collections to include works in little-explored fields such as film and photography.
Known as the “poet of the Revolution,” Freneau later wrote works that are understood as precursors to the transcendentalist movement, inspiring Henry David Thoreau and Ralph Waldo Emerson. He also edited one of the first national newspapers, supporting Thomas Jefferson and opposing Alexander Hamilton.
Founder of the Vanguard Group, one of the largest mutual-fund companies in the world; named by Time magazine as one of the 100 most influential people in the world in 2004; his investment strategy focuses on the superiority of index funds over traditionally managed mutual funds.
By Evan Thomas
In the Twenty-Year Record of the Class of 1905, Norman Thomas wrote, “My path has led me away from the road traveled by many old friends. This I regret, but nothing else.” Thomas had agitated against the First World War, helped found the American Civil Liberties Union, and, in 1924, joined the Socialist Party and soon replaced Eugene Debs as the Socialists’ standard-bearer. From 1917 to 1924, Thomas was banned from speaking on the Princeton campus by President John Grier Hibben 1882, an ardent interventionist. But in 1932, when Thomas was the Socialist candidate for president, Hibben had a change of heart, and Princeton gave Thomas an honorary degree. Even that ceremony was unconventional: Though Thomas was sitting on the platform, Dean Augustus Trowbridge forgot to mention him until someone tugged at the dean’s sleeve as he sat down, the degree presentations concluded. Thomas’ degree had become stuck to another honorary degree citation (for Supreme Court Justice Benjamin Cardozo) by a paper clip. “A capitalist paper clip,” Thomas later joked.
Thomas, who was my grandfather, had a sense of humor. He railed against human folly, but he understood it and forgave it. When my father, Evan Thomas ’42, dropped out of Princeton in the fall of 1941 to go to war driving an ambulance with the American Field Service attached to the British Eighth Army, my grandfather wrote him: “If the Lord must be disappointed in us men and our ways, so must be the Devil in the face of such courage, love, and companionship as plain people show.”
Thomas ran for president of the United States six times (1928–1948), but never won more than 800,000 votes. It amused him to recall that Franklin Roosevelt once invited him into the Oval Office and told him, “You know, Norman, I think I’m a better politician than you are.” My grandfather would grin when he told that story and say, “I thought that was a damned obvious thing to say.” Thomas’ 1932 platform — minimum-wage laws, low-cost housing for the poor, a five-day workweek, unemployment insurance, health insurance for the aged, civil rights for blacks, and old-age pensions — doesn’t look so radical now. He was an early and strong anti-communist (Leon Trotsky said, “Norman Thomas called himself a Socialist as the result of a misunderstanding”), and he disapproved of the militancy of radicals in the late 1960s. He disliked seeing young people burning the American flag. “Wash the flag,” he said. “Don’t burn it.”
My grandfather — “Big Dad,” he was called by his family — was in some ways conventionally upper-middle class. In the evenings, he liked to go swimming at the Cold Spring Harbor Beach Club. As a boy, I recall seeing him lying on his back, paddling along as he debated the world’s fate with another beach club member, John Foster Dulles 1908.
Thomas was a brave man. He stood up against petty tyrants. In March 1935, notes William Manchester in The Glory and the Dream, Thomas went to Mississippi to speak out for black sharecroppers and to castigate the racist rule of Gov. Theodore (“The Man”) Bilbo. A drunken mob beat him bloody and threw him across the county line. “We don’t need no goddamn Yankee bastard to tell us what to do with our niggers,” someone said. Three years later, my grandfather spoke out against Mayor Frank (“I Am the Law”) Hague in Jersey City. Police slugged him and escorted him out of town; he came right back. There is a famous Life magazine photo of Thomas, who was an ordained Presbyterian minister, turning his cheek as an egg splatters against his head. A policeman on a horse rears in the background.
Thomas loved Princeton. The son of a Presbyterian minister from Marion, Ohio, he was introduced to the world of ideas at Princeton and graduated first in his class. He faithfully came back to Old Nassau for P-rades. At Reunions, he once dryly remarked, “I prayed to God to make me a sophomore again for just one night. And he did.” Though Thomas wanted to die crusading for justice, and he toured the world into his 80s, the stroke that left him bedridden occurred as he was listening to the 1967 Princeton-Harvard football game; he was stricken when Princeton held on fourth-and-one to preserve an 18–14 upset victory against an undefeated Harvard team.
In Café Vivian at the Frist Campus Center, there is a photo of Thomas at his 60th reunion in his Class of 1905 beer jacket. When Thomas returned to Princeton for that reunion, he was half-blind and hobbled by arthritis. He made his way to the Princeton-Yale baseball game on the arm of his classmate Ray Fosdick. A cheer from the crowd went up as they entered the stands. “What’s going on, Ray?” Thomas asked. “I can’t see. Why are they cheering?” Fosdick turned to his friend and said, “Why, Norm, don’t you understand? They’re cheering for you.”
Evan Thomas, editor-at-large at Newsweek, is in his first year of a five-year campus appointment as the first Ferris Visiting Professor of Journalism in Residence.
Nader, the father of the consumer-protection movement, and Rumsfeld, the only two-time defense secretary and leading strategist of the Iraq war, share an odd and unlikely connection: Nader’s third-party presidential candidacy in 2000, many believe, led to the election of George W. Bush, who brought Rumsfeld back to the Pentagon.