The first week of October was Nobel Prize Week, when the Royal Swedish Academy handed out awards in three sciences, in literature, and for contributions to world peace. This year, the physics prize went to MIT’s Rainer Weiss and Caltech’s Kip Thorne and Barry Barish for their work developing and leading the Laser Interferometry Gravitational-Wave Observatory, or LIGO. Yet although LIGO has continued to generate headlines in the weeks since its triumph, the prize is more an indicator of where science as a social institution was in the late 20th century than of where it may be going under the politics of the early 21st.
For one thing, awarding a prize to a trio based at American universities does not reflect the enormous international collaboration that makes astronomy possible. (Hundreds of people, from 20 countries, helped author one of LIGO’s most important papers, and yet a vestigial archaism of the Nobel Prize prevents more than three people from sharing it.) For another, even though LIGO’s researchers announced their prize-winning findings in 2016, LIGO itself embodies—and is in some ways a relic of—the particular institutions of Cold War-era Big Science.
LIGO has been in the works for 40 years, a remarkable survivor of post-Cold War scientific austerity. It is the most expensive endeavor ever undertaken by the National Science Foundation, and possibly also the largest—at least measured in a straight line. An interferometer, as the name suggests, measures tiny changes in length by using the distinctive interference pattern created by beams of light intersecting at right angles. The longer the length of the beams, the more sensitive the instrument and the tinier the changes it can measure. To measure the impossibly faint displacements caused by gravitational waves, the echoes of distant black holes colliding billions of years ago, LIGO’s arms have to be four kilometers long.
There aren’t many places that can hold as big a scientific object as LIGO, and here arises another connection to the Cold War military-scientific-industrial complex. One LIGO detector is in the middle of a Louisiana pine forest, but the other is on the Hanford Reservation in Washington State, the Federal facility built to produce plutonium for the Nagasaki bomb and subsequent nuclear weapons.
It’s a commonplace that the Internet began as a Defense Department project, but many icons of apparently “pure” scientific research developed from and depended on military infrastructures. Vannevar Bush— the brilliant electrical engineer, former MIT dean, and founder of Raytheon—laid out a rationale for this symbiosis over 70 years ago. Even before the Second World War was over, he and other leaders of the scientific establishment dreamt of ways to keep the spigot of military funding flowing during the peace. In a July 1945 report entitled Science, the Endless Frontier, he made his case: “Scientific progress on a broad front results from the free play of free intellects, working on subjects of their own choice, in the manner dictated by their curiosity for exploration of the unknown.” Only scientists knew what was best for science, the argument went, but such “freedom of inquiry” would trickle benefits down to the entire nation and human race. The highly visible success of the Manhattan Project, of radar, of penicillin, and of other scientific projects that had been equally theoretical just a decade earlier leant weight to these bold claims.
The historian David Noble argued that what Bush and his colleagues were after was federal money with little democratic oversight or control. More populist and democratically-minded members of the Senate and of the Roosevelt and Truman administrations, as well as organizations of smaller universities, forestalled Bush’s efforts to establish a completely independent research foundation that floated on federal dollars. But the compromise that resulted from these debates was the research infrastructure we now take for granted, structured around federal grants to university laboratories from executive departments or through separate agencies like the National Science Foundation and National Institutes of Health. Research universities, especially elite ones like MIT and Caltech, flourished under Cold War contracts from the Departments of Defense and Energy. Yet whatever antidemocratic impulses may have been involved in the NSF’s beginning, long-term and expensive projects like LIGO were only possible because of the relative independence of the Foundation. And while using gravitational waves to observe black-hole collisions may not materially improve life on Earth, the LIGO project spawned hundreds of PhDs in engineering, computing, and other fields that will.
What’s especially remarkable about LIGO, as MIT historian of science David Kaiser pointed out in a recent New York Times op-ed, is that it was green-lit in the early 1990s, the very moment that Cold War funding for large science projects was drying up. The period’s most famous scientific casualty was the Superconducting Super Collider, the giant Texas particle accelerator, whose defenders Congress forced into a death-match with advocates of the International Space Station. Supposedly only one project could justify its funding now that the U.S. no longer needed to lord scientific achievements over its global rivals, and in the end the ISS survived while the SSC died. This outcome had less to do with compelling arguments about the knowledge that might result from a manned orbital presence—notwithstanding Ronald Reagan’s laughable claims about “lifesaving medicines which could be manufactured only in space”—and much more to do with ISS’s ability to distribute pork to multiple Congressional districts. Likewise, Kaiser writes that LIGO’s survival for four decades, amid government shutdowns and austerity, “required political mastery as much as physics know-how.”
LIGO, in other words, was the survivor of a particular political economy of science that emerged in the postwar decades. It depended on the money and infrastructure of the Cold War’s global rivalry, as well as a cadre of administrators who were insulated from short-term politics. That political economy has lasted until today, though somewhat diminished since the 1990s. But now, as grievance and revanchism consume everything in Washington, those forces are coming for science too.
The clearest evidence of this can be found in the Trump administration’s stated priorities for science and technology. The budgets for basic science agencies across the federal government are being decimated, if not wiped out. The fiscal year 2018 budget for NASA, for instance, cuts five Earth science missions as well as educational outreach.
What resources remain are being diverted to fill President Trump’s fantasies of the most military-inspired element of the Cold War science complex, and the most obviously spectacular: manned missions to the Moon. Forty-five years after the last Earthling landed on the Moon, President Trump and Vice-President Pence have announced their desire for a return mission. Such announcements have been de rigeur in the 21st century—both George W. Bush and Barack Obama made halfhearted gestures—but Trump and Pence, whom the President tapped to revive the 1950s-relic National Space Council, seem more ambitious. Even NASA’s legendary Johnson Space Center in Houston is preparing for what one writer called Trump’s “pivot to the Moon.”
For a president driven by dominance, it galls that Russia, India, China, and Europe are all talking about their own lunar arrivals in the next decade or two. Likewise, it's no wonder that an administration obsessed with appearances, with masculine American mythology, and with its zero-sum geopolitical worldview would try to ride Moon rockets back to glory. But whatever glory the military-industrial-science complex possessed lay in its ability to transmute the energies of thousands of LIGO scientists into collective and peaceful achievements, not in sending a few men and a flag to the Moon aboard a giant missile.