ISC HPC Blog
How Not to Do Science
If some proposed legislation in the US becomes the law of the land, supercomputers funded by the National Science Foundation (NSF) might have a rather different set of workloads in the not-too-distant future. In fact, there might be less need for these supercomputers, altogether.
In the death spiral that seems to define US politics these days, it's not very surprising that there is a bill in the works to constrict federally-funded scientific research. Congressman Lamar Smith (R-Texas), the new chairman of the Committee on Science, Space and Technology, is spearheading legislation that would substitute peer review at the NSF with a new set of criteria. For good measure, the bill would also set in motion a process that could force these requirements to be adopted by other federal agencies involved in R&D.
The ironically-titled High Quality Research Act defines restrictions on the types of research that can be funded. Specifically, an NSF-sponsored project would have to adhere to the following guidelines:
1. is in the interests of the United States to advance the national health, prosperity, or welfare, and to secure the national defense by promoting the progress of science;
2. is the finest quality, is ground breaking, and answers questions or solves problems that are of utmost importance to society at large; and
3. is not duplicative of other research projects being funded by the Foundation or other Federal science agencies.
While, superficially, that wording may appear benign, it would fundamentally change the nature of federally-funded scientific research. The first two requirements, in particular, imply that only research with real-world applications should even be considered.
Currently, NSF guidelines focus on the intellectual merit of a project and its impact on the scientific community and society as a whole. That allows for all types of basic science research -- that which is investigative in nature, but has no immediate practical application. The defining characteristic of basic science research is that it may or may not lead to practical uses.
Often such work languishes for years until an enabling technology comes along to give it life. But when that happens, the results can be spectacular. Think of digital computers, the Internet, lasers, and genome sequencing, all of which are underpinned by basic science, and all of which received federal funding in their early stages.
The silliest part of the proposed legislation is that it mandates that the research be "ground breaking," an attribute that is impossible to predict. It's like saying unless the research will win a Noble Prize, it's not worth doing. Such wording reflects a fundamental misunderstanding of how science works.
The third requirement regarding duplicate research is less harmful, since there are already NSF guidelines in place to address that. On the other hand, sometimes doubling up on R&D can be useful, since researchers use different approaches that can yield different outcomes.
Although the proposed legislation is being criticized for singling out social science research, it wouldn't stop there. For example, NSF grants for things like creating models of the early universe, sequencing the genomes of endangered species, or studying ice-age climatology would have little chance of getting funding under the propose criteria.
To the lay public, the loss of that kind of research doesn't seem especially alarming. But if you told them that simulating the universe could lead to breakthroughs in space travel, that sequencing non-human genomes could provide the foundation for cancer cures, and studying past climates could show how to reverse global warming, people might realize that its worth spreading your R&D bets around.
Beyond the new restrictions is the equally disturbing idea put forth in an accompanying letter Smith sent to the NSF. It suggests that the agency hand over NSF scientists' peer reviews discussions of grants to the Congressional committee members. That letter prompted pushback from fellow committee member Eddie Johnson (D-Texas), further politicizing the selection of research grants, who wrote back that such a policy would send "a chilling message to the entire scientific community." Johnson went on to explain how politicizing NSF grant research would undermine the agency's mission.
At this point, nobody is threatening the funding of supercomputers by the NSF. Presumably there would still be plenty of "practical" science research projects (engineering R&D, disease genomics, weather studies, etc.) to warrant continued investments in HPC. But such politicization could set in motion a increasingly hostile attitude toward science, and the exodus of researchers to nations with more science-friendly governments.
Frankly, I don't expect this bill to make it very far. There are still plenty of science-savvy representatives in Congress to quash this sort of thing. It's just disheartening to realize that there are people in the federal government who have little sense of what science is or how it works and are willing to reverse decades of R&D policy without regard for the consequences.
Michael can also be followed at https://twitter.com/HPC_Feldman.
Michael Feldman is a Senior Analyst at Intersect360 Research and a 35-year veteran of the computer industry. In his role at Intersect360 Research, Michael focuses on the company's 360-degree view market intelligence subscription service as well as its client-specific research and consulting services.
Prior to joining Intersect360 Research, Michael was managing editor of HPCWire, where he gained recognition as one of the foremost opinions in the HPC industry. He also previously served as a programmer and analyst for Computer Sciences Corporation, a software engineer at Aonix Corporation, and as a developer of software tools for the US Department of Defense and other government agencies. Michael holds a B.S. in Microbiology from the University of Maryland and a B.S. in Computer Science from Chapman University.