The Futures of Science
Jane F. Koretz
In any discussion of the future, it would be easy enough for me, as a biologist, to deal with a myriad of important issues that have significant global and ethical dimensions. The continuing revolution in contemporary biological sciences, and especially the increasing accessibility and efficacy of recombinant DNA technology, has the potential for a major impact on every aspect of our lives. It is highly likely, for example, that, as understanding of the interrelationships of the ecosphere is improved, bio-genetic strategies to reverse the damage we as a species have done to this world can be developed and made to work. Beyond frost-free strawberries and hormone-stimulated milk production may come biologically-mediated toxic waste cleanup, bioremediation of damaged landscapes, biologically-based mechanisms to restore the ozone layer and to compensate for the loss of the oxygen-releasing capabilities of the tropical rain forests, and, a la Jurassic Park, reversal of selected extinctions via reclamation of the relevant genomes. New strains of agricultural products, perhaps bringing their own nitrogen-fixing systems along with them for soil remediation and nutrition, could be engineered to contain essential human nutrients as well as enhanced resistance to temperature extremes and pests. New and more productive plant variants to feed the ever-increasing numbers of humans adequately, if not appetizingly, are closer to reality than might be believed. It may soon be possible to clone and grow selected animal tissues–rather than the animals themselves–to supply a steady high-protein dietary supplement; this will also eliminate the need for slaughterhouses, allow the fragile ecosystems of the plains and steppes to recover from overgrazing, and free up farm lands for the growth of products that serve to feed humans directly. Eventually, gene-engineered hydroponic strains may allow the majority of agricultural production to take place in otherwise hostile environments here and elsewhere (e.g., in orbit; on other planets or moons).
At the most direct level, advances in gene therapy hold out the hope in the near future of developing strategies for dealing directly with the consequences of genetic diseases such as sickle-cell anemia, , and some of the muscular dystrophies, as well as diseases resulting from genetic predispositions, such as breast and certain other cancers and auto-immune disorders. Further off, but coming from the same set of scientific advances, will be the capacity to alter an individual’s genetic complement ab initio or, at worst, in utero, so that these tendencies do not continue to be passed from parent to child. We should, in the future, have the capability of cloning replacement parts for ourselves (developing the methods used recently to clone a sheep), obviating the need for organ donors and the consequent ethical and moral problems such donations raise, and it should also be possible to design and develop biological “machines” to serve such functions as clearing out our arteries, preventing (or enhancing) conception, and performing microsurgery on microcancers and other altered cell clusters. Ultimately, of course, recombinant DNA technology may lead to the alteration of intrinsic genetic information–not simply or only to correct an error, but to improve on what is already there (“designer” genes and “designer” babies). With the Human Genome Project currently in high gear mapping our genetic complement and an increasingly large number of physical and personality characteristics being linked, at least tangentially, to the genes, the sorts of changes we may be capable of making to ourselves–including our life spans and personalities–know no bounds.
In a broader sense, the biological revolution is accompanied by equally revolutionary developments in the other sciences, particularly physics, and these developments have led to a host of technological breakthroughs in materials science, leading in turn to microminiaturization of circuitry–the basis of computer and other electronic advances–and the creation of novel new materials and processes. Future developments are limited only by the human imagination, but might be expected to include nanotechnology (highly miniaturized “machines” capable of sophisticated functions, including reproduction), breakthroughs in artificial intelligence, and technology for implementation of some of the biological and biomedical advances briefly mentioned above.
One of the first essays I ever wrote for the Humanist Institute Colloquium series dealt with the potential ethical and moral problems associated with advances in the biological and biomedical sciences [Koretz, J.F. New Questions from the New Biology. In Humanism Today, 4:116-126 (1988)]. My contention then was that new technologies would lead to situations where new ethical questions would be raised and have to be addressed, if not unequivocally answered. One example that I used was in relation to “designer” genes–the issues surrounding elective replacement of certain traits and the larger issue of what being human is and will, in the future, be. Less than a decade further on, we are much closer to having to confront such issues than I had imagined at that time; not surprisingly, however, we are no closer to formulating–let alone developing an ethical strategy for dealing with–the dilemmas that will soon face us, than when I wrote that article. We are in fact facing a much larger complex of issues than seemed possible then, precisely because of the steadily accelerating advances of biotechnology and the complementary non-biological technological advances that make them more quickly accessible. The rapid growth of bioethics as a profession (cf. Sunday New York Times Magazine, December 15, 1996) is an inadequate response to the increasing complexities of decision-making in the face of medical advances; while valuable, it tends to be reactive rather than pro-active and limited in its scope of applicability. At the same time that bioethicists are grappling with what are often simply variations of classic ethical questions (e.g., allocation of scarce resources applied to transplant organs and certain new medical procedures; public safety vs. privacy rights in relation to HIV infection or non-compliance in the treatment of multiple drug resistant strains of tuberculosis), new problems are emerging for which there are no “classical” analogies or solutions that can be adapted or adopted. The implications of the cloning of sheep and monkeys provide a case in point.
But the medical issues that arise are only the tip of the iceberg. Because of our anthropocentrism, we do not adequately recognize that the same technologies that can change our species can change the entire ecosystem for the worse as well as the better. Public policies relating to the regulation or release of gene-engineered species into the environment are being decided by the corporations who develop them and by the patent offices which allow natural gene sequences to be classified as intellectual property. Again we lack an adequate overview that allows us to see individual cases, no matter how benign, as examples of a new and larger set of questions that need to be addressed pro-actively. In fact, there is only one venue that has consistently, frankly, and straightforwardly dealt with some of the consequences of technological advances and that is SF–science, or speculative, fiction. Whether utopia, dystopia, or imaginative extrapolation of current trends into a recognizable near future, the positive and negative consequences of new and imagined technologies are considered there in relation both to us and to our world(s). I need only mention Jurassic Park or The Stand to trigger a response to issues relating to recombinant DNA technology or the uncheckable spread of dangerous, human-engineered disease respectively, and these two books are not even grappling with the consequences of highly speculative technology.
This essay is not about advances in biotechnology, however; that would be entirely too easy for me to write (especially since I have already written and published a version of it!). Rather, I want to deal with some of the intellectual consequences, now and in the future, of the technologies that come out of basic science research. As the foundation to this effort, it is essential first to distinguish between science (or basic science) and technology (applications of basic science), though it must be noted that there is a continuum of the two rather than a clear division between them. Basic science is concerned primarily with elucidation of the fundamental principles and relationships that govern the cosmos and processes within the cosmos. Technology, in contrast, makes use of these principles and relationships to develop devices or processes that serve a particular or desired function. Thus, to a first approximation, the engineering disciplines make use of discoveries in physics and chemistry and are the applied arm of these sciences, just as medicine, animal husbandry, and agriculture make use of discoveries in the biological sciences. The coupling between science and technology is generally referred to as “research and development” or, more recently, as “technology transfer.”
An example going from one end of the spectrum to the other may be of value. In the nineteenth century, the question of the wave vs. particulate nature of light was unresolved, since experiments providing results consistent with either conclusion could be performed. The development of quantum mechanics in the early twentieth century enabled both aspects of light to be explained in a single theory, and that theory–like any good theory–not only explained what was known but also made predictions that could be experimentally tested. One way to test these predictions was to use light that was organized so that (a) it was monochromatic and (b) its waves were aligned so that all the light was in phase; this is known as coherent light, and the device that created beams of coherent light was called a laser. At the time of its creation, it was no more than a jury-rigged mechanism for testing some of the implications of quantum mechanical theory for photons, a toy that cleverly used some quantum mechanical principles per se and could be used to test others. I was a little girl at the time, but remember the dismissive jokes about lasers’ and masers’ utility. Forty years later, the laser is the key element in bar code scanning devices and CD players, is used for critically important ocular and other surgeries, and provides the high levels of precision needed in a wide variety of measurement, fabrication, and quality-control procedures. And, it should be remembered, a variation of laser technology was the basis of SDI, Reagan’s “Star Wars” defense system.
Thus, the technological applications that can be derived from even a minor corollary to a scientific principle are numerous and varied, and they can span the range from improving quality of life to threatening it. In the case of the laser, the time lag between invention and application, at least outside of scientific experimentation, was about twenty years. In our increasingly competitive and market-driven economy, however, the time lag involved in technology transfer is steadily shrinking. This is most clearly evident in the personal computer industry, where continuing advances in speed and amount of data storage, transfer, and computational operations per second make top-of-the-line systems essentially obsolete within a year or two of initial availability.
Scientific progress, in terms of the development of new theories of the universe and its processes, has shown a bumpy and uneven character over time. Periods of new insight are followed by periods of varying lengths for consolidation and exploration of the implications of these insights before further insights appear. In physics, for example, Newton’s laws of motion revolutionized mechanics, but were not themselves displaced until Einstein’s development of the theory of relativity, whence they became simply an approximation for very low velocities; similarly, classic ideas about light, electricity, magnetism, the fundamental nature of matter, and the nature of the universe itself were transformed by the development of quantum mechanics and related disciplines. In biology, Darwin’s theory of evolution gave meaning and sense to the anatomical relationships between species that had been recognized empirically through Linnaeus’s system of classification, but its mechanism of action was relatively unclear until the laws of heredity had been discovered by Mendel and rediscovered by subsequent investigators. The periodic table of Mendeleev revolutionized the science of chemistry by providing a framework for the understanding of the relationships between the elements, and eventually allowed correlation between classical chemistry, chemical physics, and quantum chemistry. The development of the calculus by Leibnitz and/or Newton revolutionized both pure and applied mathematics, as well as providing a framework for the further development of the physical sciences; this work has never been superceded, but rather has been augmented, by continuing advances in abstract mathematics, Boolean algebra, topology, non-Euclidean geometry, and the like.
It was not until the nineteenth century, however, that advances in scientific understanding that were translated or transformed into technology began to make a significant impact on society as a whole. (It could be argued that the invention of the printing press was the single most significant technological innovation of this past millennium, but it did not incorporate any novel scientific principles). I am thinking in particular about nineteenth-century England, with the development of steam engines, the building of railroads and coal-powered naval fleets, and the harnessing of natural power sources, such as water flow in rivers, for the weaving of cloth and other mechanical manufacturing methods. These technologies fundamentally altered the fabric of a heretofore agrarian system by displacing hundreds of thousands of individuals from the land, transformed the economic base through the development of a new form of middle class, changed the balance of power between city and countryside, and altered the political and economic relationship between England and other countries.
The pace of change due to technological advances has accelerated and spread thoughout the world since that time. In considering the twentieth century, we have changed from the horse and buggy to the car and plane, from community to mobility, from candles and wood to electricity and nuclear fission, from home-made to manufactured, from book to multimedia, from piano to Moog synthesizer…. Some of these changes did not make a significant impact until after World War II, and some are still so new that the full magnitude of their impact remains to be determined. In general, however, it can be maintained that technological innovation has been proceeding at an ever-increasing rate that shows no signs at present of slowing. It can also be maintained that, like nineteenth-century England, the transformation of science into technology has also transformed and continues to transform the intrinsic nature of our society in a wide variety of ways.
But even science itself has had a transformative effect–on the way we think of ourselves and our relationship to others and to the universe. The Copernican universe, where the Earth and other planets revolve around the sun, rather than all the celestial bodies revolving around the Earth, fundamentally altered our sense of place within the cosmos and our relation to a deity. The discovery of the Newtonian laws of motion, when applied to the solar system, supported this world view; in addition, it laid the foundation for the concept of the universe as a mechanical device, controlled by mechanical laws of cause and effect and set in motion by the ultimate mechanician. The Darwinian theory of revolution laid waste to the Biblical creation tales, and shook the very foundations of the religious establishment so strongly that the aftershocks are still being felt today. (Quantum mechanics would have the same transformative impact on our sense of ourselves and the universe as these other sciences [see, e.g., Koretz, J.F. The Philosophical Implications of the Quantum Universe. In Humanism Today, 4:13-23 (1988)], but, happily or unhappily, it is very difficult for most to grasp; as a result, its effects on us are felt only in terms of the technologies derived from its principles).
There are a number of different ways in which one can respond to ideas or concepts that fundamentally alter one’s conception of one’s place in the universe–one can ignore them, embrace them, destroy them, or try to encompass them. Historically, the response has been primarily to ignore or embrace them–thus C. P. Snow could coin the phrase “the two cultures” and have it instantly enter the language as a term denoting the gap between the liberal arts and the sciences. While basic science had been an intrinsic part of philosophy–“natural philosophy”–for centuries dating back to Aristotle, the increasingly wide schism between science and religion, combined with a more empirical approach to scientific study and research, led to a separation that has been maintained and cultivated both in the academy and in the larger environs of society.
Education, whether public or private, whether primary or secondary or advanced, nourishes this separation both explicitly and implicitly in a variety of ways. At the university level, the humanities/social sciences and sciences/mathematics/technology are isolated organizationally as well as intellectually from each other, and there are social pressures at every level that contribute to a continuing schism as well. Since most post-secondary degrees in the United States require a distribution in the nature of the courses taken (as opposed to some other countries, where specialization begins much earlier in the educational process), everyone is theoretically exposed to the interests and basic principles of disciplines far away from their own. We all have to have the tools to communicate with each other (at least in principle), and special introductory couses designed for the non-major are part of the curricular offerings of almost every college or university department. Very often, however, these special courses end up doing the opposite of what they should have been accomplishing; rather than communicating the underlying principles of the discipline at a level where they can be understood on their own terms, they emphasize its arcane nature and its intellectual inaccessibility to all except the select. In other words, many academics have a vested interest in maintaining the “cultural” gap between the humanities and the sciences, and will do whatever is necessary to negate attempts to bridge the differences.
But the separation which is institutionalized at the college/university level–even to the extent of specialized colleges and universities such as St. John’s on the one hand and CalTech on the other– is merely a reflection of trends which begin and are maintained at much lower levels. In a sense, the high school and even middle school educational experience is a more rudimentary version of the post-secondary situation; the beginnings of “the two cultures” reside in the nursery and primary schools, where those who in general are the least sophisticated in, and least comfortable with, the sciences and mathematics are given the major responsibility for introducing new minds to these concepts. An acquaintance of mine–a fifth-grade teacher who has occasionally invited me to her class to talk with her students about science–has noted that, by the time she gets them in her class, the students already have very strong opinions about science and mathematics, often quite negative if not downright hostile. Some of this, she says, comes from their interaction with teachers in the lower grades; some of this, unfortunately, also comes from the parents and home environments which could otherwise have acted as a countervailing influence on the influence of the teachers. For some children, there is nothing that could ever cause them to lose their sense of wonder at the marvels of this world, but for most, the imperative to “choose sides” comes before they can even fully understand the terms of the battle.
Specialization–“streaming,” if you will–at a comparatively early age, combined with organizational and intellectual separation of the disciplines throughout the course of the educational process, provides the mechanism by which the “cultural” gap has been cultivated and preserved. One “culture” is embraced, while the other is ignored and/or undervalued. More recently, however, open warfare has been declared on one side by the other, or so it seems to some of us who, while standing on one side of the chasm, nevertheless are interested in what is happening on the other side and would like to build bridges between the two. This war is operating on two fronts–a ground attack from much of society against the bewildering rapidity of technological progress and cultural change, and a blitzkrieg by some in academe against the very nature of science itself–and one of the saddest aspects of this entire situation is that a good proportion of the science/technology “culture” does not even know this is going on because of the “cultural” gap.
In the nineteenth century, the Luddites tried to reverse the development of new technologies without success. In the late twentieth century, the neo-Luddites have become more sophisticated in their methods, trying to co-opt the language and mechanisms of technology to destroy both technology itself and the science which is its foundation. Thus we are confronted with Creation Science (an oxymoron if ever there was one) and other religious and pseudo-religious movements that seek to relegate science to the status of an alternative mythology, with pressure groups that seek laws to limit access to new technological forms of learning and knowledge (e.g., the Internet, access to which has been completely banned in Iraq and regulation of which is being demanded both within the United States and internationally), and, at a more fundamental level, with a variety of different organizations ranging from Operation Rescue to the Michigan Militia which seek to reverse the social empowerment of previously oppressed groups, empowerment that had been made possible by the cultural changes wrought by technology. At an even more basic level, the increased costs of higher education and vocational training, caused in large part by significant reductions in Federal and State funding during the 80s, have made access to and acquisition of new job skills difficult to accomplish, especially for the poor. And with typical human illogic, it is the technology (and the science underlying it) which is blamed, not the politicians who created a financial crisis in higher education, nor the legislators who are raising tuition to state colleges and universities at an unprecedented rate, nor the industrialists who have a vested interest in a cheap, poorly educated, narrowly skilled, labor force. The fear of change, the inability to encompass change at the same rate as it occurs, and the fact that change in our society is inextricably bound to technological advances that are not well understood by a majority of those upon whom it has an impact–all of these factors lead to hostility to technology, hostility to the science and mathematics that underlie technological development, and an educational environment–at home and in the schools–which is hostile to these disciplines.
At the academic level, the hostility to technological change is directed specifically at the foundations of science itself–that is, against both science and reason. The latter, which is not restricted solely to the sciences, mathematics, or technology, was in fact the initial victim of post-Modern thought (another oxymoron!), which emphasizes relativism and the equivalence of personal experience/mythologizing at the expense of a critical approach. Science is thus seen not as objective or an expression of reason but as subjective and another form of narrative that, because of the claims it makes about knowledge of the world, has carved out a privileged position in the academy to which it has no more right than any other area. The rather ragtag assortment of people with a variety of political agendas who operate under the general rubric of “science and technology studies” are not, therefore, studying science or technology, but rather attempting with some measure of success to alter the intellectual environment of the universities by distorting the very purposes and processes of inquiry, and reducing inquiry per se to the level of relativistic personal experience.
In trying to describe this movement (I cannot use the adjectives “intellectual” or “critical”), I am fettered by my distaste for these people’s apparent irrationality, their proud lack of acquaintance with the subjects and disciplines they are trying to destroy, and perhaps even their jealousy of the (limited) certitudes that science and mathematics can provide about the nature of the world. I am also in fear of the short- and long-term effects of this anti-intellectual approach on the future of both “cultures”–the sciences, the humanities, and higher education as a whole. But I am also, at another level, amused in a bitter, ironic fashion. I have a mental image of these people sitting in their ergonomic chairs at their personal computers writing their anti-science and anti-technology essays or sending e-mail to each other while listening to New Age music on their CD players, taking a break every now and then to heat up a low-cholesterol snack in the microwave or eat a piece of out-of-season fruit that was transported in from halfway around the world. There are many narratives about the nature of the world, but only a select few where the implications of the story can lead to concrete changes within that world….
My own introduction to what I would call academic neo-Luddism came a little over a decade ago, at about the time when Evelyn Fox Keller’s biography of Barbara McClintock–A Feeling for the Organism–was published. Fox Keller’s contention, which ran as a major theme throughout the book, was that McClintock was bringing a special insight, a feminine connectedness with nature if you will, to her studies of maize genetics, etc., etc., etc.. McClintock in return repudiated the biography and the very idea that she had a special perspective on her work because she was a woman, or even that her work was furthered by her feelings rather than her intellect. I thought at the time that Fox Keller was a crazed feminist; when I later learned that she had a Ph.D. in physics (and had written bitterly on the “cowboy culture” of high-energy physics, for example), it seemed to confirm my assessment and to make me wonder what her graduate school experiences had done to her to lead to Fox Keller making a career out of writing nasty essays about science and scientists.
But Fox Keller and her writings, it later turned out, were symptoms of a more broadly based movement/point of view about which I then knew nothing. When, a number of years ago, a group of female scientists in the Albany, New York area arranged to meet to discuss the possibility of setting up a local chapter of the Association for Women in Science (AWIS), I went to that organizational meeting with great enthusiasm. After the organizational aspects were addressed, there followed, to my horror, an approving discussion of how they, as women, had a special affinity and insight for the research areas in which they were involved. I took the opposite side, of course, but to little or no avail. The problem, at least on this microscale, was that the anti-science, anti-reason message was well&emdash;hidden within feminist rhetoric–the concept that women were not only equal/equivalent to men in their capacities but, in certain, special, extra-rational ways, superior. Subversion from within…
Commentators far better informed than I about the extent and nature of this continuing attack on reason and science have been responding. I highly recommend the recent Conference Proceedings from the New York Academy of Sciences, entitled The Flight from Science and Reason, which provides a much more complete assessment of the pervasive impact of this anti-rational movement in a variety of disciplines, as well as the beginnings of a set of strategies to deal with it. A variety of articles in Academe, the magazine of the American Association of University Professors, also address the issue, although, since the majority are anti-rationalist, it is more instructive as a window into the minds of the “opposition” than as an outline for response. Finally (and I laughed myself to the floor when I found out about this), there was a recent “fraud” perpetrated, whereby a mathematical physicist named Alan Sokal wrote an anti-science essay filled with deliberate jargon use and scientific errors for a science studies journal and then, after acceptance and publication, published a second essay in Lingua Franca excoriating the editors of the first journal for their ignorance of the subject matter they professed to be studying. Needless to say, this set off a firestorm of rationalizations, accusations of academic dishonesty against Sokal for doing this, and various other brushfires that continue up to the present; what has NOT come out of this deliberate “joke” has been a serious self-assessment on the part of the science studies people.
Are you feeling paranoid yet?
The traditional response to the larger social problems–prejudice, inequities, or whatever–has always been education. Many of us truly believe at a fundamental level that the truth shall indeed set us free, and that there is nothing that cannot ultimately be dealt with by an educated populace. In fact, for many of us, education is the path to human perfectibility, and rational thought is the mechanism by which education is absorbed and utilized.
What, then, does one do when the educational process is itself being subverted?
I raise the question with great concern about the future, but I have no answers to suggest. The gap between the two cultures is so deeply entrenched in our society and our educational system that the rational solution for bridging the gap–ensuring that all factions of our society are literate in both communications skills and scientific concepts–seems hopelessly naive and impossible to bring about. And yet, if these anti-intellectual trends continue, an increasing majority of our society will be even more disenfranchised from their appropriate roles in helping to shape the future of their society, and the technology which underlies it, than they are now. Scientific and technological progress will seem more and more Frankensteinian, a threat rather than an assurance about the nature of the world in which they will live.
I firmly believe that the technology which arises from advances in basic science must be harnessed to the needs and concerns of the society which fosters it, but this implies two important points: (1) there must be some consensual vision from our society, however blurry, of the desired shape of the future; and (2) there must be an increased responsiveness on the part of scientists and technologists to this set of future hopes. The point about a common vision has always been problematic except for certain limited and fairly selfish goals, and will become even more problematic in the future if these anti-intellectual and anti-science trends in society and the academy continue unchecked. As a society, we will continue to want–or even demand–solutions to problems of disease, genetic defects, and aging; we will continue to be increasingly concerned about the environment and scarce resources, desiring clean and environmentally friendly alternative sources of energy as well as remediation of the damage we have already caused; we will continue to desire machines and devices that make our lives easier and more pleasant; and we will continue to want anything that may increase the goodness of our quality of life, however we define it. At the same time, as a society, we are becoming increasingly hostile to those who have the capacity to bring these dreams into reality, increasingly proud of our ignorance of science and “how things work,” and increasingly discouraging to those who have the desire to train in scientific and/or technological disciplines. The contradictory directions of these impulses are obvious, and will become increasingly dissonant if these trends are allowed to continue.
In relation to point two, the responsiveness of the scientific/technological community to the needs and wants of society is, and always has been, externally enforced to a large degree; “doing” science and developing technology are expensive endeavors that require outside funding. Specific aims are thus controlled in large part by the short- and long-term goals of non-scientific bodies, whether they be the government, private foundations, or industry. In my own experience, there is, as well, an increasing sensitivity on the part of my colleagues to the concerns of those outside of their disciplines, and I strongly hope that this awareness continues to grow. It is essential to us all that dialogue and mutual feedback occur across the cultural gap, and that this process be fostered and expanded in both directions. I emphasize the word “both” because the onus for dialogue must be equally shared, the shaping of social and societal goals must depend on a mutual understanding of what can and cannot be accomplished in the short and long terms, and the responsibility for the application of discoveries, whether for good or ill, must be accepted by everyone.
In the final analysis, ANY contribution to the pool of human intellectual accomplishment, whether it be in the sciences or the humanities, can be used, abused, or misused. This is because, as a species, we have both the capacity to dream and the capability to destroy dreams. Science and the technologies that arise from science can, like the laser, be utilized in a multiplicity of ways that range from good to evil with all the shades in between also represented. But it should never be forgotten that the greatest human misery, the justification for the most heinous treatment of others, and the largest number of deaths have historically come not from the misuse of science, but from differing and sometimes mutually contradictory concepts of religion and interpretation of religious writings. Paraphrasing a National Rifle Association slogan, guns don’t kill people, ideas kill people. To try to make science and technology and, in a larger sense, reason itself, the scapegoat for all the ills and fears to which modern society is prone, is to deny the most important and immortal part of our very selves–our ability to use our minds to think, to reason, to imagine, to dream. To destroy, distort, or otherwise handicap our potential is thus a potent way to destroy, distort, or otherwise handicap our future.
I worry about the future of science, but, even more, I worry about the future.
The best summary of the “science wars,” at least from a science point of view, can be found in the published conference (May 31–June 2, 1995) proceedings of the New York Academy of Sciences–The Flight From Science and Reason (Volume 775, 1996), edited by Paul R. Gross, Norman Levitt, and Martin W. Lewis. Two of the editors–Norman Levitt and Paul R. Gross–also collaborated on a more recent article in Academe (November-December, 1996) entitled, “Academic Anti-Science.” It includes an excellent summary of the issues involved, a clever recap of Alan Sokal’s intellectual “prank,” and a useful set of end notes leading to further readings pro and con. Quite recently, Gross and Levitt’s book Higher Superstition: The Academic Left and its Quarrels with Science has become available in trade paperback format.
As for other writings cited in the body of the text, they are possibly interesting but not germane to the central issue considered here. The one exception is A Feeling for the Organism, and I strongly urge that, if you feel compelled to read it, you borrow it from your local library.
© 1998 by the North American Committee for Humanism (NACH) All rights reserved, including the right to reproduce this book, or portions thereof in any form, including electronic media, except for the inclusion of brief quotations in a review.