Skip to main content

Environmentalism and Technology

Wait a minute, you might say: it’s environmentalism against technology, for isn’t technology a fundamental source of environmental problems? Too many people consuming too much, enabled by technology?

This has been the position of deep greens. In fact, some trace all environmental problems to the beginning of agriculture, arguing that it was the shift from hunter-gatherer to farming that created what they consider the human cancer consuming the globe. Even moderate greens can be anti-tech, reflecting both skepticism about capitalism and the countercultural ideology that characterizes most environmental discourse.

Consider, for example, something as mainstream as the precautionary principle, which holds that no new technology be introduced until it can be demonstrated to have no harmful environmental impacts. Taken at face value, this embeds within it a strong preference for “privileging the present” -- that is, attempting to ban or limit technological evolution -- for the potential implications of all but the most trivial technological innovations cannot be known in advance.

Positioning environmentalism against technology, however, has its problems. For one, it misunderstands the nature of complex cultural systems. These inevitably evolve, generally towards greater complexity; consider, for example, how much more complex international governance, information networks, or financial structures are now than just a few years ago.

And technologies are evolving rapidly as well, particularly in the three areas that promise to impact environmental systems the most: biotechnology, nanotechnology, and information technology. The first will, over time, give us design capabilities over life; the second will let us manipulate matter at the molecular level; the third will change how we perceive and understand the world within which the first two are accomplished.

Moreover, developing such capabilities will give the cultures that do so significant competitive advantages over those that opt for stability rather than technological evolution. There are historical examples of this process -- for example, China, from roughly the 11th to the 14th centuries. At that time, China was the most technically advanced society, but for a number of reasons its elite chose stability over the social and cultural confusion that development and diffusion of technologies (such as gunpowder and firearms) might have caused. Northern Europe, however, followed a more chaotic path, including the Enlightenment and the Industrial Revolution, which favored technological evolution. The result: Eurocentric, not Chinese, culture forms the basis of today’s globalization.

Applying this lesson to current conditions raises the question of whether deep-green opposition to certain technological advances, especially genetically modified organisms, could halt technological advance. Some societies -- Europe, in particular -- may choose stasis over evolution. But biotech is such a powerful advance in human capabilities that other societies -- especially developing countries with immediate needs that biotech can address -- are not likely to forego its benefits. And to the extent their cultures become more competitive by doing so, they may come to dominate global culture.

So is the answer then to simply give up and let technology evolve as it will? Not at all. In fact, the essential problem with an ideological opposition to technology is that it prevents precisely the kind of dialog between the environmentalist and technological discourses required to create a rational and ethical anthropogenic earth. For technologies are not unproblematic, and their evolutionary paths are not preordained; rather, they are products of complex and little-known social, cultural, economic, and systems dynamics. It is important that they be questioned and understood.

The challenge is thus not unthinking opposition, or maintenance of ideological purity, or even meaningless repetition of ambiguous phrases such as “precautionary principle.” It is far more demanding. It is to learn to perceive and understand technology as a human practice and experience, and to help guide that experience in ways that are environmentally appropriate.


Brad Allenby is VP, Environment, Health and Safety at AT&T; an adjunct professor at Columbia University School of International and Public Affairs, Princeton Theological Seminary, and the University of Virginia Engineering School; and Batten Fellow at the University of Virginia Darden School of Business. The opinions expressed are those of the author, and not necessarily of any institution with which he is associated.

More on this topic