7/09/2007

Problems with Technology

In preparation for a paper on Critical Theory and the critique of "Instrumental Rationality", I've been reading a number of articles in the philosophy of technology. As might be suspected from a discipline that has virtually eschewed any technological innovation--aside from the word processor, I can scarcely think of any concrete examples--most of them are critical, with some even celebrating their "Luddism".

Just today, I came upon a review for yet another book against genetic engineering by philosopher Michael Sandel, who I think I used to like. Called "The Case Against Perfection", it's typical of a certain class of problems that I'll explicate below. At least his critique seems somewhat novel, according to the reviewer, who calls it "half right".

Reading these various pieces has helped to clear up, in my own mind, what my general advocacy of technology more specifically entails, and where I am in agreement with its critics. To that end, I've been experimenting with a simple typology of problems, which I present in a draft form now.

I see two basic categories of problems, which I call "Problems of Destruction" and "Problems of Transformation". With respect to the former, I am on the same page with the most rabid of Luddites; it is the latter that I am less inclined to think of as problematic. Of course, these categories are not likely to be exhaustive--I can think of some examples which seem to fit into neither--nor mutually exclusive; they are simply schematic.

I) Problems of Destruction, as might be inferred, are those that deal with issues of survival. These include various threats to individuals, species, the environment, civilization, even life itself.

Certain fields like the recently emergent "synthetic biology"--a name I only came across days ago, but which already seems ubiquitous on the net--along with other sectors of biotechnology and nanotechnology have the potential to unleash massive devastation on par with nuclear holocaust (but without all that messy radiation). The engineering of a super-virus not found in nature or something like the infamous "gray goo" scenario (in which self-replicating nanobots are let loose and convert the entire biosphere into copies of themselves) would be examples of this.

(Artificial intelligence and robotics pose different kinds of threats, such as replacing human beings, which are considered by some as instances of the other class of problems. Insofar as an artificially intelligent civilization might mitigate the risk of other kinds of destruction, I find myself potentially sympathetic. This is an area I'll have to return to.)

Less destructive examples include pollution and other industrial processes which contribute to global warming, as well as more locally situated contaminations. The development of biofuels--undoubtedly the stupidest way of trying to resolve the energy crisis so unsurprisingly one championed by our president--poses significant dangers which have largely gone unrecognized. According to a recent report human beings already consume a quarter of nature's productive capacity, a figure which would be made even worse if fossil fuels were simply exchanged for biofuels.

Threats to the survival of human beings and other life would be regarded seriously by all but the most misanthropic. Efforts like the green energy movement and oversight of the most dangerous areas of research would be in our best interest. Those who resist these measures are simply not examining long-term consequences. Unfortunately, some major corporate players fall into this camp as a consequence of our brand of capitalism which is incapable of looking ahead more than a couple of years, usually being focused on this quarter's earnings and whatnot.

II) Problems of Transformation are more problematic in their problematicity. (As ugly as that last sentence is, it conveys what I want to say tersely.) Sandel's diatribe against genetic engineering is but one of a broad range of examples. In essence, what I'm calling "transformations" involve significant alterations to established ways of life, some more profound than others.

Since at least the industrial revolution, we have undergone a number of significant transformations. We live very differently than did our ancestors. The more conservative elements of society are likely to lament this as a loss, but most people are happy to call this "progress". I use the more neutral term "transformation" to avoid overt bias, even though I tend more often than not to fall into the latter camp. (Also, it's foolish to view a change as progressive simply in virtue of the fact that it is novel.)

The largest areas of concern today seem to be the potentially radical transformations to human beings that come from "enhancement technologies" such as genetics and cybernetics. Many critics contend that the blurring of boundaries occasioned by such interventions threaten our "humanity", "dignity", "meaning", or whatever other romantic buzzword one cares to use. In the very least, I grant them that not all "enhancements" will necessarily be improvements.

The more sophisticated critics realize that humans will (by and large) adapt to changes and take them for a new "normal"--and see this as part of the problem. However, it is difficult for such critiques to find purchase; either they rest on some dubious metaphysical ground, or they rely on the equally dubious strategy of taking certain characteristics of human beings--like the way that certain things disgust or frighten us--as essential. Quite frankly, I think "postmodern" intellectuals have no basis for criticizing transhumanism except their individual prejudices, which are only valid to those who share them.

Other examples of this would include heightening of the gap between technological elites and those without access. I'm inclined to think that such partitions are more a function of capitalism than a necessary consequence of technology. In fact, I see no way of finding positive alternatives to capitalism without significant technological change. Proponents of technology like to see this gap as more of a "lag"; the poor eventually do get access, as can be seen, e.g., in the spread of cell phones in the developing world.

I think that this Transformation category probably requires some greater specification since it covers such a broad range of issues. What's important to note is the way in which they tend to effect not merely our material circumstances, but also our beliefs, attitudes, and structures of meaning. The latter is what is most scary to people, but as an anti-essentialist, I am unconcerned.

Social critics will always find ways to complain of deficits of meaning in a society; conservatives will always long for the good old days that never were; but most people will adapt. Clinging to the old ways of life I see as a consequence of a couple of factors. Often, it's just greater fear of unknown evils than of known ones. When it's opposition to ostensible improvements, it's a way to make people feel better about the unnecessary suffering that they had to endure ("suffering is just a part of life!", "pain is what gives human existence meaning", "our imperfections constitute our humanness", and other such drivel).

Most likely, the intelligent inhabitants of earth a century from now (if there are any) would not appear "human" to most people today. As for me, I see no reason to cling to an evolutionary accident. What matters are things like rich experience, intelligence, reason, happiness, meaningfulness, benevolence, and so forth. Whether or not such beings think of themselves as "human", perhaps as some nostalgic sentiment, is to me entirely inconsequential. (I think an extension of the category of speciesism would be appropriate here.)

1 comment:

Anonymous said...

Hi, I posted here some time ago, lost the link, and just refound it. Sorry for the absence.

As someone that is involved in the philosophy of technology, I'd say that you've hit the Destructive and Transformative camps fairly well. That said, the distinction between what camp a particular issue/idea falls in isn't always clear-cut - digitization of communication networks, for example, allow for the acceleration of capital migrations and their subsequent effects. In addition to this 'destructive' effect, it is also a transformative problem - digitization offers the hopes of more comprehensive global communication, though the 'la' effect for spreading technology will likely prevent a global communications network unless it is seen to be profitable (and really, how profitable is it for citizens of Ethiopia to be able to talk to American or Canadians from a business perspective?).