6/04/2006

"Technological Fundamentalism"

Take a look at this article entitled "The Four Fundamentalisms", purporting to examine four ideologies, four non-thinking tendencies which pose a grave threat to the very existence of humanity.

The author, journalism professor Robert Jensen, extends the traditional meaning of "fundamentalism" beyond pertaining only to religion:

I want to use it ["fundamentalism"] in a more general fashion to describe any intellectual/political/theological position that asserts an absolute certainty in the truth and/or righteousness of a belief system. Such fundamentalism leads to an inclination to want to marginalize, or in some cases eliminate, alternative ways to understand and organize the world. After all, what's the point of engaging in honest dialogue with those who believe in heretical systems that are so clearly wrong or even evil? In this sense, fundamentalism is an extreme form of hubris, a delusional overconfidence not only in one's beliefs but in the ability of humans to know much of anything definitively. In the way I use the term, fundamentalism isn't unique to religious people but is instead a feature of a certain approach to the world, rooted in the mistaking of very limited knowledge for wisdom.

As an alternative he proposes a kind of epistemological humility, "an ignorance-based worldview". The kind of opposition he paints here is similar to one I've adopted from Wm. James between monism and pluralism. The monist is the individual who believes that there is only one right way (his way, of course) in terms of belief, practice, self-identification, etc., while the pluralist seeks to, as I like to put it, "let the many flourish".

In both of these models, the idea is to acknowledge our limitations and be more accepting of individuals who arrive at different conclusions than we do (within reason). It's a matter of being open to new possibilities, seeing beyond a narrow perspective, exercising empathy towards those who differ from us.

Of these four threatening fundamentalisms, I'm with Jensen on the first three: religious fundamentalism, nationalism, and market fundamentalism. Fundamentalists generally suffer from an imagination deficit. Why should God be constrained to the limits imposed by a handful of ancient authors? Similarly, nationalists fail to see the arbitrariness of national boundaries (local allegiance, or loyalty to your community, is another thing entirely) and apologists for capitalism refuse to envision alternatives to a world economic order that leaves so many destitute and suffering.

We might debate over whether there are possible positive manifestations of these, over the extent of the problems they pose, and so forth. But I'd rather turn to his fourth category, which has me a bit on edge: technological fundamentalism.

Most concisely defined, technological fundamentalism is the assumption that the increasing use of increasingly more sophisticated high-energy, advanced technology is always a good thing and that any problems caused by the unintended consequences of such technology eventually can be remedied by more technology. Those who question such declarations are often said to be "anti-technology," which is a meaningless insult. All human beings use technology of some kind, whether it's stone tools or computers. An anti-fundamentalist position is not that all technology is bad, but that the introduction of new technology should be evaluated on the basis of its effects -- predictable and unpredictable -- on human communities and the non-human world, with an understanding of the limits of our knowledge.

He goes on to criticize, in particular, nuclear and biotechnology, suggesting that we are dealing with forces far more powerful than we can control.

Now here, I'll grant that he makes a good argument. Much of new technology doesn't actually increase happiness in the world, and certainly doesn't offer people meaning. Labor-saving devices are quickly adjusted-to, and individuals just call for more and more, faster and smaller and easier-to-use and more powerful, and so on.

But, Jensen fails to think of the possibilities that could arise if we conducted our research with different ends. Instead of developing new toys, we might try to understand better what makes humans happy, what gives them meaning, makes their lives worth living. This kind of research has been done, and we're learning a lot.

It may be the case that it's foolish to think that the problems that technology causes can be solved by more technology--it's like the mouse in your house, which you buy a cat to get rid of, but then the cat won't leave, and you have to get a dog, and on and on like that until you get to, I don't know, Voltron, but then you can never get rid of him (h/t Poorman).

But should the solution be just to get used to the mouse eating your cheese? I really enjoy cheese, and maybe it gets worse when the cat starts drinking all my milk, or the dog starts eating my steaks, or Voltron blows up my refrigerator, and maybe that'll happen, but then maybe Voltron will just leave quietly after that? Surely crossing our fingers and hoping for the best as we do the exact same thing as before will work, won't it?

In all seriousness, I'm aware of these warning about hubris and playing God and all that. But even though I consider myself to be a Green (I'm registered to vote as one), I've never bought into this whole anti-biotechnology business. Like, what's so bad about GMOs? Wouldn't it be a shame if we could feed more people? Oh, but we're "playing God"! We're creating "Frankenfood"! Isn't that scary? Let's just leave everything well enough alone.

The potential for good is just too great for me to turn my back on biotech. Of course we want to ask what ends we are pursuing and I'm all about working towards sustainability. I just don't see that as incompatible with this research though.

The fact of the matter is that biotech can not only help to satisfy the world's basic needs, it can also directly increase human happiness. I'm not talking exclusively about antidepressants here, although they are nothing to sneeze at. But consider technologies like the one I wrote about not so long ago, that allows people to monitor their brain activity and develop new strategies for controlling pain. Or think about Haidt's work, and the research that shows the positive value of things like habitual meditation and strong social networks in cultivating human happiness.

This should not be about turning our backs on technology because we're scared of what it can do. This should be about using technology more intelligently. Perhaps we should lengthen the period between lab result and consumer product, but by no means should we be holding back the basic research.

Knowledge is power and I see nothing wrong with re-making a cold and uncaring universe in our own image, in the way we think it should be. We should be careful that we don't destroy ourselves or the other inhabitants of this planet, but let us make sure that our caution is in the right places for the right reasons.

No comments: