I've begun reading Ray Kurzweil's most recent book The Singularity is Near. I'm tempted to call the man a visionary; his view of the world is profound and startling. It's got me thinking about some issues I've seen raised elsewhere, and I've come around to some political conclusions that are, to say the least, not mainstream.
In contemporary politics, a 2-dimensional model is sometimes proposed for plotting political views. Wikipedia has a nice discussion of the diversity of views on what constitutes the political spectrum. Many have seen the inadequacy of the left-right model, and have proposed alternatives, one of which is known as the "political compass" model (discussed in detail here), which is fairly representative of 2-axis accounts, so I will briefly discuss it.
The two axes are the social and the economic. On the economic side of things, you find a range of views from hardcore libertarianism and support of laissez-faire capitalism, on the right end, to various forms of communism and socialism, on the left. The range of views on economics is of course more complex than this, but it offers a rough guideline by focusing on the level of interference in an economy by the state or some other collective.
On the social axis, you have views that range from authoritarianism and the worship of tradition, on the right (actually, the top of the compass), to what they call libertarianism (it's probably appropriate to distinguish between economic and social libertarianism, as well as the political group that calls itself libertarian, whose members are often both) with its emphasis on personal freedoms, on the left (the bottom). The relevant characteristic here is a measure of permissibility, how much is left up to individuals to choose for themselves.
I trust these two axes are fairly straightforward. However, there a third dimension is emerging in 21st century politics, and will become extremely prominent very soon. My suggestion is that it warrants its own axis, since it does not map well with either of the other two. I think the easiest way to distinguish the ends are by calling them technoconservatives and technoprogressives, but you also might talk about the range as being from Neo-Luddite to Transhumanist. On either end, you find unlikely allies.
On the technoconservative side, you find environmental activists, religious fundamentalists, leftist academics (much to my chagrin; it makes me wish Brave New World were never written), and groups like the Amish. On the technoprogressive side, you have avowed transhumanists, technophiles of various sorts, avid consumers of gadgets and doodads, and even the occasional social conservative who sees technology as a means of "curing" the gay or as a great way to enforce law and order.
I view our culture as being far more technoconservative than technoprogressive, in spite of all the money we put into scientific and technological research. Yes, there is a sort of contradiction represented by our worship of progress and our insatiable consumer culture, but by and large Americans are weary of new technologies, particularly biotechnology like genetic engineering. Contrast us with nations like Japan, India, or Thailand, and you find far less support of, for instance, germline genetic augmentation, which many westerners are inclined to dismiss as scary "eugenics".
Carl Elliott has documented our tenacious relationship with human enhancement in Better Than Well, a text which I use in my intro philosophy class and one I highly recommend. What you find, Elliott suggests, is a tension in thinking about human identity between notions of authenticity and being true to yourself, and those of self-improvement and being all that you can be. And, in truth, this issue of technology is largely one of identity, although in a different sense than the trendy topic found in academia today.
The fact of the matter is that we will very soon be dealing with beings of our own creation who match and then exceed human capabilities. The extent to which this happens in part depends on how long the Luddites are able to hold off what I see as inevitable (or, inevitable so long as we do not destroy ourselves, which is actually a likely possibility).
But if the technoprogressives win out, we will see robots, genetically enhanced humans, humans who voluntarily integrate themselves with machines (cyborgs, they are often called), but also computerized intelligences without bodies properly speaking, as well as intelligences realized in new forms like molecular computing that call into question the living/non-living distinction, and then all the hybrids of these various types. Imagine the world of the "X-Men" combined with that of "Ghost in the Shell". Add in nanotechnology, and things get even more wacky--and all of this happening all at once.
Very soon, people will have no idea about what is human and who counts as a person deserving of legal and moral rights. In this era, the technoconservatives (or at least the social conservatives among their number) will probably limit humanity to the unenhanced and non-cybernetic. This group, which I like to think of as the Neo-Amish, may constitute a substantial portion of the population in the beginning. But ultimately, they will be unable to compete with their enhanced brethren (and good riddance!).
This picture of the world scares many people. It runs counter to our understanding of the equality of persons; when you talk about improving humans, you are talking about making some better than others. It essentially destroys the traditional human world. I think that it is more than likely--I'd say even 90% certain--that human beings, as we are constituted now, will be extinct by 2100. Either we will have destroyed our species, or we will have moved beyond it.
This is, as I see it, the last century of humanity. Because I far prefer augmenting humanity rather than eliminating it, I strongly support unimpeded technological development in the GNR (genetics, nanotechnology, robotics) fields. Now, some people oppose it for the same reason, since they see it as increasing the likelihood of human extinction, whether by some super-virus, a robot rebellion, or the destruction of life by omnivorous nanobots (the "gray goo" scenario). And to an extent, this is true.
However, I see a far greater threat in the form of the idiots who control our nuclear weapons. One of the problems with unenhanced humanity is constituted by our aggressive and tribal impulses that incline us to hate those different than us and go to great lengths to destroy them. The sooner the enhanced and machine intelligences take over the world, as far as I am concerned, the better. In short, for those of you who know the film, I see "I, Robot" as having an unhappy ending.
Now, this is a radical view politically. It makes me part ways with many democratic ideals. But the more I read about the exponential growth of technologies, the more inevitable it seems. Yes, we may ban a lot of biotech, but the change is coming faster than we can respond to. Ultimately, I think it's the machines who will triumph, because people will view advances in computing and robotics as relatively innocuous, until the robots and other machine intelligences become commonplace.
Before long, recognizing their superiority, the machines and their cybernetic allies will sweep the unenhanced out of power--hopefully nonviolently--and usher in a new era of peace and cooperation. My guess is that this would happen around mid-century. It behooves those of us with any sense to join them.
All this sentimental piffle about the flaws and sufferings that make human beings special should be recognized for the garbage that it is. Nietzsche articulates it perfectly: the one thing we cannot accept is the notion that our suffering is meaningless. Well, folks, like it or not, it mostly is. It's the product of blind chance, of complex natural processes that lack foresight, of the various idiocies that human beings foist upon themselves. A world with as little suffering as possible would be a far better world, and this will (I hope) be the world of the future, in which intelligence wins out over other natural forces.
Most of you who read this, although I know few do, will likely regard me as insane, or at least deluded. I say time will tell. Predicting the future is a difficult business, and I could likely be wrong. However, one thing I am quite confident in is my normative stance: human beings are ultimately unfit to govern themselves. You can arrange societies in only so many ways, some more preferable than others, but like a mosaic made out of dog shit, the substance with which you work will put serious limitations on how beautiful your finished product will be.
In the past, this view was simply used as a means for one group to gain power over another. We are "the well born", "the best men", the ones who speak directly to the gods, etc., etc. But, in the future, this pretense will become a reality. Let there be democracy amongst the enhanced, but most human beings have no idea about what is good for themselves. Certainly, the people are not equipped to recognize who among them would serve as the best leaders--George W. Bush is but one obvious example.
We should be honest with ourselves. Frankly, I see myself as lacking in these respects--certainly I lack the intelligence and wisdom to see how the world ought to be governed but, really, so do all human beings. For too long, we have been at the whims of fortune, but now that we have the possibility to become masters of fate, it is incumbent upon us that we do so.
I find myself with a new purpose, as an academic and a philosopher, which is to sell this vision of the world, to encourage people to put effort into making our radically different future as utopian as possible. More and more are waking up to the importance of the issue, but far too many react to it with revulsion. I hope to change that. In any case, this is really the defining issue of our time. Recognize it.
Subscribe to:
Post Comments (Atom)
2 comments:
Dom
You come to many of the same conclusions here that I come to Citizen Cyborg (Westview Press - 2004). Check it out - I think you'll enjoy it.
------------------------
James Hughes Ph.D.
Executive Director, Institute for Ethics and Emerging Technologies
http://ieet.org
Editor, Journal of Evolution and Technology
http://jetpress.org
Public Policy Studies, Trinity College
http://internet2.trincoll.edu/facProfiles/Default.aspx?fid=1004332
Williams 229B, Trinity College
300 Summit St., Hartford CT 06106
(office) 860-297-2376
director@ieet.org
The third axis exists already, but it's not defined by an attitude toward technology. Those attitudes will compartmentalize nicely within the rather specious "political compass."
I call the "political compass" specious because it is limited and it fetters a responder by the very text of the questions it asks to divine one's "politics". The questions have serious confirmation bias problems. They seek to put one into one of the 4 quadrants. They do not seek considered, honest answers. For if they did, the authors of the questions and the Political Compass would find their model severely unrealistic and quite totalitarian, even dictatorial in approach.
As to your discussion of the supposedly "enlightened" view of Asian nations toward tampering with the human genome, I submit that your paucity of scientific background is harming you here. As a philosopher you must be aware of the fact that we have not divined just exactly what comprises the "engine of life" -- that which causes life to originate and sustain itself.
We know that humans need air, water and food, and shelter from extreme conditions as well as predators.
We do not know what compels a sperm and egg to join.
We do not know what causes the joined sperm and egg to begin its developmental process.
And yet you and others feel comfortable mucking about with genetic coding, even though we DO NOT KNOW what will happen?
We're not talking about inert chemicals being combined to create, for example, the surprising explosion when elemental sodium encounters water.
We're talking about potential horrific mutation, and the power to do it.
I suggest you study mutagens and teratogenicity.
Post a Comment