So I gave my paper on "robocracy" last night. I think it went quite well. I got some great questions from my peers, who actually took it seriously, and in light of their criticisms, I think I'm going to abandon the idea.
(The basic premise: Plato's Republic w/ superintelligent machines serving as the guardians, i.e., rule by [hopefully benevolent] AI experts.)
I think I'm going to try to take a more "descriptive" approach to these issues. It's difficult enough to lay out what the future is going to look like. I want to wait and see what forms artificial life and intelligence takes in the future before I advocate making them part of the political structure.
Generally speaking, I think I'm going to put some of this AI stuff on the back burner, and focus more on the issue of human enhancement. I am more or less convinced now that ordinary, unmodified humans will go extinct this century. But human extinction is not necessarily a bad thing, especially if we are replaced by something better.
For a nice intro to the subject, I highly recommend this lecture by Michael Bess. He lays out some of the expectations and a lot of the relevant moral concerns. Even if you have no background at all on human enhancement, it should make sense (it was written for a popular [educated] audience).
If there is one thing I lament about losing confidence in the AI-ocracy schema, it's that now I come very close to being perceivable as a "prophet of doom". I frankly don't see anyway humanity can be saved. Our creations are already out of our control, so how could we possibly hope to prevent weakly godlike intelligences from destroying our species? Our best bet is probably to enhance ourselves, but that may result in a different kind of loss of humanity.
And I'm just not sure whether that's good or bad. So much of what humanity takes itself to be is simply the product of ignorance, chance, and wishful thinking. It's mere pretense, flight-of-fancy, romantic sentimentalism. I'm not even sure what I value about our species, except perhaps the ability to recognize this (that our reason has some power over our affects) and our creativity and innovation.
But, these things are the province of a slim minority. On this logic, it makes no sense for me to be anything other than an elitist. I'm content to put the rest of humanity on a par with the bulk of mammalian life. That's not to say they don't warrant moral concern--we should not be cruel to animals--but they certainly don't deserve any reverence or respect. I suppose I shan't miss them when they're gone.