6/28/2011

My Two Politics

It's been so long since I've tried to write anything, but I'm going to give it a shot. I wanted to attempt to give expression to an idea I've joked about, but never really examined seriously.

I have started telling people that I have two sets of political views, one which is largely critical (anti-war, anti-imperialist, anti-corporate, anti-authoritarian, etc.) which I think of as the "serious" or "realistic" position, and then my "half-serious" or "fantastical" politics consists of what I like to call "all the robot stuff". The latter has considerably more positive content: a mix of trans- and posthumanism, technocracy, robocracy, and various other futurist ideologies.

Rarely do the two conflict. When I read political news, especially concerning US foreign policy, the glimpse of reality (of all the suffering and bloodshed our country's government inflicts) activates my critical faculties (and my sense of disgust), jarring me into a "serious" state of mind.

But I can only take so much of this, so I take refuge in techno-utopias, where just AI rules gently over a transhumanist populace or--in my darkest fits of misanthropy--in a post-human world in which "the robots have won" and the cancer of humanity has been surgically excised.

The first perspective is rooted firmly in the present and in my knowledge, limited as it is, of history. The second jumps around from one possible future to the next, often with little or no connection to today.

So, the two rarely conflict. But as we inexorably head into the future, as the technological advancements I pine for increasingly become reality, the points of contact increase.

For instance, in a recent status update on a popular social networking site which shall remain nameless, I made a criticism of Obama's fondness for drone strikes--"flying death robot attacks". When challenged on this, I attempted to justify the contradiction by saying that I was against these robots slaughtering humans because they were just pawns of the US government, but an autonomous AI like Skynet doing its thing is totally cool.

This was a joke. Or was it? Sometimes I'm not even sure myself, which is why I call this view "half-joking". Though I am strongly opposed to war, I am not a principled pacifist, nor do I believe life is "sacred" or that humans have intrinsic rights to life, liberty, etc. Rights are legal constructions, while so-called "moral" rights are in my view mere social constructs.

The powerful create "rights" and bestow them upon themselves. They extend those rights to others only when themselves compelled by a sufficient force. If AIs still had the need for such crass rationalizations, they could compellingly argue that as superior beings, they have the right to use humans just as humans used non-human animals (not to mention other human beings who were conveniently classified as "non-human"). This would be compelling not simply because of its consistency, but also because the robot would shoot you with its laser if you tried to disagree.

I see now. I am able to have these two contradictory politics because, at heart, I don't really believe in anything.

Ah, but that's hyperbolic! That's like calling myself a "nihilist"--which I also do. But that can't be right. If nothing else, implicit in the decisions I make are patterns of value. While I may want to explain those away as accidental emotional associations, there are limits to how much you can take a third-person perspective concerning the operations of your mind.

---

The hardest part about blogging, I now recall, is ending a post. This is especially difficult for me because in the past I often went in not knowing completely what I'm going to say, adopting a stream-of-consciousness writing style, attempting to give voice to the various interlocutors in my ongoing internal dialogue. (I mean that in a non-crazy way. I know the voices in my head are all just me.)

I imagine that since I have been away from disciplined writing for so long (in which you must prepare an argument, and have at least a mental outline of your paper), my proclivities for free-ranging thought and improvisational writing will be increased.

So, that's going to mean an abrupt ending every now and then.