Over at the Huffington Post, R.J. Eskow blogged earlier this week about enhancement technologies and transhumanism, commenting specifically on my piece about Fenton and Habermas from last weekend.
Eskow's discussion - entitled "Homo Futurus: How Radically Should We Remake Ourselves - Or Our Children?" - is quite sympathetic, and Eskow makes clear his general inclination to side with individual liberty against social restrictions, though he also worries that this is not a simple matter, once children are involved. It's one thing to support individual liberty against the state, but what about when it is the liberty of parents versus the interests of children? He expresses the fear that so-called liberal eugenics could become libertarian eugenics.
I actually agree with the direction of this, though Eskow's discussion seems to be placing me in the "libertarian" category, which would be rather misleading. It's possibly understandable, because this category does, indeed, seem to be where a lot of transhumanist discourse used to belong, and much of it doubtless still does. E.g., when I was signed up to the Extropians mailing list a few years ago, many of the participants were forthright libertarians of the Ayn Randian variety, and saw no role for the state to restrict their proposals to alter themselves or their children, if the technology became available. This was, it seemed, a belief that we should have total freedom to pursue our personal transhumanist aspirations, no matter what the consequences for others or for society as a whole.
To try to avoid a new misunderstanding, I never saw claims that extreme made by the leadership of the Extropians group (I'm not, for example, attributing them to Max More), but they were, and probably still are, quite common at the grass-roots level.
Now to my own case. Whatever credentials I have as a transhumanist are actually rather blurred - I've been willing to accept the label when it's applied to me. In some contexts, I've even been willing to apply it to myself as shorthand, or to express solidarity with my transhumanist friends - whose general attitudes to technology do strike me as more rational than those of their opponents. But there are always caveats, whether or not they are always stated.
I don't actually buy wholesale into some body of theory called "transhumanism"; I'm choosier than that, and I'm unconvinced of the merit of many of the specific proposals that emerge from more clearly self-identified transhumanists. Anyway, even if I do count as a transhumanist, this can only be because the words "transhumanism" and "transhumanist" now signify a much wider range of positions than the Randian techno-libertarianism that I associate with Extropian circles. Indeed, some of the leading figures in the World Transhumanist Association are left-wing political activists, as well as being interested in technologies that could alter human capacities.
I replied to the Huffington Post piece with a few comments to try to clear up any misconceptions about where I stand on all of this. What I say below is an edited/expanded version of what I wrote over there.
For a start, I don't think the choice is necessarily between liberal eugenics and authoritarian genetics (which Eskow seems to have understood me as saying), where the latter refers to the government-mandated genetic programs of the past. While the distinction is an important one to make, there are many positions in between, and even liberal eugenics could take more or less "libertarian" forms.
Also, I'm really not fond of the word "libertarian" without qualifications attached to it. The attitudes that I take to some issues are libertarian in a popular sense, i.e. they put a high value on individual liberty. But I tend to think of myself as an Enlightenment or Millian liberal, rather than as a libertarian. "Libertarianism" sounds too much like the position of Rand, say, or Robert Nozick - positions that I repudiate. I don't belong to their "taxation is theft" camp. In practice, my economic positions tend to be centrist and pragmatic, but in principle I am prepared to support very large redistributions of wealth if needed to meet the kind of social democratic objectives that I favour. (Mill himself did not argue for a minimal state, though he did argue for freedom of speech, in particular, and against the use of the criminal law in the absence of fairly direct harms to others.)
The regulation of enhancement technoloiges that I'd want at the end of the day might not be all that libertarian, even in a popular sense. In the context of genetic engineering of children (the technology that Eskow focuses on), I do worry about micro-managing how kids turn out. In another context, I support Richard Dawkins in deploring religious indoctrination of young children, and I am no believer in absolute parental rights to control how kids turn out - irrespective of the interests of society at large or the kids themselves. I also worry about impacts on democratic equality. Finally, I worry about issues of safety and efficacy.
However, I think it's important to criticise opponents of genetic technologies when their arguments overreach or they start to favour overly broad restrictions. I especially think it's important to keep the heat on arguments that seemingly valorise "the natural". Attempting to improve on our evolved nature would be difficult, and should be approached with caution. We could go badly wrong here. But it is best, I think, if we refrain from arguments that have anything to do with the supposed inviolability of nature, or of a specific human nature. In any event, when poor arguments are put along those lines I'll continue to criticise them.