I'll be reviewing this book formally elsewhere, but it's worth also drawing attention to it here. Agar is an unlikely recruit to the bioconservative ranks, having previously written a book that offers a cautious defence of human enhancement technologies. On this occasion, however, he's produced a lucid critique of what he calls "radical enhancement". As I'll be pointing out in my review, he may not have changed position, exactly, as he's defined radical enhancement fairly narrowly, as involving "improving significant human attributes and abilities to levels that greatly exceed what is currently possible for human beings."
We've also published a lengthy review of this book, by Jamie Bronstein, over at The Journal of Evolution and Technology.
The arguments that seem dearest to Agar's heart, and will, I think, merit most discussion, involve the claim that a posthuman life would be impoverished by human standards (not by the standards of the posthumans themselves, who may be perfectly happy with their lives, or by some kind of objective standard that is inescapably binding on all rational beings). This raises issues about whether such human standards exist and, if so, whether they are the ones we should apply.
Agar wants us to adopt a "species-relativism" about values, which does not necessarily mean that we should be speciesist. Our species-relative values may turn out to involve concern for the interests of non-human animals. However, they may also involve a desire to preserve various kinds of relationships, approaches to life, social institutions, and so on, that would (arguably) have no appeal to posthumans, with their indefinitely long lives and/or vastly augmented, ever-developing intellects. To preserve these things and lead lives that we ourselves see as valuable, we will need to stay human.
I don't necessarily agree with all, or any, of this approach, and will saying more about why in my review, but Humanity's End is certainly an important contribution to current debates about enhancement, emerging technologies, and transhumanist philosophies.
I find many of your posts very intriguing or enlightening, but I have a hard time understanding transhumanism, or rather your obsession with it. If we change, we change. If we don't, we don't. If a prosthetic eye gets invented, those rich enough will buy it, and the poor won't be able to afford it, as it always was. So, relax.
What use is either cheer-leading inventions that are going to come up anyway, and most likely excitedly anticipating some that never will (because they are impractical or impossible)?
And the much more pressing issue at the moment seems to be that we are in the process of shooting ourselves in the foot so spectacularly that a failure to develop "transhumanist philosophies" will be the least of our future problems: unless we solve certain issues of overpopulation, climate change, soil degradation / salinization / erosion, groundwater depletion, habitat fragmentation, deforestation, biodiversity loss, waste, increasing social inequality, and dependence on all manner of non-renewable resources from oil to rare earths, the question of whether we would have been intellectually prepared to deal with mind uploading or brain implants if our society had not disintegrated into a hellhole of mass emigration, hunger riots, economic, political and societal collapse and war around the time of their invention will remain entirely academic.
Why use a loaded word like "obsession"? We all have our interests, and we are entitled to think about them and blog about them without being accused of being, in effect, obsessed. Considering that I wrote a PhD thesis on the regulation of enhancement technologies, you can be confident that I will continue to blog about related subjects.
I don't consider these issues to be trivial. In fact, I think they often provide important test cases for the principles that apply to the development of public policy in a liberal democracy (which was a theme of the PhD thesis). I'm horrified at many of the developments that we saw in this area in the light of the 1997 announcement of Dolly. And I'm not impressed when someone tells me not to be interested in X because there are all these other things, W, Y, and Z to be interested in. If the subject of enhancement, etc., doesn't interest you, just skip those posts. Likewise if I blog about Thomas Pynchon, or cats, or the X-Men franchise, or any other of my pet (no pun intended re the "cats" one) interests.
Sheesh! It's not as if this is a one-trick-pony blog. I write about a variety of things that interest me and, when it comes to the serious topics, things that I think I know enough about to make a contribution to public discussion. If there are things that you want covered that are not covered here it means either I'm not interested or, more likely, that I think there are lots of other people who are much more expert than I am. Those people's blogs aren't hard to find. I'm sure you could find many environmental science blogs, for example.
Meaanwhile, I'm not going to pretend to be someone I'm not. This blog is me. Take it or leave it.
Sorry if that annoyed you. Of course I can take it or leave it, but this was a serious question. I do not understand the whole idea of having to have a transhumanist movement, or having to defend transhumanist ideas, any more than I understand the need of having a movement to support the continued rotation of the planet, but many seemingly intelligent, reasonable and educated people do seem to feel that it is important, and so I may ask some of them what it is good for.
'a failure to develop "transhumanist philosophies" will be the least of our future problems'
It never fails to amaze me how often the argument from priority gets raised in the contaxt of enhancement technologies, rather than in response to resources being expended on other non-planet-saving directions. Is it a ridiculous indulgence to spend money on the arts, for example, while millions die of malaria? Arguably ... yet rarely argued.
As to your argument from inevitability: even if I were to accept the fatalistic claim that certain techno-innovations either will or won't happen, regardless of our efforts, when and how they come about are surely variables worth considering. If radical life-extension is delayed by a couple of generations, that's potentially a few million dead who might otherwise have been alive, sharing extra time with their loved ones.
I'm posting some reflections on this book at the moment. They're written as I complete the various chapters, and without the benefit of having read the whole thing. Somebody might be interested
The Species-Relativist Argument: An Introduction
The Species-Relativist Argument: The Boundary Line
I'm going to have more up in a couple of days.
Post a Comment