I've accepted an invitation to join the Scientific Advisory Board for the Lifeboat Foundation. This puts me in good company with Gregory Benford, David Brin, Aubrey de Grey, Ray Kurzweil, James Hughes, Robert J. Sawyer, Natasha Vita-More, my distinguished colleague at Monash University, J.J.C. ("Jack") Smart, and a bunch of other people so astonishingly eminent as to make me feel humbled by the offer (the list includes at least a couple of Nobel Prize winners).
The Foundation's website describes it in this way:
The Lifeboat Foundation is a nonprofit, nongovernmental organization, dedicated to ensuring that humanity adopts the powerful technologies of genetics, nanotechnology, and robotics safely as we move towards the Singularity. This humanitarian organization is pursuing all possible options, including relinquishment when feasible (we are against the U.S. government posting the recipe for the 1918 flu virus on the internet), and helping accelerate the development of defensive technologies including anti-biological virus technology, active nanotechnological shields, and self-sustaining space colonies in case the other defensive strategies fail.
I'm hedging my bets about the likelihood and imminence of the so-called technological singularity - when technological progress is supposed to soar straight upwards like a wall across the future. Yes, that's the sceptic in me coming out again. But I can relate to most of that statement.
Perhaps it's a bit ironic that I'm making this announcement after I've just blogged saying that I'm not afraid of the future, but this is clearly not an anti-technology group, and seems to be going out of its way to include a range of positions. While we're welcoming the prospect of a strange future and massive technological advancement, it's also good if a high-powered think tank is at work considering what the genuine dangers might be. If I can help in any small way, that's more than cool.