Dr. Anders Sandberg of the Future of Humanity Institute at Oxford just did one of the best Reddit AMAs I’ve ever read, a brilliant back-and-forth with readers on existential risks, Transhumanism, economics, space travel, future technologies, etc. He speaks wisely of trying to predict the next global crisis: “It will likely not be anything we can point to before, since there are contingency plans. It will be something obvious in retrospect.”
The whole piece is recommended, and some exchanges are embedded below.
_________________________
Question:
Will we start creating new species of animals (and plants, fungi, and microbes) any time soon?
What about fertilizing the oceans? Will we turn vast areas of ocean into monoculture like a corn field or a wood-pulp plantation?
When will substantial numbers of people live anywhere other than Earth? Where will it be?
What will we do about climate change?
Dr. Anders Sandberg:
I think we are already making new species, although releasing them into nature is frowned upon.
Ocean fertilization might be a way of binding carbon and getting good “ocean agriculture”, but the ecological prize might be pretty big. Just consider how land monocultures squeeze out biodiversity. But if we needed to (say to feed a trillion population), we could.
I think we need to really lower the cost to orbit (beanstalks, anyone?) for mass emigration. Otherwise I expect the first real space colonists to be more uploads and robots than biological humans.
I think we will muddle through climate: technological innovations make us more green, but not before a lot of change will happen – which people will also get used to.
_________________________
Question:
What augmentations, if any, do you plan on getting?
Dr. Anders Sandberg:
I have long wanted to get a magnetic implant to sense magnetic fields, but since I want to be able to get close to MRI machines I have held off.
I think the first augmentations will be health related or sensory enhancement gene therapy – I would love to see ultraviolet and infrared. But life extension is likely the key area, which might involve gene therapy and implanting modified stem cells.
Further down the line I want to have implants in my hypothalamus so I can access my body’s “preferences menu” and change things like weight setpoint or manage pain. I am a bit scared of implants in the motivation system to help me manage my behavior, but it might be useful. And of course, a good neural link to my exoself of computers and gadgets would be useful – especially if it could allow me to run software supported simulations in my mental workspace.
In the long run I hope to just make my body as flexible and modifiable as possible, although no doubt it would tend to normally be set to something like “idealized standard self”.
It is hard to tell which augmentations will arrive when. But I think going for general purpose goods – health, intelligence, the ability to control oneself – is a good heuristic for what to aim for.
_________________________
Question:
What major crisis can we expect in next few years? What the world is going to be like by 2025?
Dr. Anders Sandberg:
I am more of a long term guy, so it might be better to ask the people at the World Economic Forum risk report (where I am on the advisory board).http://www.weforum.org/reports/global-risks-report-2015
One group of things are economic troubles – they are safe bets before 2025 since they happen every few years, but most are not major crises. Expect some asset bubbles or deflation in a major economy, energy price shocks, failure of a major financial mechanism or institution, fiscal crises, and/or some critical infrastructure failures.
Similarly there will be at least some extreme weather or natural disaster events that cause a nasty surprise (think Katrina or the Tohoku earthquake) – such things happen all the time, but the amount of valuable or critical stuff in the world is going up, and we are affected more and more systemically (think hard drive prices after the Thai floods – all the companies were located on the same flood plain). I would be more surprised by any major biodiversity loss or ecosystem collapse, but the oceans are certainly not looking good. Even with the scariest climate scenarios things in 2025 are not that different from now.
What to look out for is interstate conflicts that get global consequences. We have never seen a “real” cyber war: maybe it is overhyped, maybe we underestimate the consequences (think something like the DARPA cyber challenge as persistent, adapting malware everywhere). Big conflicts are unfortunately not impossible, and we still have lots of nukes in the world. WMD proliferation looks worryingly doable.
If I were to make a scenario for a major crisis it would be something like a systemic global issue like the oil price causing widespread trouble in some unstable regions (think of past oil-food interactions triggering unrest leading to the Arab Spring, or Russia being under pressure now due to cheap oil), which spills over into some actual conflict that has long-range effects getting out of hand (say the release of nasty bio- or cyberweapons). But it will likely not be anything we can point to before, since there are contingency plans. It will be something obvious in retrospect.
And then we will dust ourselves off, swear to never let that happen again, and half forget it.
_________________________
Question:
As I understand it, regarding existential risk and our survival as a species, most if not all discussion has to happen under the umbrella of ‘if we don’t kill ourselves off first.’ Surely, as a man who thinks so far ahead, you must have some hope that catastrophic self-inflicted won’t spell the end of our race, or at least that it won’t put us back irrevocably far technologically. In your estimation, what are the immediate self-inflicted harms we face and will we have the capacity to face them when their destructive effects manifest. Will the climate change to the point of poisoning our planet, will uncontrolled pollution destroy our global ecology in some other way, will nuclear blasts destroy all but the cockroaches and bacteria on the planet? It seems to me that we needn’t think too far to see one of these scenarios come to pass if we don’t present a globally concerted effort to intervene.
Dr. Anders Sandberg:
I think climate change, like ecological depletion or poisons, are unlikely to spell radical disaster (still, there is enough of a tail to the climate change distribution to care about the extreme cases). But they can make the world much worse to live in, and cause strains in the global social fabric that make other risks more likely.
Nuclear war is still a risk with us. And nuclear winters are potential giga-killers; we just don’t know whether they are very likely or not, because of model uncertainty. I think the probability is way higher than most people think (because of both Bayesian estimation and observer selection effects).
I think bioengineered pandemics are also a potential stumbling block. There may not be many omnicidal maniacs, but the gain-of-function experiments show that well-meaning researchers can make potentially lethal pathogens and the recent distribution of anthrax by the US military show that amazingly stupid mistakes do happen with alarming regularity.
See also: https://theconversation.com/the-five-biggest-threats-to-human-existence-27053
_________________________
Question:
I have trouble imagining how our current economic structure could cope with all the 10’s of millions of driver/taxi/delivery jobs going.
The economic domino effect of inability to pay debts/mortgages, loss of secondary jobs they were supporting, fall in demand for goods, etc, etc
It seems like the world has never really got back to “normal” (whatever that is anymore in the 21st century) after the 2008 financial crisis & never will.
I’m an optimist by nature, I’m sure we will segue & transition into something we probably haven’t even imagined yet.
But it’s very hard to imagine our current hands off laissez fair style of economy functioning in the 2020’s in the face of so much unemployment.
Dr. Anders Sandberg:
Back in the 19th century it would have seemed absurd that the economy could absorb all those farmers. But historical examples may be misleading: the structure of the economy changes.
In many ways laissez faire economics work perfectly fine in the super-unemployed scenario: we just form an internal economy, less effective than the official one sailing off into the stratosphere, and repeat the process (the problem might be if property rights make it impossible to freely set up a side economy). But clearly there is a lot of human capital wasted in this scenario.
Some people almost reflexively suggest a basic income guarantee as the remedy to an increasingly automated economy. I think we need to think much more creatively about other solutions, the BIG is just one possibility (and might not even be feasible in many nations).
_________________________
Question:
What is the most defining characteristic of transhumanism as an idea in the 10s compared with the 00s?
Dr. Anders Sandberg:
Back when I started in the 90s we were all early-Wired style tech enthusiasts. The future was coming, and it was all full of cyber! Very optimistic, very much based on the idea that if we could just organise better and convince society that transhumanism was a good idea, then we would win.
By the 00s we had learned that just having organisations does not mean your ideas get taken seriously. Although they were actually taken seriously to a far greater extent: the criticism from Fukuyama and others actually forced a very healthy debate about the ethics and feasibility of transhumanism. Also, the optimism had become tempered post-dotcom, post-911: progress is happening, but much more uneven and slow than we may have hoped for. It was by this point the existential risk and AI safety strands came into their own.
Transhumanism in the 10s? Right now I think the cool thing is the posttranshumanist movements like the rationalists and the effective altruists: in many ways full of transhumanist ideas, yet not beholden to always proclaiming their transhumanism. We have also become part of institutions, and there are people that grew up with transhumanism who are now senior enough to fund things, make startups or become philanthropists.
_________________________
Question:
Which do you think is more important for the future of humanity, the exploration of outer space (planets, stars, galaxies, etc.)? Or the exploration of inner space (consciousness, intelligence, self, etc.)?
Dr. Anders Sandberg:
Both, but in different ways. Exploration of outer space is necessary for long term survival. Exploration of inner space is what may improve us.
Question:
What step would you take first? Would you first discover “everything” or as much as possible about inner space, or outer space?
Dr. Anders Sandberg:
I suspect safety first: getting off-planet is a good start. But one approach does not preclude working on the other at the same time.•
Tags: Anders Sandberg