Commenting on philosopher Nick Bostrom’s new book, Elon Musk compared superintelligence to nuclear weapons in terms of the danger it poses us. From Adario Strange at Mashable:
“Nevertheless, the comparison of A.I. to nuclear weapons, a threat that has cast a worrying shadow over much of the last 30 years in terms of humanity’s longevity possibly being cut short by a nuclear war, immediately raises a couple of questions.
The first, and most likely from many quarters, will be to question Musk’s future-casting. Some may use Musk’s A.I. concerns — which remain fantastical to many — as proof that his predictions regarding electric cars and commercial space travel are the visions of someone who has seen too many science fiction films. ‘If Musk really thinks robots might destroy humanity, maybe we need to dismiss his long view thoughts on other technologies.’ Those essays are likely already being written.
The other, and perhaps more troubling, is to consider that Musk’s comparison of A.I. to nukes is apt. What if Musk, empowered by rare insight from his exclusive perch guiding the very real future of space travel and automobiles, really has an accurate line on the future of A.I.?
Later, doubling down on his initial tweet, Musk wrote, ‘Hope we’re not just the biological boot loader for digital superintelligence. Unfortunately, that is increasingly probable.'”