There is a fascinating premise underpinning Steven Levy’s Backchannel interview with Jerry Kaplan, the provocatively titled, “Can You Rape a Robot?”: AI won’t need become conscious for us to treat it as such, for the new machines to require a very evolved sense of morality. Kaplan, the author of Humans Need Not Apply, believes that autonomous machines will be granted agency if they can only mirror our behaviors. Simulacrum on an advanced level will be enough. The author thinks AI can vastly improve the world, but only if we’re careful to make morality part of the programming.
Well by the end of your book, you’re pretty much saying we will have robot overlords — call them “mechanical minders.”
It is plausible that certain things can [happen]… the consequences are very real. Allowing robots to own assets has severe consequences and I stand by that and I will back it up. Do I have the thing about your daughter marrying a robot in there?
That’s a different book. [Kaplan has a sequel ready.] I’m out in the far future here, but it’s plausible that people will have a different attitude about these things because it’s very difficult to not have an emotional reaction to these things. As they become more a part of our lives people may very well start to inappropriately imbue them with certain points of view.•