Gideon Rose

You are currently browsing articles tagged Gideon Rose.

In conversation with Foreign Affairs’ Gideon Rose, Helen Grenier, founder of iRobot and CyPhy Works, addresses, among other things, the twin threats of technological unemployment and terrorism. An excerpt:

Question:

Old science fiction used to be filled with flying cars, jetpacks, and things like that. Will those eventually take advantage of the space you’re talking about, or will it be just drones?

Helen Greiner:

I believe that the technologies will come. We already have a lot of technologies in ground robots to sense and avoid things. I want to bring some of those technologies to the flying robot space. I think we can disambiguate the airspace; I see no reason why drones can’t share the airspace with man.

Question:

How will people be safe in a world in which drone technology has proliferated and drones become incredibly easy to purchase and operate?

Helen Greiner:

A terrorist could buy a drone today and start planning an attack with it, and I think the only way we’re actually going to catch that is with human intelligence. Terrorists aren’t going to get drones from a company building them for commercial reasons; they’re going to go to the hobby store and buy the ones that are already freely available, if they want to pack them with explosives. I think it’s a challenge. But you can do the same with a car, and you don’t say, ‘Well, we shouldn’t sell cars because you can use them in a suicide attack.’ All we have to do is figure out who’s going to be doing it and try to stop it.

Question:

You like the idea of a world full of robots, and a lot of people would agree if they helped them do things. But does that world full of robots have as many jobs for ordinary humans?

Helen Greiner:

Robots have been in place in factories for decades now, and jobs have changed, but there’re still people in factories. Maybe there are fewer, but we’re able to produce more. If robots happen to make things more efficient, you want to be the place that has them. You can’t stop technology; the world’s going to continue to move forward. I would love to see [technological productivity] change the social contract and how people think of a full workweek—as four days of work or, later on, even three days—because there could be more quality time that people spend with their families.

Question:

Did the Roomba’s making people comfortable with the idea of robots in their homes have a cultural significance beyond the economics?

Helen Greiner:

I believe it did.”

Tags: ,

From a Foreign Affairs interview that Gideon Rose conducted with roboticist Sebastian Thrun, a passage about the subject’s triumph in a 2005 driverless-car competition in the Mojave Desert:

Question:

Why did your project end up working so well?

Sebastian Thrun:

Many of the people who participated in the race had a strong hardware focus, so a lot of teams ended up building their own robots. Our calculus was that this was not about the strength of the robot or the design of the chassis. Humans could drive those trails perfectly; it was not complicated off-road terrain. It was really just desert trails. So we decided it was purely a matter of artificial intelligence. All we had to do was put a computer inside the car, give it the appropriate eyes and ears, and make it smart.

In trying to make it smart, we found that driving is really governed not by two or three rules but by tens of thousands of rules. There are so many different contingencies. We had a day when birds were sitting on the road and flew up as our vehicle approached. And we learned that to a robot eye, a bird looks exactly the same as a rock. So we had to make the machine smart enough to distinguish birds from rocks.

In the end, we started relying on what we call machine learning, or big data. That is, instead of trying to program all these rules by hand, we taught our robot the same way we would teach a human driver. We would go into the desert, and I would drive, and the robot would watch me and try to emulate the behaviors involved. Or we would let the robot drive, and it would make a mistake, and we would go back to the data and explain to the robot why this was a mistake and give the robot a chance to adjust.

Question:

So you developed a robot that could learn?

Sebastian Thrun:

Yes. Our robot was learning. It was learning before the race, and it was learning in the race.”

Tags: ,