So, Saturday was the Urban Challenge final. (Nice BBC writeup with some video clips here.) Like I said, this is huge.
Some background is important for appreciating this.
The first Grand Challenge was in 2004. The course was 150 miles of static desert terrain. The best result that year was CMU, who got a bit over 7 miles before driving into a berm and catching on fire.
The second was in 2005. The course was pretty much the same as before. 5 teams completed the entire thing.
This year was the first Urban Challenge. As the name implies, the course was in an urban setting. All the vehicles were in it at once, working to complete several minigoals of the form 'get from A to B'. There were also another 30+ vehicles being driven around the course by humans. Not only did the autonomous vehicles have to not crash into anything, they lost time for violating the California driving code. They had 6 hours to complete their goals, for a total of about 60 miles of driving.
6 teams finished on the first year. This is huge. Over 3-4 years we've gone from having vehicles that could manage to follow GPS waypoints more or less blindly, to them being able to follow traffic laws and merge with dynamic traffic. DARPA started funding the challenges because the military has a goal of making 1/3 of ground forces autonomous by 2015. When I first heard that, I thought it was a joke. Over the last year, I've been starting to think that we're about 10 years out from widespread use of autonomous vehicles. 2015 is starting to look very doable.
At the risk of being a bit discursive, what does this mean? I'm going to look at a couple of possible results.
First, get ready for another annoying round of kneejerk reactionism to new technology. Pretty soon now, we as a society are going to have to address letting autonomous vehicles drive legally. This will probably start with freeways, then be extended to everywhere. Expect much squawking. I predict people will vehemently decry switching over until insurance companies start to give discounts to people who use automatic driving systems. Then we'll move into an even more annoying classist phase of reactionism, much like we saw with cellphones. This will also be crushed by insurance costs, until driving manually will be the equivalent of owning a tricked out racecar today. Hopefully that final phase will include a switch to a much more comprehensive test for a driver's license, something along the lines of a pilot's license today. Cars are weapons, and I'll feel a lot safer when we really treat them as such.
Second, there are large labor implication for this switch. If someone in charge is smart, they'll start the retraining programs now. There is no way Wal-Mart isn't switching to autonomous trucks the instant it's legal, and they'll be doing everything possible to make it legal asap.
Finally, while the military aspects are really just an initial motivator, I suspect they will switch over before anyone else. It's an easy win, particularly in Iraq. Hell, as long as there aren't any photogenic football players driving them, everyone (in control) is happier when insurgents blow up convoys. That's just another cost-plus government contract. And if the programming isn't quite ready yet, well, any locals who get themselves run over in Iraq were probably bad guys anyway. :P
Which makes me wonder -- as we switch our forces over to autonomous robots, what are the foreign policy implications? Wars will become purely an investment of capital, without any of the pesky political implications on the domestic front. So we'll be more willing to dive into them. (Is it even a military action under the constitution if there aren't any people involved? Where is the line between providing material help to a puppet dictatorship and providing them with an army of autonomous stormtroopers?) Worse, what do insurgents do when the occupying force is soulless robots? Angry, powerless people are always going to turn to violence as a means of expression, but will blowing up autonomous vehicles really satisfy them? Terrorism doesn't work against machines. I'm afraid they'll turn to attacks on neutral civilians all the more, making the local human cost all the more painful. (And, incidentally, making successful results from involvement in foreign conflicts even less likely, as this pushes the situation towards civil war that much faster.)
So, whew, we can look forward to a lot of stuff over the next few years. Hell of a good time to be getting into the general field. :)
Some background is important for appreciating this.
The first Grand Challenge was in 2004. The course was 150 miles of static desert terrain. The best result that year was CMU, who got a bit over 7 miles before driving into a berm and catching on fire.
The second was in 2005. The course was pretty much the same as before. 5 teams completed the entire thing.
This year was the first Urban Challenge. As the name implies, the course was in an urban setting. All the vehicles were in it at once, working to complete several minigoals of the form 'get from A to B'. There were also another 30+ vehicles being driven around the course by humans. Not only did the autonomous vehicles have to not crash into anything, they lost time for violating the California driving code. They had 6 hours to complete their goals, for a total of about 60 miles of driving.
6 teams finished on the first year. This is huge. Over 3-4 years we've gone from having vehicles that could manage to follow GPS waypoints more or less blindly, to them being able to follow traffic laws and merge with dynamic traffic. DARPA started funding the challenges because the military has a goal of making 1/3 of ground forces autonomous by 2015. When I first heard that, I thought it was a joke. Over the last year, I've been starting to think that we're about 10 years out from widespread use of autonomous vehicles. 2015 is starting to look very doable.
At the risk of being a bit discursive, what does this mean? I'm going to look at a couple of possible results.
First, get ready for another annoying round of kneejerk reactionism to new technology. Pretty soon now, we as a society are going to have to address letting autonomous vehicles drive legally. This will probably start with freeways, then be extended to everywhere. Expect much squawking. I predict people will vehemently decry switching over until insurance companies start to give discounts to people who use automatic driving systems. Then we'll move into an even more annoying classist phase of reactionism, much like we saw with cellphones. This will also be crushed by insurance costs, until driving manually will be the equivalent of owning a tricked out racecar today. Hopefully that final phase will include a switch to a much more comprehensive test for a driver's license, something along the lines of a pilot's license today. Cars are weapons, and I'll feel a lot safer when we really treat them as such.
Second, there are large labor implication for this switch. If someone in charge is smart, they'll start the retraining programs now. There is no way Wal-Mart isn't switching to autonomous trucks the instant it's legal, and they'll be doing everything possible to make it legal asap.
Finally, while the military aspects are really just an initial motivator, I suspect they will switch over before anyone else. It's an easy win, particularly in Iraq. Hell, as long as there aren't any photogenic football players driving them, everyone (in control) is happier when insurgents blow up convoys. That's just another cost-plus government contract. And if the programming isn't quite ready yet, well, any locals who get themselves run over in Iraq were probably bad guys anyway. :P
Which makes me wonder -- as we switch our forces over to autonomous robots, what are the foreign policy implications? Wars will become purely an investment of capital, without any of the pesky political implications on the domestic front. So we'll be more willing to dive into them. (Is it even a military action under the constitution if there aren't any people involved? Where is the line between providing material help to a puppet dictatorship and providing them with an army of autonomous stormtroopers?) Worse, what do insurgents do when the occupying force is soulless robots? Angry, powerless people are always going to turn to violence as a means of expression, but will blowing up autonomous vehicles really satisfy them? Terrorism doesn't work against machines. I'm afraid they'll turn to attacks on neutral civilians all the more, making the local human cost all the more painful. (And, incidentally, making successful results from involvement in foreign conflicts even less likely, as this pushes the situation towards civil war that much faster.)
So, whew, we can look forward to a lot of stuff over the next few years. Hell of a good time to be getting into the general field. :)
no subject