Google's Self-Driving Lexus Is Safer Than You Are
#1
Google's Self-Driving Lexus Is Safer Than You Are
(Can you take it to the track and press "WIN"?)
Data gathered from Google’s self-driving Prius and Lexus cars shows that they are safer and smoother when steering themselves than when a human takes the wheel, according to the leader of Google’s autonomous-car project.
Chris Urmson made those claims today at a robotics conference in Santa Clara, California. He presented results from two studies of data from the hundreds of thousands of miles Google’s vehicles have logged on public roads in California and Nevada.
One of those analyses showed that when a human was behind the wheel, Google’s cars accelerated and braked significantly more sharply than they did when piloting themselves. Another showed that the cars’ software was much better at maintaining a safe distance from the vehicle ahead than the human drivers were.
“We’re spending less time in near-collision states,” said Urmson. “Our car is driving more smoothly and more safely than our trained professional drivers.”
In addition to painting a rosy picture of his vehicles’ autonomous capabilities, Urmson showed a new dashboard display that his group has developed to help people understand what an autonomous car is doing and when they might want to take over. “Inside the car we’ve gone out of our way to make the human factors work,” he said.
Although that might suggest the company is thinking about how to translate its research project into something used by real motorists, Urmson dodged a question about how that might happen. “We’re thinking about different ways of bringing it to market,” he said. “I can’t tell you any more right now.”
Urmson did say that he is in regular contact with automakers. Many of those companies are independently working on self-driving cars themselves (see “Driverless Cars Are Further Away Than You Think”).
Google has been testing its cars on public roads since 2010 (see “Look, No Hands”), always with a human in the driver’s seat who can take over if necessary.
Urmson dismissed claims that legal and regulatory problems pose a major barrier to cars that are completely autonomous. He pointed out that California, Nevada, and Florida have already adjusted their laws to allow tests of self-driving cars. And existing product liability laws make it clear that a car’s manufacturer would be at fault if the car caused a crash, he said. He also said that when the inevitable accidents do occur, the data autonomous cars collect in order to navigate will provide a powerful and accurate picture of exactly who was responsible.
Urmson showed data from a Google car that was rear-ended in traffic by another driver. Examining the car’s annotated map of its surroundings clearly showed that the Google vehicle smoothly halted before being struck by the other vehicle.
“We don’t have to rely on eyewitnesses that can’t act be trusted as to what happened—we actually have the data,” he said. “The guy around us wasn’t paying enough attention. The data will set you free.”
http://www.technologyreview.com/news...than-you-or-i/
Data gathered from Google’s self-driving Prius and Lexus cars shows that they are safer and smoother when steering themselves than when a human takes the wheel, according to the leader of Google’s autonomous-car project.
Chris Urmson made those claims today at a robotics conference in Santa Clara, California. He presented results from two studies of data from the hundreds of thousands of miles Google’s vehicles have logged on public roads in California and Nevada.
One of those analyses showed that when a human was behind the wheel, Google’s cars accelerated and braked significantly more sharply than they did when piloting themselves. Another showed that the cars’ software was much better at maintaining a safe distance from the vehicle ahead than the human drivers were.
“We’re spending less time in near-collision states,” said Urmson. “Our car is driving more smoothly and more safely than our trained professional drivers.”
In addition to painting a rosy picture of his vehicles’ autonomous capabilities, Urmson showed a new dashboard display that his group has developed to help people understand what an autonomous car is doing and when they might want to take over. “Inside the car we’ve gone out of our way to make the human factors work,” he said.
Although that might suggest the company is thinking about how to translate its research project into something used by real motorists, Urmson dodged a question about how that might happen. “We’re thinking about different ways of bringing it to market,” he said. “I can’t tell you any more right now.”
Urmson did say that he is in regular contact with automakers. Many of those companies are independently working on self-driving cars themselves (see “Driverless Cars Are Further Away Than You Think”).
Google has been testing its cars on public roads since 2010 (see “Look, No Hands”), always with a human in the driver’s seat who can take over if necessary.
Urmson dismissed claims that legal and regulatory problems pose a major barrier to cars that are completely autonomous. He pointed out that California, Nevada, and Florida have already adjusted their laws to allow tests of self-driving cars. And existing product liability laws make it clear that a car’s manufacturer would be at fault if the car caused a crash, he said. He also said that when the inevitable accidents do occur, the data autonomous cars collect in order to navigate will provide a powerful and accurate picture of exactly who was responsible.
Urmson showed data from a Google car that was rear-ended in traffic by another driver. Examining the car’s annotated map of its surroundings clearly showed that the Google vehicle smoothly halted before being struck by the other vehicle.
“We don’t have to rely on eyewitnesses that can’t act be trusted as to what happened—we actually have the data,” he said. “The guy around us wasn’t paying enough attention. The data will set you free.”
http://www.technologyreview.com/news...than-you-or-i/
#4
Rennlist Member
Yes. I have a lot more faith is the fundamental human instinct of self-preservation, and my own ability to communicate with drivers, than I do in a computer program thhat may be amazingly similar to the one currently in place for Obamacare.
#6
While VR's reaction takes a vibrant leap of subject, he does raise an interesting point. The developers of the Affordable Care website admit that the project suffered from insufficient prior testing.
Google, on the other hand, has extensively tested its robotic cars...more than three years and "hundreds of thousands of miles... on public roads in California and Nevada."
Perhaps, if Google had developed the Affordable Care website, it may have worked better? At any rate, just about any alternative is guaranteed to improve the quality of driving on our nation's highways.
Google, on the other hand, has extensively tested its robotic cars...more than three years and "hundreds of thousands of miles... on public roads in California and Nevada."
Perhaps, if Google had developed the Affordable Care website, it may have worked better? At any rate, just about any alternative is guaranteed to improve the quality of driving on our nation's highways.
#7
Rennlist Member
The day a "Google Car" can see an approaching car, look at the drivers eyes (are they texting or paying attention) and assign an appropriate threat level to it, will be the day I ride in one...
Trending Topics
#8
Rennlist Member
While VR's reaction takes a vibrant leap of subject, he does raise an interesting point. The developers of the Affordable Care website admit that the project suffered from insufficient prior testing.
Google, on the other hand, has extensively tested its robotic cars...more than three years and "hundreds of thousands of miles... on public roads in California and Nevada."
Google, on the other hand, has extensively tested its robotic cars...more than three years and "hundreds of thousands of miles... on public roads in California and Nevada."
With respect, insufficient "testing" is not the problem with that $700 million **** sandwich.
I guess maybe I am in the minoriity in being extremely skeptical of any gigantic all-encompassing hairball of computer code for life-and-death situations...like driving. I flat-out don't trust them...and surely don't trust Google. Imagine getting the blue screen of death at 75 MPH...
This is EXACTLY my point.
#10
Rennlist Member
I'd like to intentionally swerve toward a robot car just to see if it takes a massive data dump on itself.
#12
Addict
Rennlist Member
Rennlist Member
I'd also love to hear more about the details of their training program for their "professional drivers"...
#13
Airbus' self flying planes have a reputation for flying themselves into the ground.
And calling the 0care website a website or computer program is a stretch.
-Mike
#14
The problem with using computers for this is computers are incredibly dumb. They literally only do what they are told to and if circumstances arise that have not been taken into account by the programming bad things can happen. The Airbus reference in the previous post is from an incident around 1990 where an Airbus at an airshow was to be flown by over a runway in landing configuration. They got to the end of the runway, a couple hundred feet up and it was time to climb, but the computers in the plane wouldn't let the pilot pull up or add power and the plane gracefully flew into the trees at the airport boundary. The dumb *** computers responded to the pilots inputs thinking "oh no, were are not going up, we are landing".
Failure possibilities are on many levels. Code bugs will still get through, but one can reduce these by code reviews and programming standards. However, hardware failures can also happen and when they do, the circumstances change, and bad things can happen.
Things like lane departure and blind spot warnings can help improve a poor driver. Radar for detecting clears in traffic could help with t-bones. The bottom line is aids like this are much less risky because the human driving the car still has override capability.
Society could improve driver safety by simply requiring better driver education. I strongly feel track driving has made me a better street driver. I'm more aware and drive more cautiously than I used to and I have better skills for dealing with emergencies. There is literally no temptation to hot rod on the street because it's either too lame compared to the track or too dangerous to attempt or it would be jail time if caught.
My $0.02.
-Mike
Failure possibilities are on many levels. Code bugs will still get through, but one can reduce these by code reviews and programming standards. However, hardware failures can also happen and when they do, the circumstances change, and bad things can happen.
Things like lane departure and blind spot warnings can help improve a poor driver. Radar for detecting clears in traffic could help with t-bones. The bottom line is aids like this are much less risky because the human driving the car still has override capability.
Society could improve driver safety by simply requiring better driver education. I strongly feel track driving has made me a better street driver. I'm more aware and drive more cautiously than I used to and I have better skills for dealing with emergencies. There is literally no temptation to hot rod on the street because it's either too lame compared to the track or too dangerous to attempt or it would be jail time if caught.
My $0.02.
-Mike
#15
Rennlist Member
It would be very interesting to see a self-driving track car. On an open, dry track with no traffic I bet it could be programmed to drive faster than many/most good drivers. With traffic would be a whole other story.
Give it time.
Give it time.