Welcome, Guest! - or
Easy to remember!  »  VinNews.com

Los Angeles - Google Self-Driving Car Strikes Bus On California Street

Published on: February 29, 2016 07:04 PM
By: AP
Change text size Text Size  
FILE - A line of Lexus SUVs equipped with Google self-driving sensors await test riders during a media preview of Google's prototype autonomous vehicles in Mountain View, California September 29, 2015.  REUTERS/Elijah NouvelageFILE - A line of Lexus SUVs equipped with Google self-driving sensors await test riders during a media preview of Google's prototype autonomous vehicles in Mountain View, California September 29, 2015.  REUTERS/Elijah Nouvelage

Los Angeles - A self-driving car being tested by Google struck a public bus on a Silicon Valley street, a fender-bender that appears to be the first time one of the tech company’s vehicles caused a crash during testing.

Google accepted at least some responsibility for the collision, which occurred on Valentine’s Day when one of the Lexus SUVs it has outfitted with sensors and cameras hit the side of the bus near the company’s headquarters in Mountain View, California.

Advertisement:

No one was injured, according to an accident report Google wrote and submitted to the California Department of Motor Vehicles. It was posted online Monday.

According to the report, Google’s car intended to turn right off a major boulevard when it detected sandbags around a storm drain at the intersection.

The right lane was wide enough to let some cars turn and others go straight, but the Lexus needed to slide to its left within the right lane to get around the obstruction.

The Lexus was going 2 mph when it made the move and its left front struck the right side of the bus, which was going straight at 15 mph.

The car’s test driver — who under state law must be in the front seat to grab the wheel when needed — thought the bus would yield and did not have control before the collision, Google said.

While the report does not address fault, Google said in a written statement, “We clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision.”

Chris Urmson, the head of Google’s self-driving car project, said in a brief interview that he believes the Lexus was moving before the bus started to pass.

“We saw the bus, we tracked the bus, we thought the bus was going to slow down, we started to pull out, there was some momentum involved,” Urmson told The Associated Press.

He acknowledged that Google’s car did have some responsibility but said it was “not black and white.”

The Santa Clara Valley Transportation Authority said none of the 15 passengers or the driver of the bus was injured.

An internal investigation by the transit agency was ongoing and “no determination of liability has been made,” spokeswoman Stacey Hendler Ross said in a written statement.

There may never be a legal decision on liability, especially if damage was negligible — as both sides indicated it was — and neither Google nor the transit authority pushes the case.

Still, the collision could be the first time a Google car in autonomous mode caused a crash.

Google cars have been involved in nearly a dozen collisions in or around Mountain View since starting to test on city streets in the spring of 2014. In most cases, Google’s cars were rear-ended. No one has been seriously injured.

Google’s written statement called the Feb. 14 collision “a classic example of the negotiation that’s a normal part of driving — we’re all trying to predict each other’s movements.”

Google said its computers have reviewed the incident and engineers changed the software that governs the cars to understand that buses may not be as inclined to yield as other vehicles.

A spokeswoman for California’s DMV, which regulates Google’s testing of about two dozen Lexus SUVs in the state, said the agency hoped to speak Monday with Google. Under state law, Google must retain data from the moments before and after any collision.

“As far as he-said she-said, there shouldn’t be any of that. It’s all there,” said Robert W. Peterson, an insurance law expert at Santa Clara University who has studied self-driving cars.

A critic of Google’s self-driving car efforts said the collision shows the tech giant should be kept from taking onto public streets self-driving prototypes it built without a steering wheel or pedals.

Google sees that as the next natural step for the technology, and has pressed California’s DMV and federal regulators to authorize cars in which humans have limited means of intervening.

“Clearly Google’s robot cars can’t reliably cope with everyday driving situations,” said John M. Simpson of the nonprofit Consumer Watchdog. “There needs to be a licensed driver who can takeover, even if in this case the test driver failed to step in as he should have.”



More of today's headlines

Miami - The Florida primary is weeks away, but tens of thousands of voters headed to the polls Monday for early voting in this critical contest that could make-or-break... New York - Meal-kit companies have exploded in the past four years, shipping boxes of raw meat, seafood, fresh vegetables and other ingredients to busy city folk who...

 

Total3

Read Comments (3)  —  Post Yours »

1

 Feb 29, 2016 at 07:33 PM StevenWright Says:

Google says chances that the driver was injured are remote.
Technicians at Google are searching for the cause of the crash….

2

 Feb 29, 2016 at 10:16 PM Realistic Says:

Self driving cars need a steering wheel and driver that can take over. Should the passenger decide to make an unscheduled stop he wouldn't otherwise be able to. Also the best GPS directions don't always get the address quite right they can be off by a few houses as the algorithm for determining the location of an address doesn't always reflect reality.
Driver-less cars in principle would be a boon for older people and would allow many to remain independent for far longer, but the driver still needs the ability to override the car.

3

 Mar 01, 2016 at 02:04 PM Anonymous Says:

Reply to #2  
Realistic Says:

Self driving cars need a steering wheel and driver that can take over. Should the passenger decide to make an unscheduled stop he wouldn't otherwise be able to. Also the best GPS directions don't always get the address quite right they can be off by a few houses as the algorithm for determining the location of an address doesn't always reflect reality.
Driver-less cars in principle would be a boon for older people and would allow many to remain independent for far longer, but the driver still needs the ability to override the car.

None of the articles about the Google cars discuss what I consider to be the most important factor - the choices (or ethics if you prefer), that are programmed into the system. Here's an example - your brakes fail on a hill. You have three choices- you can hit a tree, which would kill you and your 4 passengers, you can hit a pedestrian, which would kill the pedestrian but save 5 people in your car, or ou could veer into the other lane, which would cause several other cars to crash with unknown consequences. I have no clue how I would respond in such a situation, never having been in one. I don't even know how one begins to assess the morals of each consequence and which one would be better (any experts on this subject, please comment!). I just know that I would have to pick one of the above as would the Google car. Which one will the car pick? Who gets to decide which is the appropriate action? What if the Torah would mandate that I take option 1 but Google has decided that option 2 is preferred? Do I potentially violate halacha by buying a Google car and how would I even know? I am not saying that I can answer these questions, but I would certainly like them discussed!

4

Sign-in to post a comment

Click here to sign-in.

Scroll Up
Advertisements:
Sell your scrap gold and broken jewelry and earn hard cash sell gold today!