Google-Owned Self-Driving Car Involved In Another Accident, Sends Motorcyclist To Hospital


Waymo is a subsidiary of Alphabet, the parent company of Google. It is the biggest name in self-driving cars at the moment. And the name it is making for itself is a mixed bag. There has been yet another report of an accident involving a self-driving car. The human driver was thrown under the bus once again.

Engadget described it this way, “Waymo blames self-driving collision on pesky human.”

“Waymo has admitted in a blog post that one of its test vehicles hit a motorcycle in Mountain View. The company has defended its technology in the post, though, clarifying that the event was caused by human error. Apparently, the test driver took control of the vehicle after seeing a passenger car to the left moving into their lane. Waymo says they moved the car to the right lane without noticing that a motorcycle had moved from behind to pass the test vehicle. The test car sustained minor damage, but the collision was unfortunately serious enough to send (PDF) the motorcyclist to the hospital.”

This is different from the accident involving a driver falling asleep at the wheel of a Waymo self-driving car. In that instance, the test driver fell asleep and the car had a minor accident involving no other vehicles and leaving no one injured. It could have been worse.

The common factor is that the test driver or the driver in another vehicle is always to blame in accidents involving self-driving cars. Sometimes, the driver is not attentive enough. This time, the driver did what humans are supposed to do. He stayed attentive and took the wheel based on his judgment of the situation.

Unfortunately, his judgement was deemed faulty. Waymo determined that had the driver simply allowed the system to react the way it was designed, the car would have slowed down and avoided the accident altogether. That said, there is no accusation of wrongdoing on the part of the test driver.

Other self-driving incidents have resulted in fatalities. Cameras can misidentify objects in the road that the human driver could account for if attentive. Tesla’s Auto Pilot has also been at the center of deadly collisions. Again, it is always determined that the driver put too much trust in the system.

Test drivers are in a no-win situation when self-driving accidents occur. They are either not attentive enough, or they take control when they should have trusted the system. Ultimately, self-driving systems will have to learn to deal with human drivers. It might be a long time before these types of issues are worked out.

Share this article: Google-Owned Self-Driving Car Involved In Another Accident, Sends Motorcyclist To Hospital
More from Inquisitr