5 Reasons Why a Self-Driving Car Is a Bad Driver

People have been looking forward to self-driving cars ever since David Hasselhoff let KITT take over in Knight Rider. Imagine what we could get done if we didn’t have to spend two hours commuting every day? We could write the next great American novel or get prepared for the big meeting. Google is testing a self-driving car now, and while it doesn’t have road rage or text its boyfriend, we can’t help but think that given the complexities of the road, it’s going to be a bad driver.

1. Fair weather friend

bad weather

Alex Wong/Getty Images

How long did it take you to learn how to drive in a snowstorm or heavy rain? I remember getting stuck in snowdrifts and hydroplaning through a few stop lights as a youth in a crappy ’87 Ford Thunderbird. Thanks to that, I know how to drive in snow and rain, so when something happens, I know what to do. Sensors can tell if there is a car in front of me, but can it tell if there is a patch of black ice? How will the Google car compensate for slippery conditions or the 50 mile an hour winds and whiteout conditions of a blizzard? My guess is that it will respond much like my 16-year-old self did when I first encountered the conditions, but unlike me, it won’t learn. Fortune also worried what happens when the Google car gets stuck.

2. Construction and detours


Source: Thinkstock

For many places, construction season is year-round, with projects starting and stopping several times throughout the year. There are delays, lane closures, and even road closures that are annoying, but something we live with. A self-driving car uses mapping software and global positioning systems that are updated regularly, but what if you’re wireless goes down or the update didn’t include a construction site.

MIT Technology Review gives a great example of what might happen if a stop light isn’t included on the mapping program. The car might not register the light, which could mean at the very best a ticket for blowing the light or at worst a collision. We can see a light without the need of programming, but a self-driving car is at the mercy of the latest upload. Google was quick to interject that the car can see most unmapped signage.

3. Too cautious


Source: iStock

When the Google car doesn’t recognize or is confused by something, it goes into caution mode. It relies on its sensors and slows down to make sure there’s no cars or other problems in the immediate vicinity. This can be very troublesome if you’re in the middle of traffic or an intersection. It’s like being behind the 3-foot-tall, blue-haired granny that’s going 30 down the highway and can’t see over the steering wheel. The sensors cannot distinguish between a real threat and something harmless. For example, if the car senses something in the road, it will slow down or stop. This could be a car, an animal or a paper bag. How many of us have driven over the carcass of a dead animal or part of a broken tire from a semi knowing it wouldn’t hurt the car?

4. Not good with people

Source: Thinkstock

Source: Thinkstock

By far, the biggest variable when it comes to driving is and always shall be people. People are reckless, often times stupid and do completely irrational things. The car’s sensors aren’t calibrated to take into account the homeless guy coming toward your car with a dirty towel and a spray bottle. It doesn’t know what to do when the local American Legion has a tag day and stands in the middle of the street looking for spare change. People walk out into traffic every day specifically with the intent of being hit.

5. The human factor

google car

Source: Google

One of the most poignant scenes in I Robot is when Will Smith’s character recounts the time a robot saved him instead of a little girl. The robot had no emotions, and the data showed saving Smith’s character was the best option. Data can’t compare to the split-second decision making of the human brain. In a situation where the outcome could either be the potential death of a group of people or a single pedestrian, the human brain factors in both emotional and ethical components before making a decision. A person would likely swerve to hit the pedestrian over hitting a group of pedestrians. A robot or self-driving car simply sees the data and makes a decision based on the best outcome for the car and that may not be the one you would make.

Follow Brock on Twitter @brockcooper

More from Gear and Style Cheat Sheet:

Want more great content like this? Sign up here to receive the best of Cheat Sheet delivered daily. No spam; just tailored content straight to your inbox