Sponsored By
Clare Goldsberry

July 12, 2016

5 Min Read
When self-driving cars fail their driver’s test—and their drivers

Self-driving cars appear to be all the rage, and leave it to a techie to be the eager purchaser of a Tesla Model S electric vehicle with self-driving capabilities. Unfortunately, something went terribly wrong and Joshua Brown, a 40-year-old owner of a tech company, was killed May 7 when his vehicle raced under the trailer of a tractor-trailer rig that was making a left turn. The car’s Autopilot system should have seen the large trailer, but evidently didn’t because of some sort of glitch.

First reports are that the camera in the Tesla mistook the large white side of the trailer to be the white sky ahead and didn’t brake, causing the car to drive under the trailer at a high rate of speed. It was noted in news reports that Brown was possibly watching a Harry Potter movie, according to the driver of the truck who was the first to approach the wrecked vehicle, and most likely wasn’t aware of the large tractor-trailer vehicle crossing the road ahead. The investigation is ongoing.

Tesla stated in several news accounts that it warns drivers they must stay alert with both hands on the steering wheel, even when the car is in autonomous mode. This new frontier of Autopilot for automobiles is fairy uncharted territory. Autopilot was formerly reserved for airplanes flying high above the crowded freeways and byways of ground transportation.

When it comes to automation technology, the Holy Grail is a robot that actually thinks like a human being. For example, cameras installed in robots and vehicles can be programmed to trigger an action when something is beyond the bounds of the program. A robot with a camera eye, for example, can be programmed to spot a defect in a plastic part when that part passes through its sightline and reject it. But what if the camera sees something that isn’t really there?

The cameras in autonomous vehicles can be programmed to see many things, like the rear end of the vehicle in front of it. But a camera can’t be programmed to “perceive” what lies ahead. And perception, as they say, is reality. Humans can see, perceive and interpret a seen object in a split second. Robot cameras can see an object and make a determination based on pre-programmed information, but they cannot perceive and interpret. Thus, the white side of a trailer was possibly seen by the camera as the whiteness of the sky ahead. Further investigation will tell.

In one test situation with an autonomous vehicle in California, the car’s camera saw another vehicle and began following it, as the software dictated. Only it wasn’t the rear end of a car traveling in the same direction as the test car—it was the front end of an oncoming vehicle. The alert driver quickly overrode the autopilot and swerved back into the correct lane, averting what could have been a serious crash. I say if you have to keep your hands on the steering wheel and your eyes on the road, you don’t need a self-driving car.

Sales of Tesla’s electric autonomous vehicles aren’t doing well and the company is not expected to reach its projected goal of 90,000 sales for 2016. That’s not surprising given that Elon Musk, Tesla’s CEO, has bet the farm (and taxpayer money) on a vehicle that most people don’t really want. Musk seems to be attempting to solve a problem that most people know is a problem—driver error causing vehicle crashes—but doubt that autonomous vehicles are the answer.

Maybe the joys of driving have worn off. Millennials, certainly, are not enamored with fast cars, unlike baby boomers for whom owning an eight-cylinder, 426-engine muscle car was the most exciting thing going. Still, I think that Musk might be missing the point and because of that he is saddled with "push” marketing: Convincing consumers that this is something they really need when the masses of consumers don’t really know why they need it, and are not willing to pay for it.

The old saying from a plastics process inventor keeps coming to my mind (sorry if I’m repeating myself, but it’s so appropriate): “Build for the masses, eat with the classes. Build for the classes, eat with the masses.”

Autonomous cars will raise another specter—liability. I remember talking to an automotive industry attorney one time who said that attorneys generally are excited about autonomous cars and the potential for crashes that will yield big money. “With a self-driving car, we don’t have to worry about suing some 16-year-old teenager who has no money when he causes a wreck,” he said. “Attorneys can go after the car manufacturers, the software developers and others in the supply chain with deep pockets.”

The last I heard, Brown’s family hadn’t decided whether or not to sue Tesla as they await the results of the investigation.

A Wall Street Journal headline in the July 5 edition, said it best: "Tesla’s Problem: Pushing Boundaries Too Far." This just might be a case of experimenting with technology because it can be done. But, as Bill Banholzer, an engineer and acquaintance in the plastics industry, once said in an excellent article, “Just because something can be done doesn’t mean we should do it.”

Do you agree? Take the PlasticsToday poll on the home page and share your opinion with our community.


Joshua Brown posted a video on YouTube in October 2015 showing his Tesla in Autopilot mode.

About the Author(s)

Clare Goldsberry

Until she retired in September 2021, Clare Goldsberry reported on the plastics industry for more than 30 years. In addition to the 10,000+ articles she has written, by her own estimation, she is the author of several books, including The Business of Injection Molding: How to succeed as a custom molder and Purchasing Injection Molds: A buyers guide. Goldsberry is a member of the Plastics Pioneers Association. She reflected on her long career in "Time to Say Good-Bye."

Sign up for the PlasticsToday NewsFeed newsletter.

You May Also Like