Another Tesla reportedly using Autopilot hits a parked police car
By Chris Isidore
CNN Business
August 30, 3032
Tesla Model 3
New York (CNN Business)Another Tesla has hit an emergency vehicle, apparently while using the Autopilot driver-assist feature, adding to a problem that is already the subject of a federal safety probe.
The
Florida Highway Patrol reported the accident just before 5 in the
morning Saturday along Interstate 4 in Orlando. No one was seriously
injured in the crash, though the Tesla did narrowly miss hitting a state
trooper as he left his car to assist another driver who had broken down
on the highway.
The
broken-down car was a Mercedes that had come to a stop in a travel
lane. The police cruiser was stopped behind it with its emergency lights
flashing. The left front of the Tesla Model 3 crashed into the side of
the police car, and then hit the Mercedes.
"The driver stated that [the Tesla] was in Autopilot mode," said the report from the Florida Highway Patrol.
The National Highway Traffic Safety Administration
disclosed earlier this month that it is investigating at least 11
accidents involving Teslas that have hit police cars, ambulances or
other emergency vehicles while they were responding to traffic
accidents. The crashes under investigation occurred between January 22,
2018, and July 10, 2021, across nine states. They took place mostly at
night, and the accident response scenes were all outfitted with control
measures such as emergency vehicle lights, flares, illuminated arrow
boards and road cones, according to NHTSA.
Florida police said they would report the crash to the NHTSA and to Tesla.
The
highway safety agency said that it is important that Tesla owners using
Autopilot remain alert and ready to take control of the car in order to
avoid obstacles.
"NHTSA
reminds the public that no commercially available motor vehicles today
are capable of driving themselves," the agency said in a statement.
"Every available vehicle requires a human driver to be in control at all
times, and all state laws hold human drivers responsible for operation
of their vehicles."
Tesla
did not respond to a request for comment on the latest crash or on the
NHTSA investigation. Although the company says that its data show cars
using Autopilot have fewer accidents per mile than cars being driven by
humans, it does warn that "current Autopilot features require active
driver supervision and do not make the vehicle autonomous."
In
addition to the NHTSA probe, Senators Richard Blumenthal of Connecticut
and Edward Markey of Massachusetts, Democrats who have been critical of
Tesla in the past, have asked the Federal Trade Commission to launch
an investigation into whether Tesla's use of the term "Autopilot" and
its claims about the car's self-driving capabilities amount to misleading advertising. The FTC has not commented on whether it has launched the requested probe into Tesla's claims.
Driver-assist
options such as Tesla's Autopilot or adaptive cruise control, which is
available in a wide range of vehicles from various automakers, do a good
job of slowing a car down when traffic ahead slows, said Sam
Abuelsamid, an expert in self-driving vehicles and principal analyst at
Guidehouse Insights.
But
Abuelsamid said those vehicles are designed to ignore stationary
objects when traveling at more than 40 mph so they don't slam on the
brakes when approaching overpasses or other objects on the side of the
road — such as a car stopped on the shoulder. Fortunately most of these
automatic braking systems do stop for stationary objects when they are
traveling at slower speeds.
The
bigger problem, according to Abuelsamid, is that many more Tesla owners
appear to assume their cars can, in fact, drive themselves than do
drivers of other vehicles with automatic driver-assist features.
Moreover, the cues that a driver would see when approaching an accident
site, such as road flares or flashing lights, make more sense to a human
than they might to an auto drive system.
"When
it works, which can be most of the time, it can be very good,"
Abuelsamid said about Tesla's Autopilot feature. "But it can easily be
confused by things that humans would have no problem with. Machine
visions are not as adaptive as humans'. And the problem is that all
machine systems sometimes make silly errors."
1 comment:
Reminds me of the joke about the first completely automatic passenger aircraft. The "pilot" comes on the PA and assures the passengers that "nothing can possibly go wrong go wrong go wrong go wrong go wrong."
Post a Comment