If you haven't bought a new car in a few years, you might be surprised at how many driving tasks are now automated — speed control, braking, lane-keeping and even changing lanes.
Why it matters: Carmakers keep adding more automated features in the name of safety. But now authorities want to find out if assisted-driving technology itself is dangerous by making it too easy for people to misuse.
- The more sophisticated the assisted-driving system, the more complacent drivers can become, abdicating their own responsibility for operating the car.
- This can lead to avoidable crashes and dangerous incidents that undermine public confidence in automated driving.
- Even with the latest technology, drivers still need to watch where they're going and be prepared to take the wheel; fully autonomous vehicles are years from widespread deployment.
Context: Federal regulators have taken a mostly hands-off approach to automated vehicle technologies, offering only guidelines for fully driverless cars like robotaxis, which are under development and evolving.
- Now the Biden administration is stepping up its scrutiny of assisted-driving systems available today, like Tesla's Autopilot.
What's happening: The National Highway Traffic Safety Administration said recently that companies must report serious crashes involving driver-assistance and automated-driving systems to authorities within a day of learning about them.
- This week NHTSA opened a formal investigation into Tesla Autopilot after a series of crashes involving emergency vehicles.
- The agency said it had identified 11 crashes since 2018 in which Tesla vehicles operating on Autopilot struck emergency vehicles, despite the presence of flashing lights, flares or road cones.
- At least 17 people were injured and one person died in the crashes, according to NHTSA.
Between the lines: While the focus on crashes with emergency vehicles is fairly narrow, NHTSA will be looking carefully at where and how Autopilot functions, including how it identifies and reacts to obstacles in the road.
- Importantly, it will also examine how Autopilot monitors and assists drivers, and how it enforces the driver's engagement while the system is operating.
Be smart: Tesla Autopilot is not an autonomous driving system. It is an advanced driver assistance system (ADAS) that allows the car to maintain its speed and stay in its lane.
- Tesla is gradually adding more features to a package it calls "full self-driving," but such labels are confusing to consumers because they misrepresent the car's capabilities, safety advocates say.
What to watch: NHTSA will consider whether there is a defect in Tesla's Autopilot system due to a "foreseeable misuse" of the technology and whether all of its 765,000 affected cars should be recalled.
- "If NHTSA takes this all the way and decides there’s a defect, I think it will up the bar for the industry, and make people more confident in these technologies," David Friedman, vice president of advocacy at Consumer Reports, tells Axios.
- But that could be a "double-edged sword" if it results in stricter AV regulations that hurt U.S. competitiveness, warns AV expert Grayson Brulte.
The bottom line: Authorities are reviewing not just whether assisted-diving technology works, but also its effects on human behavior.
"behind" - Google News
August 18, 2021 at 05:00PM
https://ift.tt/3D6X4JV
Why drivers are zoning out behind the wheel - Axios
"behind" - Google News
https://ift.tt/2YqUhZP
https://ift.tt/2yko4c8
Bagikan Berita Ini
0 Response to "Why drivers are zoning out behind the wheel - Axios"
Post a Comment