Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To be fair this is mostly a problem with the general public's understanding of the term autopilot and not with the actual systems that have used that name before. Aeronautical autopilots would probably only be classified as level 1 or maybe level 2 when that term was first coined and used to describe them. Even now I am not sure if we would classify the average autopilots you find on a commercial airliner as being fully at level 4.


Modern aircraft autopilot systems are often used in an L1-L3 way by that scale, but are capable of at least L4.

The flight-management systems of modern commercial aircraft can, for example, receive a flight plan expressed as a set of waypoints and information about the destination airport, and then fly the plane along that route and conduct a fully automated landing at the destination.


I think the concepts of L1-L4 don't have much meaning in the absence of an environment where the machine is constantly reacting to the behavior of other actors and environmental hazards.


> this is mostly a problem with the general public's understanding of the term autopilot

In other words, it's the customer's fault. And it's arguable that Tesla benefited from this misunderstanding.


There is a difference between the customer being at fault in controlling the car and a customer being at fault for not learning the basics about features of the car. I don't think we can make the first a universal rule, but I wouldn't think it is controversial for the second to be universal.


I agree that customers should learn how their cars work.

But if we all engineered products based on how customers should behave, the human race would be extinct.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: