odysseus2000 wrote:BobbyD
...because some people consider taking a system which occupies a dangerous no mans land between machine control and human control, running on insufficient data, putting it in to the hands of real people on real roads and telling them it is an autopilot to be incredibly irresponsible. But hey, health and safety gone mad right?
You mean like human drive cars and all the folk they have killed.
All Tesla cars have human drivers. Teslas are not Autonomous. They are arguably more dangerous because at Level 2 the locus of control sits on a blurred edge, good enough for a human to get bored monitoring it, not good enough to spot a stationary car and decide not to drive in to it.
At Level 0 the human is engaged, they are driving. At Level 4 and above, there is a clean delineation of responsibility if the car is driving then the car is driving, and if the human is driving then the human is driving. The imbetween is dangerous, particularly when the car involved isn't suitabl;y kitted out with sensors.
odysseus2000 wrote:
There is a point were one looks at the greater good as well as the downside.
The greater good is being pursued who are chasing level 4/5 in a responsible manner, not selling an overdeveloped cruise control as an executive toy.