Every Tesla Accident Resulting in Death...

On 01/04/22 08:46, Don Y wrote:
On 3/31/2022 5:19 PM, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:

Sorry, that\'s not how an autopilot works.  It doesn\'t fly the plane.  It
simply maintains a heading and altitude.

They have been doing more than that for for > 50 years.
Cat 3b landings were in operation when I was a kid.

Someone still has to be watching
for other aircraft and otherwise flying the plane.  In other words, the
pilot is responsible for flying the plane, with or without the autopilot.

Yes, that\'s the original idea of a plane autopilot.  But modern ones are more
sophisticated and handle course changes along the planned route, as well as
being able to land automatically.  And more important than what plane
autopilots actually /do/, is what people /think/ they do - and remember we
are talking about drivers that think their Tesla \"autopilot\" will drive their
car while they watch a movie or nap in the back seat.

And, to put it kindly, aren\'t discouraged in that misapprehension
by the statements of the cars\' manufacturers and salesdroids.

Now, what\'s the best set of techniques to get that concept
into the heads of twats that think \"autopilot\" means \"it does
it for me\".

\"Pilots\" and \"drivers\" approach their efforts entirely differently
and with different mindsets.

They should do in one sense (differing machine/automation)
and shouldn\'t in another (both are lethal instruments).

Problem starts with the marketing.


ANYONE can drive a car.  By contrast, a fair bit more understanding,
reasoning and skill is required to pilot an aircraft.

Not entirely sure about that. 14yo can be solo, and a
very few are even aerobatic pilots.

The main difference is that you can\'t stop and catch
your breath, or stop and have a pee.

Overall learning to fly a glider is pretty much similar
to learning to drive - in cost, time and skill.

The training
is more rigorous, though, and isn\'t a one-off event.


I.e., a pilot is a lot more likely to understand the function
AND LIMITATIONS of an (aircraft) autopilot than a driver is to
have similar appreciation for an (automobile) \"autopilot\".

Pilots often don\'t understand what\'s going on; just
listen to the accident reports on the news :(
 
On 4/1/2022 3:44 AM, Tom Gardner wrote:
On 01/04/22 08:46, Don Y wrote:
On 3/31/2022 5:19 PM, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:

Sorry, that\'s not how an autopilot works. It doesn\'t fly the plane. It
simply maintains a heading and altitude.

They have been doing more than that for for > 50 years.
Cat 3b landings were in operation when I was a kid.

Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the
pilot is responsible for flying the plane, with or without the autopilot.

Yes, that\'s the original idea of a plane autopilot. But modern ones are more
sophisticated and handle course changes along the planned route, as well as
being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and remember we
are talking about drivers that think their Tesla \"autopilot\" will drive their
car while they watch a movie or nap in the back seat.

And, to put it kindly, aren\'t discouraged in that misapprehension
by the statements of the cars\' manufacturers and salesdroids.

Now, what\'s the best set of techniques to get that concept
into the heads of twats that think \"autopilot\" means \"it does
it for me\".

\"Pilots\" and \"drivers\" approach their efforts entirely differently
and with different mindsets.

They should do in one sense (differing machine/automation)
and shouldn\'t in another (both are lethal instruments).

Problem starts with the marketing.

Cars are far more ubiquitous. And, navigation is a 2-dimensional activity.

An \"average joe\" isn\'t likely to think hes gonna \"hop in a piper cub\" and
be off on a jaunt to run errands. And, navigation is a 3-dimensional
undertaking (you don\'t worry about vehicles \"above\" or \"below\", when driving!)

ANYONE can drive a car. By contrast, a fair bit more understanding,
reasoning and skill is required to pilot an aircraft.

Not entirely sure about that. 14yo can be solo, and a
very few are even aerobatic pilots.

And a \"youngster\" can drive a car (or other sort of motorized vehicle, e.g., on
a farm or other private property). The 16yo (15.5) restriction only applies to
the use on public roadways.

<https://www.abc4.com/news/local-news/underage-utah-boy-caught-driving-wrong-way-in-slc/>

<https://www.kgun9.com/news/local-news/cochise-county-four-smuggling-busts-within-five-hours-14-year-old-driver-involved>

Cars are \"simple\" to operate; can-your-feet-reach-the-pedals being the only
practical criteria. I\'d wager *I* would have a hard time walking up to
an aircraft, \"cold\", and trying to sort out how to get it off the ground...

The main difference is that you can\'t stop and catch
your breath, or stop and have a pee.

Overall learning to fly a glider is pretty much similar
to learning to drive - in cost, time and skill.

But not opportunity. I\'d have to spend a fair bit of effort researching
where to gain access to any sort of aircraft. OTOH, I can readily \"borrow\"
(with consent) any of my neighbors\' vehicles and operate all of them in
a fairly consistent manner: sports cars, trucks, commercial trucks, even
motorcycles (though never having driven one, before!).

The training
is more rigorous, though, and isn\'t a one-off event.

It\'s likely more technical, too. Most auto-driving instruction deals
with laws, not the technical \"piloting\" of the vehicle. The driving test
is similarly focused on whether or not you put that law knowledge into
effect (did you stop *at* the proper point? did you observe the speed
limit and other posted requirements?)

[When taking the test for my *first* DL, the DMV was notorious for
having a stop sign *in* the (tiny) parking lot -- in an unexpected
place. Folks who weren\'t observant -- or tipped off to this ahead
of time -- were \"failed\" before ever getting out on the roadway!]

Testing for a CDL (commercial) is considerably different; you are
quizzed on technical details of the vehicle that affect the safety of
you and others on the roadway -- because you are operating a much more
\"lethal\" vehicle (< 26,000 pounds GVW). You also have to prove yourself
medically *fit* to operate (not color blind, not an insulin user,
\"controlled\" blood pressure, nonepileptic, alchoholic, etc.!

And, other \"endorsements\" have further requirements (e.g., hauling
tandem/triples, hazardous products, etc.)

I.e., a pilot is a lot more likely to understand the function
AND LIMITATIONS of an (aircraft) autopilot than a driver is to
have similar appreciation for an (automobile) \"autopilot\".

Pilots often don\'t understand what\'s going on; just
listen to the accident reports on the news :(

I think those events are caused by cognitive overload, not ignorance.
 
On 01/04/22 12:32, Don Y wrote:
On 4/1/2022 3:44 AM, Tom Gardner wrote:
On 01/04/22 08:46, Don Y wrote:
On 3/31/2022 5:19 PM, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:

Sorry, that\'s not how an autopilot works.  It doesn\'t fly the plane.  It
simply maintains a heading and altitude.

They have been doing more than that for for > 50 years.
Cat 3b landings were in operation when I was a kid.

Someone still has to be watching
for other aircraft and otherwise flying the plane.  In other words, the
pilot is responsible for flying the plane, with or without the autopilot.

Yes, that\'s the original idea of a plane autopilot.  But modern ones are more
sophisticated and handle course changes along the planned route, as well as
being able to land automatically.  And more important than what plane
autopilots actually /do/, is what people /think/ they do - and remember we
are talking about drivers that think their Tesla \"autopilot\" will drive their
car while they watch a movie or nap in the back seat.

And, to put it kindly, aren\'t discouraged in that misapprehension
by the statements of the cars\' manufacturers and salesdroids.

Now, what\'s the best set of techniques to get that concept
into the heads of twats that think \"autopilot\" means \"it does
it for me\".

\"Pilots\" and \"drivers\" approach their efforts entirely differently
and with different mindsets.

They should do in one sense (differing machine/automation)
and shouldn\'t in another (both are lethal instruments).

Problem starts with the marketing.

Cars are far more ubiquitous.  And, navigation is a 2-dimensional activity.

An \"average joe\" isn\'t likely to think hes gonna \"hop in a piper cub\" and
be off on a jaunt to run errands.  And, navigation is a 3-dimensional
undertaking (you don\'t worry about vehicles \"above\" or \"below\", when driving!)

True, but it doesn\'t change any of my points.



ANYONE can drive a car.  By contrast, a fair bit more understanding,
reasoning and skill is required to pilot an aircraft.

Not entirely sure about that. 14yo can be solo, and a
very few are even aerobatic pilots.

And a \"youngster\" can drive a car (or other sort of motorized vehicle, e.g., on
a farm or other private property).  The 16yo (15.5) restriction only applies to
the use on public roadways.

12yo fly across the country with an instructor behind.
14yo can do it on their own.

Daughter was driving my car and a double decker bus at 15yo,
on the runway and peritrack :)


Cars are \"simple\" to operate; can-your-feet-reach-the-pedals being the only
practical criteria.  I\'d wager *I* would have a hard time walking up to
an aircraft, \"cold\", and trying to sort out how to get it off the ground...

Same is true of a glider. There are only 4 controls: rudder,
stick, airbrake, cable release. Two instruments, airspeed
and barometer (i.e. height differential).

You are taught to do without them, because they all lie to
you.



The main difference is that you can\'t stop and catch
your breath, or stop and have a pee.

Overall learning to fly a glider is pretty much similar
to learning to drive - in cost, time and skill.

But not opportunity.  I\'d have to spend a fair bit of effort researching
where to gain access to any sort of aircraft.  OTOH, I can readily \"borrow\"
(with consent) any of my neighbors\' vehicles and operate all of them in
a fairly consistent manner: sports cars, trucks, commercial trucks, even
motorcycles (though never having driven one, before!).

True, but it doesn\'t change any of my points.



The training
is more rigorous, though, and isn\'t a one-off event.

It\'s likely more technical, too.  Most auto-driving instruction deals
with laws, not the technical \"piloting\" of the vehicle.  The driving test
is similarly focused on whether or not you put that law knowledge into
effect (did you stop *at* the proper point?  did you observe the speed
limit and other posted requirements?)

Not much is required to go solo.

Does the glider\'s responsiveness indicate you are flying
fast enough; are you at a reasonable height in the circuit;
what to do when you find you aren\'t and when the cable snaps.



[When taking the test for my *first* DL, the DMV was notorious for
having a stop sign *in* the (tiny) parking lot -- in an unexpected
place.  Folks who weren\'t observant -- or tipped off to this ahead
of time -- were \"failed\" before ever getting out on the roadway!]

Pre-solo tests include the instructor putting you in a
stupid position, and saying \"now get us back safely\".



Testing for a CDL (commercial) is considerably different; you are
quizzed on technical details of the vehicle that affect the safety of
you and others on the roadway -- because you are operating a much more
\"lethal\" vehicle (< 26,000 pounds GVW).  You also have to prove yourself
medically *fit* to operate (not color blind, not an insulin user,
\"controlled\" blood pressure, nonepileptic, alchoholic, etc.!

Ditto being an instructor or having a passenger.



And, other \"endorsements\" have further requirements (e.g., hauling
tandem/triples, hazardous products, etc.)

Ditto flying cross country or in clouds.


I.e., a pilot is a lot more likely to understand the function
AND LIMITATIONS of an (aircraft) autopilot than a driver is to
have similar appreciation for an (automobile) \"autopilot\".

That\'s true for the aircraft, but nobody has developed
an autopilot. You have to stay awake feel (literally,
by the seat of your pants) what\'s happening. The nearest
to an autopilot is a moving map airspace display.


Pilots often don\'t understand what\'s going on; just
listen to the accident reports on the news :(

I think those events are caused by cognitive overload, not ignorance.

Not always, e.g. the recent 737 crashes.
 
On Thursday, March 31, 2022 at 6:39:11 PM UTC-4, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:
On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:
On 31/03/2022 22:44, Ricky wrote:
On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
On 30/03/2022 00:54, Tom Gardner wrote:
On 29/03/22 20:18, David Brown wrote:
snip
No, it is not \"too strong\". It is basic statistics. Bayes\' theorem,
and all that. If a large proportion of people use autopilot, but
only a small fraction of the deaths had the autopilot on, then
clearly the autopilot reduces risks and saves lives (of those that
drive Teslas - we still know nothing of other car drivers).

A simple comparison of numbers is not sufficient. Most Tesla
autopilot usage is on highways which are much safer per mile driven
than other roads. That\'s an inherent bias because while
non-autopilot driving must include all situations, autopilot simply
doesn\'t work in most environments.

Yes. An apples-to-apples comparison is the aim, or at least as close as
one can get.

I suspect - without statistical justification -

Yes, without justification, at all.
Which do /you/ think is most likely? Autopilot crashes on the motorway,
or autopilot crashes on smaller roads?

Because autopilot doesn\'t work off the highway (it can\'t make turns, for example) more often autopilot involved crashes are on the highways.

I recall a news article that said experimenters were able to fool autopilot into making a left turn at an intersection by putting two or three small squares on the roadway. In city driving the limitations are at a level that no one would try to use it.


that the accidents
involving autopilot use are precisely cases where you don\'t have a good,
clear highway, and autopilot was used in a situation where it was not
suitable. Getting good statistics and comparisons here could be helpful
in making it safer - perhaps adding a feature that has the autopilot say
\"This is not a good road for me - you have to drive yourself\" and switch
itself off. (It would be more controversial, but probably statistically
safer, if it also sometimes said \"I\'m better at driving on this kind of
road than you are\" and switching itself on!)

An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.

Some of the dashcam \"Tesla\'s making mistakes\" videos on yootoob
aren\'t confidence inspiring. Based on one I saw, I certainly
wouldn\'t dare let a Tesla drive itself in an urban environment,

I suspect there isn\'t sufficient experience to assess relative
dangers between \"artificial intelligence\" and \"natural
stupidity\".
I don\'t doubt at all that the Tesla autopilot makes mistakes.

Which depends on how you define \"mistakes\".
Of course.
It\'s a bit like asking
if your rear view mirror makes mistakes by not showing cars in the
blind spot. The autopilot is not designed to drive the car. It is a
tool to assist the driver. The driver is required to be responsible
for the safe operation of the car at all times. I can point out to
you the many, many times the car acts like a spaz and requires me to
manage the situation. Early on, there was a left turn like on a 50
mph road, the car would want to turn into when intending to drive
straight. Fortunately they have ironed out that level of issue. But
it was always my responsibility to prevent it from causing an
accident. So how would you say anything was the fault of the
autopilot?

There are a few possibilities here (though I am not trying to claim that
any of them are \"right\" in some objective sense). You might say they
had believed that the \"autopilot\" was like a plane autopilot -

It is exactly like an airplane autopilot.


you can
turn it on and leave it to safely drive itself for most of the journey
except perhaps the very beginning and very end of the trip. As you say,
the Tesla autopilot is /not/ designed for that - that might be a mistake
from the salesmen, advertisers, user-interface designers, or just the
driver\'s mistake.

Sorry, that\'s not how an autopilot works. It doesn\'t fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying the plane, with or without the autopilot.

Yes, that\'s the original idea of a plane autopilot. But modern ones are
more sophisticated and handle course changes along the planned route, as
well as being able to land automatically. And more important than what
plane autopilots actually /do/, is what people /think/ they do - and
remember we are talking about drivers that think their Tesla \"autopilot\"
will drive their car while they watch a movie or nap in the back seat.

Great! But the autopilot is not watching for other aircraft, not monitoring communications and not able to deal with any unusual events. You keep coming back to a defective idea that autopilot means the airplane is flying itself. It\'s not! Just like in the car, there is a pilot who\'s job is to fly/drive and assure safety.

As to the movie idea, no, people don\'t think that. People might \"pretend\" that, but there\'s no level of \"thinking\" that says you can climb in the back seat while driving. Please don\'t say silly things.


And sometimes the autopilot does something daft - it is no longer
assisting the driver, but working against him or her. That, I think,
should be counted as a mistake by the autopilot.

The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. \"Daft\" is not a very useful term, as it means what you want it to mean. \"I know it when I see it.\" Hard to design to that sort of specification.

Well, \"does something daft\" is no worse than \"acts like a spaz\", and
it\'s a good deal more politically correct!

Bzzzz. Sorry, you failed.

--

Rick C.

++ Get 1,000 miles of free Supercharging
++ Tesla referral code - https://ts.la/richard11209
 
On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:

Sorry, that\'s not how an autopilot works. It doesn\'t fly the plane. It
simply maintains a heading and altitude.
They have been doing more than that for for > 50 years.
Cat 3b landings were in operation when I was a kid.
Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the
pilot is responsible for flying the plane, with or without the autopilot.

Yes, that\'s the original idea of a plane autopilot. But modern ones are more
sophisticated and handle course changes along the planned route, as well as
being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and remember we
are talking about drivers that think their Tesla \"autopilot\" will drive their
car while they watch a movie or nap in the back seat.
And, to put it kindly, aren\'t discouraged in that misapprehension
by the statements of the cars\' manufacturers and salesdroids.

Now, what\'s the best set of techniques to get that concept
into the heads of twats that think \"autopilot\" means \"it does
it for me\".

That\'s Tom Gardner level misinformation. Comments about what people think are spurious and unsubstantiated. A class of \"twats\" can be invented that think anything. Nothing matters other than what Tesla owners think. They are the ones driving the cars.

--

Rick C.

--- Get 1,000 miles of free Supercharging
--- Tesla referral code - https://ts.la/richard11209
 
On Friday, April 1, 2022 at 3:08:25 AM UTC-4, David Brown wrote:
On 01/04/2022 02:19, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:

Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the
pilot is responsible for flying the plane, with or without the
autopilot.

Yes, that\'s the original idea of a plane autopilot. But modern ones
are more
sophisticated and handle course changes along the planned route, as
well as
being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and
remember we
are talking about drivers that think their Tesla \"autopilot\" will
drive their
car while they watch a movie or nap in the back seat.

And, to put it kindly, aren\'t discouraged in that misapprehension
by the statements of the cars\' manufacturers and salesdroids.

Now, what\'s the best set of techniques to get that concept
into the heads of twats that think \"autopilot\" means \"it does
it for me\".
You don\'t. Twats will always be twats. You fix the cars.

You start by changing the name. \"Driver assistance\" rather than
\"autopilot\".

You turn the steering wheel into a dead-man\'s handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention. (Maybe you have \"motorway mode\" that allows a longer
delay time, since autopilot works better there, and perhaps also a
\"traffic queue\" mode with even longer delays.)

Do you know anything about how the Tesla autopilot actually works? Anything at all?

--

Rick C.

--+ Get 1,000 miles of free Supercharging
--+ Tesla referral code - https://ts.la/richard11209
 
On Friday, April 1, 2022 at 3:17:32 AM UTC-4, Tom Gardner wrote:
On 01/04/22 08:08, David Brown wrote:
On 01/04/2022 02:19, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:

Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the
pilot is responsible for flying the plane, with or without the
autopilot.

Yes, that\'s the original idea of a plane autopilot. But modern ones
are more
sophisticated and handle course changes along the planned route, as
well as
being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and
remember we
are talking about drivers that think their Tesla \"autopilot\" will
drive their
car while they watch a movie or nap in the back seat.

And, to put it kindly, aren\'t discouraged in that misapprehension
by the statements of the cars\' manufacturers and salesdroids.

Now, what\'s the best set of techniques to get that concept
into the heads of twats that think \"autopilot\" means \"it does
it for me\".

You don\'t. Twats will always be twats. You fix the cars.

You start by changing the name. \"Driver assistance\" rather than
\"autopilot\".
That\'s one of the things I was thinking of.
You turn the steering wheel into a dead-man\'s handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention.
I\'ve wondered why they don\'t implement that, then realised
it would directly contradict their advertising.

Please tell us what the Tesla advertising says that would be contradicted?

--

Rick C.

-+- Get 1,000 miles of free Supercharging
-+- Tesla referral code - https://ts.la/richard11209
 
On 4/1/2022 5:13 AM, Tom Gardner wrote:
On 01/04/22 12:32, Don Y wrote:
On 4/1/2022 3:44 AM, Tom Gardner wrote:

Sorry, that\'s not how an autopilot works. It doesn\'t fly the plane. It
simply maintains a heading and altitude.

They have been doing more than that for for > 50 years.
Cat 3b landings were in operation when I was a kid.

Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the
pilot is responsible for flying the plane, with or without the autopilot.

Yes, that\'s the original idea of a plane autopilot. But modern ones are
more
sophisticated and handle course changes along the planned route, as well as
being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and remember we
are talking about drivers that think their Tesla \"autopilot\" will drive
their
car while they watch a movie or nap in the back seat.

And, to put it kindly, aren\'t discouraged in that misapprehension
by the statements of the cars\' manufacturers and salesdroids.

Now, what\'s the best set of techniques to get that concept
into the heads of twats that think \"autopilot\" means \"it does
it for me\".

\"Pilots\" and \"drivers\" approach their efforts entirely differently
and with different mindsets.

They should do in one sense (differing machine/automation)
and shouldn\'t in another (both are lethal instruments).

Problem starts with the marketing.

Cars are far more ubiquitous. And, navigation is a 2-dimensional activity.

An \"average joe\" isn\'t likely to think hes gonna \"hop in a piper cub\" and
be off on a jaunt to run errands. And, navigation is a 3-dimensional
undertaking (you don\'t worry about vehicles \"above\" or \"below\", when driving!)

True, but it doesn\'t change any of my points.

If access to a technology -- or ability to make use of that technology -- is
limited, then it diminishes as a source of problems.

I don\'t worry about someone (young/old/qualified/not) climbing into the cockpit
of an A-10 at the military base down the street and strafing my neighborhood
with it\'s 30mm Gatling gun -- despite the fact that there are many dozens of
them sitting on the tarmac. And, nothing prevents an airman from getting
unhinged and taking out his frustrations with his \"vehicle of choice\".
Access is constrained as well as the know-how required to put it into use.

OTOH, a 14 year old climbing in a (stolen?) vehicle presents a very real danger
to me on my local roadways (note the articles cited in previous post). There
are hundreds of such vehicles within a stone\'s throw -- regardless of where you
happen to be throwing the stone!

Even \"heavy equipment\" is operated (driven) in virtually the same way as cars
(as we\'ve had cases of joyriders in dump trucks, back hoes, graders, etc.)

Can\'t recall any \"average joe\" taking an aircraft for a joyride, though!
(they wouldn\'t know HOW)

ANYONE can drive a car. By contrast, a fair bit more understanding,
reasoning and skill is required to pilot an aircraft.

Not entirely sure about that. 14yo can be solo, and a
very few are even aerobatic pilots.

And a \"youngster\" can drive a car (or other sort of motorized vehicle, e.g., on
a farm or other private property). The 16yo (15.5) restriction only applies to
the use on public roadways.

12yo fly across the country with an instructor behind.
14yo can do it on their own.

Daughter was driving my car and a double decker bus at 15yo,
on the runway and peritrack :)

It doesn\'t matter what LEGALLY can be done. What matters is what can be
TECHNICALLY performed. The 14 yo\'s in the articles I cited were each
breaking the law. But, were still ABLE to access and operate the vehicles
in question. Invite them to take your aircraft for a joyride and you\'ll
find them sitting on the tarmac, hours later, still trying to figure out how
to take off!

I was driving (on private property) at 10. As were most of the (males!) in
my extended family. Grandpa owned a large car business so all of the cousins
would \"work\" at the shop. It was not uncommon to be handed a set of keys and
told to bring the \"white chevy\" into bay #6 for new tires. And, once the
new rubber was mounted, told to drive the car with the ass-end raised so they
could be spin-balanced (<https://i.ytimg.com/vi/NJd-AnU71nQ/maxresdefault.jpg>
in lieu of dynamic balancers). Then, around to the alignment pit. Finally,
gassed up and parked waiting for the customer to pick it up.

[Grandpa had a rather loose interpretation of what was \"legal\" -- and spent
a fair bit of time behind bars for other \"misinterpretations\" :> ]

Cars are \"simple\" to operate; can-your-feet-reach-the-pedals being the only
practical criteria. I\'d wager *I* would have a hard time walking up to
an aircraft, \"cold\", and trying to sort out how to get it off the ground...

Same is true of a glider. There are only 4 controls: rudder,
stick, airbrake, cable release. Two instruments, airspeed
and barometer (i.e. height differential).

And a piper cub? Lear jet? Not all aircraft are gliders. And, a glider
requires a \"co-conspirator\" to get it airborn! A car just requires
\"opportunity\".

You are taught to do without them, because they all lie to
you.

The main difference is that you can\'t stop and catch
your breath, or stop and have a pee.

Overall learning to fly a glider is pretty much similar
to learning to drive - in cost, time and skill.

But not opportunity. I\'d have to spend a fair bit of effort researching
where to gain access to any sort of aircraft. OTOH, I can readily \"borrow\"
(with consent) any of my neighbors\' vehicles and operate all of them in
a fairly consistent manner: sports cars, trucks, commercial trucks, even
motorcycles (though never having driven one, before!).

True, but it doesn\'t change any of my points.

The number of \"flying\" accidents vs. the number of \"auto\" accidents makes
the point very well.

The training
is more rigorous, though, and isn\'t a one-off event.

It\'s likely more technical, too. Most auto-driving instruction deals
with laws, not the technical \"piloting\" of the vehicle. The driving test
is similarly focused on whether or not you put that law knowledge into
effect (did you stop *at* the proper point? did you observe the speed
limit and other posted requirements?)

Not much is required to go solo.

Does the glider\'s responsiveness indicate you are flying
fast enough; are you at a reasonable height in the circuit;
what to do when you find you aren\'t and when the cable snaps.

Piper cub? Lear jet?

[When taking the test for my *first* DL, the DMV was notorious for
having a stop sign *in* the (tiny) parking lot -- in an unexpected
place. Folks who weren\'t observant -- or tipped off to this ahead
of time -- were \"failed\" before ever getting out on the roadway!]

Pre-solo tests include the instructor putting you in a
stupid position, and saying \"now get us back safely\".

Automobiles just try to catch you breaking a law. Damn near anyone who
has driven a vehicle prior to being tested can get it from point A to
point B.

Commercial vehicles focus more on safety (and any ADDITIONAL laws that
may apply -- e.g., a commercial vehicle must clearly be labeled and there
are special enforcement units that will ticket for such violations) because
they assume you already understand the basics of the legal requirements
for a motor vehicle.

I can\'t recall ANY \"legal\" issues in my forklift certification. But, lots
of technical issues regarding how to safely operate the vehicle, transport
loads, derate lifting capacity based on lift height, etc. And, a strong
emphasis on how NOT to suffer a \"crush injury\"!

Testing for a CDL (commercial) is considerably different; you are
quizzed on technical details of the vehicle that affect the safety of
you and others on the roadway -- because you are operating a much more
\"lethal\" vehicle (< 26,000 pounds GVW). You also have to prove yourself
medically *fit* to operate (not color blind, not an insulin user,
\"controlled\" blood pressure, nonepileptic, alchoholic, etc.!

Ditto being an instructor or having a passenger.

And, other \"endorsements\" have further requirements (e.g., hauling
tandem/triples, hazardous products, etc.)

Ditto flying cross country or in clouds.

Do you really think opportunists are going to hijack an aircraft
and \"hope\" there are clear skies?

There are simply too many impediments to aircraft being misused/abused
to make it a real issue.

I.e., a pilot is a lot more likely to understand the function
AND LIMITATIONS of an (aircraft) autopilot than a driver is to
have similar appreciation for an (automobile) \"autopilot\".

That\'s true for the aircraft, but nobody has developed
an autopilot. You have to stay awake feel (literally,
by the seat of your pants) what\'s happening. The nearest
to an autopilot is a moving map airspace display.

Commercial aircraft rely on autopilots. In a sense, it is
an easier (navigation) problem to solve -- there\'s no real \"traffic\"
or other obstacles beyond the airports (assuming you maintain your
assigned flight corridor/speed). The same is true of railways
and waterways (more or less).

Cars operate in a much more challenging environment. Even \"on the open
road\", a condition can arise that needs immediate driver attention
(witness these 50-car pileups).

Note how poorly \"seasoned\" drivers adapt to the first snowfall of
the season. (Really? Did you FORGET what this stuff was like??)
Do they do any special (mental?) prep prior to getting behind the
wheel, in those cases? Or, just \"wing it\", assuming \"it will
come back to them\"?

Pilots often don\'t understand what\'s going on; just
listen to the accident reports on the news :(

I think those events are caused by cognitive overload, not ignorance.

Not always, e.g. the recent 737 crashes.

So, a defect in an autopilot implementation can be similarly excused?
 
On 4/1/2022 1:46 AM, Jeroen Belleman wrote:
You turn the steering wheel into a dead-man\'s handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention. (Maybe you have \"motorway mode\" that allows a longer
delay time, since autopilot works better there, and perhaps also a
\"traffic queue\" mode with even longer delays.)

All these \'assistants\' with their multiple \'modes\' only make things
more complicated and therefor unsafe. Simple is better.

\"Assistance\" should be intuitive. You don\'t even NOTICE the power
steering, brakes, autotranny, etc. \"assistants\" in a vehicle.
Because, for the most part, the way they operate is largely invariant
of driver, driving conditions, etc. (how often do folks use anything
other than \"D(rive)\" and \"R(everse)\"? Is there a way to *disable*
the power steering? Or brakes? Should there be?

I recently got a car that came standard with \'lane assist\'. I
hate it. It\'s like having a passenger tugging on the steering wheel,
absolutely intolerable. It also can\'t be switched off permanently.
For the first week or two, I just blindfolded the camera it uses to
watch the road, until I found out how to switch it off with a single
button press. (There are far too many buttons, for that matter, and
all with multiple functions, too. Bad!)

When shopping for SWMBO\'s vehicle, we were waiting for the salesman for
a test drive. Vehicle running, me behind the wheel.

One by one, error indicators came on -- all pertaining to the forward
looking \"assistants\" (too close to vehicle in front of you, maintain
your lane, etc.). Each error indicated the associated system was
offline due to a fault.

When I questioned the salesman (did *I* do something to cause that?),
he dismissed it as a consequence of the high temperatures (40+C is
common, here -- at least 1 out of 6 days). So, tell me why I should
pay extra for this feature? And, how much faith I should have in it
performing as advertised?? <frown>

That said, some automatic functions /are/ good. Climate control with
a real thermostat, auto-darkening rear view mirrors, mostly functions
that have nothing to do with driving per se. The only /good/ automatic
functions are those you don\'t notice until they /stop/ working.

These are all functions that aren\'t interactive -- like my brakes/tranny
examples. You don\'t expect to have to make changes to the mechanism,
especially while driving.

I do like certain steering wheel mounted controls (e.g., radio/music)
as it helps me keep my eyes on the road (instead of reaching over
to adjust volume, select different source material, etc.) But, have
yet to find a use for the paddle shifters -- and the stalk-mounted
controls are too numerous on too few stalks!

> I also like the GPS with head-up display.

My favorite is the side mirrors tilting downwards (to afford a view
of the ground) when backing up. The backup camera is a win as we back into
our garage and it helps avoid backing INTO something. These would be less
necessary with a \"lower profile\" vehicle, though.

[I also like the trip computer automatically reseting at each trip
and \"fill up\"]
 
On 01/04/2022 14:44, Ricky wrote:
On Friday, April 1, 2022 at 3:08:25 AM UTC-4, David Brown wrote:
On 01/04/2022 02:19, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:

Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the
pilot is responsible for flying the plane, with or without the
autopilot.

Yes, that\'s the original idea of a plane autopilot. But modern ones
are more
sophisticated and handle course changes along the planned route, as
well as
being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and
remember we
are talking about drivers that think their Tesla \"autopilot\" will
drive their
car while they watch a movie or nap in the back seat.

And, to put it kindly, aren\'t discouraged in that misapprehension
by the statements of the cars\' manufacturers and salesdroids.

Now, what\'s the best set of techniques to get that concept
into the heads of twats that think \"autopilot\" means \"it does
it for me\".
You don\'t. Twats will always be twats. You fix the cars.

You start by changing the name. \"Driver assistance\" rather than
\"autopilot\".

You turn the steering wheel into a dead-man\'s handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention. (Maybe you have \"motorway mode\" that allows a longer
delay time, since autopilot works better there, and perhaps also a
\"traffic queue\" mode with even longer delays.)

Do you know anything about how the Tesla autopilot actually works? Anything at all?

A little - but not a lot, and no personal experience.

So fill in the details here.

You\'ve already told us that it is designed for things like motorway
driving (or \"highway\" driving). Presumably you stick by that, and
therefore agree that any restrictions on the autopilot should be lower
for motorway driving than for more \"challenging\" driving such as town
roads or small, twisty country roads.

People already manage to read newspapers or eat their breakfast in
traffic queues, in purely manual cars. Do you think autopilot can
handle that kind of traffic?

My suggestion is that a way to ensure people have more focus on driving
is to require contact with the steering wheel. I am happy to hear your
objections to that idea, or to alternative thoughts.


Improper use of autopilot (and other automation in all kinds of cars)
leads to a higher risk of accidents. I expect that proper use can lower
risk. Do you disagree with these two claims?

Do you think Tesla\'s autopilot is perfect as it is, or is there room for
improvement?

Do you actually want to contribute something to this thread, or do you
just want to attack any post that isn\'t Tesla fanboy support? (Your
answers to the previous questions will cover this one too.)
 
On 01/04/2022 14:42, Ricky wrote:
On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:

Sorry, that\'s not how an autopilot works. It doesn\'t fly the
plane. It simply maintains a heading and altitude.
They have been doing more than that for for > 50 years. Cat 3b
landings were in operation when I was a kid.
Someone still has to be watching for other aircraft and
otherwise flying the plane. In other words, the pilot is
responsible for flying the plane, with or without the
autopilot.

Yes, that\'s the original idea of a plane autopilot. But modern
ones are more sophisticated and handle course changes along the
planned route, as well as being able to land automatically. And
more important than what plane autopilots actually /do/, is what
people /think/ they do - and remember we are talking about
drivers that think their Tesla \"autopilot\" will drive their car
while they watch a movie or nap in the back seat.
And, to put it kindly, aren\'t discouraged in that misapprehension
by the statements of the cars\' manufacturers and salesdroids.

Now, what\'s the best set of techniques to get that concept into the
heads of twats that think \"autopilot\" means \"it does it for me\".

That\'s Tom Gardner level misinformation. Comments about what people
think are spurious and unsubstantiated. A class of \"twats\" can be
invented that think anything. Nothing matters other than what Tesla
owners think. They are the ones driving the cars.

Are you suggesting that none of the people who drive Teslas are twats?
(Maybe that term is too British for you.)

And are you suggesting that only Tesla drivers are affected by Tesla
crashes? Obviously they will be disproportionally affected, but motor
accidents often involve other people and other cars. And while Tesla
may be leading the way in car \"autopiloting\", others are following - the
strengths and weaknesses of Tesla\'s systems are relevant to other car
manufacturers.
 
On 2022-04-01 15:38, Don Y wrote:
On 4/1/2022 1:46 AM, Jeroen Belleman wrote:
You turn the steering wheel into a dead-man\'s handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention. (Maybe you have \"motorway mode\" that allows a longer
delay time, since autopilot works better there, and perhaps also a
\"traffic queue\" mode with even longer delays.)

All these \'assistants\' with their multiple \'modes\' only make things
more complicated and therefor unsafe. Simple is better.

\"Assistance\" should be intuitive. You don\'t even NOTICE the power
steering, brakes, autotranny, etc. \"assistants\" in a vehicle.
Because, for the most part, the way they operate is largely invariant
of driver, driving conditions, etc. (how often do folks use anything
other than \"D(rive)\" and \"R(everse)\"? Is there a way to *disable*
the power steering? Or brakes? Should there be?

I much prefer a simple stick shift. I can tell what state it\'s in
by touch, and there in not the slightest doubt about it. That isn\'t
true for an automatic. You need to /look/ what state it\'s in. They\'re
too temperamental to my taste, refusing to change state under certain
conditions. Same for the electric parking brake. It took me a while to
figure out it refuses to disengage when I\'m not wearing seat belts.
Sheesh! Talk about weird interactions!

Power steering and brakes are in the set of assists that normally
go unnoticed until they fail. (Provided they are essentially linear,
smooth, without discontinuity or other surprise behaviour.)

[...]
My favorite is the side mirrors tilting downwards (to afford a view
of the ground) when backing up. The backup camera is a win as we back into
our garage and it helps avoid backing INTO something. These would be less
necessary with a \"lower profile\" vehicle, though.

[I also like the trip computer automatically reseting at each trip
and \"fill up\"]

Yes, got that too, and I agree those are good features.

Jeroen Belleman
 
On 01/04/2022 14:38, Ricky wrote:
On Thursday, March 31, 2022 at 6:39:11 PM UTC-4, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:
On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:
On 31/03/2022 22:44, Ricky wrote:
On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
On 30/03/2022 00:54, Tom Gardner wrote:
On 29/03/22 20:18, David Brown wrote:
snip
No, it is not \"too strong\". It is basic statistics. Bayes\' theorem,
and all that. If a large proportion of people use autopilot, but
only a small fraction of the deaths had the autopilot on, then
clearly the autopilot reduces risks and saves lives (of those that
drive Teslas - we still know nothing of other car drivers).

A simple comparison of numbers is not sufficient. Most Tesla
autopilot usage is on highways which are much safer per mile driven
than other roads. That\'s an inherent bias because while
non-autopilot driving must include all situations, autopilot simply
doesn\'t work in most environments.

Yes. An apples-to-apples comparison is the aim, or at least as close as
one can get.

I suspect - without statistical justification -

Yes, without justification, at all.
Which do /you/ think is most likely? Autopilot crashes on the motorway,
or autopilot crashes on smaller roads?

Because autopilot doesn\'t work off the highway (it can\'t make turns, for example) more often autopilot involved crashes are on the highways.

I was not aware of that limitation. Thanks for providing some relevant
information.

I recall a news article that said experimenters were able to fool autopilot into making a left turn at an intersection by putting two or three small squares on the roadway. In city driving the limitations are at a level that no one would try to use it.


that the accidents
involving autopilot use are precisely cases where you don\'t have a good,
clear highway, and autopilot was used in a situation where it was not
suitable. Getting good statistics and comparisons here could be helpful
in making it safer - perhaps adding a feature that has the autopilot say
\"This is not a good road for me - you have to drive yourself\" and switch
itself off. (It would be more controversial, but probably statistically
safer, if it also sometimes said \"I\'m better at driving on this kind of
road than you are\" and switching itself on!)

An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.

Some of the dashcam \"Tesla\'s making mistakes\" videos on yootoob
aren\'t confidence inspiring. Based on one I saw, I certainly
wouldn\'t dare let a Tesla drive itself in an urban environment,

I suspect there isn\'t sufficient experience to assess relative
dangers between \"artificial intelligence\" and \"natural
stupidity\".
I don\'t doubt at all that the Tesla autopilot makes mistakes.

Which depends on how you define \"mistakes\".
Of course.
It\'s a bit like asking
if your rear view mirror makes mistakes by not showing cars in the
blind spot. The autopilot is not designed to drive the car. It is a
tool to assist the driver. The driver is required to be responsible
for the safe operation of the car at all times. I can point out to
you the many, many times the car acts like a spaz and requires me to
manage the situation. Early on, there was a left turn like on a 50
mph road, the car would want to turn into when intending to drive
straight. Fortunately they have ironed out that level of issue. But
it was always my responsibility to prevent it from causing an
accident. So how would you say anything was the fault of the
autopilot?

There are a few possibilities here (though I am not trying to claim that
any of them are \"right\" in some objective sense). You might say they
had believed that the \"autopilot\" was like a plane autopilot -

It is exactly like an airplane autopilot.


you can
turn it on and leave it to safely drive itself for most of the journey
except perhaps the very beginning and very end of the trip. As you say,
the Tesla autopilot is /not/ designed for that - that might be a mistake
from the salesmen, advertisers, user-interface designers, or just the
driver\'s mistake.

Sorry, that\'s not how an autopilot works. It doesn\'t fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying the plane, with or without the autopilot.

Yes, that\'s the original idea of a plane autopilot. But modern ones are
more sophisticated and handle course changes along the planned route, as
well as being able to land automatically. And more important than what
plane autopilots actually /do/, is what people /think/ they do - and
remember we are talking about drivers that think their Tesla \"autopilot\"
will drive their car while they watch a movie or nap in the back seat.

Great! But the autopilot is not watching for other aircraft, not monitoring communications and not able to deal with any unusual events. You keep coming back to a defective idea that autopilot means the airplane is flying itself. It\'s not! Just like in the car, there is a pilot who\'s job is to fly/drive and assure safety.

I am fully aware that plane autopilots are limited. I am also aware
that they are good enough (in planes equipped with modern systems) to
allow pilots to let the system handle most of the flight itself, even
including landing. The pilot is, of course, expected to be paying
attention, watching for other aircraft, communicating with air traffic
controllers and all the rest of it. But there have been cases of pilots
falling asleep, or missing their destination because they were playing
around on their laptops. What people /should/ be doing, and what they
are /actually/ doing, is not always the same.

As to the movie idea, no, people don\'t think that. People might \"pretend\" that, but there\'s no level of \"thinking\" that says you can climb in the back seat while driving. Please don\'t say silly things.

You can google for \"backseat Tesla drivers\" as well as I can. I am
confident that some of these are staged, and equally confident that some
are not. There is no minimum level of \"thinking\" - no matter how daft
something might be, there is always a dafter person who will think it\'s
a good idea.

And sometimes the autopilot does something daft - it is no longer
assisting the driver, but working against him or her. That, I think,
should be counted as a mistake by the autopilot.

The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. \"Daft\" is not a very useful term, as it means what you want it to mean. \"I know it when I see it.\" Hard to design to that sort of specification.

Well, \"does something daft\" is no worse than \"acts like a spaz\", and
it\'s a good deal more politically correct!

Bzzzz. Sorry, you failed.

Really? You think describing the autopilot\'s actions as \"acts like a
spaz\" is useful and specific, while \"does something daft\" is not? As
for the political correctness - find a real spastic and ask them what
they think of your phrase.
 
On 4/1/2022 7:17 AM, Jeroen Belleman wrote:
On 2022-04-01 15:38, Don Y wrote:
On 4/1/2022 1:46 AM, Jeroen Belleman wrote:
You turn the steering wheel into a dead-man\'s handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention. (Maybe you have \"motorway mode\" that allows a longer
delay time, since autopilot works better there, and perhaps also a
\"traffic queue\" mode with even longer delays.)

All these \'assistants\' with their multiple \'modes\' only make things
more complicated and therefor unsafe. Simple is better.

\"Assistance\" should be intuitive. You don\'t even NOTICE the power
steering, brakes, autotranny, etc. \"assistants\" in a vehicle.
Because, for the most part, the way they operate is largely invariant
of driver, driving conditions, etc. (how often do folks use anything
other than \"D(rive)\" and \"R(everse)\"? Is there a way to *disable*
the power steering? Or brakes? Should there be?

I much prefer a simple stick shift. I can tell what state it\'s in
by touch, and there in not the slightest doubt about it. That isn\'t
true for an automatic. You need to /look/ what state it\'s in.

Why do you care? You can tell if it is in R/N/D simply by the feel of
where the control is presently positioned. If I care what \"gear\"
the transmission is currently operating in, I can look at the display
located between the speedo and tach (a place that your eyes will
always consult).

In most of the places I\'ve lived, there is little need to be able
to \"force\" the transmission into a particular gear -- that it wouldn\'t
have already assumed, on its own. And, in difficult driving conditions
(e.g., coming out of the mountains, here), you\'d likely want the vehicle
to manage most of those decisions (e.g., overheating the transmission
on a long downgrade).

SWMBO used to prefer a stick. I convinced her to move to an automatic
as it would be one less control to deal with as she aged. And, migrated
her to a larger displacement engine -- for similar reasons. She now often
relies on these two changes to extricate herself from dangerous
situations (e.g., when oncoming traffic isn\'t yielding as it should).
(this isn\'t always a pleasant experience when I\'m a passenger! :< )

They\'re
too temperamental to my taste, refusing to change state under certain
conditions. Same for the electric parking brake. It took me a while to
figure out it refuses to disengage when I\'m not wearing seat belts.
Sheesh! Talk about weird interactions!

Hmmm... that\'s a new one, for me. I\'m more bothered by all of the \"alarms\"
or warnings. \"You haven\'t put the vehicle in PARK but turned off the
ignition\" \"The headlights are still on\" and my personal peeve \"You have
exited the vehicle -- WITH the keyfob -- while it is still running\"
(what the hell are they trying to tell me that I don\'t already know?
That I left the car *running*?? I can understand an alert if I\'ve left
the key *in* the running car...

The most annoying aspect of all this is that there is not a unified means of
presenting this information! Sometimes it appears as text on a display
(I don\'t recall where I told the car that I prefer ENGLISH!), sometimes
a cryptic idiot light on the dash, sometimes a coded audio annunciator
(what the hell does THAT noise mean???), etc.

(There are three full graphic displays in the car. Can\'t you sort out how
to use them to TELL me what you think I\'ve done wrong?)

Power steering and brakes are in the set of assists that normally
go unnoticed until they fail. (Provided they are essentially linear,
smooth, without discontinuity or other surprise behaviour.)

Exactly. You wouldn\'t want to have a switch to turn them on or off.
I\'m not convinced of the utility of having the headlights come on
automatically. Or, the windshield wipers. But, those can be disabled.
Likewise for the tilt-down mirrors (SWMBO took a long time to warm to
that idea)

[...]

My favorite is the side mirrors tilting downwards (to afford a view
of the ground) when backing up. The backup camera is a win as we back into
our garage and it helps avoid backing INTO something. These would be less
necessary with a \"lower profile\" vehicle, though.

[I also like the trip computer automatically reseting at each trip
and \"fill up\"]

Yes, got that too, and I agree those are good features.

An EV driver wouldn\'t see the need for that sort of trip computer.
But, might reason the need for one that shows battery consumption
since last recharge -- or, \"on this trip\".

It\'s also annoying when the car plays nanny and won\'t let THE PASSENGER
use certain controls while the vehicle is in motion.

And, these restrictions (intended on the *driver*) are inconsistent.
So, a driver is \"distracted\" by a control that may -- or may not -- be
operable in a certain driving condition (why can I twiddle with the
radio presets while driving but not specify a new GPS destination?).

And, the quality of the implementations is really piss poor (in every
vehicle that we auditioned!). I recall typing in the name of a store
to which I wanted to drive. I apparently misspelled it as the destination
it selected was 1500 miles away (whereas the actual store was just a few
miles away -- but I couldn\'t recall on which of several parallel roads it
was located!). C\'mon, do you REALLY think I want to lay in a course to
a destination that far from here? And, that I would do so often enough
that you should blindly assume that to be the case?? E.g., maybe a
prompt saying \"We found a location by that name 1500 miles from here.
Is that what you intended (in case you haven\'t NOTICED that to be the
case)? Or would you like to look for similar names, *locally*?\"

Or, the system being completely unresponsive to \"events\" (button presses)
for 15-30 seconds? Or, the backup camera taking a few seconds to come
online (so you have to WAIT before taking your foot off the brake).

Or, displays being limited to N digits (e.g., total miles traveled on this
trip) so you never know if you are seeing the results of saturated math
or some other USEFUL presentation?

[You would think a car manufacturer with all the re$ource$ at its disposal
would come up with a better implementation! Likely the folks making the
technical decisions aren\'t skilled in the art...]
 
On 01/04/22 14:07, Don Y wrote:

<snipped many points where we are talking about
different classes of aircraft and air traffic>

I.e., a pilot is a lot more likely to understand the function
AND LIMITATIONS of an (aircraft) autopilot than a driver is to
have similar appreciation for an (automobile) \"autopilot\".

That\'s true for the aircraft, but nobody has developed
an autopilot. You have to stay awake feel (literally,
by the seat of your pants) what\'s happening. The nearest
to an autopilot is a moving map airspace display.

Commercial aircraft rely on autopilots.  In a sense, it is
an easier (navigation) problem to solve -- there\'s no real \"traffic\"
or other obstacles beyond the airports (assuming you maintain your
assigned flight corridor/speed).  The same is true of railways
and waterways (more or less).

Er, no.

You are considering a small part of air traffic, that
in controlled airspace.

Many flights, powered and unpowered, happen outside
controlled airspace, where the rule is to look out
of the cockpit for converging traffic.

One one occasion I watched a commercial airliner
taking off a thousand feet below me. Hercules buzz
around too. Then there are balloons, hang gliders
and the like.

There are even rules as to which side of roads and
railways you should fly on, so that there aren\'t
head-on collisions between aircraft following the
same ground feature in opposite directions

Gliders frequently operate very near each other,
especially in thermals and when landing. They
also have to spot other gliders coming straight
at them when ridge flying; not trivial to spot
a white blob the size of a motorbike\'s front
converging at 120mph.

To help cope with that, some gliders are equipped
with FLARMs - short range radio transmitters to
indicate the direction of other gliders and whether
you are likely to hit them.



Cars operate in a much more challenging environment.  Even \"on the open
road\", a condition can arise that needs immediate driver attention
(witness these 50-car pileups).

Note how poorly \"seasoned\" drivers adapt to the first snowfall of
the season.  (Really?  Did you FORGET what this stuff was like??)
Do they do any special (mental?) prep prior to getting behind the
wheel, in those cases?  Or, just \"wing it\", assuming \"it will
come back to them\"?

Pilots often don\'t understand what\'s going on; just
listen to the accident reports on the news :(

I think those events are caused by cognitive overload, not ignorance.

Not always, e.g. the recent 737 crashes.

So, a defect in an autopilot implementation can be similarly excused?

Que? Strawman.
 
On Friday, April 1, 2022 at 10:08:31 AM UTC-4, David Brown wrote:
On 01/04/2022 14:44, Ricky wrote:
On Friday, April 1, 2022 at 3:08:25 AM UTC-4, David Brown wrote:
On 01/04/2022 02:19, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:

Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the
pilot is responsible for flying the plane, with or without the
autopilot.

Yes, that\'s the original idea of a plane autopilot. But modern ones
are more
sophisticated and handle course changes along the planned route, as
well as
being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and
remember we
are talking about drivers that think their Tesla \"autopilot\" will
drive their
car while they watch a movie or nap in the back seat.

And, to put it kindly, aren\'t discouraged in that misapprehension
by the statements of the cars\' manufacturers and salesdroids.

Now, what\'s the best set of techniques to get that concept
into the heads of twats that think \"autopilot\" means \"it does
it for me\".
You don\'t. Twats will always be twats. You fix the cars.

You start by changing the name. \"Driver assistance\" rather than
\"autopilot\".

You turn the steering wheel into a dead-man\'s handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention. (Maybe you have \"motorway mode\" that allows a longer
delay time, since autopilot works better there, and perhaps also a
\"traffic queue\" mode with even longer delays.)

Do you know anything about how the Tesla autopilot actually works? Anything at all?

A little - but not a lot, and no personal experience.

So fill in the details here.

You\'ve already told us that it is designed for things like motorway
driving (or \"highway\" driving). Presumably you stick by that, and
therefore agree that any restrictions on the autopilot should be lower
for motorway driving than for more \"challenging\" driving such as town
roads or small, twisty country roads.

I don\'t really understand what you mean about \"restrictions\". Again, I think your image of how it works is not how it works. I don\'t know enough of your image to know how to explain to you what you have wrong.

Autopilot will try to keep the car in a lane, recognize lights, stop signs, exit ramps and vehicles. When on appropriate highways, it will work in navigate on autopilot where it can change lanes (pass slow vehicles, get out of passing lane, etc.) and take exits. It will stop for traffic lights, but can not navigate turns at intersections or even twisty roads. When it sees somthing that upsets it, it will sound the alarm (that should be ALARM) and insist you take over. One such situation is blinking yellow lights at an intersection with light traffic. The autopilot never understands this light can be driven through.


People already manage to read newspapers or eat their breakfast in
traffic queues, in purely manual cars. Do you think autopilot can
handle that kind of traffic?

No, the autopilot won\'t read the newspaper. Otherwise I have no idea what you are asking. What is \"that kind of traffic\"? You mean stop and go? Yes, it does that very well. That\'s one situation I would not worry much about taking a nap.


My suggestion is that a way to ensure people have more focus on driving
is to require contact with the steering wheel. I am happy to hear your
objections to that idea, or to alternative thoughts.

Teslas already do that. Please, go to a Tesla forum and read about the cars a bit. It would save me a lot of typing.


Improper use of autopilot (and other automation in all kinds of cars)
leads to a higher risk of accidents. I expect that proper use can lower
risk. Do you disagree with these two claims?

\"Higher\" and \"lower\" than what???


Do you think Tesla\'s autopilot is perfect as it is, or is there room for
improvement?

Of course there is room for improvement. When I first got my car it wouldn\'t take an exit ramp. Then it would take the exit, but would enter it at full speed! Now it is better, but you still need to watch it. It\'s also poor at slowing before traffic lights. Often the lights are around a bend and until it sees the light is red, it\'s barreling along. Then has to hit the brakes, not just the regenerative engine brake. This is how most people drive and it wastes a lot of fuel.


Do you actually want to contribute something to this thread, or do you
just want to attack any post that isn\'t Tesla fanboy support? (Your
answers to the previous questions will cover this one too.)

This is your BS. I\'m not criticizing and criticism of Tesla. You don\'t pay attention enough to understand that. I\'m criticizing your remarks based on ignorance of Teslas, ignorance that doesn\'t stop you and Tom from forming opinions based on your ignorance rather than knowledge.

Please just go read a bit about them. There is tons of info. Even weighing just the electrons to read it all, it\'s still tons! How many electrons in a ton, anyway?

--

Rick C.

-++ Get 1,000 miles of free Supercharging
-++ Tesla referral code - https://ts.la/richard11209
 
On Friday, April 1, 2022 at 10:12:26 AM UTC-4, David Brown wrote:
On 01/04/2022 14:42, Ricky wrote:
On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:

Sorry, that\'s not how an autopilot works. It doesn\'t fly the
plane. It simply maintains a heading and altitude.
They have been doing more than that for for > 50 years. Cat 3b
landings were in operation when I was a kid.
Someone still has to be watching for other aircraft and
otherwise flying the plane. In other words, the pilot is
responsible for flying the plane, with or without the
autopilot.

Yes, that\'s the original idea of a plane autopilot. But modern
ones are more sophisticated and handle course changes along the
planned route, as well as being able to land automatically. And
more important than what plane autopilots actually /do/, is what
people /think/ they do - and remember we are talking about
drivers that think their Tesla \"autopilot\" will drive their car
while they watch a movie or nap in the back seat.
And, to put it kindly, aren\'t discouraged in that misapprehension
by the statements of the cars\' manufacturers and salesdroids.

Now, what\'s the best set of techniques to get that concept into the
heads of twats that think \"autopilot\" means \"it does it for me\".

That\'s Tom Gardner level misinformation. Comments about what people
think are spurious and unsubstantiated. A class of \"twats\" can be
invented that think anything. Nothing matters other than what Tesla
owners think. They are the ones driving the cars.

Are you suggesting that none of the people who drive Teslas are twats?
(Maybe that term is too British for you.)

The term is too BS for me. A twat is whoever you want them to be. For all I know, you think everyone who drives a Tesla is a twat. How about if we discuss facts rather than BS?


And are you suggesting that only Tesla drivers are affected by Tesla
crashes? Obviously they will be disproportionally affected, but motor
accidents often involve other people and other cars. And while Tesla
may be leading the way in car \"autopiloting\", others are following - the
strengths and weaknesses of Tesla\'s systems are relevant to other car
manufacturers.

Now I have no idea why you have brought this up from left field. Is \"left field\" too American for you? That\'s from a sport called \"baseball\", not to be confused with \"blernsball\".

--

Rick C.

+-- Get 1,000 miles of free Supercharging
+-- Tesla referral code - https://ts.la/richard11209
 
On Friday, April 1, 2022 at 10:17:55 AM UTC-4, Jeroen Belleman wrote:
On 2022-04-01 15:38, Don Y wrote:
On 4/1/2022 1:46 AM, Jeroen Belleman wrote:
You turn the steering wheel into a dead-man\'s handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention. (Maybe you have \"motorway mode\" that allows a longer
delay time, since autopilot works better there, and perhaps also a
\"traffic queue\" mode with even longer delays.)

All these \'assistants\' with their multiple \'modes\' only make things
more complicated and therefor unsafe. Simple is better.

\"Assistance\" should be intuitive. You don\'t even NOTICE the power
steering, brakes, autotranny, etc. \"assistants\" in a vehicle.
Because, for the most part, the way they operate is largely invariant
of driver, driving conditions, etc. (how often do folks use anything
other than \"D(rive)\" and \"R(everse)\"? Is there a way to *disable*
the power steering? Or brakes? Should there be?
I much prefer a simple stick shift. I can tell what state it\'s in
by touch, and there in not the slightest doubt about it.

My Tesla is a manual. The transmission never controls the gear it is in. I also know exactly what gear it is in without looking or even feeling. It only has one speed. Well, I guess it has two actually, + and -.


That isn\'t
true for an automatic. You need to /look/ what state it\'s in. They\'re
too temperamental to my taste, refusing to change state under certain
conditions. Same for the electric parking brake. It took me a while to
figure out it refuses to disengage when I\'m not wearing seat belts.
Sheesh! Talk about weird interactions!

The Tesla is also pretty good about that. I never have to worry with the parking break as it is automatically set when in park and also when at a stop light. Stepping on the brake until you are stopped sets the brake and stepping on the gas... accelerator releases it.


Power steering and brakes are in the set of assists that normally
go unnoticed until they fail. (Provided they are essentially linear,
smooth, without discontinuity or other surprise behaviour.)

I hit the starter a bit too briefly in my Kia and put it in reverse, only to find the engine had not actually started and I\'m rolling backwards with no break or steering. lol


My favorite is the side mirrors tilting downwards (to afford a view
of the ground) when backing up. The backup camera is a win as we back into
our garage and it helps avoid backing INTO something. These would be less
necessary with a \"lower profile\" vehicle, though.

[I also like the trip computer automatically reseting at each trip
and \"fill up\"]
Yes, got that too, and I agree those are good features.

Both the Kia and Tesla start a trip odometer on fueling. Any charging on the Tesla restarts it. I don\'t know about the Kia, once I\'ve driven to a gas station I\'m not leaving until the tank is full.

--

Rick C.

+-+ Get 1,000 miles of free Supercharging
+-+ Tesla referral code - https://ts.la/richard11209
 
On Friday, April 1, 2022 at 10:29:58 AM UTC-4, David Brown wrote:
On 01/04/2022 14:38, Ricky wrote:
On Thursday, March 31, 2022 at 6:39:11 PM UTC-4, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:
On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:
On 31/03/2022 22:44, Ricky wrote:
On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
On 30/03/2022 00:54, Tom Gardner wrote:
On 29/03/22 20:18, David Brown wrote:
snip
No, it is not \"too strong\". It is basic statistics. Bayes\' theorem,
and all that. If a large proportion of people use autopilot, but
only a small fraction of the deaths had the autopilot on, then
clearly the autopilot reduces risks and saves lives (of those that
drive Teslas - we still know nothing of other car drivers).

A simple comparison of numbers is not sufficient. Most Tesla
autopilot usage is on highways which are much safer per mile driven
than other roads. That\'s an inherent bias because while
non-autopilot driving must include all situations, autopilot simply
doesn\'t work in most environments.

Yes. An apples-to-apples comparison is the aim, or at least as close as
one can get.

I suspect - without statistical justification -

Yes, without justification, at all.
Which do /you/ think is most likely? Autopilot crashes on the motorway,
or autopilot crashes on smaller roads?

Because autopilot doesn\'t work off the highway (it can\'t make turns, for example) more often autopilot involved crashes are on the highways.

I was not aware of that limitation. Thanks for providing some relevant
information.
I recall a news article that said experimenters were able to fool autopilot into making a left turn at an intersection by putting two or three small squares on the roadway. In city driving the limitations are at a level that no one would try to use it.


that the accidents
involving autopilot use are precisely cases where you don\'t have a good,
clear highway, and autopilot was used in a situation where it was not
suitable. Getting good statistics and comparisons here could be helpful
in making it safer - perhaps adding a feature that has the autopilot say
\"This is not a good road for me - you have to drive yourself\" and switch
itself off. (It would be more controversial, but probably statistically
safer, if it also sometimes said \"I\'m better at driving on this kind of
road than you are\" and switching itself on!)

An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.

Some of the dashcam \"Tesla\'s making mistakes\" videos on yootoob
aren\'t confidence inspiring. Based on one I saw, I certainly
wouldn\'t dare let a Tesla drive itself in an urban environment,

I suspect there isn\'t sufficient experience to assess relative
dangers between \"artificial intelligence\" and \"natural
stupidity\".
I don\'t doubt at all that the Tesla autopilot makes mistakes.

Which depends on how you define \"mistakes\".
Of course.
It\'s a bit like asking
if your rear view mirror makes mistakes by not showing cars in the
blind spot. The autopilot is not designed to drive the car. It is a
tool to assist the driver. The driver is required to be responsible
for the safe operation of the car at all times. I can point out to
you the many, many times the car acts like a spaz and requires me to
manage the situation. Early on, there was a left turn like on a 50
mph road, the car would want to turn into when intending to drive
straight. Fortunately they have ironed out that level of issue. But
it was always my responsibility to prevent it from causing an
accident. So how would you say anything was the fault of the
autopilot?

There are a few possibilities here (though I am not trying to claim that
any of them are \"right\" in some objective sense). You might say they
had believed that the \"autopilot\" was like a plane autopilot -

It is exactly like an airplane autopilot.


you can
turn it on and leave it to safely drive itself for most of the journey
except perhaps the very beginning and very end of the trip. As you say,
the Tesla autopilot is /not/ designed for that - that might be a mistake
from the salesmen, advertisers, user-interface designers, or just the
driver\'s mistake.

Sorry, that\'s not how an autopilot works. It doesn\'t fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying the plane, with or without the autopilot.

Yes, that\'s the original idea of a plane autopilot. But modern ones are
more sophisticated and handle course changes along the planned route, as
well as being able to land automatically. And more important than what
plane autopilots actually /do/, is what people /think/ they do - and
remember we are talking about drivers that think their Tesla \"autopilot\"
will drive their car while they watch a movie or nap in the back seat.

Great! But the autopilot is not watching for other aircraft, not monitoring communications and not able to deal with any unusual events. You keep coming back to a defective idea that autopilot means the airplane is flying itself. It\'s not! Just like in the car, there is a pilot who\'s job is to fly/drive and assure safety.

I am fully aware that plane autopilots are limited. I am also aware
that they are good enough (in planes equipped with modern systems) to
allow pilots to let the system handle most of the flight itself, even
including landing. The pilot is, of course, expected to be paying
attention, watching for other aircraft, communicating with air traffic
controllers and all the rest of it. But there have been cases of pilots
falling asleep, or missing their destination because they were playing
around on their laptops. What people /should/ be doing, and what they
are /actually/ doing, is not always the same.

Exactly like the Tesla autopilot. The pilot is still in charge and responsible.


As to the movie idea, no, people don\'t think that. People might \"pretend\" that, but there\'s no level of \"thinking\" that says you can climb in the back seat while driving. Please don\'t say silly things.

You can google for \"backseat Tesla drivers\" as well as I can. I am
confident that some of these are staged, and equally confident that some
are not. There is no minimum level of \"thinking\" - no matter how daft
something might be, there is always a dafter person who will think it\'s
a good idea.

The fact that someone pulled a stunt doesn\'t mean they thought that was an ok thing to do. You know that. So why are we discussing this?


And sometimes the autopilot does something daft - it is no longer
assisting the driver, but working against him or her. That, I think,
should be counted as a mistake by the autopilot.

The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. \"Daft\" is not a very useful term, as it means what you want it to mean. \"I know it when I see it.\" Hard to design to that sort of specification.

Well, \"does something daft\" is no worse than \"acts like a spaz\", and
it\'s a good deal more politically correct!

Bzzzz. Sorry, you failed.

Really? You think describing the autopilot\'s actions as \"acts like a
spaz\" is useful and specific, while \"does something daft\" is not? As
for the political correctness - find a real spastic and ask them what
they think of your phrase.

How do you know what is meant by \"spaz\"? That\'s my point. Words like that are not well defined. I intended the word to be colorful, with no particular meaning. Your use of daft was in a statement that needed much more detail to be meaningful. Besides, if I jump off a cliff, are you going to jump as well?

--

Rick C.

++- Get 1,000 miles of free Supercharging
++- Tesla referral code - https://ts.la/richard11209
 
On 01/04/2022 18:39, Ricky wrote:
On Friday, April 1, 2022 at 10:12:26 AM UTC-4, David Brown wrote:
On 01/04/2022 14:42, Ricky wrote:
On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:

Sorry, that\'s not how an autopilot works. It doesn\'t fly the
plane. It simply maintains a heading and altitude.
They have been doing more than that for for > 50 years. Cat 3b
landings were in operation when I was a kid.
Someone still has to be watching for other aircraft and
otherwise flying the plane. In other words, the pilot is
responsible for flying the plane, with or without the
autopilot.

Yes, that\'s the original idea of a plane autopilot. But modern
ones are more sophisticated and handle course changes along the
planned route, as well as being able to land automatically. And
more important than what plane autopilots actually /do/, is what
people /think/ they do - and remember we are talking about
drivers that think their Tesla \"autopilot\" will drive their car
while they watch a movie or nap in the back seat.
And, to put it kindly, aren\'t discouraged in that misapprehension
by the statements of the cars\' manufacturers and salesdroids.

Now, what\'s the best set of techniques to get that concept into the
heads of twats that think \"autopilot\" means \"it does it for me\".

That\'s Tom Gardner level misinformation. Comments about what people
think are spurious and unsubstantiated. A class of \"twats\" can be
invented that think anything. Nothing matters other than what Tesla
owners think. They are the ones driving the cars.

Are you suggesting that none of the people who drive Teslas are twats?
(Maybe that term is too British for you.)

The term is too BS for me. A twat is whoever you want them to be. For all I know, you think everyone who drives a Tesla is a twat. How about if we discuss facts rather than BS?

It means \"a stupid person\" or \"someone who does stupid things\". No, not
everyone who drives a Tesla is a twat - but /some/ are, such as those
that think their autopilot will drive the car without them paying attention.

And are you suggesting that only Tesla drivers are affected by Tesla
crashes? Obviously they will be disproportionally affected, but motor
accidents often involve other people and other cars. And while Tesla
may be leading the way in car \"autopiloting\", others are following - the
strengths and weaknesses of Tesla\'s systems are relevant to other car
manufacturers.

Now I have no idea why you have brought this up from left field. Is \"left field\" too American for you? That\'s from a sport called \"baseball\", not to be confused with \"blernsball\".

You said that Tesla autopilots are only relevant to Tesla drivers.
That\'s wrong. I usually prefer to give a bit of explanation as to why I
think someone is wrong.
 

Welcome to EDABoard.com

Sponsor

Back
Top