Every Tesla Accident Resulting in Death...

T

Tom Gardner

Guest
From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1

The website referred to appears to be collating information in
a reasonable and unemotional way.


Every Tesla Accident Resulting in Death (Tesla Deaths)
Gabe Goldberg <gabe@gabegold.com>
Thu, 24 Mar 2022 01:53:39 -0400

We provide an updated record of Tesla fatalities and Tesla accident deaths
that have been reported and as much related crash data as possible
(e.g. location of crash, names of deceased, etc.). This sheet also tallies
claimed and confirmed Tesla autopilot crashes, i.e. instances when
Autopilot was activated during a Tesla crash that resulted in death. Read
our other sheets for additional data and analysis on vehicle miles traveled,
links and analysis comparing Musk\'s safety claims, and more.

Tesla Deaths Total as of 3/23/2022: 246
Tesla Autopilot Deaths Count: 12

https://www.tesladeaths.com/
 
On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1

The website referred to appears to be collating information in
a reasonable and unemotional way.


Every Tesla Accident Resulting in Death (Tesla Deaths)
Gabe Goldberg <ga...@gabegold.com
Thu, 24 Mar 2022 01:53:39 -0400

We provide an updated record of Tesla fatalities and Tesla accident deaths
that have been reported and as much related crash data as possible
(e.g. location of crash, names of deceased, etc.). This sheet also tallies
claimed and confirmed Tesla autopilot crashes, i.e. instances when
Autopilot was activated during a Tesla crash that resulted in death. Read
our other sheets for additional data and analysis on vehicle miles traveled,
links and analysis comparing Musk\'s safety claims, and more.

Tesla Deaths Total as of 3/23/2022: 246
Tesla Autopilot Deaths Count: 12

https://www.tesladeaths.com/

Yeah, it\'s raw data. Did you have a point?

--

Rick C.

- Get 1,000 miles of free Supercharging
- Tesla referral code - https://ts.la/richard11209
 
On 29/03/2022 15:00, Rickster wrote:
On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1

The website referred to appears to be collating information in
a reasonable and unemotional way.


Every Tesla Accident Resulting in Death (Tesla Deaths)
Gabe Goldberg <ga...@gabegold.com
Thu, 24 Mar 2022 01:53:39 -0400

We provide an updated record of Tesla fatalities and Tesla accident deaths
that have been reported and as much related crash data as possible
(e.g. location of crash, names of deceased, etc.). This sheet also tallies
claimed and confirmed Tesla autopilot crashes, i.e. instances when
Autopilot was activated during a Tesla crash that resulted in death. Read
our other sheets for additional data and analysis on vehicle miles traveled,
links and analysis comparing Musk\'s safety claims, and more.

Tesla Deaths Total as of 3/23/2022: 246
Tesla Autopilot Deaths Count: 12

https://www.tesladeaths.com/

Yeah, it\'s raw data. Did you have a point?

Without comparisons to other types of car, and correlations with other
factors, such raw data is useless. You\'d need to compare to other
high-end electric cars, other petrol cars in similar price ranges and
styles. You\'d want to look at statistics for \"typical Tesla drivers\"
(who are significantly richer than the average driver, but I don\'t know
what other characteristics might be relevant - age, gender, driving
experience, etc.) You\'d have to compare statistics for the countries
and parts of countries where Teslas are common.

And you would /definitely/ want to anonymise the data. If I had a
family member who was killed in a car crash, I would not be happy about
their name and details of their death being used for some sort of absurd
Tesla hate-site.

I\'m no fan of Teslas myself. I like a car to be controlled like a car,
not a giant iPhone (and I don\'t like iPhones either). I don\'t like the
heavy tax breaks given by Norway to a luxury car, and I don\'t like the
environmental costs of making them (though I am glad to see improvements
on that front). I don\'t like some of the silly claims people make about
them - like Apple gadgets, they seem to bring out the fanboy in some of
their owners. But that\'s all just me and my personal preferences and
opinions - if someone else likes them, that\'s fine. Many Tesla owners
are very happy with their cars (and some are unhappy - just as for any
other car manufacturer). I can\'t see any reason for trying to paint
them as evil death-traps - you\'d need very strong statistical basis for
that, not just a list of accidents.
 
On 29/03/22 15:16, David Brown wrote:
On 29/03/2022 15:00, Rickster wrote:
On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1

The website referred to appears to be collating information in
a reasonable and unemotional way.


Every Tesla Accident Resulting in Death (Tesla Deaths)
Gabe Goldberg <ga...@gabegold.com
Thu, 24 Mar 2022 01:53:39 -0400

We provide an updated record of Tesla fatalities and Tesla accident deaths
that have been reported and as much related crash data as possible
(e.g. location of crash, names of deceased, etc.). This sheet also tallies
claimed and confirmed Tesla autopilot crashes, i.e. instances when
Autopilot was activated during a Tesla crash that resulted in death. Read
our other sheets for additional data and analysis on vehicle miles traveled,
links and analysis comparing Musk\'s safety claims, and more.

Tesla Deaths Total as of 3/23/2022: 246
Tesla Autopilot Deaths Count: 12

https://www.tesladeaths.com/

Yeah, it\'s raw data. Did you have a point?

I have no point.

I am curious about the causes of crashes when \"autopilot\" is engaged.


Without comparisons to other types of car, and correlations with other
factors, such raw data is useless. You\'d need to compare to other
high-end electric cars, other petrol cars in similar price ranges and
styles. You\'d want to look at statistics for \"typical Tesla drivers\"
(who are significantly richer than the average driver, but I don\'t know
what other characteristics might be relevant - age, gender, driving
experience, etc.) You\'d have to compare statistics for the countries
and parts of countries where Teslas are common.

And you would /definitely/ want to anonymise the data. If I had a
family member who was killed in a car crash, I would not be happy about
their name and details of their death being used for some sort of absurd
Tesla hate-site.

I\'m no fan of Teslas myself. I like a car to be controlled like a car,
not a giant iPhone (and I don\'t like iPhones either). I don\'t like the
heavy tax breaks given by Norway to a luxury car, and I don\'t like the
environmental costs of making them (though I am glad to see improvements
on that front). I don\'t like some of the silly claims people make about
them - like Apple gadgets, they seem to bring out the fanboy in some of
their owners. But that\'s all just me and my personal preferences and
opinions - if someone else likes them, that\'s fine. Many Tesla owners
are very happy with their cars (and some are unhappy - just as for any
other car manufacturer). I can\'t see any reason for trying to paint
them as evil death-traps - you\'d need very strong statistical basis for
that, not just a list of accidents.

There is an attempt at comparisons, as stated in the FAQ.
 
Tom Gardner wrote:

This sheet also tallies claimed and confirmed Tesla autopilot crashes,
i.e. instances when Autopilot was activated during a Tesla crash

Would be better without that \"i.e.\"
Maybe \"active\" but not \"activated\".
 
On Tuesday, March 29, 2022 at 11:17:49 AM UTC-4, Tom Gardner wrote:
On 29/03/22 15:16, David Brown wrote:
On 29/03/2022 15:00, Rickster wrote:
On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1

The website referred to appears to be collating information in
a reasonable and unemotional way.


Every Tesla Accident Resulting in Death (Tesla Deaths)
Gabe Goldberg <ga...@gabegold.com
Thu, 24 Mar 2022 01:53:39 -0400

We provide an updated record of Tesla fatalities and Tesla accident deaths
that have been reported and as much related crash data as possible
(e.g. location of crash, names of deceased, etc.). This sheet also tallies
claimed and confirmed Tesla autopilot crashes, i.e. instances when
Autopilot was activated during a Tesla crash that resulted in death. Read
our other sheets for additional data and analysis on vehicle miles traveled,
links and analysis comparing Musk\'s safety claims, and more.

Tesla Deaths Total as of 3/23/2022: 246
Tesla Autopilot Deaths Count: 12

https://www.tesladeaths.com/

Yeah, it\'s raw data. Did you have a point?
I have no point.

I am curious about the causes of crashes when \"autopilot\" is engaged.

What do you expect to learn by posting this here? Autopilot is not perfect, by any means. They tell you to remain alert just as if you were driving, and in fact, observe your grasp on the wheel alerting you if you relax too much.

The point is when the car crashes on autopilot, it is the driver\'s fault, not the car because the car is just a driving assistance tool, like the blind spot warning device. If you smack someone in your blind spot, who\'s fault is that? Yours, because the tool is not perfect.

I know one accident occurred at a highway divide where the guy had previously had the car try to go up the middle, rather than left or right. He even posted that the car was trying to kill him. One day he did something wrong at that same spot and he killed the car and himself.

Have you learned anything new yet?

--

Rick C.

+ Get 1,000 miles of free Supercharging
+ Tesla referral code - https://ts.la/richard11209
 
On 29/03/2022 17:17, Tom Gardner wrote:
On 29/03/22 15:16, David Brown wrote:
On 29/03/2022 15:00, Rickster wrote:
Yeah, it\'s raw data.  Did you have a point?

I have no point.

Fair enough, I suppose. But was there a reason for the post then?

I am curious about the causes of crashes when \"autopilot\" is engaged.

That\'s a reasonable thing to wonder about. The more we (people in
general, Tesla drivers, Tesla developers, etc.) know about such crashes,
the better the possibilities for fixing weaknesses or understanding how
to mitigate them. Unfortunately, the main mitigation is \"don\'t rely on
autopilot - stay alert and focused on the driving\" does not work. For
one thing, many people don\'t obey it - people have been found in the
back seat of crashed Telsa\'s where they were having a nap. And those
that try to follow it are likely to doze off from boredom.

However, there is no need for a list of \"crashes involving Teslas\",
names of victims, and a site with a clear agenda to \"prove\" that Teslas
are not as safe as they claim. It is counter-productive to real
investigation and real learning.

There is an attempt at comparisons, as stated in the FAQ.

It is a pretty feeble attempt, hidden away.

Even the comparison of \"autopilot\" deaths to total deaths is useless
without information about autopilot use, and how many people rely on it.

The whole post just struck me as a bit below par for your usual high
standard. There\'s definitely an interesting thread possibility around
the idea of how safe or dangerous car \"autopilots\" can be, and how they
compare to average drivers. But your post was not a great starting
point for that.
 
On 29/03/22 20:18, David Brown wrote:
On 29/03/2022 17:17, Tom Gardner wrote:
On 29/03/22 15:16, David Brown wrote:
On 29/03/2022 15:00, Rickster wrote:
Yeah, it\'s raw data.  Did you have a point?

I have no point.


Fair enough, I suppose. But was there a reason for the post then?

Primarily to provoke thought and discussion, and
secondarily to point to occurrences that Tesla fanbois
and Musk prefer to sweep under the carpet.


I am curious about the causes of crashes when \"autopilot\" is engaged.


That\'s a reasonable thing to wonder about. The more we (people in
general, Tesla drivers, Tesla developers, etc.) know about such crashes,
the better the possibilities for fixing weaknesses or understanding how
to mitigate them. Unfortunately, the main mitigation is \"don\'t rely on
autopilot - stay alert and focused on the driving\" does not work. For
one thing, many people don\'t obey it - people have been found in the
back seat of crashed Telsa\'s where they were having a nap. And those
that try to follow it are likely to doze off from boredom.

Agreed.

Musk and his /very/ carefully worded advertising don\'t help
matters. That should be challenged by evidence.

I haven\'t seen such evidence collated anywhere else.



However, there is no need for a list of \"crashes involving Teslas\",
names of victims, and a site with a clear agenda to \"prove\" that Teslas
are not as safe as they claim. It is counter-productive to real
investigation and real learning.

As far as I can see the website does not name the dead.
The linked references may do.

Musk makes outlandish claims about his cars, which need
debunking in order to help prevent more unnecessary
accidents.

From https://catless.ncl.ac.uk/Risks/33/11/#subj3
\"Weeks earlier, a Tesla using the company\'s advanced
driver-assistance system had crashed into a tractor-trailer
at about 70 mph, killing the driver. When National Highway
Traffic Safety Administration officials called Tesla
executives to say they were launching an investigation,
Musk screamed, protested and threatened to sue, said a
former safety official who spoke on the condition of
anonymity to discuss sensitive matters.

\"The regulators knew Musk could be impulsive and stubborn;
they would need to show some spine to win his cooperation.
So they waited. And in a subsequent call, “when tempers were
a little bit cool, Musk agreed to cooperate: He was a
changed person.\'\' \"
https://www.washingtonpost.com/technology/2022/03/27/tesla-elon-musk-regulation



There is an attempt at comparisons, as stated in the FAQ.

It is a pretty feeble attempt, hidden away.

Even the comparison of \"autopilot\" deaths to total deaths is useless
without information about autopilot use, and how many people rely on it.

That\'s too strong, but I agree most ratios (including that one)
aren\'t that enlightening.


The whole post just struck me as a bit below par for your usual high
standard. There\'s definitely an interesting thread possibility around
the idea of how safe or dangerous car \"autopilots\" can be, and how they
compare to average drivers. But your post was not a great starting
point for that.

Real world experiences aren\'t a bad /starting/ point, but
they do have limitations. Better starting points are to
be welcomed.

An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.

Some of the dashcam \"Tesla\'s making mistakes\" videos on
yootoob aren\'t confidence inspiring. Based on one I saw,
I certainly wouldn\'t dare let a Tesla drive itself in
an urban environment,

I suspect there isn\'t sufficient experience to assess
relative dangers between \"artificial intelligence\" and
\"natural stupidity\".
 
On 3/29/2022 3:54 PM, Tom Gardner wrote:
Some of the dashcam \"Tesla\'s making mistakes\" videos on
yootoob aren\'t confidence inspiring. Based on one I saw,
I certainly wouldn\'t dare let a Tesla drive itself in
an urban environment,

+1

I\'m not sure I\'d rely on any of these technologies to do
more than *help* me (definitely not *replace* me!)

E.g., I like the LIDAR warning me that a vehicle is
about to pass behind my parked car as I\'m backing out...
because I often can\'t see \"a few seconds\" off to each
side, given the presence of other vehicles parked on each
side of mine. But, I still look over my shoulder AND
watch the backup camera as I pull out.

I suspect there isn\'t sufficient experience to assess
relative dangers between \"artificial intelligence\" and
\"natural stupidity\".

I\'m not sure it can all be distilled to \"natural stupidity\".

When we last looked for a new vehicle, one of the salespersons
commented on some of this \"advisory tech\" with such exuberance:
\"Oh, yeah! It works GREAT! I don\'t even bother to *look*,
anymore!\"

And, to the average Joe, why should they HAVE to \"look\" if
the technology was (allegedly) performing that function?
(\"Oh, do you mean it doesn\'t really *work*? Then why are
you charging me for it? If I couldn\'t rely on the engine,
would you tell me to always wear good WALKING SHOES when
I set out on a drive???!\")

And, \"laziness\" is often an issue.

I designed a LORAN-C -based autopilot (boat) in the 70\'s. You
typed in lat-lons of your destinations (a series) and the autopilot
would get you to them, correcting for drift (\"cross-track error\")
to ensure straight-line travel (a conventional autopilot just
kept the vessel pointed in the desired direction so ocean currents
would steadily push you off your desired course).

There was considerable debate about how to handle the sequencing
of destinations:
- should you automatically replace the current destination with
the *next* in the series, having reached the current? (and, what
do you use as criteria for reaching that current destination)
- should you require manual intervention to advance to the next
destination, having reached the current? And, if so, how will
the skipper know what the vessel\'s path will be AFTER overshooting
the destination? The autopilot will keep trying to return the vessel
to that position -- no control over throttle -- but how do you
anticipate the path that it will take in doing so?
- should you be able to alert the skipper to reaching the current
destination (in case he\'s in the stern of the vessel prepping
lobster pots for deployment)?
- should you incorporate throttle controls (what if you cut the
throttle on reaching the destination and the vessel then drifts
away from that spot)?
- should you \"tune\" the instrument to the vessel\'s characteristics
(helm control of a speedboat is considerably more responsive than
of a fishing trawler!)

There\'s no real \"right\" answer -- short of taking over more control
of the vessel (which then poses different problems).

So, you recognize the fact that skippers will act in whatever way
suits them -- at the moment -- and don\'t try to be their \"nanny\"
(cuz anything you do in that regard they will UNdo)
 
On Tuesday, March 29, 2022 at 6:54:41 PM UTC-4, Tom Gardner wrote:
On 29/03/22 20:18, David Brown wrote:
On 29/03/2022 17:17, Tom Gardner wrote:
On 29/03/22 15:16, David Brown wrote:
On 29/03/2022 15:00, Rickster wrote:
Yeah, it\'s raw data. Did you have a point?

I have no point.


Fair enough, I suppose. But was there a reason for the post then?
Primarily to provoke thought and discussion, and
secondarily to point to occurrences that Tesla fanbois
and Musk prefer to sweep under the carpet.

You haven\'t point out anything useful. You posted a link to what is really a pretty crappy web page. Where\'s the thought, where\'s the discussion? You said yourself you had nothing to say about it. Ok, thanks for the link. Bye.


I am curious about the causes of crashes when \"autopilot\" is engaged.


That\'s a reasonable thing to wonder about. The more we (people in
general, Tesla drivers, Tesla developers, etc.) know about such crashes,
the better the possibilities for fixing weaknesses or understanding how
to mitigate them. Unfortunately, the main mitigation is \"don\'t rely on
autopilot - stay alert and focused on the driving\" does not work. For
one thing, many people don\'t obey it - people have been found in the
back seat of crashed Telsa\'s where they were having a nap. And those
that try to follow it are likely to doze off from boredom.
Agreed.

Musk and his /very/ carefully worded advertising don\'t help
matters. That should be challenged by evidence.

Ok. where\'s the evidence?


> I haven\'t seen such evidence collated anywhere else.

I still haven\'t seen any evidence, although I\'m not sure what it is supposed to be evidence of.


However, there is no need for a list of \"crashes involving Teslas\",
names of victims, and a site with a clear agenda to \"prove\" that Teslas
are not as safe as they claim. It is counter-productive to real
investigation and real learning.
As far as I can see the website does not name the dead.
The linked references may do.

Musk makes outlandish claims about his cars, which need
debunking in order to help prevent more unnecessary
accidents.

From https://catless.ncl.ac.uk/Risks/33/11/#subj3
\"Weeks earlier, a Tesla using the company\'s advanced
driver-assistance system had crashed into a tractor-trailer
at about 70 mph, killing the driver. When National Highway
Traffic Safety Administration officials called Tesla
executives to say they were launching an investigation,
Musk screamed, protested and threatened to sue, said a
former safety official who spoke on the condition of
anonymity to discuss sensitive matters.

\"The regulators knew Musk could be impulsive and stubborn;
they would need to show some spine to win his cooperation.
So they waited. And in a subsequent call, “when tempers were
a little bit cool, Musk agreed to cooperate: He was a
changed person.\'\' \"
https://www.washingtonpost.com/technology/2022/03/27/tesla-elon-musk-regulation

Ok, so???

I think we all know Musk is a jerk. He\'s a huge PT Barnum sales person too.. Who didn\'t know that? What\'s your point?


There is an attempt at comparisons, as stated in the FAQ.

It is a pretty feeble attempt, hidden away.

Even the comparison of \"autopilot\" deaths to total deaths is useless
without information about autopilot use, and how many people rely on it..
That\'s too strong, but I agree most ratios (including that one)
aren\'t that enlightening.

I\'m happy to see any ratios that mean anything, but I didn\'t see them. I saw a table of incidents which included at least one death. Where are the comparisons?


The whole post just struck me as a bit below par for your usual high
standard. There\'s definitely an interesting thread possibility around
the idea of how safe or dangerous car \"autopilots\" can be, and how they
compare to average drivers. But your post was not a great starting
point for that.
Real world experiences aren\'t a bad /starting/ point, but
they do have limitations. Better starting points are to
be welcomed.

An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.

Some of the dashcam \"Tesla\'s making mistakes\" videos on
yootoob aren\'t confidence inspiring. Based on one I saw,
I certainly wouldn\'t dare let a Tesla drive itself in
an urban environment,

You aren\'t supposed to let a Tesla drive itself in any environment. You are the driver. Autopilot is just a driving assistance tool. You seem to think autopilot is autonomous driving. It\'s not even remotely close. If that\'s what you are looking for, you won\'t find anyone from Tesla claiming autopilot is anything other than an \"assist\", including Musk.


I suspect there isn\'t sufficient experience to assess
relative dangers between \"artificial intelligence\" and
\"natural stupidity\".

I\'m not sure what you wish to measure. That\'s what a comparison does, it measures one thing vs. another in terms of some measurement. What exactly do you want to measure? Or are you just on a fishing trip looking for something damning to Musk or Tesla?

--

Rick C.

-- Get 1,000 miles of free Supercharging
-- Tesla referral code - https://ts.la/richard11209
 
On 30/03/2022 00:54, Tom Gardner wrote:
On 29/03/22 20:18, David Brown wrote:
On 29/03/2022 17:17, Tom Gardner wrote:
On 29/03/22 15:16, David Brown wrote:
On 29/03/2022 15:00, Rickster wrote:
Yeah, it\'s raw data.  Did you have a point?

I have no point.


Fair enough, I suppose.  But was there a reason for the post then?

Primarily to provoke thought and discussion, and
secondarily to point to occurrences that Tesla fanbois
and Musk prefer to sweep under the carpet.


I am curious about the causes of crashes when \"autopilot\" is engaged.


That\'s a reasonable thing to wonder about.  The more we (people in
general, Tesla drivers, Tesla developers, etc.) know about such crashes,
the better the possibilities for fixing weaknesses or understanding how
to mitigate them.  Unfortunately, the main mitigation is \"don\'t rely on
autopilot - stay alert and focused on the driving\" does not work.  For
one thing, many people don\'t obey it - people have been found in the
back seat of crashed Telsa\'s where they were having a nap.  And those
that try to follow it are likely to doze off from boredom.

Agreed.

Musk and his /very/ carefully worded advertising don\'t help
matters. That should be challenged by evidence.

I haven\'t seen such evidence collated anywhere else.

But that site does not have evidence of anything relevant. It shows
that people sometimes die on the road, even in Teslas. Nothing more.

If the Tesla people are using false or misleading advertising, or making
safety claims that can\'t be verified, then I agree they should be held
accountable. Collect evidence to show that - /real/ comparisons and
/real/ statistics.

Progress was not made against tobacco companies by compiling lists of
people who smoked and then died. It was done by comparing the death
rates of people who smoked to those of people who don\'t smoke.

However, there is no need for a list of \"crashes involving Teslas\",
names of victims, and a site with a clear agenda to \"prove\" that Teslas
are not as safe as they claim.  It is counter-productive to real
investigation and real learning.

As far as I can see the website does not name the dead.
The linked references may do.

From your initial post (you read what you quoted, didn\'t you?) :

\"\"\"
We provide an updated record of Tesla fatalities and Tesla accident deaths
that have been reported and as much related crash data as possible
(e.g. location of crash, names of deceased, etc.).
\"\"\"

Musk makes outlandish claims about his cars, which need
debunking in order to help prevent more unnecessary
accidents.

From https://catless.ncl.ac.uk/Risks/33/11/#subj3
  \"Weeks earlier, a Tesla using the company\'s advanced
  driver-assistance system had crashed into a tractor-trailer
  at about 70 mph, killing the driver. When National Highway
  Traffic Safety Administration officials called Tesla
  executives to say they were launching an investigation,
  Musk screamed, protested and threatened to sue, said a
  former safety official who spoke on the condition of
  anonymity to discuss sensitive matters.

  \"The regulators knew Musk could be impulsive and stubborn;
  they would need to show some spine to win his cooperation.
  So they waited. And in a subsequent call, “when tempers were
  a little bit cool, Musk agreed to cooperate: He was a
  changed person.\'\' \"
 
https://www.washingtonpost.com/technology/2022/03/27/tesla-elon-musk-regulation

So people who know how to investigate these things are investigating
them. That\'s great. (It is also - in theory, at least - unbiased. The
autopilot might not have been at fault.) It\'s a lot better than some
amateur with a grudge, an ignorance of statistics and a google document
page.

There is an attempt at comparisons, as stated in the FAQ.

It is a pretty feeble attempt, hidden away.

Even the comparison of \"autopilot\" deaths to total deaths is useless
without information about autopilot use, and how many people rely on it.

That\'s too strong, but I agree most ratios (including that one)
aren\'t that enlightening.

No, it is not \"too strong\". It is basic statistics. Bayes\' theorem,
and all that. If a large proportion of people use autopilot, but only a
small fraction of the deaths had the autopilot on, then clearly the
autopilot reduces risks and saves lives (of those that drive Teslas - we
still know nothing of other car drivers).

The whole post just struck me as a bit below par for your usual high
standard.  There\'s definitely an interesting thread possibility around
the idea of how safe or dangerous car \"autopilots\" can be, and how they
compare to average drivers.  But your post was not a great starting
point for that.

Real world experiences aren\'t a bad /starting/ point, but
they do have limitations. Better starting points are to
be welcomed.

Real world experiences are enough to say \"this might be worth looking
at\" - but no more than that.

An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.

Some of the dashcam \"Tesla\'s making mistakes\" videos on
yootoob aren\'t confidence inspiring. Based on one I saw,
I certainly wouldn\'t dare let a Tesla drive itself in
an urban environment,

I suspect there isn\'t sufficient experience to assess
relative dangers between \"artificial intelligence\" and
\"natural stupidity\".

I don\'t doubt at all that the Tesla autopilot makes mistakes. So do
human drivers. The interesting question is who makes fewer mistakes, or
mistakes with lower consequences - and that is a question for which no
amount of anecdotal yootoob videos or Tesla/Musk hate sites will help.
The only evidence you have so far is that people love to show that
something fancy and expensive is not always perfect, and I believe we
knew that already.
 
On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
On 30/03/2022 00:54, Tom Gardner wrote:
On 29/03/22 20:18, David Brown wrote:
On 29/03/2022 17:17, Tom Gardner wrote:
On 29/03/22 15:16, David Brown wrote:
On 29/03/2022 15:00, Rickster wrote:
Yeah, it\'s raw data. Did you have a point?

I have no point.


Fair enough, I suppose. But was there a reason for the post then?

Primarily to provoke thought and discussion, and
secondarily to point to occurrences that Tesla fanbois
and Musk prefer to sweep under the carpet.


I am curious about the causes of crashes when \"autopilot\" is engaged.


That\'s a reasonable thing to wonder about. The more we (people in
general, Tesla drivers, Tesla developers, etc.) know about such crashes,
the better the possibilities for fixing weaknesses or understanding how
to mitigate them. Unfortunately, the main mitigation is \"don\'t rely on
autopilot - stay alert and focused on the driving\" does not work. For
one thing, many people don\'t obey it - people have been found in the
back seat of crashed Telsa\'s where they were having a nap. And those
that try to follow it are likely to doze off from boredom.

Agreed.

Musk and his /very/ carefully worded advertising don\'t help
matters. That should be challenged by evidence.

I haven\'t seen such evidence collated anywhere else.
But that site does not have evidence of anything relevant. It shows
that people sometimes die on the road, even in Teslas. Nothing more.

If the Tesla people are using false or misleading advertising, or making
safety claims that can\'t be verified, then I agree they should be held
accountable. Collect evidence to show that - /real/ comparisons and
/real/ statistics.

Progress was not made against tobacco companies by compiling lists of
people who smoked and then died. It was done by comparing the death
rates of people who smoked to those of people who don\'t smoke.



However, there is no need for a list of \"crashes involving Teslas\",
names of victims, and a site with a clear agenda to \"prove\" that Teslas
are not as safe as they claim. It is counter-productive to real
investigation and real learning.

As far as I can see the website does not name the dead.
The linked references may do.
From your initial post (you read what you quoted, didn\'t you?) :
\"\"\"
We provide an updated record of Tesla fatalities and Tesla accident deaths
that have been reported and as much related crash data as possible
(e.g. location of crash, names of deceased, etc.).
\"\"\"


Musk makes outlandish claims about his cars, which need
debunking in order to help prevent more unnecessary
accidents.

From https://catless.ncl.ac.uk/Risks/33/11/#subj3
\"Weeks earlier, a Tesla using the company\'s advanced
driver-assistance system had crashed into a tractor-trailer
at about 70 mph, killing the driver. When National Highway
Traffic Safety Administration officials called Tesla
executives to say they were launching an investigation,
Musk screamed, protested and threatened to sue, said a
former safety official who spoke on the condition of
anonymity to discuss sensitive matters.

\"The regulators knew Musk could be impulsive and stubborn;
they would need to show some spine to win his cooperation.
So they waited. And in a subsequent call, “when tempers were
a little bit cool, Musk agreed to cooperate: He was a
changed person.\'\' \"

https://www.washingtonpost.com/technology/2022/03/27/tesla-elon-musk-regulation

So people who know how to investigate these things are investigating
them. That\'s great. (It is also - in theory, at least - unbiased. The
autopilot might not have been at fault.) It\'s a lot better than some
amateur with a grudge, an ignorance of statistics and a google document
page.



There is an attempt at comparisons, as stated in the FAQ.

It is a pretty feeble attempt, hidden away.

Even the comparison of \"autopilot\" deaths to total deaths is useless
without information about autopilot use, and how many people rely on it.

That\'s too strong, but I agree most ratios (including that one)
aren\'t that enlightening.
No, it is not \"too strong\". It is basic statistics. Bayes\' theorem,
and all that. If a large proportion of people use autopilot, but only a
small fraction of the deaths had the autopilot on, then clearly the
autopilot reduces risks and saves lives (of those that drive Teslas - we
still know nothing of other car drivers).

A simple comparison of numbers is not sufficient. Most Tesla autopilot usage is on highways which are much safer per mile driven than other roads. That\'s an inherent bias because while non-autopilot driving must include all situations, autopilot simply doesn\'t work in most environments.


The whole post just struck me as a bit below par for your usual high
standard. There\'s definitely an interesting thread possibility around
the idea of how safe or dangerous car \"autopilots\" can be, and how they
compare to average drivers. But your post was not a great starting
point for that.

Real world experiences aren\'t a bad /starting/ point, but
they do have limitations. Better starting points are to
be welcomed.
Real world experiences are enough to say \"this might be worth looking
at\" - but no more than that.

An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.

Some of the dashcam \"Tesla\'s making mistakes\" videos on
yootoob aren\'t confidence inspiring. Based on one I saw,
I certainly wouldn\'t dare let a Tesla drive itself in
an urban environment,

I suspect there isn\'t sufficient experience to assess
relative dangers between \"artificial intelligence\" and
\"natural stupidity\".
I don\'t doubt at all that the Tesla autopilot makes mistakes.

Which depends on how you define \"mistakes\". It\'s a bit like asking if your rear view mirror makes mistakes by not showing cars in the blind spot. The autopilot is not designed to drive the car. It is a tool to assist the driver. The driver is required to be responsible for the safe operation of the car at all times. I can point out to you the many, many times the car acts like a spaz and requires me to manage the situation. Early on, there was a left turn like on a 50 mph road, the car would want to turn into when intending to drive straight. Fortunately they have ironed out that level of issue. But it was always my responsibility to prevent it from causing an accident. So how would you say anything was the fault of the autopilot?


So do
human drivers. The interesting question is who makes fewer mistakes, or
mistakes with lower consequences - and that is a question for which no
amount of anecdotal yootoob videos or Tesla/Musk hate sites will help.
The only evidence you have so far is that people love to show that
something fancy and expensive is not always perfect, and I believe we
knew that already.

That\'s where they are headed with the full self driving. But gauging the breadth of issues the car has problems with, I think it will be a long, long time before we can sit back and relax while the car drives us home.

--

Rick C.

-+ Get 1,000 miles of free Supercharging
-+ Tesla referral code - https://ts.la/richard11209
 
On 31/03/2022 22:44, Ricky wrote:
On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
On 30/03/2022 00:54, Tom Gardner wrote:
On 29/03/22 20:18, David Brown wrote:

<snip>

No, it is not \"too strong\". It is basic statistics. Bayes\' theorem,
and all that. If a large proportion of people use autopilot, but
only a small fraction of the deaths had the autopilot on, then
clearly the autopilot reduces risks and saves lives (of those that
drive Teslas - we still know nothing of other car drivers).

A simple comparison of numbers is not sufficient. Most Tesla
autopilot usage is on highways which are much safer per mile driven
than other roads. That\'s an inherent bias because while
non-autopilot driving must include all situations, autopilot simply
doesn\'t work in most environments.

Yes. An apples-to-apples comparison is the aim, or at least as close as
one can get.

I suspect - without statistical justification - that the accidents
involving autopilot use are precisely cases where you don\'t have a good,
clear highway, and autopilot was used in a situation where it was not
suitable. Getting good statistics and comparisons here could be helpful
in making it safer - perhaps adding a feature that has the autopilot say
\"This is not a good road for me - you have to drive yourself\" and switch
itself off. (It would be more controversial, but probably statistically
safer, if it also sometimes said \"I\'m better at driving on this kind of
road than you are\" and switching itself on!)

An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.

Some of the dashcam \"Tesla\'s making mistakes\" videos on yootoob
aren\'t confidence inspiring. Based on one I saw, I certainly
wouldn\'t dare let a Tesla drive itself in an urban environment,

I suspect there isn\'t sufficient experience to assess relative
dangers between \"artificial intelligence\" and \"natural
stupidity\".
I don\'t doubt at all that the Tesla autopilot makes mistakes.

Which depends on how you define \"mistakes\".

Of course.

It\'s a bit like asking
if your rear view mirror makes mistakes by not showing cars in the
blind spot. The autopilot is not designed to drive the car. It is a
tool to assist the driver. The driver is required to be responsible
for the safe operation of the car at all times. I can point out to
you the many, many times the car acts like a spaz and requires me to
manage the situation. Early on, there was a left turn like on a 50
mph road, the car would want to turn into when intending to drive
straight. Fortunately they have ironed out that level of issue. But
it was always my responsibility to prevent it from causing an
accident. So how would you say anything was the fault of the
autopilot?

There are a few possibilities here (though I am not trying to claim that
any of them are \"right\" in some objective sense). You might say they
had believed that the \"autopilot\" was like a plane autopilot - you can
turn it on and leave it to safely drive itself for most of the journey
except perhaps the very beginning and very end of the trip. As you say,
the Tesla autopilot is /not/ designed for that - that might be a mistake
from the salesmen, advertisers, user-interface designers, or just the
driver\'s mistake.

And sometimes the autopilot does something daft - it is no longer
assisting the driver, but working against him or her. That, I think,
should be counted as a mistake by the autopilot. Tesla autopilots are
not alone in this, of course. I have heard of several cases where
\"smart\" cruise controls on cars have been confused by things like
changes to road layouts or when driving in tunnels underneath parts of a
city, and suddenly braking hard due to speed limit changes on surface
roads that don\'t apply in the tunnel.

So do human drivers. The interesting question is who makes fewer
mistakes, or mistakes with lower consequences - and that is a
question for which no amount of anecdotal yootoob videos or
Tesla/Musk hate sites will help. The only evidence you have so far
is that people love to show that something fancy and expensive is
not always perfect, and I believe we knew that already.

That\'s where they are headed with the full self driving. But gauging
the breadth of issues the car has problems with, I think it will be a
long, long time before we can sit back and relax while the car drives
us home.

Yes. Automatic driving is progressing, but it has a long way to go as yet.
 
On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:
On 31/03/2022 22:44, Ricky wrote:
On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
On 30/03/2022 00:54, Tom Gardner wrote:
On 29/03/22 20:18, David Brown wrote:
snip
No, it is not \"too strong\". It is basic statistics. Bayes\' theorem,
and all that. If a large proportion of people use autopilot, but
only a small fraction of the deaths had the autopilot on, then
clearly the autopilot reduces risks and saves lives (of those that
drive Teslas - we still know nothing of other car drivers).

A simple comparison of numbers is not sufficient. Most Tesla
autopilot usage is on highways which are much safer per mile driven
than other roads. That\'s an inherent bias because while
non-autopilot driving must include all situations, autopilot simply
doesn\'t work in most environments.

Yes. An apples-to-apples comparison is the aim, or at least as close as
one can get.

I suspect - without statistical justification -

Yes, without justification, at all.

that the accidents
involving autopilot use are precisely cases where you don\'t have a good,
clear highway, and autopilot was used in a situation where it was not
suitable. Getting good statistics and comparisons here could be helpful
in making it safer - perhaps adding a feature that has the autopilot say
\"This is not a good road for me - you have to drive yourself\" and switch
itself off. (It would be more controversial, but probably statistically
safer, if it also sometimes said \"I\'m better at driving on this kind of
road than you are\" and switching itself on!)

An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.

Some of the dashcam \"Tesla\'s making mistakes\" videos on yootoob
aren\'t confidence inspiring. Based on one I saw, I certainly
wouldn\'t dare let a Tesla drive itself in an urban environment,

I suspect there isn\'t sufficient experience to assess relative
dangers between \"artificial intelligence\" and \"natural
stupidity\".
I don\'t doubt at all that the Tesla autopilot makes mistakes.

Which depends on how you define \"mistakes\".
Of course.
It\'s a bit like asking
if your rear view mirror makes mistakes by not showing cars in the
blind spot. The autopilot is not designed to drive the car. It is a
tool to assist the driver. The driver is required to be responsible
for the safe operation of the car at all times. I can point out to
you the many, many times the car acts like a spaz and requires me to
manage the situation. Early on, there was a left turn like on a 50
mph road, the car would want to turn into when intending to drive
straight. Fortunately they have ironed out that level of issue. But
it was always my responsibility to prevent it from causing an
accident. So how would you say anything was the fault of the
autopilot?

There are a few possibilities here (though I am not trying to claim that
any of them are \"right\" in some objective sense). You might say they
had believed that the \"autopilot\" was like a plane autopilot -

It is exactly like an airplane autopilot.


you can
turn it on and leave it to safely drive itself for most of the journey
except perhaps the very beginning and very end of the trip. As you say,
the Tesla autopilot is /not/ designed for that - that might be a mistake
from the salesmen, advertisers, user-interface designers, or just the
driver\'s mistake.

Sorry, that\'s not how an autopilot works. It doesn\'t fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying the plane, with or without the autopilot.


And sometimes the autopilot does something daft - it is no longer
assisting the driver, but working against him or her. That, I think,
should be counted as a mistake by the autopilot.

The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. \"Daft\" is not a very useful term, as it means what you want it to mean. \"I know it when I see it.\" Hard to design to that sort of specification.

--

Rick C.

+- Get 1,000 miles of free Supercharging
+- Tesla referral code - https://ts.la/richard11209
 
On 01/04/2022 00:29, Ricky wrote:
On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:
On 31/03/2022 22:44, Ricky wrote:
On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
On 30/03/2022 00:54, Tom Gardner wrote:
On 29/03/22 20:18, David Brown wrote:
snip
No, it is not \"too strong\". It is basic statistics. Bayes\' theorem,
and all that. If a large proportion of people use autopilot, but
only a small fraction of the deaths had the autopilot on, then
clearly the autopilot reduces risks and saves lives (of those that
drive Teslas - we still know nothing of other car drivers).

A simple comparison of numbers is not sufficient. Most Tesla
autopilot usage is on highways which are much safer per mile driven
than other roads. That\'s an inherent bias because while
non-autopilot driving must include all situations, autopilot simply
doesn\'t work in most environments.

Yes. An apples-to-apples comparison is the aim, or at least as close as
one can get.

I suspect - without statistical justification -

Yes, without justification, at all.

Which do /you/ think is most likely? Autopilot crashes on the motorway,
or autopilot crashes on smaller roads?

that the accidents
involving autopilot use are precisely cases where you don\'t have a good,
clear highway, and autopilot was used in a situation where it was not
suitable. Getting good statistics and comparisons here could be helpful
in making it safer - perhaps adding a feature that has the autopilot say
\"This is not a good road for me - you have to drive yourself\" and switch
itself off. (It would be more controversial, but probably statistically
safer, if it also sometimes said \"I\'m better at driving on this kind of
road than you are\" and switching itself on!)

An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.

Some of the dashcam \"Tesla\'s making mistakes\" videos on yootoob
aren\'t confidence inspiring. Based on one I saw, I certainly
wouldn\'t dare let a Tesla drive itself in an urban environment,

I suspect there isn\'t sufficient experience to assess relative
dangers between \"artificial intelligence\" and \"natural
stupidity\".
I don\'t doubt at all that the Tesla autopilot makes mistakes.

Which depends on how you define \"mistakes\".
Of course.
It\'s a bit like asking
if your rear view mirror makes mistakes by not showing cars in the
blind spot. The autopilot is not designed to drive the car. It is a
tool to assist the driver. The driver is required to be responsible
for the safe operation of the car at all times. I can point out to
you the many, many times the car acts like a spaz and requires me to
manage the situation. Early on, there was a left turn like on a 50
mph road, the car would want to turn into when intending to drive
straight. Fortunately they have ironed out that level of issue. But
it was always my responsibility to prevent it from causing an
accident. So how would you say anything was the fault of the
autopilot?

There are a few possibilities here (though I am not trying to claim that
any of them are \"right\" in some objective sense). You might say they
had believed that the \"autopilot\" was like a plane autopilot -

It is exactly like an airplane autopilot.


you can
turn it on and leave it to safely drive itself for most of the journey
except perhaps the very beginning and very end of the trip. As you say,
the Tesla autopilot is /not/ designed for that - that might be a mistake
from the salesmen, advertisers, user-interface designers, or just the
driver\'s mistake.

Sorry, that\'s not how an autopilot works. It doesn\'t fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying the plane, with or without the autopilot.

Yes, that\'s the original idea of a plane autopilot. But modern ones are
more sophisticated and handle course changes along the planned route, as
well as being able to land automatically. And more important than what
plane autopilots actually /do/, is what people /think/ they do - and
remember we are talking about drivers that think their Tesla \"autopilot\"
will drive their car while they watch a movie or nap in the back seat.

And sometimes the autopilot does something daft - it is no longer
assisting the driver, but working against him or her. That, I think,
should be counted as a mistake by the autopilot.

The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. \"Daft\" is not a very useful term, as it means what you want it to mean. \"I know it when I see it.\" Hard to design to that sort of specification.

Well, \"does something daft\" is no worse than \"acts like a spaz\", and
it\'s a good deal more politically correct!
 
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:

Sorry, that\'s not how an autopilot works. It doesn\'t fly the plane. It
simply maintains a heading and altitude.

They have been doing more than that for for > 50 years.
Cat 3b landings were in operation when I was a kid.


Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the
pilot is responsible for flying the plane, with or without the autopilot.

Yes, that\'s the original idea of a plane autopilot. But modern ones are more
sophisticated and handle course changes along the planned route, as well as
being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and remember we
are talking about drivers that think their Tesla \"autopilot\" will drive their
car while they watch a movie or nap in the back seat.

And, to put it kindly, aren\'t discouraged in that misapprehension
by the statements of the cars\' manufacturers and salesdroids.

Now, what\'s the best set of techniques to get that concept
into the heads of twats that think \"autopilot\" means \"it does
it for me\".
 
On 01/04/2022 02:19, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:

Someone still has to be watching
for other aircraft and otherwise flying the plane.  In other words, the
pilot is responsible for flying the plane, with or without the
autopilot.

Yes, that\'s the original idea of a plane autopilot.  But modern ones
are more
sophisticated and handle course changes along the planned route, as
well as
being able to land automatically.  And more important than what plane
autopilots actually /do/, is what people /think/ they do - and
remember we
are talking about drivers that think their Tesla \"autopilot\" will
drive their
car while they watch a movie or nap in the back seat.

And, to put it kindly, aren\'t discouraged in that misapprehension
by the statements of the cars\' manufacturers and salesdroids.

Now, what\'s the best set of techniques to get that concept
into the heads of twats that think \"autopilot\" means \"it does
it for me\".

You don\'t. Twats will always be twats. You fix the cars.

You start by changing the name. \"Driver assistance\" rather than
\"autopilot\".

You turn the steering wheel into a dead-man\'s handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention. (Maybe you have \"motorway mode\" that allows a longer
delay time, since autopilot works better there, and perhaps also a
\"traffic queue\" mode with even longer delays.)
 
On 01/04/22 08:08, David Brown wrote:
On 01/04/2022 02:19, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:

Someone still has to be watching
for other aircraft and otherwise flying the plane.  In other words, the
pilot is responsible for flying the plane, with or without the
autopilot.

Yes, that\'s the original idea of a plane autopilot.  But modern ones
are more
sophisticated and handle course changes along the planned route, as
well as
being able to land automatically.  And more important than what plane
autopilots actually /do/, is what people /think/ they do - and
remember we
are talking about drivers that think their Tesla \"autopilot\" will
drive their
car while they watch a movie or nap in the back seat.

And, to put it kindly, aren\'t discouraged in that misapprehension
by the statements of the cars\' manufacturers and salesdroids.

Now, what\'s the best set of techniques to get that concept
into the heads of twats that think \"autopilot\" means \"it does
it for me\".

You don\'t. Twats will always be twats. You fix the cars.

You start by changing the name. \"Driver assistance\" rather than
\"autopilot\".

That\'s one of the things I was thinking of.


You turn the steering wheel into a dead-man\'s handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention.

I\'ve wondered why they don\'t implement that, then realised
it would directly contradict their advertising.


(Maybe you have \"motorway mode\" that allows a longer
delay time, since autopilot works better there, and perhaps also a
\"traffic queue\" mode with even longer delays.)

Modes are a pain[1]. Too often plane crash investigators hear
\"what\'s it doing /now/\" on the CVR.

There was also the case that wheel brakes should not be applied
until after landing, and that was defined by \"wheels are rotating\".
Then an aquaplaning aircraft skidded off the end of the runway!

[1]Remember the early Smalltalk T-shirt drawing attention
to the novel concept of WIMP interface using the motto
\"don\'t mode me in\"?)
 
On 3/31/2022 5:19 PM, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:

Sorry, that\'s not how an autopilot works. It doesn\'t fly the plane. It
simply maintains a heading and altitude.

They have been doing more than that for for > 50 years.
Cat 3b landings were in operation when I was a kid.

Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the
pilot is responsible for flying the plane, with or without the autopilot.

Yes, that\'s the original idea of a plane autopilot. But modern ones are more
sophisticated and handle course changes along the planned route, as well as
being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and remember we
are talking about drivers that think their Tesla \"autopilot\" will drive their
car while they watch a movie or nap in the back seat.

And, to put it kindly, aren\'t discouraged in that misapprehension
by the statements of the cars\' manufacturers and salesdroids.

Now, what\'s the best set of techniques to get that concept
into the heads of twats that think \"autopilot\" means \"it does
it for me\".

\"Pilots\" and \"drivers\" approach their efforts entirely differently
and with different mindsets.

ANYONE can drive a car. By contrast, a fair bit more understanding,
reasoning and skill is required to pilot an aircraft.

I.e., a pilot is a lot more likely to understand the function
AND LIMITATIONS of an (aircraft) autopilot than a driver is to
have similar appreciation for an (automobile) \"autopilot\".
 
On 2022-04-01 09:08, David Brown wrote:
On 01/04/2022 02:19, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:

Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the
pilot is responsible for flying the plane, with or without the
autopilot.

Yes, that\'s the original idea of a plane autopilot. But modern ones
are more
sophisticated and handle course changes along the planned route, as
well as
being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and
remember we
are talking about drivers that think their Tesla \"autopilot\" will
drive their
car while they watch a movie or nap in the back seat.

And, to put it kindly, aren\'t discouraged in that misapprehension
by the statements of the cars\' manufacturers and salesdroids.

Now, what\'s the best set of techniques to get that concept
into the heads of twats that think \"autopilot\" means \"it does
it for me\".

You don\'t. Twats will always be twats. You fix the cars.

You start by changing the name. \"Driver assistance\" rather than
\"autopilot\".

You turn the steering wheel into a dead-man\'s handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention. (Maybe you have \"motorway mode\" that allows a longer
delay time, since autopilot works better there, and perhaps also a
\"traffic queue\" mode with even longer delays.)

All these \'assistants\' with their multiple \'modes\' only make things
more complicated and therefor unsafe. Simple is better.

I recently got a car that came standard with \'lane assist\'. I
hate it. It\'s like having a passenger tugging on the steering wheel,
absolutely intolerable. It also can\'t be switched off permanently.
For the first week or two, I just blindfolded the camera it uses to
watch the road, until I found out how to switch it off with a single
button press. (There are far too many buttons, for that matter, and
all with multiple functions, too. Bad!)

That said, some automatic functions /are/ good. Climate control with
a real thermostat, auto-darkening rear view mirrors, mostly functions
that have nothing to do with driving per se. The only /good/ automatic
functions are those you don\'t notice until they /stop/ working. I also
like the GPS with head-up display.

Jeroen Belleman
 

Welcome to EDABoard.com

Sponsor

Back
Top