nightmare

On Thursday, September 5, 2019 at 12:36:46 PM UTC-4, Lasse Langwadt Christensen wrote:
torsdag den 5. september 2019 kl. 16.00.34 UTC+2 skrev Rick C:
On Thursday, September 5, 2019 at 9:41:06 AM UTC-4, DecadentLinux...@decadence.org wrote:
tabbypurr@gmail.com wrote in
news:6d6259b2-1624-4eb3-a906-fd84f85a77e9@googlegroups.com:


depends how you define wrong. Plenty of people go to McDs when
they could have a better burger a few doors away


I find it lame that those going to McD's or the others failed to
notice them creep up the burger price to now $3.99 and even $4.99!

It is so lame, and their app 'offers' turned lame too.

Right now... the best fast food burger in this region of the
country anyway... is the Checker's (Rally's) Texas Garlic Bread
Toast Double Burger at $2.99.

A $3 burger is closer to the right price point. Far closer than
the others. If I want to pay that much for a burger, I'll go down to
the popular sports bar and get their $7 burger and actually get a
real, cooked when you ordered it on a grill by a cook and dressed
amazingly. And that usually comes with a side salad too... for that
$7. And some of them have POOL TABLES! And juke boxes and sports
playing on the TVs all over the place. Far more worth the extra two
bucks when standing in line on a sticky slop-mopped floor to buy
economized process max profit fast 'food' is your alternative. Or
the "drive through" method. The roller skate server and stalls were
a far better way. Frisch's Big Boy should be very successful, but
jackasses want food fast so they can continue driving down the road
with their faces planted in a cell phone, except now they can fuss
with a bag of food at the same time. How quaint... NOT!

The sourdough garlic bread toast is an excellent change of pace for
fast food. Their Philly Cheese Steak Burgers they put out for a
short time were pretty darn good too.

I find it amusing that people are talking about dollar store burgers as if there was any way they could be "good" food. Virtually all the meat comes from other than contented cows, processed at the same smelly abattoirs and shipped frozen through the same channels only to be thawed out and cooked by someone barely old enough to drive.


the cheap meat might be from old dairy cows, which if fine for a burger,
but I don't know what third world country you have to be in for abattoirs
and such not be strictly controlled by vets and food safety authorities

I'm not familiar with regulations of any country that require quality food rather than simply food that is safe enough it doesn't make you sick. Back when I ate meat, there were many times I had crappy meals with meat with no real taste or even a good texture in some cases. I also am not aware of any country with laws requiring anything other than a minimum level of not being cruel to animals. Have you ever even been near a chicken house? They are literally some of the most disgusting places I've ever seen.

Yeah, they are "controlled" but that has very little to do with the quality of the meat or the life it had before it became "meat".

The cheapest meat meals are the worst in terms of what they've done to produce that meal.

--

Rick C.

--+ Get 1,000 miles of free Supercharging
--+ Tesla referral code - https://ts.la/richard11209
 
On 9/5/19 3:18 PM, bitrex wrote:
On 9/5/19 2:53 PM, Michael Terrell wrote:
On Thursday, September 5, 2019 at 2:04:32 AM UTC-4, bitrex wrote:

For the tasks the kiddos were attempting to use the Mac Quadra-class
machines for at the time (mid 1990s) at ART COLLEGE e.g. standard
definition video editing they were sort of useless when they were
new, too!

You set up the edits and and effects you wanted to apply, let the pizza
box crank overnight, and hope to God it hadn't crashed or locked up in
the morning.

A mid '80s Vital Industries 'Squeeze Zoom' video effects system did
real time video effects with a Z80B and 768KB of RAM in broadcast
quality System M color. The edit controllers for the Sony U-max (3/4")
and the 2" Reel to Reel VTRs used 6502 processors. Those MAC computers
were always crap, for real work. The Commodore Amiga with the external
Video Toaster hardware was much better, and cheaper.


Budget, budget. small colleges weren't and still aren't drowning in
Federal funds. I think there might have been one Real Professional video
editing machine in the lab I don't recall. It might have cost 10 or 15
grand, new.

It's a tough swing to even have one because if you blow all the dough on
the film department's gear the comp sci and student library dept/e-mail
dept start bitchin' that they're still using 486DXes from '91 and what
the hell is going on?

Mac Color Classics and 486DXes were still being used as workhorse
machines for email and word-processing at my school well into the late
1990s, early 2000s probably.
 
On 9/5/19 2:53 PM, Michael Terrell wrote:
On Thursday, September 5, 2019 at 2:04:32 AM UTC-4, bitrex wrote:

For the tasks the kiddos were attempting to use the Mac Quadra-class
machines for at the time (mid 1990s) at ART COLLEGE e.g. standard
definition video editing they were sort of useless when they were new, too!

You set up the edits and and effects you wanted to apply, let the pizza
box crank overnight, and hope to God it hadn't crashed or locked up in
the morning.

A mid '80s Vital Industries 'Squeeze Zoom' video effects system did real time video effects with a Z80B and 768KB of RAM in broadcast quality System M color. The edit controllers for the Sony U-max (3/4") and the 2" Reel to Reel VTRs used 6502 processors. Those MAC computers were always crap, for real work. The Commodore Amiga with the external Video Toaster hardware was much better, and cheaper.

The early Quadras particularly had some problems:

<https://en.wikipedia.org/wiki/Macintosh_Quadra#Processor>
 
On 9/5/19 2:53 PM, Michael Terrell wrote:
On Thursday, September 5, 2019 at 2:04:32 AM UTC-4, bitrex wrote:

For the tasks the kiddos were attempting to use the Mac Quadra-class
machines for at the time (mid 1990s) at ART COLLEGE e.g. standard
definition video editing they were sort of useless when they were new, too!

You set up the edits and and effects you wanted to apply, let the pizza
box crank overnight, and hope to God it hadn't crashed or locked up in
the morning.

A mid '80s Vital Industries 'Squeeze Zoom' video effects system did real time video effects with a Z80B and 768KB of RAM in broadcast quality System M color. The edit controllers for the Sony U-max (3/4") and the 2" Reel to Reel VTRs used 6502 processors. Those MAC computers were always crap, for real work. The Commodore Amiga with the external Video Toaster hardware was much better, and cheaper.

Budget, budget. small colleges weren't and still aren't drowning in
Federal funds. I think there might have been one Real Professional video
editing machine in the lab I don't recall. It might have cost 10 or 15
grand, new.

It's a tough swing to even have one because if you blow all the dough on
the film department's gear the comp sci and student library dept/e-mail
dept start bitchin' that they're still using 486DXes from '91 and what
the hell is going on?
 
On Thursday, September 5, 2019 at 2:04:32 AM UTC-4, bitrex wrote:
For the tasks the kiddos were attempting to use the Mac Quadra-class
machines for at the time (mid 1990s) at ART COLLEGE e.g. standard
definition video editing they were sort of useless when they were new, too!

You set up the edits and and effects you wanted to apply, let the pizza
box crank overnight, and hope to God it hadn't crashed or locked up in
the morning.

A mid '80s Vital Industries 'Squeeze Zoom' video effects system did real time video effects with a Z80B and 768KB of RAM in broadcast quality System M color. The edit controllers for the Sony U-max (3/4") and the 2" Reel to Reel VTRs used 6502 processors. Those MAC computers were always crap, for real work. The Commodore Amiga with the external Video Toaster hardware was much better, and cheaper.
 
On Thu, 5 Sep 2019 11:42:54 -0700 (PDT), whit3rd <whit3rd@gmail.com>
wrote:

On Thursday, September 5, 2019 at 7:32:42 AM UTC-7, jla...@highlandsniptechnology.com wrote:

Dorian is now headed about 140 degrees away from the path that most
models predicted a week ago.

Why is that meaningful? The storm is over a hundred miles wide, takes over
a day to move its diameter, and the 'heading' is just an imaginary center point doing
a moment-by-moment movement.

If you wish to imagine that center doing a 100 yard circular orbit inside
the storm, you can easily contrive the 'heading' to change 180 degrees in
a few hours, but that doesn't say which folk are getting wet. It's an irrelevance.

If you are faced with life or death, it's relevant.

The storm's position today is not notably different from the estimate two days ago.

The time lapse for weather predictions to degrade to zero accuracy
seems to be 5 to maybe 7 days. Here on the West coast, it's usually
less.
 
On Thursday, September 5, 2019 at 3:02:58 PM UTC-4, John Larkin wrote:
On Thu, 5 Sep 2019 11:42:54 -0700 (PDT), whit3rd <whit3rd@gmail.com
wrote:

On Thursday, September 5, 2019 at 7:32:42 AM UTC-7, jla...@highlandsniptechnology.com wrote:

Dorian is now headed about 140 degrees away from the path that most
models predicted a week ago.

Why is that meaningful? The storm is over a hundred miles wide, takes over
a day to move its diameter, and the 'heading' is just an imaginary center point doing
a moment-by-moment movement.

If you wish to imagine that center doing a 100 yard circular orbit inside
the storm, you can easily contrive the 'heading' to change 180 degrees in
a few hours, but that doesn't say which folk are getting wet. It's an irrelevance.


If you are faced with life or death, it's relevant.


The storm's position today is not notably different from the estimate two days ago.

The time lapse for weather predictions to degrade to zero accuracy
seems to be 5 to maybe 7 days. Here on the West coast, it's usually
less.

Some people have quite a remarkable grasp of the obvious, although his data is not actually correct. More times than not a week out forecast is relatively accurate although you need to not only have error bars on the amount of rain, sun, temperature but also time. If the week out forecast is for the hurricane to make landfall in your backyard on Tuesday and it doesn't come until Wednesday, that's not a failure of prediction in my book although obviously the pedantic might consider it such.

--

Rick C.

-+- Get 1,000 miles of free Supercharging
-+- Tesla referral code - https://ts.la/richard11209
 
bitrex wrote:
On 9/4/19 12:51 PM, jlarkin@highlandsniptechnology.com wrote:
On Wed, 4 Sep 2019 17:14:45 +0100, Martin Brown
'''newspam'''@nezumi.demon.co.uk> wrote:

On 03/09/2019 21:03, Rick C wrote:
On Monday, September 2, 2019 at 4:22:22 PM UTC-4, John Larkin wrote:
On Mon, 2 Sep 2019 15:16:23 -0400, bitrex <user@example.net> wrote:

On 9/2/19 11:53 AM, John Larkin wrote:

https://www.theregister.co.uk/2019/09/02/microsoft_roundup/

How could something this crazy happen?

The Feb update broke a previously working audio/visual
hardware/software
suite of mine, the mfgr says "talk to Microsoft" and Microsoft says
"talk to the mfgr"

?\_(?)_/?

A recent Firefox update got tangled with Windows access permissions.
That cost me a few hours of IT consultant time to repair. He basically
fiddled until it got fixed. Better him than me.

We are in the dark ages of computing.

I am a little confused.  You often describe your technique of
designing as what amounts to fiddling...

I hate to say it but on this particular point I tend to agree with JL.

Good software can be defined as something which remains useful,
relatively bug free and in service five years after it was launched.
That was about the timescale where overly enthusiastic large medieval
buildings tended to first show signs of subsidence and failure too.

Although there is some excellent software about and best practice is
improving gradually (though IMHO too slowly) there is far too much of a
ship it and be damned macho business culture in shrink wrap software.

Win10 updates that bricked certain brands of portable for example.

running some spice sims, breadboarding something to try an idea,
swapping parts to see what happens.

I find it very odd that he trusts Spice simulation predictions when at
the same time he rails incessantly against climate change simulations.

I mostly - certainly not always - am guided by simulations of stable
linear systems where I can trust the component models and the sim
software. I don't trust future-state simulations of unstable or
chaotic systems, especially if I don't understand the component
behavior, the forcings, or the initial states.

Simulation mostly helps me to think.

And of course, being in business to sell stuff, my simulations are
rapidly, often concurrently, verified by experiment, which also guides
future expectations of simulations.

Is that unreasonable?



How is this different?

The individual components in electronics hardware are generally much
better characterised and do more or less what they say on the tin.
Software developers have a bad habit of re-inventing the wheel and not
always putting the axle at the centre or making the damn thing round!

A PC operating system has some 100 billion bits.  Which one do you
want changed?

Problem with binary logic is that a fence post error is the opposite of
what you intended to do. It is pretty clear that modern software could
be made a lot more robust by static analysis to find all the places
where malevolent data packets can target OS privilege escalation.

The ultimate secure computer system will have absolute hardware
protections. Programmers can't be trusted here.



It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't compromised
in some way, too. Or that your compiler's compiler wasn't compromised,
or that the compiler that you use to compile the tool you use to check
to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...
And yet..there is such a thing as a totally bug-free program.
Tom Pittman wrote a BASIC interpreter for the RCA COSMAC and it was
bug-free.
Then RCA bought it and compromised it (Tom's version was still OK).

Moral? Your customer may become your worst enemy.
 
John Larkin wrote:
On Wed, 4 Sep 2019 14:37:24 -0400, bitrex <user@example.net> wrote:

On 9/4/19 12:51 PM, jlarkin@highlandsniptechnology.com wrote:
On Wed, 4 Sep 2019 17:14:45 +0100, Martin Brown
'''newspam'''@nezumi.demon.co.uk> wrote:

On 03/09/2019 21:03, Rick C wrote:
On Monday, September 2, 2019 at 4:22:22 PM UTC-4, John Larkin wrote:
On Mon, 2 Sep 2019 15:16:23 -0400, bitrex <user@example.net> wrote:

On 9/2/19 11:53 AM, John Larkin wrote:

https://www.theregister.co.uk/2019/09/02/microsoft_roundup/

How could something this crazy happen?

The Feb update broke a previously working audio/visual hardware/software
suite of mine, the mfgr says "talk to Microsoft" and Microsoft says
"talk to the mfgr"

?\_(?)_/?

A recent Firefox update got tangled with Windows access permissions.
That cost me a few hours of IT consultant time to repair. He basically
fiddled until it got fixed. Better him than me.

We are in the dark ages of computing.

I am a little confused. You often describe your technique of designing as what amounts to fiddling...

I hate to say it but on this particular point I tend to agree with JL.

Good software can be defined as something which remains useful,
relatively bug free and in service five years after it was launched.
That was about the timescale where overly enthusiastic large medieval
buildings tended to first show signs of subsidence and failure too.

Although there is some excellent software about and best practice is
improving gradually (though IMHO too slowly) there is far too much of a
ship it and be damned macho business culture in shrink wrap software.

Win10 updates that bricked certain brands of portable for example.

running some spice sims, breadboarding something to try an idea, swapping parts to see what happens.

I find it very odd that he trusts Spice simulation predictions when at
the same time he rails incessantly against climate change simulations.

I mostly - certainly not always - am guided by simulations of stable
linear systems where I can trust the component models and the sim
software. I don't trust future-state simulations of unstable or
chaotic systems, especially if I don't understand the component
behavior, the forcings, or the initial states.

Simulation mostly helps me to think.

And of course, being in business to sell stuff, my simulations are
rapidly, often concurrently, verified by experiment, which also guides
future expectations of simulations.

Is that unreasonable?



How is this different?

The individual components in electronics hardware are generally much
better characterised and do more or less what they say on the tin.
Software developers have a bad habit of re-inventing the wheel and not
always putting the axle at the centre or making the damn thing round!

A PC operating system has some 100 billion bits. Which one do you want changed?

Problem with binary logic is that a fence post error is the opposite of
what you intended to do. It is pretty clear that modern software could
be made a lot more robust by static analysis to find all the places
where malevolent data packets can target OS privilege escalation.

The ultimate secure computer system will have absolute hardware
protections. Programmers can't be trusted here.



It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't compromised
in some way, too. Or that your compiler's compiler wasn't compromised,
or that the compiler that you use to compile the tool you use to check
to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...

I believe that absolute hardware protection is possible. But people
don't even use the protection mechanisms that are provided. The
structure of c sure doesn't help.

Address space layout randomization is hilarious, the moral equivalent
of hiding under the bed.
Tell me.
Ages ago, i wrote a goodly set of multi-million digit arithmetic
routines (you know, My Dear Aunt Sally) that required multiple re-use of
memory address space.
In the process, i discovered that "buffer overflow" is impossible
with easy programming.
Ditto concerning "memory leaks".
 
On 9/5/19 6:49 PM, Robert Baer wrote:
bitrex wrote:
On 9/4/19 2:53 PM, John Larkin wrote:

It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't compromised
in some way, too. Or that your compiler's compiler wasn't compromised,
or that the compiler that you use to compile the tool you use to check
to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...

I believe that absolute hardware protection is possible. But people
don't even use the protection mechanisms that are provided. The
structure of c sure doesn't help.

No, that's why C++ is a great idea even for small embedded
applications. You can write insecure code as easily as you can in C.
But it's easier to write secure code with it as it makes it easier to
write code that enforces some set of common-sense generally agreed
upon restrictions.

e.g. you may never directly write to a raw storage array of fixed size
without bounds-checking, or allow some quantity measured in positive
integers to ever have something that's not a positive integer assigned
to it. And it can enforce stuff like that with no runtime resource
overhead.

There are other languages that do the same out of the box, but the
runtime overhead tends to be higher making them inappropriate for
embedded work.
  Do not forget, with some languages and some machines, one can have
PLUS zero, MINUS zero, and an UNSIGNED zero.
  ..and still conform to IEEE technical machine representation standards.

Does anyone still use those machines for anything?
 
bitrex wrote:
On 9/4/19 2:53 PM, John Larkin wrote:

It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't compromised
in some way, too. Or that your compiler's compiler wasn't compromised,
or that the compiler that you use to compile the tool you use to check
to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...

I believe that absolute hardware protection is possible. But people
don't even use the protection mechanisms that are provided. The
structure of c sure doesn't help.

No, that's why C++ is a great idea even for small embedded applications.
You can write insecure code as easily as you can in C. But it's easier
to write secure code with it as it makes it easier to write code that
enforces some set of common-sense generally agreed upon restrictions.

e.g. you may never directly write to a raw storage array of fixed size
without bounds-checking, or allow some quantity measured in positive
integers to ever have something that's not a positive integer assigned
to it. And it can enforce stuff like that with no runtime resource
overhead.

There are other languages that do the same out of the box, but the
runtime overhead tends to be higher making them inappropriate for
embedded work.
Do not forget, with some languages and some machines, one can have
PLUS zero, MINUS zero, and an UNSIGNED zero.
..and still conform to IEEE technical machine representation standards.
 
bitrex wrote:
On 9/4/19 2:53 PM, John Larkin wrote:
On Wed, 4 Sep 2019 14:37:24 -0400, bitrex <user@example.net> wrote:

On 9/4/19 12:51 PM, jlarkin@highlandsniptechnology.com wrote:
On Wed, 4 Sep 2019 17:14:45 +0100, Martin Brown
'''newspam'''@nezumi.demon.co.uk> wrote:

On 03/09/2019 21:03, Rick C wrote:
On Monday, September 2, 2019 at 4:22:22 PM UTC-4, John Larkin wrote:
On Mon, 2 Sep 2019 15:16:23 -0400, bitrex <user@example.net> wrote:

On 9/2/19 11:53 AM, John Larkin wrote:

https://www.theregister.co.uk/2019/09/02/microsoft_roundup/

How could something this crazy happen?

The Feb update broke a previously working audio/visual
hardware/software
suite of mine, the mfgr says "talk to Microsoft" and Microsoft says
"talk to the mfgr"

?\_(?)_/?

A recent Firefox update got tangled with Windows access permissions.
That cost me a few hours of IT consultant time to repair. He
basically
fiddled until it got fixed. Better him than me.

We are in the dark ages of computing.

I am a little confused.  You often describe your technique of
designing as what amounts to fiddling...

I hate to say it but on this particular point I tend to agree with JL.

Good software can be defined as something which remains useful,
relatively bug free and in service five years after it was launched.
That was about the timescale where overly enthusiastic large medieval
buildings tended to first show signs of subsidence and failure too.

Although there is some excellent software about and best practice is
improving gradually (though IMHO too slowly) there is far too much
of a
ship it and be damned macho business culture in shrink wrap software.

Win10 updates that bricked certain brands of portable for example.

running some spice sims, breadboarding something to try an idea,
swapping parts to see what happens.

I find it very odd that he trusts Spice simulation predictions when at
the same time he rails incessantly against climate change simulations.

I mostly - certainly not always - am guided by simulations of stable
linear systems where I can trust the component models and the sim
software. I don't trust future-state simulations of unstable or
chaotic systems, especially if I don't understand the component
behavior, the forcings, or the initial states.

Simulation mostly helps me to think.

And of course, being in business to sell stuff, my simulations are
rapidly, often concurrently, verified by experiment, which also guides
future expectations of simulations.

Is that unreasonable?



How is this different?

The individual components in electronics hardware are generally much
better characterised and do more or less what they say on the tin.
Software developers have a bad habit of re-inventing the wheel and not
always putting the axle at the centre or making the damn thing round!

A PC operating system has some 100 billion bits.  Which one do you
want changed?

Problem with binary logic is that a fence post error is the
opposite of
what you intended to do. It is pretty clear that modern software could
be made a lot more robust by static analysis to find all the places
where malevolent data packets can target OS privilege escalation.

The ultimate secure computer system will have absolute hardware
protections. Programmers can't be trusted here.



It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't compromised
in some way, too. Or that your compiler's compiler wasn't compromised,
or that the compiler that you use to compile the tool you use to check
to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...

I believe that absolute hardware protection is possible. But people
don't even use the protection mechanisms that are provided. The
structure of c sure doesn't help.

Address space layout randomization is hilarious, the moral equivalent
of hiding under the bed.



There's a little MINIX machine running inside every Intel processor sold
- who knows exactly what it does. It's known to have the capability to
flash the BIOS even when the machine is powered down (but still plugged
in/available battery) and can run self-modifying code.

....and in the past, FPU multiply and/or divide errors.
 
On 9/5/19 6:34 PM, Robert Baer wrote:
bitrex wrote:
On 9/4/19 12:51 PM, jlarkin@highlandsniptechnology.com wrote:
On Wed, 4 Sep 2019 17:14:45 +0100, Martin Brown
'''newspam'''@nezumi.demon.co.uk> wrote:

On 03/09/2019 21:03, Rick C wrote:
On Monday, September 2, 2019 at 4:22:22 PM UTC-4, John Larkin wrote:
On Mon, 2 Sep 2019 15:16:23 -0400, bitrex <user@example.net> wrote:

On 9/2/19 11:53 AM, John Larkin wrote:

https://www.theregister.co.uk/2019/09/02/microsoft_roundup/

How could something this crazy happen?

The Feb update broke a previously working audio/visual
hardware/software
suite of mine, the mfgr says "talk to Microsoft" and Microsoft says
"talk to the mfgr"

?\_(?)_/?

A recent Firefox update got tangled with Windows access permissions.
That cost me a few hours of IT consultant time to repair. He
basically
fiddled until it got fixed. Better him than me.

We are in the dark ages of computing.

I am a little confused.  You often describe your technique of
designing as what amounts to fiddling...

I hate to say it but on this particular point I tend to agree with JL.

Good software can be defined as something which remains useful,
relatively bug free and in service five years after it was launched.
That was about the timescale where overly enthusiastic large medieval
buildings tended to first show signs of subsidence and failure too.

Although there is some excellent software about and best practice is
improving gradually (though IMHO too slowly) there is far too much of a
ship it and be damned macho business culture in shrink wrap software.

Win10 updates that bricked certain brands of portable for example.

running some spice sims, breadboarding something to try an idea,
swapping parts to see what happens.

I find it very odd that he trusts Spice simulation predictions when at
the same time he rails incessantly against climate change simulations.

I mostly - certainly not always - am guided by simulations of stable
linear systems where I can trust the component models and the sim
software. I don't trust future-state simulations of unstable or
chaotic systems, especially if I don't understand the component
behavior, the forcings, or the initial states.

Simulation mostly helps me to think.

And of course, being in business to sell stuff, my simulations are
rapidly, often concurrently, verified by experiment, which also guides
future expectations of simulations.

Is that unreasonable?



How is this different?

The individual components in electronics hardware are generally much
better characterised and do more or less what they say on the tin.
Software developers have a bad habit of re-inventing the wheel and not
always putting the axle at the centre or making the damn thing round!

A PC operating system has some 100 billion bits.  Which one do you
want changed?

Problem with binary logic is that a fence post error is the opposite of
what you intended to do. It is pretty clear that modern software could
be made a lot more robust by static analysis to find all the places
where malevolent data packets can target OS privilege escalation.

The ultimate secure computer system will have absolute hardware
protections. Programmers can't be trusted here.



It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't
compromised in some way, too. Or that your compiler's compiler wasn't
compromised, or that the compiler that you use to compile the tool you
use to check to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...
  And yet..there is such a thing as a totally bug-free program.
  Tom Pittman wrote a BASIC interpreter for the RCA COSMAC and it was
bug-free.
  Then RCA bought it and compromised it (Tom's version was still OK).

  Moral? Your customer may become your worst enemy.

Even the Space Shuttle may software probably (almost surely) had bugs.
Just never had enough uptime to encounter them. They didn't make a
four-way computer voting system cuz they thought it would be economical
to try to make it totally bug-free vs. using four (five actually) GPCs.
 
On Thu, 5 Sep 2019 14:34:46 -0800, Robert Baer
<robertbaer@localnet.com> wrote:

bitrex wrote:
On 9/4/19 12:51 PM, jlarkin@highlandsniptechnology.com wrote:
On Wed, 4 Sep 2019 17:14:45 +0100, Martin Brown
'''newspam'''@nezumi.demon.co.uk> wrote:

On 03/09/2019 21:03, Rick C wrote:
On Monday, September 2, 2019 at 4:22:22 PM UTC-4, John Larkin wrote:
On Mon, 2 Sep 2019 15:16:23 -0400, bitrex <user@example.net> wrote:

On 9/2/19 11:53 AM, John Larkin wrote:

https://www.theregister.co.uk/2019/09/02/microsoft_roundup/

How could something this crazy happen?

The Feb update broke a previously working audio/visual
hardware/software
suite of mine, the mfgr says "talk to Microsoft" and Microsoft says
"talk to the mfgr"

?\_(?)_/?

A recent Firefox update got tangled with Windows access permissions.
That cost me a few hours of IT consultant time to repair. He basically
fiddled until it got fixed. Better him than me.

We are in the dark ages of computing.

I am a little confused.  You often describe your technique of
designing as what amounts to fiddling...

I hate to say it but on this particular point I tend to agree with JL.

Good software can be defined as something which remains useful,
relatively bug free and in service five years after it was launched.
That was about the timescale where overly enthusiastic large medieval
buildings tended to first show signs of subsidence and failure too.

Although there is some excellent software about and best practice is
improving gradually (though IMHO too slowly) there is far too much of a
ship it and be damned macho business culture in shrink wrap software.

Win10 updates that bricked certain brands of portable for example.

running some spice sims, breadboarding something to try an idea,
swapping parts to see what happens.

I find it very odd that he trusts Spice simulation predictions when at
the same time he rails incessantly against climate change simulations.

I mostly - certainly not always - am guided by simulations of stable
linear systems where I can trust the component models and the sim
software. I don't trust future-state simulations of unstable or
chaotic systems, especially if I don't understand the component
behavior, the forcings, or the initial states.

Simulation mostly helps me to think.

And of course, being in business to sell stuff, my simulations are
rapidly, often concurrently, verified by experiment, which also guides
future expectations of simulations.

Is that unreasonable?



How is this different?

The individual components in electronics hardware are generally much
better characterised and do more or less what they say on the tin.
Software developers have a bad habit of re-inventing the wheel and not
always putting the axle at the centre or making the damn thing round!

A PC operating system has some 100 billion bits.  Which one do you
want changed?

Problem with binary logic is that a fence post error is the opposite of
what you intended to do. It is pretty clear that modern software could
be made a lot more robust by static analysis to find all the places
where malevolent data packets can target OS privilege escalation.

The ultimate secure computer system will have absolute hardware
protections. Programmers can't be trusted here.



It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't compromised
in some way, too. Or that your compiler's compiler wasn't compromised,
or that the compiler that you use to compile the tool you use to check
to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...
And yet..there is such a thing as a totally bug-free program.
Tom Pittman wrote a BASIC interpreter for the RCA COSMAC and it was
bug-free.
Then RCA bought it and compromised it (Tom's version was still OK).

Moral? Your customer may become your worst enemy.

The original version of NT had a small, clean, pretty secure kernel.

https://www.amazon.com/Showstopper-Breakneck-Windows-Generation-Microsoft/dp/1497638836/ref=sr_1_1?keywords=showstopper&qid=1567726131&s=books&sr=1-1

Then some Micro-idiots took over and broke it.
 
fredag den 6. september 2019 kl. 00.36.57 UTC+2 skrev bitrex:
On 9/5/19 6:49 PM, Robert Baer wrote:
bitrex wrote:
On 9/4/19 2:53 PM, John Larkin wrote:

It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't compromised
in some way, too. Or that your compiler's compiler wasn't compromised,
or that the compiler that you use to compile the tool you use to check
to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...

I believe that absolute hardware protection is possible. But people
don't even use the protection mechanisms that are provided. The
structure of c sure doesn't help.

No, that's why C++ is a great idea even for small embedded
applications. You can write insecure code as easily as you can in C.
But it's easier to write secure code with it as it makes it easier to
write code that enforces some set of common-sense generally agreed
upon restrictions.

e.g. you may never directly write to a raw storage array of fixed size
without bounds-checking, or allow some quantity measured in positive
integers to ever have something that's not a positive integer assigned
to it. And it can enforce stuff like that with no runtime resource
overhead.

There are other languages that do the same out of the box, but the
runtime overhead tends to be higher making them inappropriate for
embedded work.
  Do not forget, with some languages and some machines, one can have
PLUS zero, MINUS zero, and an UNSIGNED zero.
  ..and still conform to IEEE technical machine representation standards.



Does anyone still use those machines for anything?

https://en.wikipedia.org/wiki/Signed_zero
 
On Thursday, September 5, 2019 at 3:40:47 PM UTC-4, Rick C wrote:
I'm not familiar with regulations of any country that require quality food rather than simply food that is safe enough it doesn't make you sick. Back when I ate meat, there were many times I had crappy meals with meat with no real taste or even a good texture in some cases. I also am not aware of any country with laws requiring anything other than a minimum level of not being cruel to animals. Have you ever even been near a chicken house? They are literally some of the most disgusting places I've ever seen.

Yeah, they are "controlled" but that has very little to do with the quality of the meat or the life it had before it became "meat".

The cheapest meat meals are the worst in terms of what they've done to produce that meal.

Yes, I have been to a Chicken farm. Some of my relatives raised 10,000 at a time. The buildings were clean, the chickens weren't crammed into tiny cages, and they hosed out the buildings daily to was away their crap. The buildings were temperature controlled, and they were rated as the best run operation in Kentucky at the time. They had an outbreak of some disease, once. All of the chicks were destroyed. The buildings were sterilized and every piece of equipment was inspected before that building was used again. This was over 50 years ago, when I was a teenager.
 
On Thursday, September 5, 2019 at 3:18:53 PM UTC-4, bitrex wrote:
On 9/5/19 2:53 PM, Michael Terrell wrote:
On Thursday, September 5, 2019 at 2:04:32 AM UTC-4, bitrex wrote:

For the tasks the kiddos were attempting to use the Mac Quadra-class
machines for at the time (mid 1990s) at ART COLLEGE e.g. standard
definition video editing they were sort of useless when they were new, too!

You set up the edits and and effects you wanted to apply, let the pizza
box crank overnight, and hope to God it hadn't crashed or locked up in
the morning.

A mid '80s Vital Industries 'Squeeze Zoom' video effects system did real time video effects with a Z80B and 768KB of RAM in broadcast quality System M color. The edit controllers for the Sony U-max (3/4") and the 2" Reel to Reel VTRs used 6502 processors. Those MAC computers were always crap, for real work. The Commodore Amiga with the external Video Toaster hardware was much better, and cheaper.


Budget, budget. small colleges weren't and still aren't drowning in
Federal funds. I think there might have been one Real Professional video
editing machine in the lab I don't recall. It might have cost 10 or 15
grand, new.

It's a tough swing to even have one because if you blow all the dough on
the film department's gear the comp sci and student library dept/e-mail
dept start bitchin' that they're still using 486DXes from '91 and what
the hell is going on?

So, the film department got all new equipment, every term? Or were they lucky to get some repairs on 40 year old cameras and film processing equipment?

An Amiga with the Video Toaster was around $3,000. It didn't need all night to process video. A pair of U-matics and edit controller were about $5,000, including two 19 inch monitors. The CATV headend I maintained in the early '80s had a system from Panasonic. They pinched pennies tighter than any college.

So what if old computers were used for Email servers? The secretaries had nice new computers at Microdyne, but Production was still using machines back to the XT. Some, because they were the only ones available, and others to support programming of obsolete PLA and very early EPROMS. I had to laugh at the Y2K 'expert' hired to make sure that every system was compatible. He was complaining that he couldn't get into the BIOS of a Zenith computer. I told him that it was a dedicated XT, with no real time clock. I had to laugh even harder when he said that we would have all new computers within a month. He was the same fool who ;upgraded' a Windows 2.0 system too Win 95 without asking. He wiped out all of the test software that would only run under 2.0, without making a backup. It took me almost a full week to rebuild the files from multiple, partial backups. The test software was written by Scientific-Atlanta, but they had disposed of all traces of the automated test system after they lost a patent infringement lawsuit, and contracted with us to build the est of the equipment they had contracts for. The settlement inculded them giving us all files, and the test system.
 
On Thursday, September 5, 2019 at 12:02:58 PM UTC-7, John Larkin wrote:
On Thu, 5 Sep 2019 11:42:54 -0700 (PDT), whit3rd <whit3rd@gmail.com
wrote:

If you wish to imagine that center doing a 100 yard circular orbit inside
the storm, you can easily contrive the 'heading' to change 180 degrees in
a few hours, but that doesn't say which folk are getting wet. It's an irrelevance.

If you are faced with life or death, it's relevant.

Something is relevant, but not the silly 'heading' information. You want
a few days notice for heavy weather, and you got it. Modeliing works, you
just want to quibble about error budgets, and you aren't good at it


The time lapse for weather predictions to degrade to zero accuracy
seems to be 5 to maybe 7 days. Here on the West coast, it's usually
less.


Huh? Hurricane paths are tricky, but Seattle is 45 degrees and rainy,
late September through May. That's MORE than seven days from now...

"Zero accuracy"... is not a sensible measure.
 
bitrex wrote:
On 9/5/19 6:49 PM, Robert Baer wrote:
bitrex wrote:
On 9/4/19 2:53 PM, John Larkin wrote:

It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't
compromised
in some way, too. Or that your compiler's compiler wasn't compromised,
or that the compiler that you use to compile the tool you use to check
to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...

I believe that absolute hardware protection is possible. But people
don't even use the protection mechanisms that are provided. The
structure of c sure doesn't help.

No, that's why C++ is a great idea even for small embedded
applications. You can write insecure code as easily as you can in C.
But it's easier to write secure code with it as it makes it easier to
write code that enforces some set of common-sense generally agreed
upon restrictions.

e.g. you may never directly write to a raw storage array of fixed
size without bounds-checking, or allow some quantity measured in
positive integers to ever have something that's not a positive
integer assigned to it. And it can enforce stuff like that with no
runtime resource overhead.

There are other languages that do the same out of the box, but the
runtime overhead tends to be higher making them inappropriate for
embedded work.
   Do not forget, with some languages and some machines, one can have
PLUS zero, MINUS zero, and an UNSIGNED zero.
   ..and still conform to IEEE technical machine representation
standards.



Does anyone still use those machines for anything?
Why yes; the Pentium X86..
 
On Thursday, September 5, 2019 at 9:55:41 PM UTC-4, Michael Terrell wrote:
On Thursday, September 5, 2019 at 3:40:47 PM UTC-4, Rick C wrote:

I'm not familiar with regulations of any country that require quality food rather than simply food that is safe enough it doesn't make you sick. Back when I ate meat, there were many times I had crappy meals with meat with no real taste or even a good texture in some cases. I also am not aware of any country with laws requiring anything other than a minimum level of not being cruel to animals. Have you ever even been near a chicken house? They are literally some of the most disgusting places I've ever seen.

Yeah, they are "controlled" but that has very little to do with the quality of the meat or the life it had before it became "meat".

The cheapest meat meals are the worst in terms of what they've done to produce that meal.

Yes, I have been to a Chicken farm. Some of my relatives raised 10,000 at a time. The buildings were clean, the chickens weren't crammed into tiny cages, and they hosed out the buildings daily to was away their crap. The buildings were temperature controlled, and they were rated as the best run operation in Kentucky at the time. They had an outbreak of some disease, once.. All of the chicks were destroyed. The buildings were sterilized and every piece of equipment was inspected before that building was used again. This was over 50 years ago, when I was a teenager.

You appear to be bragging but you are helping to make my point. Especially the part about the conditions leading to an infectious outbreak that resulted in all the animals being "destroyed". Farms brag that the chickens are "free range" because they can go outside the building... into a pen that is a tiny fraction of the size of the building. The chickens don't seem to have any preference for that tiny screened in porch because it's all crowded, inside and out.

Yep, no small part of why I don't eat chicken anymore. I used to support Perdue with my chicken consumption, but I'm actually a lot happier and healthier eating little meat, only seafood really. If their were more options that didn't require cooking every meal and especially in restaurants, I think I wouldn't even eat seafood. Yeah, you can eat vegetarian in restaurant, but you end up having the same meal everywhere you go other than in places that are a bit more aware. One of the best places was an Afghan restaurant which had a number of veggie dished which weren't some contrived recipe. They were authentic dishes. One was pumpkin and was excellent.

--

Rick C.

-++ Get 1,000 miles of free Supercharging
-++ Tesla referral code - https://ts.la/richard11209
 

Welcome to EDABoard.com

Sponsor

Back
Top