nightmare

On 03/09/2019 21:03, Rick C wrote:
On Monday, September 2, 2019 at 4:22:22 PM UTC-4, John Larkin wrote:
On Mon, 2 Sep 2019 15:16:23 -0400, bitrex <user@example.net> wrote:

On 9/2/19 11:53 AM, John Larkin wrote:

https://www.theregister.co.uk/2019/09/02/microsoft_roundup/

How could something this crazy happen?

The Feb update broke a previously working audio/visual hardware/software
suite of mine, the mfgr says "talk to Microsoft" and Microsoft says
"talk to the mfgr"

Ĺť\_(?)_/Ĺť

A recent Firefox update got tangled with Windows access permissions.
That cost me a few hours of IT consultant time to repair. He basically
fiddled until it got fixed. Better him than me.

We are in the dark ages of computing.

I am a little confused. You often describe your technique of designing as what amounts to fiddling...

I hate to say it but on this particular point I tend to agree with JL.

Good software can be defined as something which remains useful,
relatively bug free and in service five years after it was launched.
That was about the timescale where overly enthusiastic large medieval
buildings tended to first show signs of subsidence and failure too.

Although there is some excellent software about and best practice is
improving gradually (though IMHO too slowly) there is far too much of a
ship it and be damned macho business culture in shrink wrap software.

Win10 updates that bricked certain brands of portable for example.

> running some spice sims, breadboarding something to try an idea, swapping parts to see what happens.

I find it very odd that he trusts Spice simulation predictions when at
the same time he rails incessantly against climate change simulations.
How is this different?

The individual components in electronics hardware are generally much
better characterised and do more or less what they say on the tin.
Software developers have a bad habit of re-inventing the wheel and not
always putting the axle at the centre or making the damn thing round!
A PC operating system has some 100 billion bits. Which one do you want changed?

Problem with binary logic is that a fence post error is the opposite of
what you intended to do. It is pretty clear that modern software could
be made a lot more robust by static analysis to find all the places
where malevolent data packets can target OS privilege escalation.

--
Regards,
Martin Brown
 
On Wed, 4 Sep 2019 17:14:45 +0100, Martin Brown
<'''newspam'''@nezumi.demon.co.uk> wrote:

On 03/09/2019 21:03, Rick C wrote:
On Monday, September 2, 2019 at 4:22:22 PM UTC-4, John Larkin wrote:
On Mon, 2 Sep 2019 15:16:23 -0400, bitrex <user@example.net> wrote:

On 9/2/19 11:53 AM, John Larkin wrote:

https://www.theregister.co.uk/2019/09/02/microsoft_roundup/

How could something this crazy happen?

The Feb update broke a previously working audio/visual hardware/software
suite of mine, the mfgr says "talk to Microsoft" and Microsoft says
"talk to the mfgr"

?\_(?)_/?

A recent Firefox update got tangled with Windows access permissions.
That cost me a few hours of IT consultant time to repair. He basically
fiddled until it got fixed. Better him than me.

We are in the dark ages of computing.

I am a little confused. You often describe your technique of designing as what amounts to fiddling...

I hate to say it but on this particular point I tend to agree with JL.

Good software can be defined as something which remains useful,
relatively bug free and in service five years after it was launched.
That was about the timescale where overly enthusiastic large medieval
buildings tended to first show signs of subsidence and failure too.

Although there is some excellent software about and best practice is
improving gradually (though IMHO too slowly) there is far too much of a
ship it and be damned macho business culture in shrink wrap software.

Win10 updates that bricked certain brands of portable for example.

running some spice sims, breadboarding something to try an idea, swapping parts to see what happens.

I find it very odd that he trusts Spice simulation predictions when at
the same time he rails incessantly against climate change simulations.

I mostly - certainly not always - am guided by simulations of stable
linear systems where I can trust the component models and the sim
software. I don't trust future-state simulations of unstable or
chaotic systems, especially if I don't understand the component
behavior, the forcings, or the initial states.

Simulation mostly helps me to think.

And of course, being in business to sell stuff, my simulations are
rapidly, often concurrently, verified by experiment, which also guides
future expectations of simulations.

Is that unreasonable?


How is this different?

The individual components in electronics hardware are generally much
better characterised and do more or less what they say on the tin.
Software developers have a bad habit of re-inventing the wheel and not
always putting the axle at the centre or making the damn thing round!

A PC operating system has some 100 billion bits. Which one do you want changed?

Problem with binary logic is that a fence post error is the opposite of
what you intended to do. It is pretty clear that modern software could
be made a lot more robust by static analysis to find all the places
where malevolent data packets can target OS privilege escalation.

The ultimate secure computer system will have absolute hardware
protections. Programmers can't be trusted here.
 
On 9/4/19 12:51 PM, jlarkin@highlandsniptechnology.com wrote:
On Wed, 4 Sep 2019 17:14:45 +0100, Martin Brown
'''newspam'''@nezumi.demon.co.uk> wrote:

On 03/09/2019 21:03, Rick C wrote:
On Monday, September 2, 2019 at 4:22:22 PM UTC-4, John Larkin wrote:
On Mon, 2 Sep 2019 15:16:23 -0400, bitrex <user@example.net> wrote:

On 9/2/19 11:53 AM, John Larkin wrote:

https://www.theregister.co.uk/2019/09/02/microsoft_roundup/

How could something this crazy happen?

The Feb update broke a previously working audio/visual hardware/software
suite of mine, the mfgr says "talk to Microsoft" and Microsoft says
"talk to the mfgr"

?\_(?)_/?

A recent Firefox update got tangled with Windows access permissions.
That cost me a few hours of IT consultant time to repair. He basically
fiddled until it got fixed. Better him than me.

We are in the dark ages of computing.

I am a little confused. You often describe your technique of designing as what amounts to fiddling...

I hate to say it but on this particular point I tend to agree with JL.

Good software can be defined as something which remains useful,
relatively bug free and in service five years after it was launched.
That was about the timescale where overly enthusiastic large medieval
buildings tended to first show signs of subsidence and failure too.

Although there is some excellent software about and best practice is
improving gradually (though IMHO too slowly) there is far too much of a
ship it and be damned macho business culture in shrink wrap software.

Win10 updates that bricked certain brands of portable for example.

running some spice sims, breadboarding something to try an idea, swapping parts to see what happens.

I find it very odd that he trusts Spice simulation predictions when at
the same time he rails incessantly against climate change simulations.

I mostly - certainly not always - am guided by simulations of stable
linear systems where I can trust the component models and the sim
software. I don't trust future-state simulations of unstable or
chaotic systems, especially if I don't understand the component
behavior, the forcings, or the initial states.

Simulation mostly helps me to think.

And of course, being in business to sell stuff, my simulations are
rapidly, often concurrently, verified by experiment, which also guides
future expectations of simulations.

Is that unreasonable?



How is this different?

The individual components in electronics hardware are generally much
better characterised and do more or less what they say on the tin.
Software developers have a bad habit of re-inventing the wheel and not
always putting the axle at the centre or making the damn thing round!

A PC operating system has some 100 billion bits. Which one do you want changed?

Problem with binary logic is that a fence post error is the opposite of
what you intended to do. It is pretty clear that modern software could
be made a lot more robust by static analysis to find all the places
where malevolent data packets can target OS privilege escalation.

The ultimate secure computer system will have absolute hardware
protections. Programmers can't be trusted here.

It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't compromised
in some way, too. Or that your compiler's compiler wasn't compromised,
or that the compiler that you use to compile the tool you use to check
to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...
 
On 9/4/19 12:33 PM, jlarkin@highlandsniptechnology.com wrote:

Sometimes older guys wax nostalgic about the big iron they used in their
teens and 20s in the fashion that you know they thought it was really
something else at the time.

It was the best thing around, when slide rules were what most everyone
used. Why not always work with the best available technology? A PDP-8
allowed me to simulate steamship propulsion systems and get business
and go out on sea trials with a private cabin, at the age of 20. Why
not?



You won't find me waxing nostalgic much about any "In my day" computers
because in the same time period for me what the kids had available was
early Pentiums and some Mac Quadras and stuff. These machines were
unpleasant to use the operating systems stunk they were under-powered,
overpriced,

I can't help having a lot of experience. I've been fascinated with
electronics and girls all my life.

and generally sucked balls.

Of course, some people have different interests.

Oh. Oh I see. you thought I was talking about you, specifically.

Well I can't really help that if you see it that way.
 
On 9/4/19 12:33 PM, jlarkin@highlandsniptechnology.com wrote:

Sometimes older guys wax nostalgic about the big iron they used in their
teens and 20s in the fashion that you know they thought it was really
something else at the time.

It was the best thing around, when slide rules were what most everyone
used. Why not always work with the best available technology? A PDP-8
allowed me to simulate steamship propulsion systems and get business
and go out on sea trials with a private cabin, at the age of 20. Why
not?



You won't find me waxing nostalgic much about any "In my day" computers
because in the same time period for me what the kids had available was
early Pentiums and some Mac Quadras and stuff. These machines were
unpleasant to use the operating systems stunk they were under-powered,
overpriced,

I can't help having a lot of experience. I've been fascinated with
electronics and girls all my life.

and generally sucked balls.

Of course, some people have different interests.

I don't quite follow are you insinuating that by use of that colloquial
expression that I'm gay and love gay sex all the time? That I like BIG
GAY DICKS and looking at big sexy men with big dicks routinely?

c'mon I think you're better than that passive-aggressive stuff it's not
becoming of you.
 
On Wed, 4 Sep 2019 14:37:24 -0400, bitrex <user@example.net> wrote:

On 9/4/19 12:51 PM, jlarkin@highlandsniptechnology.com wrote:
On Wed, 4 Sep 2019 17:14:45 +0100, Martin Brown
'''newspam'''@nezumi.demon.co.uk> wrote:

On 03/09/2019 21:03, Rick C wrote:
On Monday, September 2, 2019 at 4:22:22 PM UTC-4, John Larkin wrote:
On Mon, 2 Sep 2019 15:16:23 -0400, bitrex <user@example.net> wrote:

On 9/2/19 11:53 AM, John Larkin wrote:

https://www.theregister.co.uk/2019/09/02/microsoft_roundup/

How could something this crazy happen?

The Feb update broke a previously working audio/visual hardware/software
suite of mine, the mfgr says "talk to Microsoft" and Microsoft says
"talk to the mfgr"

?\_(?)_/?

A recent Firefox update got tangled with Windows access permissions.
That cost me a few hours of IT consultant time to repair. He basically
fiddled until it got fixed. Better him than me.

We are in the dark ages of computing.

I am a little confused. You often describe your technique of designing as what amounts to fiddling...

I hate to say it but on this particular point I tend to agree with JL.

Good software can be defined as something which remains useful,
relatively bug free and in service five years after it was launched.
That was about the timescale where overly enthusiastic large medieval
buildings tended to first show signs of subsidence and failure too.

Although there is some excellent software about and best practice is
improving gradually (though IMHO too slowly) there is far too much of a
ship it and be damned macho business culture in shrink wrap software.

Win10 updates that bricked certain brands of portable for example.

running some spice sims, breadboarding something to try an idea, swapping parts to see what happens.

I find it very odd that he trusts Spice simulation predictions when at
the same time he rails incessantly against climate change simulations.

I mostly - certainly not always - am guided by simulations of stable
linear systems where I can trust the component models and the sim
software. I don't trust future-state simulations of unstable or
chaotic systems, especially if I don't understand the component
behavior, the forcings, or the initial states.

Simulation mostly helps me to think.

And of course, being in business to sell stuff, my simulations are
rapidly, often concurrently, verified by experiment, which also guides
future expectations of simulations.

Is that unreasonable?



How is this different?

The individual components in electronics hardware are generally much
better characterised and do more or less what they say on the tin.
Software developers have a bad habit of re-inventing the wheel and not
always putting the axle at the centre or making the damn thing round!

A PC operating system has some 100 billion bits. Which one do you want changed?

Problem with binary logic is that a fence post error is the opposite of
what you intended to do. It is pretty clear that modern software could
be made a lot more robust by static analysis to find all the places
where malevolent data packets can target OS privilege escalation.

The ultimate secure computer system will have absolute hardware
protections. Programmers can't be trusted here.



It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't compromised
in some way, too. Or that your compiler's compiler wasn't compromised,
or that the compiler that you use to compile the tool you use to check
to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...

I believe that absolute hardware protection is possible. But people
don't even use the protection mechanisms that are provided. The
structure of c sure doesn't help.

Address space layout randomization is hilarious, the moral equivalent
of hiding under the bed.
 
On 04/09/2019 17:51, jlarkin@highlandsniptechnology.com wrote:
On Wed, 4 Sep 2019 17:14:45 +0100, Martin Brown
'''newspam'''@nezumi.demon.co.uk> wrote:


Although there is some excellent software about and best practice is
improving gradually (though IMHO too slowly) there is far too much of a
ship it and be damned macho business culture in shrink wrap software.

Win10 updates that bricked certain brands of portable for example.

running some spice sims, breadboarding something to try an idea, swapping parts to see what happens.

I find it very odd that he trusts Spice simulation predictions when at
the same time he rails incessantly against climate change simulations.

I mostly - certainly not always - am guided by simulations of stable
linear systems where I can trust the component models and the sim
software. I don't trust future-state simulations of unstable or
chaotic systems, especially if I don't understand the component
behavior, the forcings, or the initial states.

Apart from the initial bias most of the interesting behaviour in Spice
comes from its solution of non-linear component models.

Simulation mostly helps me to think.

And of course, being in business to sell stuff, my simulations are
rapidly, often concurrently, verified by experiment, which also guides
future expectations of simulations.

Is that unreasonable?

No, but it is unreasonable to decry one sort of simulation because what
it predicts is inconvenient when you also rely on another simulation.
How is this different?

The individual components in electronics hardware are generally much
better characterised and do more or less what they say on the tin.
Software developers have a bad habit of re-inventing the wheel and not
always putting the axle at the centre or making the damn thing round!

A PC operating system has some 100 billion bits. Which one do you want changed?

Problem with binary logic is that a fence post error is the opposite of
what you intended to do. It is pretty clear that modern software could
be made a lot more robust by static analysis to find all the places
where malevolent data packets can target OS privilege escalation.

The ultimate secure computer system will have absolute hardware
protections. Programmers can't be trusted here.

Even with an entirely separate code and data address space strict
Harvard architecture so that data can never get executed there are still
ways to subvert an operating system. It is just a bit harder to do.

That isn't to say that stuff could not be done better. OS/2 was very
much technically superior to Windows when it was launched but IBM made
such an awful hash of marketing it with PS/2 MCA hardware lock-in that
apart from in a handful of niche applications it sank without trace.

Segmented modified Harvard architectures can go a long way to defending
against most of the common software failings. Unfortunately flat memory
models of interspersed code and data have become the norm and some nasty
go faster compromises made to aid gaming speeds on PC video drivers.

I am inclined to the view that hardware safety critical interlocks
should never depend on software working and I much prefer it if there is
a very visible physical interlock that prevents someone from carelessly
firing a big laser or lighting a plasma when I am inside the beam path.

I once returned to a piece of kit to find that the only process still
running was the small one that pumped the dead man's handle.

Life gets very difficult for a machine like the 9900 series CPU that
suddenly finds its registers (including the program counter) are in ROM!
8kV flashovers do very strange things to control electronics.

--
Regards,
Martin Brown
 
On Wednesday, September 4, 2019 at 2:53:39 PM UTC-4, John Larkin wrote:
Address space layout randomization is hilarious, the moral equivalent
of hiding under the bed.

It worked for Corazon Amurao.

--

Rick C.

-- Get 1,000 miles of free Supercharging
-- Tesla referral code - https://ts.la/richard11209
 
On Wednesday, September 4, 2019 at 12:34:09 PM UTC-4, jla...@highlandsniptechnology.com wrote:
I can't help having a lot of experience. I've been fascinated with
electronics and girls all my life.

LOL The lady doth protest too much, methinks

--

Rick C.

+ Get 1,000 miles of free Supercharging
+ Tesla referral code - https://ts.la/richard11209
 
On 9/4/19 3:56 PM, bitrex wrote:
On 9/4/19 2:53 PM, John Larkin wrote:

It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't compromised
in some way, too. Or that your compiler's compiler wasn't compromised,
or that the compiler that you use to compile the tool you use to check
to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...

I believe that absolute hardware protection is possible. But people
don't even use the protection mechanisms that are provided. The
structure of c sure doesn't help.

No, that's why C++ is a great idea even for small embedded applications.
You can write insecure code as easily as you can in C. But it's easier
to write secure code with it as it makes it easier to write code that
enforces some set of common-sense generally agreed upon restrictions.

e.g. you may never directly write to a raw storage array of fixed size
without bounds-checking, or allow some quantity measured in positive
integers to ever have something that's not a positive integer assigned
to it. And it can enforce stuff like that with no runtime resource
overhead.

Or rather, no more overhead than building in the same assurances
manually in C.
 
On 9/4/19 2:53 PM, John Larkin wrote:

It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't compromised
in some way, too. Or that your compiler's compiler wasn't compromised,
or that the compiler that you use to compile the tool you use to check
to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...

I believe that absolute hardware protection is possible. But people
don't even use the protection mechanisms that are provided. The
structure of c sure doesn't help.

No, that's why C++ is a great idea even for small embedded applications.
You can write insecure code as easily as you can in C. But it's easier
to write secure code with it as it makes it easier to write code that
enforces some set of common-sense generally agreed upon restrictions.

e.g. you may never directly write to a raw storage array of fixed size
without bounds-checking, or allow some quantity measured in positive
integers to ever have something that's not a positive integer assigned
to it. And it can enforce stuff like that with no runtime resource overhead.

There are other languages that do the same out of the box, but the
runtime overhead tends to be higher making them inappropriate for
embedded work.
 
On 9/4/19 2:53 PM, John Larkin wrote:
On Wed, 4 Sep 2019 14:37:24 -0400, bitrex <user@example.net> wrote:

On 9/4/19 12:51 PM, jlarkin@highlandsniptechnology.com wrote:
On Wed, 4 Sep 2019 17:14:45 +0100, Martin Brown
'''newspam'''@nezumi.demon.co.uk> wrote:

On 03/09/2019 21:03, Rick C wrote:
On Monday, September 2, 2019 at 4:22:22 PM UTC-4, John Larkin wrote:
On Mon, 2 Sep 2019 15:16:23 -0400, bitrex <user@example.net> wrote:

On 9/2/19 11:53 AM, John Larkin wrote:

https://www.theregister.co.uk/2019/09/02/microsoft_roundup/

How could something this crazy happen?

The Feb update broke a previously working audio/visual hardware/software
suite of mine, the mfgr says "talk to Microsoft" and Microsoft says
"talk to the mfgr"

?\_(?)_/?

A recent Firefox update got tangled with Windows access permissions.
That cost me a few hours of IT consultant time to repair. He basically
fiddled until it got fixed. Better him than me.

We are in the dark ages of computing.

I am a little confused. You often describe your technique of designing as what amounts to fiddling...

I hate to say it but on this particular point I tend to agree with JL.

Good software can be defined as something which remains useful,
relatively bug free and in service five years after it was launched.
That was about the timescale where overly enthusiastic large medieval
buildings tended to first show signs of subsidence and failure too.

Although there is some excellent software about and best practice is
improving gradually (though IMHO too slowly) there is far too much of a
ship it and be damned macho business culture in shrink wrap software.

Win10 updates that bricked certain brands of portable for example.

running some spice sims, breadboarding something to try an idea, swapping parts to see what happens.

I find it very odd that he trusts Spice simulation predictions when at
the same time he rails incessantly against climate change simulations.

I mostly - certainly not always - am guided by simulations of stable
linear systems where I can trust the component models and the sim
software. I don't trust future-state simulations of unstable or
chaotic systems, especially if I don't understand the component
behavior, the forcings, or the initial states.

Simulation mostly helps me to think.

And of course, being in business to sell stuff, my simulations are
rapidly, often concurrently, verified by experiment, which also guides
future expectations of simulations.

Is that unreasonable?



How is this different?

The individual components in electronics hardware are generally much
better characterised and do more or less what they say on the tin.
Software developers have a bad habit of re-inventing the wheel and not
always putting the axle at the centre or making the damn thing round!

A PC operating system has some 100 billion bits. Which one do you want changed?

Problem with binary logic is that a fence post error is the opposite of
what you intended to do. It is pretty clear that modern software could
be made a lot more robust by static analysis to find all the places
where malevolent data packets can target OS privilege escalation.

The ultimate secure computer system will have absolute hardware
protections. Programmers can't be trusted here.



It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't compromised
in some way, too. Or that your compiler's compiler wasn't compromised,
or that the compiler that you use to compile the tool you use to check
to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...

I believe that absolute hardware protection is possible. But people
don't even use the protection mechanisms that are provided. The
structure of c sure doesn't help.

Address space layout randomization is hilarious, the moral equivalent
of hiding under the bed.

There's a little MINIX machine running inside every Intel processor sold
- who knows exactly what it does. It's known to have the capability to
flash the BIOS even when the machine is powered down (but still plugged
in/available battery) and can run self-modifying code.
 
On 04/09/2019 19:53, John Larkin wrote:
On Wed, 4 Sep 2019 14:37:24 -0400, bitrex <user@example.net> wrote:

It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't compromised
in some way, too. Or that your compiler's compiler wasn't compromised,
or that the compiler that you use to compile the tool you use to check
to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...

I believe that absolute hardware protection is possible. But people

It is if you don't mind having computers that can only do one job.

Your problem is always that at some point you have to load the program
code in from external storage as data and then flip a bit to allow it to
execute. Controlling that executable transition is the key.

Sadly the likes of Windows have far too much code executing with the
highest level of privileges for very minor speed gain and insanely high
vulnerability risk of buffer overrun attacks.

don't even use the protection mechanisms that are provided. The
structure of c sure doesn't help.

C is unfortunate, but we are kind of stuck with it now. Attempts to
improve it have made the syntax more complicated but left in the
tendency for cryptic obfuscated code that can easily go wrong.

The ability to coerce an integer into a pointer to anything you like is
an intrinsic weakness (avid practitioners consider it a strength).

Address space layout randomization is hilarious, the moral equivalent
of hiding under the bed.

It makes life harder for the aggressors, but they have come up with
clever ways to allocate multiple do nowt threads with synchronised
release to defeat even the rather more well crafted Apple OS.

Handing programs memory pointers in such a way that the process will be
killed if it ever attempts to write outside array bounds is one
effective defence but until recently there were overheads on it.

Security has been so often sacrificed for go faster stripes and the
cache line attacks demonstrate just how tricky these things can be.

--
Regards,
Martin Brown
 
On Wed, 4 Sep 2019 15:50:02 -0400, bitrex <user@example.net> wrote:

On 9/4/19 2:53 PM, John Larkin wrote:
On Wed, 4 Sep 2019 14:37:24 -0400, bitrex <user@example.net> wrote:

On 9/4/19 12:51 PM, jlarkin@highlandsniptechnology.com wrote:
On Wed, 4 Sep 2019 17:14:45 +0100, Martin Brown
'''newspam'''@nezumi.demon.co.uk> wrote:

On 03/09/2019 21:03, Rick C wrote:
On Monday, September 2, 2019 at 4:22:22 PM UTC-4, John Larkin wrote:
On Mon, 2 Sep 2019 15:16:23 -0400, bitrex <user@example.net> wrote:

On 9/2/19 11:53 AM, John Larkin wrote:

https://www.theregister.co.uk/2019/09/02/microsoft_roundup/

How could something this crazy happen?

The Feb update broke a previously working audio/visual hardware/software
suite of mine, the mfgr says "talk to Microsoft" and Microsoft says
"talk to the mfgr"

?\_(?)_/?

A recent Firefox update got tangled with Windows access permissions.
That cost me a few hours of IT consultant time to repair. He basically
fiddled until it got fixed. Better him than me.

We are in the dark ages of computing.

I am a little confused. You often describe your technique of designing as what amounts to fiddling...

I hate to say it but on this particular point I tend to agree with JL.

Good software can be defined as something which remains useful,
relatively bug free and in service five years after it was launched.
That was about the timescale where overly enthusiastic large medieval
buildings tended to first show signs of subsidence and failure too.

Although there is some excellent software about and best practice is
improving gradually (though IMHO too slowly) there is far too much of a
ship it and be damned macho business culture in shrink wrap software.

Win10 updates that bricked certain brands of portable for example.

running some spice sims, breadboarding something to try an idea, swapping parts to see what happens.

I find it very odd that he trusts Spice simulation predictions when at
the same time he rails incessantly against climate change simulations.

I mostly - certainly not always - am guided by simulations of stable
linear systems where I can trust the component models and the sim
software. I don't trust future-state simulations of unstable or
chaotic systems, especially if I don't understand the component
behavior, the forcings, or the initial states.

Simulation mostly helps me to think.

And of course, being in business to sell stuff, my simulations are
rapidly, often concurrently, verified by experiment, which also guides
future expectations of simulations.

Is that unreasonable?



How is this different?

The individual components in electronics hardware are generally much
better characterised and do more or less what they say on the tin.
Software developers have a bad habit of re-inventing the wheel and not
always putting the axle at the centre or making the damn thing round!

A PC operating system has some 100 billion bits. Which one do you want changed?

Problem with binary logic is that a fence post error is the opposite of
what you intended to do. It is pretty clear that modern software could
be made a lot more robust by static analysis to find all the places
where malevolent data packets can target OS privilege escalation.

The ultimate secure computer system will have absolute hardware
protections. Programmers can't be trusted here.



It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't compromised
in some way, too. Or that your compiler's compiler wasn't compromised,
or that the compiler that you use to compile the tool you use to check
to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...

I believe that absolute hardware protection is possible. But people
don't even use the protection mechanisms that are provided. The
structure of c sure doesn't help.

Address space layout randomization is hilarious, the moral equivalent
of hiding under the bed.



There's a little MINIX machine running inside every Intel processor sold
- who knows exactly what it does. It's known to have the capability to
flash the BIOS even when the machine is powered down (but still plugged
in/available battery) and can run self-modifying code.

I believe it has been hacked already.
 
On Wednesday, 4 September 2019 19:37:29 UTC+1, bitrex wrote:
On 9/4/19 12:51 PM, jlarkin@highlandsniptechnology.com wrote:
On Wed, 4 Sep 2019 17:14:45 +0100, Martin Brown
'''newspam'''@nezumi.demon.co.uk> wrote:

Problem with binary logic is that a fence post error is the opposite of
what you intended to do. It is pretty clear that modern software could
be made a lot more robust by static analysis to find all the places
where malevolent data packets can target OS privilege escalation.

The ultimate secure computer system will have absolute hardware
protections. Programmers can't be trusted here.



It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't compromised
in some way, too. Or that your compiler's compiler wasn't compromised,
or that the compiler that you use to compile the tool you use to check
to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...

But you can still do a huge lot better than MS Windows


NT
 
On Wednesday, 4 September 2019 16:11:26 UTC+1, bitrex wrote:
On 9/3/19 4:03 PM, Rick C wrote:
On Monday, September 2, 2019 at 4:22:22 PM UTC-4, John Larkin wrote:
On Mon, 2 Sep 2019 15:16:23 -0400, bitrex <user@example.net> wrote:

On 9/2/19 11:53 AM, John Larkin wrote:

https://www.theregister.co.uk/2019/09/02/microsoft_roundup/

How could something this crazy happen?



The Feb update broke a previously working audio/visual hardware/software
suite of mine, the mfgr says "talk to Microsoft" and Microsoft says
"talk to the mfgr"

Ĺť\_(?)_/Ĺť

A recent Firefox update got tangled with Windows access permissions.
That cost me a few hours of IT consultant time to repair. He basically
fiddled until it got fixed. Better him than me.

We are in the dark ages of computing.

I am a little confused. You often describe your technique of designing as what amounts to fiddling... running some spice sims, breadboarding something to try an idea, swapping parts to see what happens.

How is this different?

A PC operating system has some 100 billion bits. Which one do you want changed?


Sometimes older guys wax nostalgic about the big iron they used in their
teens and 20s in the fashion that you know they thought it was really
something else at the time.

You won't find me waxing nostalgic much about any "In my day" computers
because in the same time period for me what the kids had available was
early Pentiums and some Mac Quadras and stuff. These machines were
unpleasant to use the operating systems stunk they were under-powered,
overpriced, and generally sucked balls.

Yes - computers are different to other stuff. A computer that old is pretty much useless, a hifi amp from 1959 can (occasionally) be excellent.


NT
 
On Wednesday, September 4, 2019 at 8:24:25 PM UTC-4, tabb...@gmail.com wrote:
On Wednesday, 4 September 2019 16:11:26 UTC+1, bitrex wrote:
On 9/3/19 4:03 PM, Rick C wrote:
On Monday, September 2, 2019 at 4:22:22 PM UTC-4, John Larkin wrote:
On Mon, 2 Sep 2019 15:16:23 -0400, bitrex <user@example.net> wrote:

On 9/2/19 11:53 AM, John Larkin wrote:

https://www.theregister.co.uk/2019/09/02/microsoft_roundup/

How could something this crazy happen?



The Feb update broke a previously working audio/visual hardware/software
suite of mine, the mfgr says "talk to Microsoft" and Microsoft says
"talk to the mfgr"

Ĺť\_(?)_/Ĺť

A recent Firefox update got tangled with Windows access permissions.
That cost me a few hours of IT consultant time to repair. He basically
fiddled until it got fixed. Better him than me.

We are in the dark ages of computing.

I am a little confused. You often describe your technique of designing as what amounts to fiddling... running some spice sims, breadboarding something to try an idea, swapping parts to see what happens.

How is this different?

A PC operating system has some 100 billion bits. Which one do you want changed?


Sometimes older guys wax nostalgic about the big iron they used in their
teens and 20s in the fashion that you know they thought it was really
something else at the time.

You won't find me waxing nostalgic much about any "In my day" computers
because in the same time period for me what the kids had available was
early Pentiums and some Mac Quadras and stuff. These machines were
unpleasant to use the operating systems stunk they were under-powered,
overpriced, and generally sucked balls.

Yes - computers are different to other stuff. A computer that old is pretty much useless, a hifi amp from 1959 can (occasionally) be excellent.

The only reason why old computers are "useless" is because there is something else that is better and cheaper. It's simply not worth continuing to use the old machine. But there is nothing inherently worse about an old computer. It's not like they go bad in any way other than how everything goes bad, by wearing out.

On the other hand, an old amplifier does have inherent shortcomings in the same way as old PCs. Newer gear is smaller, lower power and can be made with features older gear can only dream of. So no one other than collectors have any reason to use old amps. Better stuff is available.

--

Rick C.

-+ Get 1,000 miles of free Supercharging
-+ Tesla referral code - https://ts.la/richard11209
 
On 9/4/19 8:27 PM, tabbypurr@gmail.com wrote:
On Wednesday, 4 September 2019 19:37:29 UTC+1, bitrex wrote:
On 9/4/19 12:51 PM, jlarkin@highlandsniptechnology.com wrote:
On Wed, 4 Sep 2019 17:14:45 +0100, Martin Brown
'''newspam'''@nezumi.demon.co.uk> wrote:

Problem with binary logic is that a fence post error is the opposite of
what you intended to do. It is pretty clear that modern software could
be made a lot more robust by static analysis to find all the places
where malevolent data packets can target OS privilege escalation.

The ultimate secure computer system will have absolute hardware
protections. Programmers can't be trusted here.



It's been proved that there's no such thing as an "ultimate secure
computer" there's no way to ensure that your compiler isn't compromised
in some way, too. Or that your compiler's compiler wasn't compromised,
or that the compiler that you use to compile the tool you use to check
to see if your compiler is compromised, wasn't compromised.

Or that the hardware that you use to compile the software that you use
to design the hardware for the ultimate secure computer didn't itself
inject a vulnerability into the design software that then compromises
your new hardware.

And so forth...

But you can still do a huge lot better than MS Windows


NT

You can do a lot better than McDonalds too, even in the regime of fast
food but at some point someone realized that brand recognition and
market penetration/availability were more important concepts than the
intrinsic quality of the product.

"quantity has a quality all its own" or that is to say 40,000
restaurants nationwide and billions and billions served can't be "wrong."

Microsoft and McDonalds are either thought of as failures of late-stage
capitalism or rousing success stories, depending on how one looks at
things. They sell a mediocre product at a price a bit too high for
what's on offer and they'll always be there for you. What's not to like?
It seems that over the years and decades investors at least have sure
liked it a lot..
 
On 9/5/19 1:30 AM, bitrex wrote:

But you can still do a huge lot better than MS Windows


NT


You can do a lot better than McDonalds too, even in the regime of fast
food but at some point someone realized that brand recognition and
market penetration/availability were more important concepts than the
intrinsic quality of the product.

"quantity has a quality all its own" or that is to say 40,000
restaurants nationwide and billions and billions served can't be "wrong."

Microsoft and McDonalds are either thought of as failures of late-stage
capitalism or rousing success stories, depending on how one looks at
things. They sell a mediocre product at a price a bit too high for
what's on offer and they'll always be there for you. What's not to like?
It seems that over the years and decades investors at least have sure
liked it a lot..

If both companies were really that terrible at what they do nobody would
use them, people would find a way, any way, to do something different en
masse.

And if they put way more effort into making their product exceptional
they'd have no choice but to charge more. More than most Americans could
afford on the regular. Someone would move in immediately to undercut them.

These companies are cutting edge, in the field of walking the razors
edge of good-enough engineering.
 
On 9/4/19 8:24 PM, tabbypurr@gmail.com wrote:

Sometimes older guys wax nostalgic about the big iron they used in their
teens and 20s in the fashion that you know they thought it was really
something else at the time.

You won't find me waxing nostalgic much about any "In my day" computers
because in the same time period for me what the kids had available was
early Pentiums and some Mac Quadras and stuff. These machines were
unpleasant to use the operating systems stunk they were under-powered,
overpriced, and generally sucked balls.

Yes - computers are different to other stuff. A computer that old is pretty much useless, a hifi amp from 1959 can (occasionally) be excellent.


NT

For the tasks the kiddos were attempting to use the Mac Quadra-class
machines for at the time (mid 1990s) at ART COLLEGE e.g. standard
definition video editing they were sort of useless when they were new, too!

You set up the edits and and effects you wanted to apply, let the pizza
box crank overnight, and hope to God it hadn't crashed or locked up in
the morning.
 

Welcome to EDABoard.com

Sponsor

Back
Top