Where can I buy a large analogue meter?...

On 04/17/2022 01:00 PM, Commander Kinsey wrote:
On Sun, 17 Apr 2022 19:55:20 +0100, rbowman <bowman@montana.com> wrote:

On 04/17/2022 09:46 AM, Commander Kinsey wrote:
On Sun, 17 Apr 2022 16:08:45 +0100, Scott Lurndal <scott@slp53.sl.home
wrote:

rbowman <bowman@montana.com> writes:
On 04/16/2022 05:20 PM, Jasen Betts wrote:


Apple\'s processor is an ARM so it\'s going to be more efficient than
intels X86

When comparing RISC to CISC you have to be careful to specify what
area
you\'re comparing for efficiency. Power consumption has been where RISC
has shone. It took a while for compilers to catch up to create
optimized
code. Code size is necessarily greater, hence more RAM.

Come now, risc processors have been used for three decades now,
the compiler guys are really really good at generating quality code
for all of them.

No modern programmer is good at anything, especially tight coding. Give
them a computer from the 80s and they\'d have trouble writing a
calculator program to fit into 64KB.

One product I worked on was a handheld pH / ion concentration meter that
used an 8049.

https://en.wikipedia.org/wiki/Intel_MCS-48

I did the pH meter and another programmer did the ion concentration.
Reading the electrode value from the A/D and driving the user interface
was the same for both products but the math was sufficiently different
that 2K wasn\'t enough to do both.

There was also a benchtop meter/auto-titrator that used a Z-80. 64K was
a real luxury.

In reply to Scott Lurndal, yeah the compiler guys have gotten really
good after 3 decades...

I have a mouse driver that\'s 130MB. WTF? That\'s over 3 times the size
of the hard disk on a PC I had in 1991. What does the mouse driver do?
Watch for left and right and a few button presses? In 1991 I think it
was 30KB. 4000 times less efficient programming, we\'ve really come far.

I looked at Java back in the late \'90s. It wasn\'t too bad but as it grew
performance went into the toilet. The answer was \'you need a newer,
faster machine.\'

Over twenty years of hardware improvements and Java apps still suck.

I bought an Osborne 1 in \'81. It was a CP/M machine and came with 2
single side, single density 5 1/4\" floppy drives for a massive 90 KB
each. I later sent it back for the DD upgrade. Some how 90KB was enough
to hold Wordstar, SuperCalc, or the BDS C compiler executables which
happily ran in 64KB of RAM.

Somehow Turbo Pascal managed to compile so fast that at first I thought
it was broken compared to BDS.
 
On 04/17/2022 01:24 PM, Commander Kinsey wrote:
Security videos can be huge, I have two 4K cameras running continuously,
but I have a core of a Ryzen 9 3900XT allocated to each which only
records when it sees something suspicious. I\'ve even used it to locate
my neighbour\'s cat, which she found confusing. But it auto deletes
after a month unless I save it.

So they\'re finding out about body cams. No auto delete after a month
either. It can be years before the case comes to trial and they have to
produce the camera video.
 
On 04/17/2022 01:55 PM, Commander Kinsey wrote:
Agreed apart from \"disallow working in the same industry for N years
after leaving\". Most people probably work in the same industry for most
of their life. So such a job means if you ever choose to leave, you
can\'t get another job. I would therefore never take a job with that in
the contract.

Depends on what you mean by industry. For example I\'ve been in the
software \'industry\' but went from programming biomedical equipment to
aircraft fuel measurement and management to automated testing of copier
power supplies, to semiconductor sputtering systems to....

My software skills remained the same but the applications had little to
do with each other. I\'ve been in my present job for over 20 years but I
got old and slow and didn\'t need a new challenge every three or four
years. What I have is plenty challenging as the technology changes.
 
On Sunday, April 17, 2022 at 10:12:09 AM UTC-7, Joe Gwinn wrote:
On Sun, 17 Apr 2022 13:55:55 +0100, \"Commander Kinsey\"
C...@nospam.com> wrote:

Anyway, within the x86 architecture they keep adding instructions etc. Can\'t it be improved out of the mess?
Not without giving up un backward compatibility and making a clean
break. Which has been against Intel theology for a long time.

Apple went through the same thing, and eventually hired a bunch of
market research firms to run focus groups sessions...
The question to be answered was if there had to
be a Motorola processor on the motherboard, or would a really good
emulator suffice. The vast majority of those in the focus (myself
included) said that no Motorola hardware was needed, so long as the
emulation was in fact that good, because we all had essential software
that could not be replaced for one reason or another. I assume that
most of the focus groups came to the same answer, because that\'s
exactly what happened.

Joe Gwinn

It was a stretch, though; there was a \'toolbox\' runtime library, and the
rewrite of that was probably the first need, because it would normally
be cached, and a two-stage emulator-plus-toolbox requirement used
a LOT of cache. Apple had some PowerPC processors made with extra-large
cache in the early days of the 68k-to-Power changeover, and eventually
the OS\'es became incompatible as emulations were dropped, first 68k
and then Power code in the Intel years.
 
On 04/17/2022 01:56 PM, Commander Kinsey wrote:
On Sun, 17 Apr 2022 20:50:36 +0100, rbowman <bowman@montana.com> wrote:

On 04/17/2022 10:54 AM, Commander Kinsey wrote:
On Sun, 17 Apr 2022 17:39:40 +0100, rbowman <bowman@montana.com> wrote:

On 04/17/2022 07:03 AM, Commander Kinsey wrote:

I have Zen2 (an AMD Ryzen 9 3900XT) and that\'s also TSMC, but 7nm.

The equivalent CPU on Zen3 (Ryzen 9 5900X) is also 7nm.

Yes, very fast.

I\'ve got a 5500U in my laptop. It\'s a 7nm Zen2 unlike the 5600U Zen3
but
I have no complaints for a $700 laptop.

That\'s 0.4 of the speed of my desktop. Laptops suck.

Until the company upgraded my desktop I was using the laptop for some
projects. It beat the hell out of an elderly Core i5 with a hard drive.

I\'m not a real fan of laptops but they have their place. I\'m using a
company supplied laptop for remote work. Admittedly the HDMI is plugged
into my desktop monitor though a switch and I use a bluetooth mouse and
keyboard but it\'s good enough to VPN in to a real machine.

It\'s also difficult to travel with a desktop...

I wonder what would happen if you tried to set up a desktop, keyboard,
mouse, monitor on a table on a train?

First you would have to find a train...
 
On Sun, 17 Apr 2022 22:39:32 +0100, whit3rd <whit3rd@gmail.com> wrote:

On Sunday, April 17, 2022 at 10:12:09 AM UTC-7, Joe Gwinn wrote:
On Sun, 17 Apr 2022 13:55:55 +0100, \"Commander Kinsey\"
C...@nospam.com> wrote:

Anyway, within the x86 architecture they keep adding instructions etc. Can\'t it be improved out of the mess?
Not without giving up un backward compatibility and making a clean
break. Which has been against Intel theology for a long time.

Apple went through the same thing, and eventually hired a bunch of
market research firms to run focus groups sessions...
The question to be answered was if there had to
be a Motorola processor on the motherboard, or would a really good
emulator suffice. The vast majority of those in the focus (myself
included) said that no Motorola hardware was needed, so long as the
emulation was in fact that good, because we all had essential software
that could not be replaced for one reason or another. I assume that
most of the focus groups came to the same answer, because that\'s
exactly what happened.

Joe Gwinn

It was a stretch, though; there was a \'toolbox\' runtime library, and the
rewrite of that was probably the first need, because it would normally
be cached, and a two-stage emulator-plus-toolbox requirement used
a LOT of cache. Apple had some PowerPC processors made with extra-large
cache in the early days of the 68k-to-Power changeover, and eventually
the OS\'es became incompatible as emulations were dropped, first 68k
and then Power code in the Intel years.

Incompatibilities is a very big reason why I would never touch an Apple with a bargepole. Even their stupidity of dropping serial ports when adding USB instead of having a crossover time was absurd.
 
On Sun, 17 Apr 2022 22:24:59 +0100, rbowman <bowman@montana.com> wrote:

On 04/17/2022 01:00 PM, Commander Kinsey wrote:
On Sun, 17 Apr 2022 19:55:20 +0100, rbowman <bowman@montana.com> wrote:

On 04/17/2022 09:46 AM, Commander Kinsey wrote:
On Sun, 17 Apr 2022 16:08:45 +0100, Scott Lurndal <scott@slp53.sl.home
wrote:

rbowman <bowman@montana.com> writes:
On 04/16/2022 05:20 PM, Jasen Betts wrote:


Apple\'s processor is an ARM so it\'s going to be more efficient than
intels X86

When comparing RISC to CISC you have to be careful to specify what
area
you\'re comparing for efficiency. Power consumption has been where RISC
has shone. It took a while for compilers to catch up to create
optimized
code. Code size is necessarily greater, hence more RAM.

Come now, risc processors have been used for three decades now,
the compiler guys are really really good at generating quality code
for all of them.

No modern programmer is good at anything, especially tight coding. Give
them a computer from the 80s and they\'d have trouble writing a
calculator program to fit into 64KB.

One product I worked on was a handheld pH / ion concentration meter that
used an 8049.

https://en.wikipedia.org/wiki/Intel_MCS-48

I did the pH meter and another programmer did the ion concentration.
Reading the electrode value from the A/D and driving the user interface
was the same for both products but the math was sufficiently different
that 2K wasn\'t enough to do both.

There was also a benchtop meter/auto-titrator that used a Z-80. 64K was
a real luxury.

In reply to Scott Lurndal, yeah the compiler guys have gotten really
good after 3 decades...

I have a mouse driver that\'s 130MB. WTF? That\'s over 3 times the size
of the hard disk on a PC I had in 1991. What does the mouse driver do?
Watch for left and right and a few button presses? In 1991 I think it
was 30KB. 4000 times less efficient programming, we\'ve really come far.

I looked at Java back in the late \'90s. It wasn\'t too bad but as it grew
performance went into the toilet. The answer was \'you need a newer,
faster machine.\'

Over twenty years of hardware improvements and Java apps still suck.

I bought an Osborne 1 in \'81. It was a CP/M machine and came with 2
single side, single density 5 1/4\" floppy drives for a massive 90 KB
each. I later sent it back for the DD upgrade. Some how 90KB was enough
to hold Wordstar, SuperCalc, or the BDS C compiler executables which
happily ran in 64KB of RAM.

Somehow Turbo Pascal managed to compile so fast that at first I thought
it was broken compared to BDS.

https://en.wikipedia.org/wiki/Idiocracy
 
On Sun, 17 Apr 2022 22:29:11 +0100, rbowman <bowman@montana.com> wrote:

On 04/17/2022 01:24 PM, Commander Kinsey wrote:
Security videos can be huge, I have two 4K cameras running continuously,
but I have a core of a Ryzen 9 3900XT allocated to each which only
records when it sees something suspicious. I\'ve even used it to locate
my neighbour\'s cat, which she found confusing. But it auto deletes
after a month unless I save it.

So they\'re finding out about body cams. No auto delete after a month
either. It can be years before the case comes to trial and they have to
produce the camera video.

You can turn them off.
 
On Sun, 17 Apr 2022 22:38:18 +0100, rbowman <bowman@montana.com> wrote:

On 04/17/2022 01:55 PM, Commander Kinsey wrote:
Agreed apart from \"disallow working in the same industry for N years
after leaving\". Most people probably work in the same industry for most
of their life. So such a job means if you ever choose to leave, you
can\'t get another job. I would therefore never take a job with that in
the contract.

Depends on what you mean by industry. For example I\'ve been in the
software \'industry\' but went from programming biomedical equipment to
aircraft fuel measurement and management to automated testing of copier
power supplies, to semiconductor sputtering systems to....

My software skills remained the same but the applications had little to
do with each other. I\'ve been in my present job for over 20 years but I
got old and slow and didn\'t need a new challenge every three or four
years. What I have is plenty challenging as the technology changes.

You could always take a \"different\" job in another company but end up giving them some secrets to a neighbouring department!
 
On Sun, 17 Apr 2022 22:39:41 +0100, rbowman <bowman@montana.com> wrote:

On 04/17/2022 01:56 PM, Commander Kinsey wrote:
On Sun, 17 Apr 2022 20:50:36 +0100, rbowman <bowman@montana.com> wrote:

On 04/17/2022 10:54 AM, Commander Kinsey wrote:
On Sun, 17 Apr 2022 17:39:40 +0100, rbowman <bowman@montana.com> wrote:

On 04/17/2022 07:03 AM, Commander Kinsey wrote:

I have Zen2 (an AMD Ryzen 9 3900XT) and that\'s also TSMC, but 7nm.

The equivalent CPU on Zen3 (Ryzen 9 5900X) is also 7nm.

Yes, very fast.

I\'ve got a 5500U in my laptop. It\'s a 7nm Zen2 unlike the 5600U Zen3
but
I have no complaints for a $700 laptop.

That\'s 0.4 of the speed of my desktop. Laptops suck.

Until the company upgraded my desktop I was using the laptop for some
projects. It beat the hell out of an elderly Core i5 with a hard drive.

I\'m not a real fan of laptops but they have their place. I\'m using a
company supplied laptop for remote work. Admittedly the HDMI is plugged
into my desktop monitor though a switch and I use a bluetooth mouse and
keyboard but it\'s good enough to VPN in to a real machine.

It\'s also difficult to travel with a desktop...

I wonder what would happen if you tried to set up a desktop, keyboard,
mouse, monitor on a table on a train?

First you would have to find a train...

You don\'t have trains? They\'re annoying things with shitty brakes that would cause a car to be taken off the road. They expect everything else to get out of their way. And they never go where you want to when you want to. About time we got rid of those useless things which actually use more fuel per person than a car.
 
On Sun, 17 Apr 2022 18:13:56 +0100, \"Commander Kinsey\"
<CK1@nospam.com> wrote:

On Sun, 17 Apr 2022 18:11:56 +0100, Joe Gwinn <joegwinn@comcast.net> wrote:

On Sun, 17 Apr 2022 13:55:55 +0100, \"Commander Kinsey\"
CK1@nospam.com> wrote:

On Sun, 17 Apr 2022 01:03:59 +0100, Joe Gwinn <joegwinn@comcast.net> wrote:

On Sat, 16 Apr 2022 23:20:48 -0000 (UTC), Jasen Betts
usenet@revmaps.no-ip.org> wrote:

On 2022-04-16, Commander Kinsey <CK1@nospam.com> wrote:
On Sat, 16 Apr 2022 13:31:06 +0100, RJH <patchmoney@gmx.com> wrote:

On 16 Apr 2022 at 11:52:08 BST, \"The Natural Philosopher\"
tnp@invalid.invalid> wrote:

On 16/04/2022 11:35, RJH wrote:
On 16 Apr 2022 at 11:06:34 BST, \"The Natural Philosopher\"
tnp@invalid.invalid> wrote:

On 15/04/2022 21:28, Cindy Hamilton wrote:
On 2022-04-15, The Natural Philosopher <tnp@invalid.invalid> wrote:

BEVs are very mature technology. There is only a bit left to improve.
Like aircraft and cars in general.

Yeah, they keep saying that about computers, too. And they\'re
constantly proved wrong.

They are completely right about computers. They cant be clocked any
faster, they cant be made to work with much less power - all they can do
is add more cores.


The new(ish) Apple processors use a fraction (between and half and a third) of
the power used by an Intel equivalent.

That by itself, says nothing
A Z80 uses way less power than a pentium
A motorcycle uses way less power than a ferrari.

It says everything. Less power for the same load - google Apple M1

I prefer things designed for adults.

I very much doubt Apple can beat Intel anyway.

It\'s not Apple vs Intel it\'s TSMC vs Intel.

Apple\'s processor is an ARM so it\'s going to be more efficient than
intels X86

The long-term problem with Intel is that they cannot let go of the x86
architecture, and over time this has become severely limiting.

Apple had the same problem, but eventually did transition from
Motorola CPUs to Intel, gaining the ability to run Windows on Apple
desktop and laptop computers. But the Intel architecture had become
too hide-bound, and Apple was more or less forced to escape.

But I wonder how well and how long Apple\'s new M1 architecture will be
able to support running Windows OS and software, which is exactly what
I\'m using as I type these words. (iMac (with lots of memory),
Parallels, Win10, Forte Agent.)

I may stay on Intel for that reason, for desktops, but iPhones and
iPads will go M1, because I have no reason to retain Intel there. But
I will wait for the few apps I use to have become mature on M1 first.

So a speed change but no compatibility? Bit of a bugger to change every program\'s coding.

Anyway, within the x86 architecture they keep adding instructions etc. Can\'t it be improved out of the mess?

Not without giving up un backward compatibility and making a clean
break. Which has been against Intel theology for a long time.

But below you say you can emulate.

Apple went through the same thing, and eventually hired a bunch of
market research firms to run focus groups sessions, one of which I was
in. One long wall of our meeting room ad a very large mirror, one
that looked a bit odd. It was half-silvered, and there were observers
watching the from behind that \"mirror\".

The questions wandered around, then eventually converged. We all knew
that Apple was moving to Intel, as this had bee reported extensively
in the trade press. The question to be answered was if there had to
be a Motorola processor on the motherboard, or would a really good
emulator suffice. The vast majority of those in the focus (myself
included) said that no Motorola hardware was needed, so long as the
emulation was in fact that good, because we all had essential software
that could not be replaced for one reason or another. I assume that
most of the focus groups came to the same answer, because that\'s
exactly what happened.

Making an emulation _that_ good is a very big deal, and there has been
no talk from Apple of doing any such thing.

Joe Gwinn
 
On Sun, 17 Apr 2022 14:39:32 -0700 (PDT), whit3rd <whit3rd@gmail.com>
wrote:

On Sunday, April 17, 2022 at 10:12:09 AM UTC-7, Joe Gwinn wrote:
On Sun, 17 Apr 2022 13:55:55 +0100, \"Commander Kinsey\"
C...@nospam.com> wrote:

Anyway, within the x86 architecture they keep adding instructions etc. Can\'t it be improved out of the mess?
Not without giving up un backward compatibility and making a clean
break. Which has been against Intel theology for a long time.

Apple went through the same thing, and eventually hired a bunch of
market research firms to run focus groups sessions...
The question to be answered was if there had to
be a Motorola processor on the motherboard, or would a really good
emulator suffice. The vast majority of those in the focus (myself
included) said that no Motorola hardware was needed, so long as the
emulation was in fact that good, because we all had essential software
that could not be replaced for one reason or another. I assume that
most of the focus groups came to the same answer, because that\'s
exactly what happened.

Joe Gwinn

It was a stretch, though; there was a \'toolbox\' runtime library, and the
rewrite of that was probably the first need, because it would normally
be cached, and a two-stage emulator-plus-toolbox requirement used
a LOT of cache. Apple had some PowerPC processors made with extra-large
cache in the early days of the 68k-to-Power changeover, and eventually
the OS\'es became incompatible as emulations were dropped, first 68k
and then Power code in the Intel years.

Yes, but never mind the details, Apple did get it to work very well,
and maintained it for about ten years, then ceased to support it. By
then, most of those critical apps wee no longer critical, or had been
killed off by something else.

Joe Gwinn
 
On Sun, 17 Apr 2022 23:15:57 +0100, Joe Gwinn <joegwinn@comcast.net> wrote:

On Sun, 17 Apr 2022 14:39:32 -0700 (PDT), whit3rd <whit3rd@gmail.com
wrote:

On Sunday, April 17, 2022 at 10:12:09 AM UTC-7, Joe Gwinn wrote:
On Sun, 17 Apr 2022 13:55:55 +0100, \"Commander Kinsey\"
C...@nospam.com> wrote:

Anyway, within the x86 architecture they keep adding instructions etc. Can\'t it be improved out of the mess?
Not without giving up un backward compatibility and making a clean
break. Which has been against Intel theology for a long time.

Apple went through the same thing, and eventually hired a bunch of
market research firms to run focus groups sessions...
The question to be answered was if there had to
be a Motorola processor on the motherboard, or would a really good
emulator suffice. The vast majority of those in the focus (myself
included) said that no Motorola hardware was needed, so long as the
emulation was in fact that good, because we all had essential software
that could not be replaced for one reason or another. I assume that
most of the focus groups came to the same answer, because that\'s
exactly what happened.

Joe Gwinn

It was a stretch, though; there was a \'toolbox\' runtime library, and the
rewrite of that was probably the first need, because it would normally
be cached, and a two-stage emulator-plus-toolbox requirement used
a LOT of cache. Apple had some PowerPC processors made with extra-large
cache in the early days of the 68k-to-Power changeover, and eventually
the OS\'es became incompatible as emulations were dropped, first 68k
and then Power code in the Intel years.

Yes, but never mind the details, Apple did get it to work very well,
and maintained it for about ten years, then ceased to support it. By
then, most of those critical apps wee no longer critical, or had been
killed off by something else.

Nobody does anything critical with a Mac anyway. They\'re just for arty folk.
 
On Sunday, April 17, 2022 at 3:36:35 PM UTC-7, Commander Kinsey wrote:

> Nobody does anything critical with a Mac anyway. They\'re just for arty folk.

Not an uncommon view, but inaccurate. Excel, for example, started life
as macintosh-only code; the Windows version was an afterthought, ported
over.
 
On Mon, 18 Apr 2022 01:20:29 +0100, whit3rd <whit3rd@gmail.com> wrote:

On Sunday, April 17, 2022 at 3:36:35 PM UTC-7, Commander Kinsey wrote:

Nobody does anything critical with a Mac anyway. They\'re just for arty folk.

Not an uncommon view, but inaccurate. Excel, for example, started life
as macintosh-only code; the Windows version was an afterthought, ported
over.

Gotta start somewhere. Things tend to improve.
 
On 04/17/2022 02:52 PM, Vir Campestris wrote:
On 16/04/2022 11:06, The Natural Philosopher wrote:

They are completely right about computers. They cant be clocked any
faster, they cant be made to work with much less power - all they can
do is add more cores.

The only radical breakthrough in the last 20 years has been the solid
state disk.

Curiously not invented by Clive Sinclair or James Dyson, but by real
engineers working in large companies.

Coming in to this rather late - that turns out not to be the case.

The megahertz hasn\'t gone up much, but the instructions per clock has.

As an example of the reasons for this - do you know about speculative
execution?

Once upon a time a processor got to a branch, waited to find out which
way to go, then carried on with the correct instructions.

Then they started to decode the instructions on the non-branch path
early, because they might need them.

Then they added branch predictors, which take an increasingly good guess
as to which way the branch would go, and started on those.

The latest ones start running the instructions on _both_ paths, and
throw away the wrong ones.

All done without increasing the megahertz.

There are lots of other things going on too.

Andy

Leading to the Meltdown, Zombieload and Spectre exploits... Some of
those may have been other side-channel effects.

Side note: The only project I have worked on that used Apple equipment
used the original Mac toasters for some purposes. It was the only
machine that met the TEMPEST requirements of the day. I doubt that was
Apple\'s intention.

That was c. 1985 and the Russkies were out in the bushes trying to steal
technology. Fast forward to 2022 and the Russkies are much more
sophisticated when the pwning government data.
 
On 04/17/2022 02:55 PM, Commander Kinsey wrote:
On Sun, 17 Apr 2022 21:18:43 +0100, rbowman <bowman@montana.com> wrote:

On 04/17/2022 10:56 AM, Commander Kinsey wrote:
You sound like a real programmer. As it happens I\'m having a lot of
problems with Python. Some idiot managed to make the program require
AVX, when 50% of the users had CPUs predating that.

I\'ve run into that a couple of times. In one case out of about 30
programming and QA machines I found two that could run the program. I
just happened to develop it on one of the two and was fat, dumb, and
happy until I tried to distribute it.

Python 3.x I assume? ESRI has been using 2.7 for some GIS scripting but
are moving to 3.x. I can hardly wait to rewrite my scripts.

Not sure, they run on a Debian virtual machine using Oracle Virtualbox.
This is the last log output I can find if it means anything to you:

Not a clue. Have I mentioned I hate VMs? Sometimes they\'re good for a
laugh. Some sites with high availability systems respond to Linux like a
vampire to garlic. What they don\'t know is under all those Server 20xx
VMs, Redhat and kvm is holding the whole mess together.
 
On Mon, 18 Apr 2022 01:49:12 +0100, rbowman <bowman@montana.com> wrote:

On 04/17/2022 02:55 PM, Commander Kinsey wrote:
On Sun, 17 Apr 2022 21:18:43 +0100, rbowman <bowman@montana.com> wrote:

On 04/17/2022 10:56 AM, Commander Kinsey wrote:
You sound like a real programmer. As it happens I\'m having a lot of
problems with Python. Some idiot managed to make the program require
AVX, when 50% of the users had CPUs predating that.

I\'ve run into that a couple of times. In one case out of about 30
programming and QA machines I found two that could run the program. I
just happened to develop it on one of the two and was fat, dumb, and
happy until I tried to distribute it.

Python 3.x I assume? ESRI has been using 2.7 for some GIS scripting but
are moving to 3.x. I can hardly wait to rewrite my scripts.

Not sure, they run on a Debian virtual machine using Oracle Virtualbox.
This is the last log output I can find if it means anything to you:

Not a clue. Have I mentioned I hate VMs? Sometimes they\'re good for a
laugh. Some sites with high availability systems respond to Linux like a
vampire to garlic. What they don\'t know is under all those Server 20xx
VMs, Redhat and kvm is holding the whole mess together.

I use them to be able to run Linux shit on my grown up Windows systems.
 
On 04/17/2022 03:46 PM, Commander Kinsey wrote:
On Sun, 17 Apr 2022 22:29:11 +0100, rbowman <bowman@montana.com> wrote:

On 04/17/2022 01:24 PM, Commander Kinsey wrote:
Security videos can be huge, I have two 4K cameras running continuously,
but I have a core of a Ryzen 9 3900XT allocated to each which only
records when it sees something suspicious. I\'ve even used it to locate
my neighbour\'s cat, which she found confusing. But it auto deletes
after a month unless I save it.

So they\'re finding out about body cams. No auto delete after a month
either. It can be years before the case comes to trial and they have to
produce the camera video.

You can turn them off.

That feature may go away; they\'ll have to get more creative. Currently
some bodycam systems vendors turn the camera on when the officer gets
within a specified distance from the incident. That, of course, also
implies the body camera is a radio collar for the cop.
 
On 04/17/2022 03:47 PM, Commander Kinsey wrote:
On Sun, 17 Apr 2022 22:38:18 +0100, rbowman <bowman@montana.com> wrote:

On 04/17/2022 01:55 PM, Commander Kinsey wrote:
Agreed apart from \"disallow working in the same industry for N years
after leaving\". Most people probably work in the same industry for most
of their life. So such a job means if you ever choose to leave, you
can\'t get another job. I would therefore never take a job with that in
the contract.

Depends on what you mean by industry. For example I\'ve been in the
software \'industry\' but went from programming biomedical equipment to
aircraft fuel measurement and management to automated testing of copier
power supplies, to semiconductor sputtering systems to....

My software skills remained the same but the applications had little to
do with each other. I\'ve been in my present job for over 20 years but I
got old and slow and didn\'t need a new challenge every three or four
years. What I have is plenty challenging as the technology changes.

You could always take a \"different\" job in another company but end up
giving them some secrets to a neighbouring department!

There have been times when I\'ve thought giving the source code to a
competitor could set them back a few years.

That\'s the purpose of trade shows. You skulk around seeing what everyone
is doing and if it looks good you steal it. You need to be careful
though. A few years back cloud based solutions were all the rage. Then
AWS very visibly went tits up at inopportune moments. Twitter taking a
dump is one thing but the emergency services people are a little less
enthused by their system going down.

When I first started working I was involved in the machine tool sector.
The Asians were cute. Of course they all had cameras and they would
subtly position one of their guys in the photos so they could go home
and scale everything.

Then the Japanese ate the US machine tool business and I moved on.
 

Welcome to EDABoard.com

Sponsor

Back
Top