GRRRR part 2...

J

John Larkin

Guest
I\'m waiting for a sim to run, so may as well whine. It\'s running at 12
PPM of real time.

LT Spice lets you put the value of a part anywhere on the screen. I
just spent an embarassing amount of time figuring out why my current
limiter didn\'t work. It\'s a switching half-bridge with an output
current sensor, and a pair of P+I opamps that sense positive and
negative over-current and clamp the input demand signal appropriately.

(The current limit will be in an FPGA, but I like to do an analog sim
to get the dynamics close.)

I have a couple of BVs as isolators between the PWM generator and the
floating (+ and - 48 volt supplies) LTC4444 mosfet gate driver. The
equations of the BVs were swapped, so my power stage gain was
reversed. Negative feedback wasn\'t.

Some cad software limits how far a ref designator or a value can be
from the part. Or highlights one if you click on the other.

More fun: if you copy and paste a chunk of circuit, the copy has all
the same node names. So everything is shorted to everything until you
find and change the nodes that matter.

I\'ll need to get a new PC soon. People say that a screaming CPU and
lots of ram and solid-state C drive would really speed things up.

Hey, it finished. It made a 5.4 Gbyte RAW file.





--

If a man will begin with certainties, he shall end with doubts,
but if he will be content to begin with doubts he shall end in certainties.
Francis Bacon
 
On a sunny day (Wed, 01 Jun 2022 15:38:20 -0700) it happened John Larkin
<jlarkin@highland_atwork_technology.com> wrote in
<m2pf9hp44ssn7vu45eftf46jncjo4ppbp5@4ax.com>:

I\'m waiting for a sim to run, so may as well whine. It\'s running at 12
PPM of real time.

LT Spice lets you put the value of a part anywhere on the screen. I
just spent an embarassing amount of time figuring out why my current
limiter didn\'t work. It\'s a switching half-bridge with an output
current sensor, and a pair of P+I opamps that sense positive and
negative over-current and clamp the input demand signal appropriately.

(The current limit will be in an FPGA, but I like to do an analog sim
to get the dynamics close.)

I have a couple of BVs as isolators between the PWM generator and the
floating (+ and - 48 volt supplies) LTC4444 mosfet gate driver. The
equations of the BVs were swapped, so my power stage gain was
reversed. Negative feedback wasn\'t.

Some cad software limits how far a ref designator or a value can be
from the part. Or highlights one if you click on the other.

More fun: if you copy and paste a chunk of circuit, the copy has all
the same node names. So everything is shorted to everything until you
find and change the nodes that matter.

I\'ll need to get a new PC soon. People say that a screaming CPU and
lots of ram and solid-state C drive would really speed things up.

Hey, it finished. It made a 5.4 Gbyte RAW file.

There was a headline in scidaily.com today about if AI designed stuff should be patented and by whom, or something

I know I probably draw fire if I say that all that spice is a dead end road at least for many things.

Neural networks, done some programming with that, we are like that.. could be the future
in electronic design too.
But a NN (Neural Net) is trained by building and testing things.
Even if you spice on a supper computah there is NO guarantee the circuit will even work in reality.

I play with Raspberry Pis these days.. Even those are very much unobtainable due to chip or other shortages..
Would be nice to have LT spice running on ARM processors (maybe there is a port already?)
There must be a break even point between AI an Spice type simulations for electronic design?
Wonder how far that is away.
For the rest I leave the problem solving to my neural net, have not touched spice in years?
It is nice for filters that need a lot of repeated math scribbling, but there are other filter design programs.
Many things do not have a good spice model...
 
John Larkin wrote:

I\'ll need to get a new PC soon. People say that a screaming CPU and lots
of ram and solid-state C drive would really speed things up.

I have had NVMe drives for ages. Get good ones, like from Samsung. Spend
more than you used to spend on hard drives, it\'s well worth it. Primary
(and secondary) storage is not whizbang fun, but as IBM used to put it, it
affects \"throughput\" more than anything else. NVMe is what I have always
wanted.

If you use a secondary drive, the transfer rate between two premium NVMe
drives is OUTRAGEOUS. Transferring movies from your downloads folder to
the secondary drive is quick. Making backups of Windows is also quick.
 
On Thu, 02 Jun 2022 06:05:25 GMT, Jan Panteltje
<pNaonStpealmtje@yahoo.com> wrote:

On a sunny day (Wed, 01 Jun 2022 15:38:20 -0700) it happened John Larkin
jlarkin@highland_atwork_technology.com> wrote in
m2pf9hp44ssn7vu45eftf46jncjo4ppbp5@4ax.com>:

I\'m waiting for a sim to run, so may as well whine. It\'s running at 12
PPM of real time.

LT Spice lets you put the value of a part anywhere on the screen. I
just spent an embarassing amount of time figuring out why my current
limiter didn\'t work. It\'s a switching half-bridge with an output
current sensor, and a pair of P+I opamps that sense positive and
negative over-current and clamp the input demand signal appropriately.

(The current limit will be in an FPGA, but I like to do an analog sim
to get the dynamics close.)

I have a couple of BVs as isolators between the PWM generator and the
floating (+ and - 48 volt supplies) LTC4444 mosfet gate driver. The
equations of the BVs were swapped, so my power stage gain was
reversed. Negative feedback wasn\'t.

Some cad software limits how far a ref designator or a value can be
from the part. Or highlights one if you click on the other.

More fun: if you copy and paste a chunk of circuit, the copy has all
the same node names. So everything is shorted to everything until you
find and change the nodes that matter.

I\'ll need to get a new PC soon. People say that a screaming CPU and
lots of ram and solid-state C drive would really speed things up.

Hey, it finished. It made a 5.4 Gbyte RAW file.

There was a headline in scidaily.com today about if AI designed stuff should be patented and by whom, or something

I know I probably draw fire if I say that all that spice is a dead end road at least for many things.

It\'s sure not dead for people who design real electronics. What I want
is for Spice to run on an Nvidia parallel compute engine, 200x or so
faster than on an Intel cpu.

Neural networks, done some programming with that, we are like that.. could be the future
in electronic design too.
But a NN (Neural Net) is trained by building and testing things.

Are NNs anything but an academic toy?

>Even if you spice on a supper computah there is NO guarantee the circuit will even work in reality.

Sometimes things work exactly as Spiced. Good engineers can usually
expect when the sims aren\'t to be trusted.

I play with Raspberry Pis these days.. Even those are very much unobtainable due to chip or other shortages..
Would be nice to have LT spice running on ARM processors (maybe there is a port already?)
There must be a break even point between AI an Spice type simulations for electronic design?
Wonder how far that is away.

There have been attempts to use computers to actually design circuits,
or at least to optimize values in a given topology. They tended to be
ludicrous failures.

It\'s strange that our brains, evolved to be hunter-gatherers, can
design electronics.

For the rest I leave the problem solving to my neural net, have not touched spice in years?
It is nice for filters that need a lot of repeated math scribbling, but there are other filter design programs.
Many things do not have a good spice model...

That\'s the main hazard, not having dependable part models. That\'s a
serious problem when using RF-type parts large-signal time domain.

We finally finished our big laser modulator chassis. The
amplifier/fiducial board got to rev C, and we avoided D by adding a
MiniCircuits SMA DC block in one of the cables. You can\'t sim this
fast stuff; just guess and etch.

https://www.dropbox.com/s/29ttap9urihhep1/T500_Top_Final.jpg?raw=1

We couldn\'t get LCD driver chips, but their eval boards are available,
so we used eval boards.



--

Anybody can count to one.

- Robert Widlar
 
On Thu, 2 Jun 2022 09:28:24 -0000 (UTC), John Doe
<always.look@message.header> wrote:

John Larkin wrote:

I\'ll need to get a new PC soon. People say that a screaming CPU and lots
of ram and solid-state C drive would really speed things up.

I have had NVMe drives for ages. Get good ones, like from Samsung. Spend
more than you used to spend on hard drives, it\'s well worth it. Primary
(and secondary) storage is not whizbang fun, but as IBM used to put it, it
affects \"throughput\" more than anything else. NVMe is what I have always
wanted.

If you use a secondary drive, the transfer rate between two premium NVMe
drives is OUTRAGEOUS. Transferring movies from your downloads folder to
the secondary drive is quick. Making backups of Windows is also quick.

I guess one can plug an ssd into a PCIe slot. I need new PCs and it
would be cool to have a fast D: drive on them, mostly for Dropbox.



--

Anybody can count to one.

- Robert Widlar
 
jlarkin@highlandsniptechnology.com wrote:
On Thu, 02 Jun 2022 06:05:25 GMT, Jan Panteltje
pNaonStpealmtje@yahoo.com> wrote:
snip

Neural networks, done some programming with that, we are like
that.. could be the future in electronic design too. But a NN
(Neural Net) is trained by building and testing things.

Are NNs anything but an academic toy?

They\'re great for flagging possibilities for later evaluation. IIRC
they\'re used in drug discovery for that reason. For control systems,
not so much.

Even if you spice on a supper computah there is NO guarantee the
circuit will even work in reality.

Sometimes things work exactly as Spiced. Good engineers can usually
expect when the sims aren\'t to be trusted.


I play with Raspberry Pis these days.. Even those are very much
unobtainable due to chip or other shortages.. Would be nice to have
LT spice running on ARM processors (maybe there is a port
already?) There must be a break even point between AI an Spice type
simulations for electronic design? Wonder how far that is away.

There have been attempts to use computers to actually design
circuits, or at least to optimize values in a given topology. They
tended to be ludicrous failures.

It\'s strange that our brains, evolved to be hunter-gatherers, can
design electronics.

It\'s not at all strange that our minds, being made in the image of the
Maker, can do that. ;)

For the rest I leave the problem solving to my neural net, have not
touched spice in years? It is nice for filters that need a lot of
repeated math scribbling, but there are other filter design
programs. Many things do not have a good spice model...

That\'s the main hazard, not having dependable part models. That\'s a
serious problem when using RF-type parts large-signal time domain.

We finally finished our big laser modulator chassis. The
amplifier/fiducial board got to rev C, and we avoided D by adding a
MiniCircuits SMA DC block in one of the cables. You can\'t sim this
fast stuff; just guess and etch.

https://www.dropbox.com/s/29ttap9urihhep1/T500_Top_Final.jpg?raw=1

We couldn\'t get LCD driver chips, but their eval boards are
available, so we used eval boards.

There\'s a lot of that going around.

Cheers

Phil Hobbs

(Who is in the middle of a board spin for our nanowatt photoreceiver,
partly for that reason.)


--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC / Hobbs ElectroOptics
Optics, Electro-optics, Photonics, Analog Electronics
Briarcliff Manor NY 10510

http://electrooptical.net
http://hobbs-eo.com
 
On 6/2/2022 9:36 AM, jlarkin@highlandsniptechnology.com wrote:
On Thu, 2 Jun 2022 09:28:24 -0000 (UTC), John Doe
always.look@message.header> wrote:

John Larkin wrote:

I\'ll need to get a new PC soon. People say that a screaming CPU and lots
of ram and solid-state C drive would really speed things up.

I have had NVMe drives for ages. Get good ones, like from Samsung. Spend
more than you used to spend on hard drives, it\'s well worth it. Primary
(and secondary) storage is not whizbang fun, but as IBM used to put it, it
affects \"throughput\" more than anything else. NVMe is what I have always
wanted.

If you use a secondary drive, the transfer rate between two premium NVMe
drives is OUTRAGEOUS. Transferring movies from your downloads folder to
the secondary drive is quick. Making backups of Windows is also quick.

I guess one can plug an ssd into a PCIe slot. I need new PCs and it
would be cool to have a fast D: drive on them, mostly for Dropbox.

Cross-probing large .raw files even stored on an SSD is still slow cuz
they use a compressed format, to make it snappy you have to first
convert the data file:

<https://ltwiki.org/LTspiceHelp/LTspiceHelp/Fast_Access_File_Format.htm>

which is also slow. Or with your new machine you can buy a bunch of RAM
say 32 or 64 gig and make half of it a RAM drive dedicated to sim data
which also speeds up probing considerably.

64 gig of DDR5 is about $500 and $250 for 64 gig of DDR4 on Amazon these
days.
 
On Thursday, June 2, 2022 at 11:36:51 PM UTC+10, jla...@highlandsniptechnology.com wrote:
On Thu, 2 Jun 2022 09:28:24 -0000 (UTC), John Doe
alway...@message.header> wrote:
John Larkin wrote:

<snip>

> I guess one can plug an ssd into a PCIe slot.

Worked for me. Didn\'t make spice run any faster, but it did make mousing round in the stored waveforms much faster.

> I need new PCs and it would be cool to have a fast D: drive on them, mostly for Dropbox.

--
Bill Sloman, Sydney
 
On a sunny day (Thu, 02 Jun 2022 06:33:32 -0700) it happened
jlarkin@highlandsniptechnology.com wrote in
<5pdh9htc6l399o9esj9ua3t9p145kd0vmo@4ax.com>:

On Thu, 02 Jun 2022 06:05:25 GMT, Jan Panteltje
pNaonStpealmtje@yahoo.com> wrote:

I know I probably draw fire if I say that all that spice is a dead end road at least for many things.

It\'s sure not dead for people who design real electronics. What I want
is for Spice to run on an Nvidia parallel compute engine, 200x or so
faster than on an Intel cpu.


Neural networks, done some programming with that, we are like that.. could be the future
in electronic design too.
But a NN (Neural Net) is trained by building and testing things.

Are NNs anything but an academic toy?

Many new medicines have been designed by AI now,
it is worth keeping up to date on science by reading sciencedaily.com
But also technical things, like airplane wings, what not.
How did I learn? From looking at circuits, building and trying those.
Big advantage for me is that I had to do fast fault finding in very complex systems
and also simpler ones, so have seen thousands of designs, and HAD to grasp how those worked
to be able to fix those.
So you get an idea of the latest state of the art and what works and what not (when it fails)
This is exactly how AI is trained, say for object recognition for military - or diagnostics for medical applications.


Even if you spice on a supper computah there is NO guarantee the circuit will even work in reality.

Sometimes things work exactly as Spiced. Good engineers can usually
expect when the sims aren\'t to be trusted.


I play with Raspberry Pis these days.. Even those are very much unobtainable due to chip or other shortages..
Would be nice to have LT spice running on ARM processors (maybe there is a port already?)
There must be a break even point between AI an Spice type simulations for electronic design?
Wonder how far that is away.

There have been attempts to use computers to actually design circuits,
or at least to optimize values in a given topology. They tended to be
ludicrous failures.

It\'s strange that our brains, evolved to be hunter-gatherers, can
design electronics.

For the rest I leave the problem solving to my neural net, have not touched spice in years?
It is nice for filters that need a lot of repeated math scribbling, but there are other filter design programs.
Many things do not have a good spice model...

That\'s the main hazard, not having dependable part models. That\'s a
serious problem when using RF-type parts large-signal time domain.

We finally finished our big laser modulator chassis. The
amplifier/fiducial board got to rev C, and we avoided D by adding a
MiniCircuits SMA DC block in one of the cables. You can\'t sim this
fast stuff; just guess and etch.

https://www.dropbox.com/s/29ttap9urihhep1/T500_Top_Final.jpg?raw=1

Looks neat, what\'s in that metal box on the right?


We couldn\'t get LCD driver chips, but their eval boards are available,
so we used eval boards.

Yes, I have some eval boards, also lots of small boards from China for cheap.
About triacs.. ever used an ACS108S? Third time one blew up in my Whirlpool washing machine
Does not like spikes on the mains it seems, ordered 10 for 4 $ from ebay..
Very strange if you look at the datasheet it wants a negative drive, so takes plus as ground sort of thing
 
bitrex <user@example.net> wrote:

jlarkin@highlandsniptechnology.com wrote:
John Doe wrote:
John Larkin wrote:

I\'ll need to get a new PC soon. People say that a screaming CPU and
lots of ram and solid-state C drive would really speed things up.

I have had NVMe drives for ages. Get good ones, like from Samsung.
Spend more than you used to spend on hard drives, it\'s well worth it.
Primary (and secondary) storage is not whizbang fun, but as IBM used
to put it, it affects \"throughput\" more than anything else. NVMe is
what I have always wanted.

If you use a secondary drive, the transfer rate between two premium
NVMe drives is OUTRAGEOUS. Transferring movies from your downloads
folder to the secondary drive is quick. Making backups of Windows is
also quick.

I guess one can plug an ssd into a PCIe slot. I need new PCs and it
would be cool to have a fast D: drive on them, mostly for Dropbox.

The terminology \"NVMe\" is critical, and some are better than others.

At one point, before buying a motherboard with 2 NVMe slots, I used a PCIe
card, like that, for the additional NVMe drive.

As long as you have one NVMe drive on the motherboard and a second NVMe on
that PCIe card, file transfers are about as (blazing) fast as a motherboard
with 2 NVMe slots, but only when you are in Windows. For something like
restoring backups of Windows that might happen outside of Windows, that card
doesn\'t provide the same benefit as 2 slots on the motherboard. Probably to
do with drivers loaded when Windows boots.

I just copied a 2.7 GB file from one Samsung NVMe drive to the other. It took
less than two seconds. This is my PC\'s storage configuration. My NVMe drives
are not state-of-the-art, but they were a HUGE leap, even a big leap over
\"SSD\"...

https://www.flickr.com/photos/27532210@N04/?

There is an old picture of file transfer there too, but it\'s a little
faster than that now.

Cross-probing large .raw files even stored on an SSD is still slow cuz
they use a compressed format, to make it snappy you have to first
convert the data file:

https://ltwiki.org/LTspiceHelp/LTspiceHelp/Fast_Access_File_Format.htm

which is also slow. Or with your new machine you can buy a bunch of RAM
say 32 or 64 gig and make half of it a RAM drive dedicated to sim data
which also speeds up probing considerably.

64 gig of DDR5 is about $500 and $250 for 64 gig of DDR4 on Amazon these
days.

Difficult to believe how out of touch with reality that opinion appears to
be, on something this plainly technical. Or maybe I\'m missing something in
the translation.

\"Cross-probing\"?
Something to do with PCB design?
:D
 
On 6/2/2022 2:41 PM, John Doe wrote:

Cross-probing large .raw files even stored on an SSD is still slow cuz
they use a compressed format, to make it snappy you have to first
convert the data file:

https://ltwiki.org/LTspiceHelp/LTspiceHelp/Fast_Access_File_Format.htm

which is also slow. Or with your new machine you can buy a bunch of RAM
say 32 or 64 gig and make half of it a RAM drive dedicated to sim data
which also speeds up probing considerably.

64 gig of DDR5 is about $500 and $250 for 64 gig of DDR4 on Amazon these
days.

Difficult to believe how out of touch with reality that opinion appears to
be, on something this plainly technical. Or maybe I\'m missing something in
the translation.

\"Cross-probing\"?
Something to do with PCB design?
:D

\"This makes cross probing large circuits with huge simulation data files
interactive.\"

It\'s the term the authors of the LTSpice wiki used themselves in the
link you didn\'t read.

It means \"You click on the node in the schematic you want to see the
simulation plot for, and then the data you want to see appears in the
window that shows the plot, or vice versa by clicking alt + left on a
label in the plot pane and it will highlight the associated node\"
 
On Thursday, June 2, 2022 at 9:34:02 AM UTC-4, jla...@highlandsniptechnology.com wrote:
On Thu, 02 Jun 2022 06:05:25 GMT, Jan Panteltje
pNaonSt...@yahoo.com> wrote:

On a sunny day (Wed, 01 Jun 2022 15:38:20 -0700) it happened John Larkin
jlarkin@highland_atwork_technology.com> wrote in
m2pf9hp44ssn7vu45...@4ax.com>:

I\'m waiting for a sim to run, so may as well whine. It\'s running at 12
PPM of real time.

LT Spice lets you put the value of a part anywhere on the screen. I
just spent an embarassing amount of time figuring out why my current
limiter didn\'t work. It\'s a switching half-bridge with an output
current sensor, and a pair of P+I opamps that sense positive and
negative over-current and clamp the input demand signal appropriately.

(The current limit will be in an FPGA, but I like to do an analog sim
to get the dynamics close.)

I have a couple of BVs as isolators between the PWM generator and the
floating (+ and - 48 volt supplies) LTC4444 mosfet gate driver. The
equations of the BVs were swapped, so my power stage gain was
reversed. Negative feedback wasn\'t.

Some cad software limits how far a ref designator or a value can be
from the part. Or highlights one if you click on the other.

More fun: if you copy and paste a chunk of circuit, the copy has all
the same node names. So everything is shorted to everything until you
find and change the nodes that matter.

I\'ll need to get a new PC soon. People say that a screaming CPU and
lots of ram and solid-state C drive would really speed things up.

Hey, it finished. It made a 5.4 Gbyte RAW file.

There was a headline in scidaily.com today about if AI designed stuff should be patented and by whom, or something

I know I probably draw fire if I say that all that spice is a dead end road at least for many things.
It\'s sure not dead for people who design real electronics. What I want
is for Spice to run on an Nvidia parallel compute engine, 200x or so
faster than on an Intel cpu.

Neural networks, done some programming with that, we are like that.. could be the future
in electronic design too.
But a NN (Neural Net) is trained by building and testing things.
Are NNs anything but an academic toy?
Even if you spice on a supper computah there is NO guarantee the circuit will even work in reality.
Sometimes things work exactly as Spiced. Good engineers can usually
expect when the sims aren\'t to be trusted.

I play with Raspberry Pis these days.. Even those are very much unobtainable due to chip or other shortages..
Would be nice to have LT spice running on ARM processors (maybe there is a port already?)
There must be a break even point between AI an Spice type simulations for electronic design?
Wonder how far that is away.
There have been attempts to use computers to actually design circuits,
or at least to optimize values in a given topology. They tended to be
ludicrous failures.

It\'s strange that our brains, evolved to be hunter-gatherers, can
design electronics.
For the rest I leave the problem solving to my neural net, have not touched spice in years?
It is nice for filters that need a lot of repeated math scribbling, but there are other filter design programs.
Many things do not have a good spice model...
That\'s the main hazard, not having dependable part models. That\'s a
serious problem when using RF-type parts large-signal time domain.

We finally finished our big laser modulator chassis. The
amplifier/fiducial board got to rev C, and we avoided D by adding a
MiniCircuits SMA DC block in one of the cables. You can\'t sim this
fast stuff; just guess and etch.

https://www.dropbox.com/s/29ttap9urihhep1/T500_Top_Final.jpg?raw=1

We couldn\'t get LCD driver chips, but their eval boards are available,
so we used eval boards.



--

Anybody can count to one.

- Robert Widlar

* NN anything more than an academic toy?
Ummm, they are integral computational entities in many products that require object detection, and pattern recognition. Many autonomous vehicles use them. I have worked on various airborne threat detection systems that have ANN classifiers.

I developed ANN (Artificial Neural Net) based object detection and signal processing back in the early to mid 1990s. One application that I can talk about is object detection and classification in airline luggage. The quality of the ANN was and still is directly related to the number of training sets you throw at it. Coming up with sufficient number of training sets was a challenge. Fast forward 10 years and the explosion of the web - images by the gazillions.
They are more mainstream than you can imagine.
J
 
On 02/06/2022 23:33, jlarkin@highlandsniptechnology.com wrote:
There have been attempts to use computers to actually design circuits,
or at least to optimize values in a given topology. They tended to be
ludicrous failures.

I used an optimizer for some chip designs, it was very very good for
things like choosing the size of the transistors in flipflops for best
toggle frequency per supply current, and optimising a low pass filter
for best noise and in-band error-vector-magnitude and stop-band
rejection etc. all at the same time. It did way better than I could
have. The trick was to write a script that runs the right simulations
and results in an expression (or several) that correctly describes how
well a circuit meets the goals. Once you\'ve done that, it can twiddle
the knobs much better than any human, and I don\'t mean because it could
do it faster and spam the simulations across a thousand CPUs whilst you
could look at only one at a time, it was also better in that it could
remember many sets of parameters that were good in various ways, and
combine them more efficiently than a human. It was a company internal
tool and they will surely have kept it that way.
 
Chris Jones wrote:
On 02/06/2022 23:33, jlarkin@highlandsniptechnology.com wrote:
There have been attempts to use computers to actually design
circuits, or at least to optimize values in a given topology. They
tended to be ludicrous failures.

I used an optimizer for some chip designs, it was very very good for
things like choosing the size of the transistors in flipflops for
best toggle frequency per supply current, and optimising a low pass
filter for best noise and in-band error-vector-magnitude and
stop-band rejection etc. all at the same time. It did way better than
I could have.

Sounds like a super useful tool. I did something similar for optimizing
plasmonic nanoantennas 15 or so years ago, and like yours, it found good
solutions that weren\'t at all obvious. So I\'m a fan of the general
approach.

Of course, that sort of thing has been done automatically since the
1940s (or earlier, using manual methods--see e.g.
<https://en.wikipedia.org/wiki/Linear_programming#History>.

The trick was to write a script that runs the right simulations and
results in an expression (or several) that correctly describes how
well a circuit meets the goals. Once you\'ve done that, it can twiddle
the knobs much better than any human, and I don\'t mean because it
could do it faster and spam the simulations across a thousand CPUs
whilst you could look at only one at a time, it was also better in
that it could remember many sets of parameters that were good in
various ways, and combine them more efficiently than a human. It was
a company internal tool and they will surely have kept it that way.

Numerical optimization based on merit / penalty functions has been well
known for 200 years, since Gauss iirc. That\'s a far cry from actual
computer-based design.

Even lenses, which you\'d think would be a natural application, have been
resistant to fully-automated design--it\'s all about finding a suitable
starting point.

There are various approaches that modify topologies, of which the best
known are genetic algorithms.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC / Hobbs ElectroOptics
Optics, Electro-optics, Photonics, Analog Electronics
Briarcliff Manor NY 10510

http://electrooptical.net
http://hobbs-eo.com
 
On Fri, 3 Jun 2022 10:30:51 -0400, Phil Hobbs
<pcdhSpamMeSenseless@electrooptical.net> wrote:

Chris Jones wrote:
On 02/06/2022 23:33, jlarkin@highlandsniptechnology.com wrote:
There have been attempts to use computers to actually design
circuits, or at least to optimize values in a given topology. They
tended to be ludicrous failures.

I used an optimizer for some chip designs, it was very very good for
things like choosing the size of the transistors in flipflops for
best toggle frequency per supply current, and optimising a low pass
filter for best noise and in-band error-vector-magnitude and
stop-band rejection etc. all at the same time. It did way better than
I could have.

Sounds like a super useful tool. I did something similar for optimizing
plasmonic nanoantennas 15 or so years ago, and like yours, it found good
solutions that weren\'t at all obvious. So I\'m a fan of the general
approach.

Of course, that sort of thing has been done automatically since the
1940s (or earlier, using manual methods--see e.g.
https://en.wikipedia.org/wiki/Linear_programming#History>.

The trick was to write a script that runs the right simulations and
results in an expression (or several) that correctly describes how
well a circuit meets the goals. Once you\'ve done that, it can twiddle
the knobs much better than any human, and I don\'t mean because it
could do it faster and spam the simulations across a thousand CPUs
whilst you could look at only one at a time, it was also better in
that it could remember many sets of parameters that were good in
various ways, and combine them more efficiently than a human. It was
a company internal tool and they will surely have kept it that way.

Numerical optimization based on merit / penalty functions has been well
known for 200 years, since Gauss iirc. That\'s a far cry from actual
computer-based design.

Even lenses, which you\'d think would be a natural application, have been
resistant to fully-automated design--it\'s all about finding a suitable
starting point.

There are various approaches that modify topologies, of which the best
known are genetic algorithms.

Yes. The part about \" it could remember many sets of parameters that
were good in various ways, and combine them more efficiently than a
human\" sounds very much like a genetic programming algorithm, which
are very good at improving from a valid starting point.

Joe Gwinn
 
Joe Gwinn wrote:
On Fri, 3 Jun 2022 10:30:51 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

Chris Jones wrote:
On 02/06/2022 23:33, jlarkin@highlandsniptechnology.com wrote:
There have been attempts to use computers to actually design
circuits, or at least to optimize values in a given topology. They
tended to be ludicrous failures.

I used an optimizer for some chip designs, it was very very good for
things like choosing the size of the transistors in flipflops for
best toggle frequency per supply current, and optimising a low pass
filter for best noise and in-band error-vector-magnitude and
stop-band rejection etc. all at the same time. It did way better than
I could have.

Sounds like a super useful tool. I did something similar for optimizing
plasmonic nanoantennas 15 or so years ago, and like yours, it found good
solutions that weren\'t at all obvious. So I\'m a fan of the general
approach.

Of course, that sort of thing has been done automatically since the
1940s (or earlier, using manual methods--see e.g.
https://en.wikipedia.org/wiki/Linear_programming#History>.

The trick was to write a script that runs the right simulations and
results in an expression (or several) that correctly describes how
well a circuit meets the goals. Once you\'ve done that, it can twiddle
the knobs much better than any human, and I don\'t mean because it
could do it faster and spam the simulations across a thousand CPUs
whilst you could look at only one at a time, it was also better in
that it could remember many sets of parameters that were good in
various ways, and combine them more efficiently than a human. It was
a company internal tool and they will surely have kept it that way.

Numerical optimization based on merit / penalty functions has been well
known for 200 years, since Gauss iirc. That\'s a far cry from actual
computer-based design.

Even lenses, which you\'d think would be a natural application, have been
resistant to fully-automated design--it\'s all about finding a suitable
starting point.

There are various approaches that modify topologies, of which the best
known are genetic algorithms.

Yes. The part about \" it could remember many sets of parameters that
were good in various ways, and combine them more efficiently than a
human\" sounds very much like a genetic programming algorithm, which
are very good at improving from a valid starting point.

Joe Gwinn

Not as far as I know. The point of genetic algos is to change the
topology, not just the values. Unless I\'m misunderstanding, Chris\'s
optimizer was the usual sort that tweaks parameters to minimize some
penalty function. Most of those remember previous values too--for
instance my usual go-to algo, the Nelder-Mead downhill simplex method
(\'amoeba()\' in Numerical Recipes). For N variables, it keeps N+1 sets.

I like Nelder-Mead because most of the things I need to optimize are
either discontinuous themselves, like the number and placement of
rectangular boxes of metal in a nanoantenna, or else need to be
constrained to physically realizable values, as in a filter design code
where the component values need to be positive. (I usually use
mirroring to constrain that sort of thing, which avoids the tendency of
the simplex to collapse along the discontinuity like water along a curb.)

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC / Hobbs ElectroOptics
Optics, Electro-optics, Photonics, Analog Electronics
Briarcliff Manor NY 10510

http://electrooptical.net
http://hobbs-eo.com
 
On 4/6/22 01:38, Joe Gwinn wrote:
On Fri, 3 Jun 2022 10:30:51 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

Chris Jones wrote:
On 02/06/2022 23:33, jlarkin@highlandsniptechnology.com wrote:
There have been attempts to use computers to actually design
circuits, or at least to optimize values in a given topology. They
tended to be ludicrous failures.

I used an optimizer for some chip designs, it was very very good for
things like choosing the size of the transistors in flipflops for
best toggle frequency per supply current, and optimising a low pass
filter for best noise and in-band error-vector-magnitude and
stop-band rejection etc. all at the same time. It did way better than
I could have.

Sounds like a super useful tool. I did something similar for optimizing
plasmonic nanoantennas 15 or so years ago, and like yours, it found good
solutions that weren\'t at all obvious. So I\'m a fan of the general
approach.

Of course, that sort of thing has been done automatically since the
1940s (or earlier, using manual methods--see e.g.
https://en.wikipedia.org/wiki/Linear_programming#History>.

The trick was to write a script that runs the right simulations and
results in an expression (or several) that correctly describes how
well a circuit meets the goals. Once you\'ve done that, it can twiddle
the knobs much better than any human, and I don\'t mean because it
could do it faster and spam the simulations across a thousand CPUs
whilst you could look at only one at a time, it was also better in
that it could remember many sets of parameters that were good in
various ways, and combine them more efficiently than a human. It was
a company internal tool and they will surely have kept it that way.

Numerical optimization based on merit / penalty functions has been well
known for 200 years, since Gauss iirc. That\'s a far cry from actual
computer-based design.

Even lenses, which you\'d think would be a natural application, have been
resistant to fully-automated design--it\'s all about finding a suitable
starting point.

There are various approaches that modify topologies, of which the best
known are genetic algorithms.

Yes. The part about \" it could remember many sets of parameters that
were good in various ways, and combine them more efficiently than a
human\" sounds very much like a genetic programming algorithm, which
are very good at improving from a valid starting point.

It sounds like multi-dimensional slope-descent to me.

These are very good at finding local optima for a given design. Design
from scratch requires finding global optima, something that
slope-descent isn\'t very good at. Simulated annealing and genetic
programming might have more luck.

CH
 
On Saturday, June 4, 2022 at 9:02:09 AM UTC+10, Clifford Heath wrote:
On 4/6/22 01:38, Joe Gwinn wrote:
On Fri, 3 Jun 2022 10:30:51 -0400, Phil Hobbs <pcdhSpamM...@electrooptical.net> wrote:
Chris Jones wrote:
On 02/06/2022 23:33, jla...@highlandsniptechnology.com wrote:

<snip>

Yes. The part about \" it could remember many sets of parameters that
were good in various ways, and combine them more efficiently than a
human\" sounds very much like a genetic programming algorithm, which
are very good at improving from a valid starting point.

It sounds like multi-dimensional slope-descent to me.

I actually used non-linear multi-parameter curve fitting in my Ph.D. work back around 1968. It did rely on a finite continuous data to create the surface that it crawled across. I used the Fletcher-Powell algorithm rather than Marquardt, but there are plenty of others.

https://www.sciencedirect.com/science/article/abs/pii/0167715283900494

These are very good at finding local optima for a given design. Design
from scratch requires finding global optima, something that
slope-descent isn\'t very good at. Simulated annealing and genetic
programming might have more luck.

That does sound right.

--
Bill Sloman, Sydney
 
On Fri, 3 Jun 2022 10:30:51 -0400, Phil Hobbs
<pcdhSpamMeSenseless@electrooptical.net> wrote:

Chris Jones wrote:
On 02/06/2022 23:33, jlarkin@highlandsniptechnology.com wrote:
There have been attempts to use computers to actually design
circuits, or at least to optimize values in a given topology. They
tended to be ludicrous failures.

I used an optimizer for some chip designs, it was very very good for
things like choosing the size of the transistors in flipflops for
best toggle frequency per supply current, and optimising a low pass
filter for best noise and in-band error-vector-magnitude and
stop-band rejection etc. all at the same time. It did way better than
I could have.

Sounds like a super useful tool. I did something similar for optimizing
plasmonic nanoantennas 15 or so years ago, and like yours, it found good
solutions that weren\'t at all obvious. So I\'m a fan of the general
approach.

Of course, that sort of thing has been done automatically since the
1940s (or earlier, using manual methods--see e.g.
https://en.wikipedia.org/wiki/Linear_programming#History>.

The trick was to write a script that runs the right simulations and
results in an expression (or several) that correctly describes how
well a circuit meets the goals. Once you\'ve done that, it can twiddle
the knobs much better than any human, and I don\'t mean because it
could do it faster and spam the simulations across a thousand CPUs
whilst you could look at only one at a time, it was also better in
that it could remember many sets of parameters that were good in
various ways, and combine them more efficiently than a human. It was
a company internal tool and they will surely have kept it that way.

Numerical optimization based on merit / penalty functions has been well
known for 200 years, since Gauss iirc. That\'s a far cry from actual
computer-based design.

Even lenses, which you\'d think would be a natural application, have been
resistant to fully-automated design--it\'s all about finding a suitable
starting point.

There are various approaches that modify topologies, of which the best
known are genetic algorithms.

Cheers

Phil Hobbs

You can do the same thing with a list of components to generate a
simple figure of merit for an application. Can include such esoteric
issues as cost (book, real estate, process).

Don\'t stock market analysis programs try to do this in real time?
Muddying their own pool . . . . .

RL
 
legg wrote:
On Fri, 3 Jun 2022 10:30:51 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

Chris Jones wrote:
On 02/06/2022 23:33, jlarkin@highlandsniptechnology.com wrote:
There have been attempts to use computers to actually design
circuits, or at least to optimize values in a given topology. They
tended to be ludicrous failures.

I used an optimizer for some chip designs, it was very very good for
things like choosing the size of the transistors in flipflops for
best toggle frequency per supply current, and optimising a low pass
filter for best noise and in-band error-vector-magnitude and
stop-band rejection etc. all at the same time. It did way better than
I could have.

Sounds like a super useful tool. I did something similar for optimizing
plasmonic nanoantennas 15 or so years ago, and like yours, it found good
solutions that weren\'t at all obvious. So I\'m a fan of the general
approach.

Of course, that sort of thing has been done automatically since the
1940s (or earlier, using manual methods--see e.g.
https://en.wikipedia.org/wiki/Linear_programming#History>.

The trick was to write a script that runs the right simulations and
results in an expression (or several) that correctly describes how
well a circuit meets the goals. Once you\'ve done that, it can twiddle
the knobs much better than any human, and I don\'t mean because it
could do it faster and spam the simulations across a thousand CPUs
whilst you could look at only one at a time, it was also better in
that it could remember many sets of parameters that were good in
various ways, and combine them more efficiently than a human. It was
a company internal tool and they will surely have kept it that way.

Numerical optimization based on merit / penalty functions has been well
known for 200 years, since Gauss iirc. That\'s a far cry from actual
computer-based design.

Even lenses, which you\'d think would be a natural application, have been
resistant to fully-automated design--it\'s all about finding a suitable
starting point.

There are various approaches that modify topologies, of which the best
known are genetic algorithms.


You can do the same thing with a list of components to generate a
simple figure of merit for an application. Can include such esoteric
issues as cost (book, real estate, process).

Sure. My EM simulator can optimize on literally anything expressible in
its input files.

The issue isn\'t figuring out how good a design is, it\'s generating good
ones from a blank sheet of paper.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC / Hobbs ElectroOptics
Optics, Electro-optics, Photonics, Analog Electronics
Briarcliff Manor NY 10510

http://electrooptical.net
http://hobbs-eo.com
 

Welcome to EDABoard.com

Sponsor

Back
Top