Maximum Power Point Tracking: Optimizing Solar Panels 58 Comments by: Maya Posch...

On 1/4/2023 11:46 AM, Phil Hobbs wrote:
bitrex wrote:
On 1/3/2023 7:30 PM, Phil Hobbs wrote:
RichD wrote:
On January 1,  John Larkin wrote:
https://www.theregister.com/2022/07/18/electrical_engineers_extinction/?td=rt-9cp
I\'ve been thinking for some time now that EE schools don\'t turn out
people who like electricity, but maker culture might.

I advise younguns against an engineering degree, it\'s over-specialized,
and obsolete in 5 years.

Only if you get sucked into spending all your time on the flavor of
the month.  People who spend their time in school learning
fundamental things that are hard to master on your own (math, mostly)
and then pick up the other stuff as they go along don\'t get
obsolete.  That\'s not difficult to do in your average EE program even
today, AFAICT. Signals and systems, electrodynamics, solid state
theory, and a bit of quantum are all good things to know.

Spending all your time in school programming in Javascript or VHDL or
memorizing compliance requirements is not a good career move for an EE.

I tell them to get a physics education.  Study hard.  Then you have the
tools to do anything you want.

Physicists turn up everywhere, it\'s true.  Folks with bachelor\'s
degrees in physics can do most kinds of engineering, provided they\'re
willing to bone up on the specifics.  Of course there are some who
assume they know everything and just bull ahead till they fail, but,
well, human beings are everyplace. ;)  Thing is, the basic
professional qualification for a physicist is a doctorate, whereas in
engineering it\'s a BSEE.

That is, first the academics, then the vocational training.

I agree that knowing the fundamentals cold is very important.
However, (a) physics isn\'t for everyone, by a long chalk; and (b)
there\'s a glorious intellectual heritage in engineering, so calling
it \'vocational training\' is pejorative.

Cheers

Phil \"Intermediate energy state\" Hobbs


Advanced engineering mathematics:

https://www.ebay.com/itm/194964206310

Which is pretty advanced, I don\'t know how many BS-type EEs know about
the orthogonality of Bessel functions, or regularly use contour
integration for anything.

You need to be able to do contour integration in a whole lot of signals
and systems.  For instance, the proof that instability in a linear
system is the same as acausal behavior depends on it.

The exp(i omega t) in the Fourier integral means that you have to close
the contour in one half plane for positive time and the other for
negative time.  If there are any poles inside the negative-time contour,
you get acausal response and exponential growth.   (A very pretty result
first proved by E. C. Titchmarsh, I think.)

But not as advanced as \"Advanced Mathematical Methods for Scientists &
Engineers\", which is largely about perturbation methods, boundary
layer theory, and WKB approximations. Sounds fun I guess, I just got a
used copy from Amazon for $8

That\'s Bender & Orszag, right?  By far my favorite math book of all
time.  I just _love_ that one.  The prof for my (first year grad)
asymptotic methods class was a former EE (Stephanos Venakides, may his
tribe increase).  That helped a lot.  Math classes taught by
mathematicians tend to be dry, because they regard the subject like
philosophy, whereas to a scientist or engineer, math is a technology of
thought.

BITD Arfken\'s \"Mathematical Methods for Physicists\" was one of the
standard math books for undergraduate physics, along with Levenson &
Redheffer\'s complex variables book, Boyce & di Prima on ODEs, Carrier &
Pearson for PDEs, and something on linear algebra.  My linear alg class
was taught out of Schaum\'s Outline, believe it or not--super cheap and
actually a pretty good book.  Oh, and a little book on the theoretical
side of calculus, so that you can prove theorems and stuff if you need to.

Yes that\'s the one. I don\'t understand much beyond part II, maybe
someday, but the material about ODEs, difference equations, and
asymptotic expansions is worth the price of admission alone.

Fourier analysis, perturbation theory, asymptotic methods, cluster
expansions, tensor calculus, and Feynman path integrals were all taught
in physics classes.  I took four EE classes in grad school--Tony Siegman
on lasers, Steve Harris on nonlinear optics, and Ron Bracewell on how to
think in k-space (aka reciprocal space and Fourier space), and Bernie
Widrow on DSP.

I\'m taking an online course in statistical mechanics, it\'s pretty cool,
connecting quantum mechanics micro ---> the PVNRT macro

Cheers

Phil
 
On 1/4/2023 11:46 AM, Phil Hobbs wrote:
bitrex wrote:
On 1/3/2023 7:30 PM, Phil Hobbs wrote:
RichD wrote:
On January 1,  John Larkin wrote:
https://www.theregister.com/2022/07/18/electrical_engineers_extinction/?td=rt-9cp
I\'ve been thinking for some time now that EE schools don\'t turn out
people who like electricity, but maker culture might.

I advise younguns against an engineering degree, it\'s over-specialized,
and obsolete in 5 years.

Only if you get sucked into spending all your time on the flavor of
the month.  People who spend their time in school learning
fundamental things that are hard to master on your own (math, mostly)
and then pick up the other stuff as they go along don\'t get
obsolete.  That\'s not difficult to do in your average EE program even
today, AFAICT. Signals and systems, electrodynamics, solid state
theory, and a bit of quantum are all good things to know.

Spending all your time in school programming in Javascript or VHDL or
memorizing compliance requirements is not a good career move for an EE.

I tell them to get a physics education.  Study hard.  Then you have the
tools to do anything you want.

Physicists turn up everywhere, it\'s true.  Folks with bachelor\'s
degrees in physics can do most kinds of engineering, provided they\'re
willing to bone up on the specifics.  Of course there are some who
assume they know everything and just bull ahead till they fail, but,
well, human beings are everyplace. ;)  Thing is, the basic
professional qualification for a physicist is a doctorate, whereas in
engineering it\'s a BSEE.

That is, first the academics, then the vocational training.

I agree that knowing the fundamentals cold is very important.
However, (a) physics isn\'t for everyone, by a long chalk; and (b)
there\'s a glorious intellectual heritage in engineering, so calling
it \'vocational training\' is pejorative.

Cheers

Phil \"Intermediate energy state\" Hobbs


Advanced engineering mathematics:

https://www.ebay.com/itm/194964206310

Which is pretty advanced, I don\'t know how many BS-type EEs know about
the orthogonality of Bessel functions, or regularly use contour
integration for anything.

You need to be able to do contour integration in a whole lot of signals
and systems.  For instance, the proof that instability in a linear
system is the same as acausal behavior depends on it.

The exp(i omega t) in the Fourier integral means that you have to close
the contour in one half plane for positive time and the other for
negative time.  If there are any poles inside the negative-time contour,
you get acausal response and exponential growth.   (A very pretty result
first proved by E. C. Titchmarsh, I think.)

But not as advanced as \"Advanced Mathematical Methods for Scientists &
Engineers\", which is largely about perturbation methods, boundary
layer theory, and WKB approximations. Sounds fun I guess, I just got a
used copy from Amazon for $8

That\'s Bender & Orszag, right?  By far my favorite math book of all
time.  I just _love_ that one.  The prof for my (first year grad)
asymptotic methods class was a former EE (Stephanos Venakides, may his
tribe increase).  That helped a lot.  Math classes taught by
mathematicians tend to be dry, because they regard the subject like
philosophy, whereas to a scientist or engineer, math is a technology of
thought.

BITD Arfken\'s \"Mathematical Methods for Physicists\" was one of the
standard math books for undergraduate physics, along with Levenson &
Redheffer\'s complex variables book, Boyce & di Prima on ODEs, Carrier &
Pearson for PDEs, and something on linear algebra.  My linear alg class
was taught out of Schaum\'s Outline, believe it or not--super cheap and
actually a pretty good book.  Oh, and a little book on the theoretical
side of calculus, so that you can prove theorems and stuff if you need to.

Yes that\'s the one. I don\'t understand much beyond part II, maybe
someday, but the material about ODEs, difference equations, and
asymptotic expansions is worth the price of admission alone.

Fourier analysis, perturbation theory, asymptotic methods, cluster
expansions, tensor calculus, and Feynman path integrals were all taught
in physics classes.  I took four EE classes in grad school--Tony Siegman
on lasers, Steve Harris on nonlinear optics, and Ron Bracewell on how to
think in k-space (aka reciprocal space and Fourier space), and Bernie
Widrow on DSP.

I\'m taking an online course in statistical mechanics, it\'s pretty cool,
connecting quantum mechanics micro ---> the PVNRT macro

Cheers

Phil
 
On 1/7/2023 2:00 PM, Dan Purgert wrote:
On 2023-01-07, Don Y wrote:
On 1/7/2023 6:59 AM, Dan Purgert wrote:
There are several steps to designing a product/solution.
First is figuring out what it needs to be/do.
Next, how to approach it.
Actually *doing* it is often just busywork.

Quite so. I was quite adept at getting \"what\" pinned down. \"How\" would

IME, \"what\" is the biggest problem. For ad hoc design processes,
if you don\'t know your goal before you start, how can you expect
to hit it, accurately?

It certainly is. Few (if any) people really know \"what\" they want in
concrete terms (me included!); and it takes forever and a day to get to
a point where you can take their ephemeral ideas and distill them into
something even approaching the \"what\".

The amount of time it takes isn\'t a problem. What *is* a problem
is the work done before the destination is known. Are you 100.00%
sure that you haven\'t made some initial assumptions as you were doing
that work -- and, that those assumptions will ALWAYS be true,
regardless of what you finally decide needs to be done? Are
you \"self-aware\" enough to know what these are and to remember to
go back and revisit everything that relied on them, if they are
challenged in later design decisions?

[Agile encourages you to iterate and refactor. In theory, always
remembering to tear down those things that are now invalid. But,
for anything but trivial problems, can you keep track of all that
detail -- those assumptions -- as you\'re trying to evolve the
design forwards?]

depend. The project / product I took over for was unfortunately mired
in the BS \"silo\" methodology of \"just tell us what you need, we\'ll
handle how it\'s done\".

There\'s *some* merit in that. Marketing folks shouldn\'t need to
even *understand* how its done. Aside from \"professional
courtesy\", once you\'ve settled on a design goal, they needn\'t
need to speak with you, again.

There can be (and I love being able to go \"not my problem, talk to
$OTHERTEAM\") ; but in this case the goal was moving from an existing
data routing solution to whatever the 201x \"new paradigm\" was. Y\'know,
get the customers off the crusty 30 year old stack that ran on COBOL and
could take anything we threw at it, and would happily keep running
without a care in the world, onto whatever that year\'s FOTM framework
was that had more bugs than the entomology department at the local
community college.

So, after 6 months of \"don\'t worry about a thing! we\'re right on
track!\"; we had our first real look at it...

The conversation was basically \"Here\'s what\'s in release 1: end-to-end for
partners connecting with [whatever].\"

- Hey guys, this looks great, but where\'s the support for
[variants]{b,c,d}? (oh, those are on the roadmap for release 2,4,9)
- How do we add security certificates? (oh, those are in release 5)
- What about validating the request is complete and correct? (Oh,
that\'s on the user\'s shoulders)
- heh, that\'ll be the day .. OH WAIT YOU\'RE SERIOUS!?

I have seen, firsthand, million-dollar development projects undertaken
and, on completion, simply discarded. Because they realized something
afterwards that was apparent at the start -- had they but THOUGHT
about it instead of incrementally discovering it!

\"Yay! Model 3 is done. Let\'s scrap it and get started on Model 4!\"

Unfortunately, I don\'t think the professors of said \"local\" universities
had ever heard of Knuth. They could barely instruct the classes
(although there were a few of the mostly-math-but-has-one-CS-course
professors who possibly might have ... but they were near retirement age
20 years ago, so ...)

Oh. I was fortunate in that many of my professors were leaders in their
field(s). So, the text bore the name of the guy at the front of the
lecture hall...

I had a few classes like that -- but mostly in the arts and business
classes I took. I heard other hard sciences had published professors as
well, it was just less so in the CS sub-department of mathematics.

Several of my classes had mimeographed (!) sheets that were effectively
draft editions of textbooks that would be later published. One of my
classmates described his first AI class at Northwestern wherein another
student entered carrying the text for the class. My friend asked to
have a look at it. And, promptly announced \"I already *had* this class,
taught by the author, as an undergraduate freshman...\"

All the remainder of the snipped stuff sounds kind of like where my
\"background thinking\" of this moisture sensor project is kind of going
to. A logging sensor is great and all; but well, doesn\'t prevent either
under- or over-watering the garden. So maybe the next iteration has
some methodology of controlling an irrigation zone valve in addition to
logging ... maybe at some point, utilize a network to handle the logging
instead of \"just\" writing to an SD card / EEPROM.

Eventually, everything will have to talk to other things.
Devices that are islands are of limited use. And, the communication
schemes of the past (e.g., serial ports) and the *content* of
those comms moves from \"input and output\" to also include \"control\"
(something that older designs haven\'t previously considered).

Pff, I\'ve got 32 perfectly usable \"control\" characters right here in
ASCII :D

But, does your design/product know how to change its behavior
based on those external inputs/commands? Or, does it think that
*it* is the authoritative agent in its own use?

On top of that, it will soon be a common process to support
encryption and authentication in those actions (you don\'t want
someone to be able to control your furnace without your
consent. You likely also wouldn\'t want someone on the other end
of your network to twiddle with the settings on your \'scope,
power supply, etc.)

Note to self, tinfoil hat for the new furnace ...

\"Vandals\" encrypt your hard disk. Purely on the *assumption*
that there is something of value, there, that you would PAY
to recover. Is there?

What would a commercial establishment pay to regain access to
their physical facility? Or, get the heat back on?

What\'s the goal of shooting at power substations?

Who would ever want to fly a plane (that they were on) into a building?

You can\'t count on the network layer to be operational.
So, no \"messages\".

Right, it\'s why there\'s a series of resistances in the specification to
define power classes. As I recall, one end of the power class is ~22k
ohm. The reference to LLDP is just an optional feature of 802.3at; as I
recall. But honestly it\'s been the better part of a decade since I\'ve
read the specs.

I don\'t have to conform to published standards any more than you
have to orient *your* diodes in a particular way on a PCB. So,
I embelish existing standards to get the functionality that I want,
keeping in mind that someone may interact with my *hardware*
in ways that the standards allow -- how should *I* react? As
long as his interaction doesn\'t compromise my system and my
system doesn\'t unexpectedly (for him) damage his device, then
I am free to make the changes I want.

Fair enough. I do like to stick to the standards where I can though.

A lot of thought goes into standards. It\'s not just some agency
trying to act important. So, it\'s wise to \"borrow\" heavily
from them -- but, only to the extent that it remains consistent
with your own goals.

E.g., one of my digitizer tablets has RJ11\'s for the power and
stylus/puck connections (or, maybe they are RJ45\'s). What happens
if someone plugs a telephone into these connectors? Or, if I
plug the power supply into a telephone \"outlet\"? Should those
connectors ONLY be usable by folks designing telephone equipment??

No, but one would hope the designers handled those cases. Users will
always find a way to do things in ways you hadn\'t intended.

Exactly. But, when it comes to misuse of HARDWARE, we blame it on
the user. The same sorts of misuse applied to software are blamed
on the software.

There are tables that can give you ballparks. Ideally, you\'d
know the characteristics of the ferrite.

I have the part number of the ferrites -- got them for some common-mode
chokes at one point; and I have a few left over. Datasheet won\'t be too
hard to pull from digikey / mouser (just hope it actually _has_ the
info; I recall some that I looked at were nothing more than the
engineering drawings, with no real indication of the composition.)

Play.

Playing is all well and good; but I have to start with some working
details of the parts I have.
 
On 1/7/2023 2:00 PM, Dan Purgert wrote:
On 2023-01-07, Don Y wrote:
On 1/7/2023 6:59 AM, Dan Purgert wrote:
There are several steps to designing a product/solution.
First is figuring out what it needs to be/do.
Next, how to approach it.
Actually *doing* it is often just busywork.

Quite so. I was quite adept at getting \"what\" pinned down. \"How\" would

IME, \"what\" is the biggest problem. For ad hoc design processes,
if you don\'t know your goal before you start, how can you expect
to hit it, accurately?

It certainly is. Few (if any) people really know \"what\" they want in
concrete terms (me included!); and it takes forever and a day to get to
a point where you can take their ephemeral ideas and distill them into
something even approaching the \"what\".

The amount of time it takes isn\'t a problem. What *is* a problem
is the work done before the destination is known. Are you 100.00%
sure that you haven\'t made some initial assumptions as you were doing
that work -- and, that those assumptions will ALWAYS be true,
regardless of what you finally decide needs to be done? Are
you \"self-aware\" enough to know what these are and to remember to
go back and revisit everything that relied on them, if they are
challenged in later design decisions?

[Agile encourages you to iterate and refactor. In theory, always
remembering to tear down those things that are now invalid. But,
for anything but trivial problems, can you keep track of all that
detail -- those assumptions -- as you\'re trying to evolve the
design forwards?]

depend. The project / product I took over for was unfortunately mired
in the BS \"silo\" methodology of \"just tell us what you need, we\'ll
handle how it\'s done\".

There\'s *some* merit in that. Marketing folks shouldn\'t need to
even *understand* how its done. Aside from \"professional
courtesy\", once you\'ve settled on a design goal, they needn\'t
need to speak with you, again.

There can be (and I love being able to go \"not my problem, talk to
$OTHERTEAM\") ; but in this case the goal was moving from an existing
data routing solution to whatever the 201x \"new paradigm\" was. Y\'know,
get the customers off the crusty 30 year old stack that ran on COBOL and
could take anything we threw at it, and would happily keep running
without a care in the world, onto whatever that year\'s FOTM framework
was that had more bugs than the entomology department at the local
community college.

So, after 6 months of \"don\'t worry about a thing! we\'re right on
track!\"; we had our first real look at it...

The conversation was basically \"Here\'s what\'s in release 1: end-to-end for
partners connecting with [whatever].\"

- Hey guys, this looks great, but where\'s the support for
[variants]{b,c,d}? (oh, those are on the roadmap for release 2,4,9)
- How do we add security certificates? (oh, those are in release 5)
- What about validating the request is complete and correct? (Oh,
that\'s on the user\'s shoulders)
- heh, that\'ll be the day .. OH WAIT YOU\'RE SERIOUS!?

I have seen, firsthand, million-dollar development projects undertaken
and, on completion, simply discarded. Because they realized something
afterwards that was apparent at the start -- had they but THOUGHT
about it instead of incrementally discovering it!

\"Yay! Model 3 is done. Let\'s scrap it and get started on Model 4!\"

Unfortunately, I don\'t think the professors of said \"local\" universities
had ever heard of Knuth. They could barely instruct the classes
(although there were a few of the mostly-math-but-has-one-CS-course
professors who possibly might have ... but they were near retirement age
20 years ago, so ...)

Oh. I was fortunate in that many of my professors were leaders in their
field(s). So, the text bore the name of the guy at the front of the
lecture hall...

I had a few classes like that -- but mostly in the arts and business
classes I took. I heard other hard sciences had published professors as
well, it was just less so in the CS sub-department of mathematics.

Several of my classes had mimeographed (!) sheets that were effectively
draft editions of textbooks that would be later published. One of my
classmates described his first AI class at Northwestern wherein another
student entered carrying the text for the class. My friend asked to
have a look at it. And, promptly announced \"I already *had* this class,
taught by the author, as an undergraduate freshman...\"

All the remainder of the snipped stuff sounds kind of like where my
\"background thinking\" of this moisture sensor project is kind of going
to. A logging sensor is great and all; but well, doesn\'t prevent either
under- or over-watering the garden. So maybe the next iteration has
some methodology of controlling an irrigation zone valve in addition to
logging ... maybe at some point, utilize a network to handle the logging
instead of \"just\" writing to an SD card / EEPROM.

Eventually, everything will have to talk to other things.
Devices that are islands are of limited use. And, the communication
schemes of the past (e.g., serial ports) and the *content* of
those comms moves from \"input and output\" to also include \"control\"
(something that older designs haven\'t previously considered).

Pff, I\'ve got 32 perfectly usable \"control\" characters right here in
ASCII :D

But, does your design/product know how to change its behavior
based on those external inputs/commands? Or, does it think that
*it* is the authoritative agent in its own use?

On top of that, it will soon be a common process to support
encryption and authentication in those actions (you don\'t want
someone to be able to control your furnace without your
consent. You likely also wouldn\'t want someone on the other end
of your network to twiddle with the settings on your \'scope,
power supply, etc.)

Note to self, tinfoil hat for the new furnace ...

\"Vandals\" encrypt your hard disk. Purely on the *assumption*
that there is something of value, there, that you would PAY
to recover. Is there?

What would a commercial establishment pay to regain access to
their physical facility? Or, get the heat back on?

What\'s the goal of shooting at power substations?

Who would ever want to fly a plane (that they were on) into a building?

You can\'t count on the network layer to be operational.
So, no \"messages\".

Right, it\'s why there\'s a series of resistances in the specification to
define power classes. As I recall, one end of the power class is ~22k
ohm. The reference to LLDP is just an optional feature of 802.3at; as I
recall. But honestly it\'s been the better part of a decade since I\'ve
read the specs.

I don\'t have to conform to published standards any more than you
have to orient *your* diodes in a particular way on a PCB. So,
I embelish existing standards to get the functionality that I want,
keeping in mind that someone may interact with my *hardware*
in ways that the standards allow -- how should *I* react? As
long as his interaction doesn\'t compromise my system and my
system doesn\'t unexpectedly (for him) damage his device, then
I am free to make the changes I want.

Fair enough. I do like to stick to the standards where I can though.

A lot of thought goes into standards. It\'s not just some agency
trying to act important. So, it\'s wise to \"borrow\" heavily
from them -- but, only to the extent that it remains consistent
with your own goals.

E.g., one of my digitizer tablets has RJ11\'s for the power and
stylus/puck connections (or, maybe they are RJ45\'s). What happens
if someone plugs a telephone into these connectors? Or, if I
plug the power supply into a telephone \"outlet\"? Should those
connectors ONLY be usable by folks designing telephone equipment??

No, but one would hope the designers handled those cases. Users will
always find a way to do things in ways you hadn\'t intended.

Exactly. But, when it comes to misuse of HARDWARE, we blame it on
the user. The same sorts of misuse applied to software are blamed
on the software.

There are tables that can give you ballparks. Ideally, you\'d
know the characteristics of the ferrite.

I have the part number of the ferrites -- got them for some common-mode
chokes at one point; and I have a few left over. Datasheet won\'t be too
hard to pull from digikey / mouser (just hope it actually _has_ the
info; I recall some that I looked at were nothing more than the
engineering drawings, with no real indication of the composition.)

Play.

Playing is all well and good; but I have to start with some working
details of the parts I have.
 
On 1/7/2023 2:00 PM, Dan Purgert wrote:
On 2023-01-07, Don Y wrote:
On 1/7/2023 6:59 AM, Dan Purgert wrote:
There are several steps to designing a product/solution.
First is figuring out what it needs to be/do.
Next, how to approach it.
Actually *doing* it is often just busywork.

Quite so. I was quite adept at getting \"what\" pinned down. \"How\" would

IME, \"what\" is the biggest problem. For ad hoc design processes,
if you don\'t know your goal before you start, how can you expect
to hit it, accurately?

It certainly is. Few (if any) people really know \"what\" they want in
concrete terms (me included!); and it takes forever and a day to get to
a point where you can take their ephemeral ideas and distill them into
something even approaching the \"what\".

The amount of time it takes isn\'t a problem. What *is* a problem
is the work done before the destination is known. Are you 100.00%
sure that you haven\'t made some initial assumptions as you were doing
that work -- and, that those assumptions will ALWAYS be true,
regardless of what you finally decide needs to be done? Are
you \"self-aware\" enough to know what these are and to remember to
go back and revisit everything that relied on them, if they are
challenged in later design decisions?

[Agile encourages you to iterate and refactor. In theory, always
remembering to tear down those things that are now invalid. But,
for anything but trivial problems, can you keep track of all that
detail -- those assumptions -- as you\'re trying to evolve the
design forwards?]

depend. The project / product I took over for was unfortunately mired
in the BS \"silo\" methodology of \"just tell us what you need, we\'ll
handle how it\'s done\".

There\'s *some* merit in that. Marketing folks shouldn\'t need to
even *understand* how its done. Aside from \"professional
courtesy\", once you\'ve settled on a design goal, they needn\'t
need to speak with you, again.

There can be (and I love being able to go \"not my problem, talk to
$OTHERTEAM\") ; but in this case the goal was moving from an existing
data routing solution to whatever the 201x \"new paradigm\" was. Y\'know,
get the customers off the crusty 30 year old stack that ran on COBOL and
could take anything we threw at it, and would happily keep running
without a care in the world, onto whatever that year\'s FOTM framework
was that had more bugs than the entomology department at the local
community college.

So, after 6 months of \"don\'t worry about a thing! we\'re right on
track!\"; we had our first real look at it...

The conversation was basically \"Here\'s what\'s in release 1: end-to-end for
partners connecting with [whatever].\"

- Hey guys, this looks great, but where\'s the support for
[variants]{b,c,d}? (oh, those are on the roadmap for release 2,4,9)
- How do we add security certificates? (oh, those are in release 5)
- What about validating the request is complete and correct? (Oh,
that\'s on the user\'s shoulders)
- heh, that\'ll be the day .. OH WAIT YOU\'RE SERIOUS!?

I have seen, firsthand, million-dollar development projects undertaken
and, on completion, simply discarded. Because they realized something
afterwards that was apparent at the start -- had they but THOUGHT
about it instead of incrementally discovering it!

\"Yay! Model 3 is done. Let\'s scrap it and get started on Model 4!\"

Unfortunately, I don\'t think the professors of said \"local\" universities
had ever heard of Knuth. They could barely instruct the classes
(although there were a few of the mostly-math-but-has-one-CS-course
professors who possibly might have ... but they were near retirement age
20 years ago, so ...)

Oh. I was fortunate in that many of my professors were leaders in their
field(s). So, the text bore the name of the guy at the front of the
lecture hall...

I had a few classes like that -- but mostly in the arts and business
classes I took. I heard other hard sciences had published professors as
well, it was just less so in the CS sub-department of mathematics.

Several of my classes had mimeographed (!) sheets that were effectively
draft editions of textbooks that would be later published. One of my
classmates described his first AI class at Northwestern wherein another
student entered carrying the text for the class. My friend asked to
have a look at it. And, promptly announced \"I already *had* this class,
taught by the author, as an undergraduate freshman...\"

All the remainder of the snipped stuff sounds kind of like where my
\"background thinking\" of this moisture sensor project is kind of going
to. A logging sensor is great and all; but well, doesn\'t prevent either
under- or over-watering the garden. So maybe the next iteration has
some methodology of controlling an irrigation zone valve in addition to
logging ... maybe at some point, utilize a network to handle the logging
instead of \"just\" writing to an SD card / EEPROM.

Eventually, everything will have to talk to other things.
Devices that are islands are of limited use. And, the communication
schemes of the past (e.g., serial ports) and the *content* of
those comms moves from \"input and output\" to also include \"control\"
(something that older designs haven\'t previously considered).

Pff, I\'ve got 32 perfectly usable \"control\" characters right here in
ASCII :D

But, does your design/product know how to change its behavior
based on those external inputs/commands? Or, does it think that
*it* is the authoritative agent in its own use?

On top of that, it will soon be a common process to support
encryption and authentication in those actions (you don\'t want
someone to be able to control your furnace without your
consent. You likely also wouldn\'t want someone on the other end
of your network to twiddle with the settings on your \'scope,
power supply, etc.)

Note to self, tinfoil hat for the new furnace ...

\"Vandals\" encrypt your hard disk. Purely on the *assumption*
that there is something of value, there, that you would PAY
to recover. Is there?

What would a commercial establishment pay to regain access to
their physical facility? Or, get the heat back on?

What\'s the goal of shooting at power substations?

Who would ever want to fly a plane (that they were on) into a building?

You can\'t count on the network layer to be operational.
So, no \"messages\".

Right, it\'s why there\'s a series of resistances in the specification to
define power classes. As I recall, one end of the power class is ~22k
ohm. The reference to LLDP is just an optional feature of 802.3at; as I
recall. But honestly it\'s been the better part of a decade since I\'ve
read the specs.

I don\'t have to conform to published standards any more than you
have to orient *your* diodes in a particular way on a PCB. So,
I embelish existing standards to get the functionality that I want,
keeping in mind that someone may interact with my *hardware*
in ways that the standards allow -- how should *I* react? As
long as his interaction doesn\'t compromise my system and my
system doesn\'t unexpectedly (for him) damage his device, then
I am free to make the changes I want.

Fair enough. I do like to stick to the standards where I can though.

A lot of thought goes into standards. It\'s not just some agency
trying to act important. So, it\'s wise to \"borrow\" heavily
from them -- but, only to the extent that it remains consistent
with your own goals.

E.g., one of my digitizer tablets has RJ11\'s for the power and
stylus/puck connections (or, maybe they are RJ45\'s). What happens
if someone plugs a telephone into these connectors? Or, if I
plug the power supply into a telephone \"outlet\"? Should those
connectors ONLY be usable by folks designing telephone equipment??

No, but one would hope the designers handled those cases. Users will
always find a way to do things in ways you hadn\'t intended.

Exactly. But, when it comes to misuse of HARDWARE, we blame it on
the user. The same sorts of misuse applied to software are blamed
on the software.

There are tables that can give you ballparks. Ideally, you\'d
know the characteristics of the ferrite.

I have the part number of the ferrites -- got them for some common-mode
chokes at one point; and I have a few left over. Datasheet won\'t be too
hard to pull from digikey / mouser (just hope it actually _has_ the
info; I recall some that I looked at were nothing more than the
engineering drawings, with no real indication of the composition.)

Play.

Playing is all well and good; but I have to start with some working
details of the parts I have.
 
On 1/5/2023 10:30 AM, Phil Hobbs wrote:
bitrex wrote:
On 1/4/2023 12:04 PM, Phil Hobbs wrote:
bitrex wrote:
On 1/4/2023 9:52 AM, bitrex wrote:
On 1/3/2023 7:30 PM, Phil Hobbs wrote:
RichD wrote:
On January 1,  John Larkin wrote:
https://www.theregister.com/2022/07/18/electrical_engineers_extinction/?td=rt-9cp
I\'ve been thinking for some time now that EE schools don\'t turn out
people who like electricity, but maker culture might.

I advise younguns against an engineering degree, it\'s
over-specialized,
and obsolete in 5 years.

Only if you get sucked into spending all your time on the flavor
of the month.  People who spend their time in school learning
fundamental things that are hard to master on your own (math,
mostly) and then pick up the other stuff as they go along don\'t
get obsolete.  That\'s not difficult to do in your average EE
program even today, AFAICT. Signals and systems, electrodynamics,
solid state theory, and a bit of quantum are all good things to know.

Spending all your time in school programming in Javascript or VHDL
or memorizing compliance requirements is not a good career move
for an EE.

I tell them to get a physics education.  Study hard.  Then you
have the
tools to do anything you want.

Physicists turn up everywhere, it\'s true.  Folks with bachelor\'s
degrees in physics can do most kinds of engineering, provided
they\'re willing to bone up on the specifics.  Of course there are
some who assume they know everything and just bull ahead till they
fail, but, well, human beings are everyplace. ;)  Thing is, the
basic professional qualification for a physicist is a doctorate,
whereas in engineering it\'s a BSEE.

That is, first the academics, then the vocational training.

I agree that knowing the fundamentals cold is very important.
However, (a) physics isn\'t for everyone, by a long chalk; and (b)
there\'s a glorious intellectual heritage in engineering, so
calling it \'vocational training\' is pejorative.

Cheers

Phil \"Intermediate energy state\" Hobbs


Advanced engineering mathematics:

https://www.ebay.com/itm/194964206310

Which is pretty advanced, I don\'t know how many BS-type EEs know
about the orthogonality of Bessel functions, or regularly use
contour integration for anything.

But not as advanced as \"Advanced Mathematical Methods for
Scientists & Engineers\", which is largely about perturbation
methods, boundary layer theory, and WKB approximations. Sounds fun
I guess, I just got a used copy from Amazon for $8

I would expect stuff like the WKB approximation is regularly used
more in optics design than in circuit design, though.

WKB is common in approximate quantum theory, e.g. solid state.

Cheers

Phil Hobbs


I see, in a \"we can solve the hydrogen atom exactly & that\'s it\" sense.

No, the hydrogen atom is analytically solvable in the nonrelativistic
picture. You don\'t need asymptotic methods for that.  (I expect that
they don\'t put art majors through all the higher math classes.)

Cheers

Phil Hobbs

That\'s what I mean, it\'s one of the few that is.

The math for introductory QM isn\'t horrible, I took AP calculus and was
exposed to at least some amount of differential equations in high school
y\'know. Compared to classical EM it seems easier really - does anyone
really _enjoy_ vector calculus?
 
On 1/5/2023 10:30 AM, Phil Hobbs wrote:
bitrex wrote:
On 1/4/2023 12:04 PM, Phil Hobbs wrote:
bitrex wrote:
On 1/4/2023 9:52 AM, bitrex wrote:
On 1/3/2023 7:30 PM, Phil Hobbs wrote:
RichD wrote:
On January 1,  John Larkin wrote:
https://www.theregister.com/2022/07/18/electrical_engineers_extinction/?td=rt-9cp
I\'ve been thinking for some time now that EE schools don\'t turn out
people who like electricity, but maker culture might.

I advise younguns against an engineering degree, it\'s
over-specialized,
and obsolete in 5 years.

Only if you get sucked into spending all your time on the flavor
of the month.  People who spend their time in school learning
fundamental things that are hard to master on your own (math,
mostly) and then pick up the other stuff as they go along don\'t
get obsolete.  That\'s not difficult to do in your average EE
program even today, AFAICT. Signals and systems, electrodynamics,
solid state theory, and a bit of quantum are all good things to know.

Spending all your time in school programming in Javascript or VHDL
or memorizing compliance requirements is not a good career move
for an EE.

I tell them to get a physics education.  Study hard.  Then you
have the
tools to do anything you want.

Physicists turn up everywhere, it\'s true.  Folks with bachelor\'s
degrees in physics can do most kinds of engineering, provided
they\'re willing to bone up on the specifics.  Of course there are
some who assume they know everything and just bull ahead till they
fail, but, well, human beings are everyplace. ;)  Thing is, the
basic professional qualification for a physicist is a doctorate,
whereas in engineering it\'s a BSEE.

That is, first the academics, then the vocational training.

I agree that knowing the fundamentals cold is very important.
However, (a) physics isn\'t for everyone, by a long chalk; and (b)
there\'s a glorious intellectual heritage in engineering, so
calling it \'vocational training\' is pejorative.

Cheers

Phil \"Intermediate energy state\" Hobbs


Advanced engineering mathematics:

https://www.ebay.com/itm/194964206310

Which is pretty advanced, I don\'t know how many BS-type EEs know
about the orthogonality of Bessel functions, or regularly use
contour integration for anything.

But not as advanced as \"Advanced Mathematical Methods for
Scientists & Engineers\", which is largely about perturbation
methods, boundary layer theory, and WKB approximations. Sounds fun
I guess, I just got a used copy from Amazon for $8

I would expect stuff like the WKB approximation is regularly used
more in optics design than in circuit design, though.

WKB is common in approximate quantum theory, e.g. solid state.

Cheers

Phil Hobbs


I see, in a \"we can solve the hydrogen atom exactly & that\'s it\" sense.

No, the hydrogen atom is analytically solvable in the nonrelativistic
picture. You don\'t need asymptotic methods for that.  (I expect that
they don\'t put art majors through all the higher math classes.)

Cheers

Phil Hobbs

That\'s what I mean, it\'s one of the few that is.

The math for introductory QM isn\'t horrible, I took AP calculus and was
exposed to at least some amount of differential equations in high school
y\'know. Compared to classical EM it seems easier really - does anyone
really _enjoy_ vector calculus?
 
On 1/5/2023 10:30 AM, Phil Hobbs wrote:
bitrex wrote:
On 1/4/2023 12:04 PM, Phil Hobbs wrote:
bitrex wrote:
On 1/4/2023 9:52 AM, bitrex wrote:
On 1/3/2023 7:30 PM, Phil Hobbs wrote:
RichD wrote:
On January 1,  John Larkin wrote:
https://www.theregister.com/2022/07/18/electrical_engineers_extinction/?td=rt-9cp
I\'ve been thinking for some time now that EE schools don\'t turn out
people who like electricity, but maker culture might.

I advise younguns against an engineering degree, it\'s
over-specialized,
and obsolete in 5 years.

Only if you get sucked into spending all your time on the flavor
of the month.  People who spend their time in school learning
fundamental things that are hard to master on your own (math,
mostly) and then pick up the other stuff as they go along don\'t
get obsolete.  That\'s not difficult to do in your average EE
program even today, AFAICT. Signals and systems, electrodynamics,
solid state theory, and a bit of quantum are all good things to know.

Spending all your time in school programming in Javascript or VHDL
or memorizing compliance requirements is not a good career move
for an EE.

I tell them to get a physics education.  Study hard.  Then you
have the
tools to do anything you want.

Physicists turn up everywhere, it\'s true.  Folks with bachelor\'s
degrees in physics can do most kinds of engineering, provided
they\'re willing to bone up on the specifics.  Of course there are
some who assume they know everything and just bull ahead till they
fail, but, well, human beings are everyplace. ;)  Thing is, the
basic professional qualification for a physicist is a doctorate,
whereas in engineering it\'s a BSEE.

That is, first the academics, then the vocational training.

I agree that knowing the fundamentals cold is very important.
However, (a) physics isn\'t for everyone, by a long chalk; and (b)
there\'s a glorious intellectual heritage in engineering, so
calling it \'vocational training\' is pejorative.

Cheers

Phil \"Intermediate energy state\" Hobbs


Advanced engineering mathematics:

https://www.ebay.com/itm/194964206310

Which is pretty advanced, I don\'t know how many BS-type EEs know
about the orthogonality of Bessel functions, or regularly use
contour integration for anything.

But not as advanced as \"Advanced Mathematical Methods for
Scientists & Engineers\", which is largely about perturbation
methods, boundary layer theory, and WKB approximations. Sounds fun
I guess, I just got a used copy from Amazon for $8

I would expect stuff like the WKB approximation is regularly used
more in optics design than in circuit design, though.

WKB is common in approximate quantum theory, e.g. solid state.

Cheers

Phil Hobbs


I see, in a \"we can solve the hydrogen atom exactly & that\'s it\" sense.

No, the hydrogen atom is analytically solvable in the nonrelativistic
picture. You don\'t need asymptotic methods for that.  (I expect that
they don\'t put art majors through all the higher math classes.)

Cheers

Phil Hobbs

That\'s what I mean, it\'s one of the few that is.

The math for introductory QM isn\'t horrible, I took AP calculus and was
exposed to at least some amount of differential equations in high school
y\'know. Compared to classical EM it seems easier really - does anyone
really _enjoy_ vector calculus?
 
In article <fcb22892-2d93-4046-bcdc-1d9ed5f410adn@googlegroups.com>,
bill.sloman@ieee.org says...
A chinese multi-meter might well not conform to an American Underwriters Laboratory standard, but will probably conform to the relevant IEC standard, which isn\'t going to be much different.

A cheap chinese meter might be truly cheap and nasty, and correspondlngly dangerous, but anybody who sold it to you would risk being sued if it was.

It\'s more likely to be cheap because it was produced in high volume, rather than because the manufacturer cut any corners. I\'ve ran into one American instrument that didn\'t meet their published specifications, which is a slightly different kind of problem - it wasn\'t certainly wasn\'t cheap.

Meters are rated for certain usages. That is where the CAT number comes
into play. The cheap meters may be rated for home low voltage/current
usage. Others rated for higher voltage /current.

I have a couple of $ 300+ meters and some of the \'free\' Harbor Freight
meters. None of the $ 10 to $ 50 meters. I worked with some 480 volt
circuits that had fuses up to 300 amps. No way would I use any meter
that I did not know it was CAT rated for the service. I would use the
China meter on my home electronics that may have 500 volts in them but
the current was an amp or less, but no way on a 240 volt+ circuit that
had over a few amps, especially 100 or more amps supply.

If you had ever seen any Fluke or other safety films where the less
expensive meters had been set on the ohms or current ranges and put
across the high amp circuits no one would even think about using the
meters that are not CAT ( or maybe some other standard) rated for the
application.

As far as the sueing, all they have to say is the meter was used outside
of its ratinng.
 
In article <fcb22892-2d93-4046-bcdc-1d9ed5f410adn@googlegroups.com>,
bill.sloman@ieee.org says...
A chinese multi-meter might well not conform to an American Underwriters Laboratory standard, but will probably conform to the relevant IEC standard, which isn\'t going to be much different.

A cheap chinese meter might be truly cheap and nasty, and correspondlngly dangerous, but anybody who sold it to you would risk being sued if it was.

It\'s more likely to be cheap because it was produced in high volume, rather than because the manufacturer cut any corners. I\'ve ran into one American instrument that didn\'t meet their published specifications, which is a slightly different kind of problem - it wasn\'t certainly wasn\'t cheap.

Meters are rated for certain usages. That is where the CAT number comes
into play. The cheap meters may be rated for home low voltage/current
usage. Others rated for higher voltage /current.

I have a couple of $ 300+ meters and some of the \'free\' Harbor Freight
meters. None of the $ 10 to $ 50 meters. I worked with some 480 volt
circuits that had fuses up to 300 amps. No way would I use any meter
that I did not know it was CAT rated for the service. I would use the
China meter on my home electronics that may have 500 volts in them but
the current was an amp or less, but no way on a 240 volt+ circuit that
had over a few amps, especially 100 or more amps supply.

If you had ever seen any Fluke or other safety films where the less
expensive meters had been set on the ohms or current ranges and put
across the high amp circuits no one would even think about using the
meters that are not CAT ( or maybe some other standard) rated for the
application.

As far as the sueing, all they have to say is the meter was used outside
of its ratinng.
 
In article <fcb22892-2d93-4046-bcdc-1d9ed5f410adn@googlegroups.com>,
bill.sloman@ieee.org says...
A chinese multi-meter might well not conform to an American Underwriters Laboratory standard, but will probably conform to the relevant IEC standard, which isn\'t going to be much different.

A cheap chinese meter might be truly cheap and nasty, and correspondlngly dangerous, but anybody who sold it to you would risk being sued if it was.

It\'s more likely to be cheap because it was produced in high volume, rather than because the manufacturer cut any corners. I\'ve ran into one American instrument that didn\'t meet their published specifications, which is a slightly different kind of problem - it wasn\'t certainly wasn\'t cheap.

Meters are rated for certain usages. That is where the CAT number comes
into play. The cheap meters may be rated for home low voltage/current
usage. Others rated for higher voltage /current.

I have a couple of $ 300+ meters and some of the \'free\' Harbor Freight
meters. None of the $ 10 to $ 50 meters. I worked with some 480 volt
circuits that had fuses up to 300 amps. No way would I use any meter
that I did not know it was CAT rated for the service. I would use the
China meter on my home electronics that may have 500 volts in them but
the current was an amp or less, but no way on a 240 volt+ circuit that
had over a few amps, especially 100 or more amps supply.

If you had ever seen any Fluke or other safety films where the less
expensive meters had been set on the ohms or current ranges and put
across the high amp circuits no one would even think about using the
meters that are not CAT ( or maybe some other standard) rated for the
application.

As far as the sueing, all they have to say is the meter was used outside
of its ratinng.
 
On Thu, 5 Jan 2023 18:32:20 -0600, Les Cargill <lcargil99@gmail.com>
wrote:

John Larkin wrote:
On Wed, 4 Jan 2023 10:30:35 -0800, Joerg <news@analogconsultants.com
wrote:

On 1/2/23 2:34 PM, Joe Gwinn wrote:
On Mon, 2 Jan 2023 12:59:00 -0800, Joerg <news@analogconsultants.com
wrote:

On 1/2/23 12:20 PM, whit3rd wrote:
On Monday, January 2, 2023 at 10:25:28 AM UTC-8, John Larkin wrote:

\"QUESTION: which upper level math courses did you find most applicable
to your major or masters courses. Are there any other free/cheap
courses that can set me up for success in Power Electronics and/or
Embedded systems?\"

I don\'t think that higher-level math courses set people up for success
in any EE field except academics.

Sensing, measuring, and filtering get a lot of utility from Fourier transform
techniques; it\'s hard to imagine success in phase-shift measurement without
using a F-transform. Absence of high-level math courses sets people
up for failure, but they won\'t ever know that.


Actually during one of my consulting projects in the mid-90\'s I reversed
that trend at a client. They had a big DSP do lots of Fourier transforms
and the auto-calibration routine for that board took forever. Tens of
seconds. I reverted all that to time-domain and it was finished after a
few hundred msec, every single time.

In the 80\'s we often did it with zero-crossers. Less math but blazingly
fast.

What was this big DSP doing? This story rings a bell.

In radar, the initial calibration involves multiple alternating
conversions between time and frequency domains, because the desired
result is a clean pulse in the time domain, achieved by adjusting
phase and amplitude settings as a function of frequency.


I am not at liberty to go into great detail but in a nutshell the DSP
was there to calibrate a multi-channel RF system via FFT with respect to
amplitude and phase. High precision was required. Theoretically it
could, of course, be done with the FFT but it took way too long and it
didn\'t always converge to the precision they needed. The sofwtare also
was, let\'s say, a bit temperamental.


Once the correct settings have been found iteratively, subsequent
calibration is by adjusting the various settings back to those golden
numbers - the file containing those golden numbers is of course called
a golden database.

Antenna pattern is first calibrated by a like process.


My time-domain routine didn\'t need any golden numbers and converged
every single time within less than half a second. We let the uC handle
that because the computational load dropped to peanuts. The big DSP
became unemployed.

The project start was the usual, everyone saying that FFT was the name
of the game and there wasn\'t any other decent way. If it didn\'t work in
time domain I\'d have to buy everyone a beer at night. If it did,
everyone had to buy me a beer. I needed a designated driver that night ...

Given an actual waveform a(t) and a desired waveform d(t), we can fix
a to make d with an equalizer having impulse response e(t)

d(t) = a(t) ** e(t) ** is convolution

Finding e is the reverse convolution problem.

The classic way to find e(t) is to do complex FFTs on a and d and
complex divide to get the FFT of e, then reverse FFT. That usually
makes a bunch of divide-by-0 or divide-by-almost-0 points, which sort
of blows up.


Depends on the signals. Using C/C++ and a modern toolchain,
setting all failures of isnormal(i) and isnormal (q) to zero
in a complex division routine takes care of it.

For audio, you ain\'t gonna need that even. The tone signal
and end signal will be too similar. You\'re just after
the basis of a long FIR filter anyway.

It\'s different if there\'s noise but the filtering to get around
that is pretty straight up. It\'s usually an LPF.

I do it in time domain.


Recording a single sample pulse thru the device is cheating :) These
guys call it \"Singular Value Decomposition\"

https://ieeexplore.ieee.org/abstract/document/52264

( I keed; there\'s more to it than that but the single pulse
is a good start ).

Keed?

Very interesting. I would be tempted to use the Penrose Pseudoinverse
(which has an SVD within), for its error tolerance.

..<https://en.wikipedia.org/wiki/Moore%E2%80%93Penrose_inverse>

I can see this working for reasonably small data sets, as the SVD
scales an O[n^3) or so. But the DFT (using FFT) scales as O(N*logN).

..<https://en.wikipedia.org/wiki/Computational_complexity_of_mathematical_operations>


Joe Gwinn
 
On Thu, 5 Jan 2023 18:32:20 -0600, Les Cargill <lcargil99@gmail.com>
wrote:

John Larkin wrote:
On Wed, 4 Jan 2023 10:30:35 -0800, Joerg <news@analogconsultants.com
wrote:

On 1/2/23 2:34 PM, Joe Gwinn wrote:
On Mon, 2 Jan 2023 12:59:00 -0800, Joerg <news@analogconsultants.com
wrote:

On 1/2/23 12:20 PM, whit3rd wrote:
On Monday, January 2, 2023 at 10:25:28 AM UTC-8, John Larkin wrote:

\"QUESTION: which upper level math courses did you find most applicable
to your major or masters courses. Are there any other free/cheap
courses that can set me up for success in Power Electronics and/or
Embedded systems?\"

I don\'t think that higher-level math courses set people up for success
in any EE field except academics.

Sensing, measuring, and filtering get a lot of utility from Fourier transform
techniques; it\'s hard to imagine success in phase-shift measurement without
using a F-transform. Absence of high-level math courses sets people
up for failure, but they won\'t ever know that.


Actually during one of my consulting projects in the mid-90\'s I reversed
that trend at a client. They had a big DSP do lots of Fourier transforms
and the auto-calibration routine for that board took forever. Tens of
seconds. I reverted all that to time-domain and it was finished after a
few hundred msec, every single time.

In the 80\'s we often did it with zero-crossers. Less math but blazingly
fast.

What was this big DSP doing? This story rings a bell.

In radar, the initial calibration involves multiple alternating
conversions between time and frequency domains, because the desired
result is a clean pulse in the time domain, achieved by adjusting
phase and amplitude settings as a function of frequency.


I am not at liberty to go into great detail but in a nutshell the DSP
was there to calibrate a multi-channel RF system via FFT with respect to
amplitude and phase. High precision was required. Theoretically it
could, of course, be done with the FFT but it took way too long and it
didn\'t always converge to the precision they needed. The sofwtare also
was, let\'s say, a bit temperamental.


Once the correct settings have been found iteratively, subsequent
calibration is by adjusting the various settings back to those golden
numbers - the file containing those golden numbers is of course called
a golden database.

Antenna pattern is first calibrated by a like process.


My time-domain routine didn\'t need any golden numbers and converged
every single time within less than half a second. We let the uC handle
that because the computational load dropped to peanuts. The big DSP
became unemployed.

The project start was the usual, everyone saying that FFT was the name
of the game and there wasn\'t any other decent way. If it didn\'t work in
time domain I\'d have to buy everyone a beer at night. If it did,
everyone had to buy me a beer. I needed a designated driver that night ...

Given an actual waveform a(t) and a desired waveform d(t), we can fix
a to make d with an equalizer having impulse response e(t)

d(t) = a(t) ** e(t) ** is convolution

Finding e is the reverse convolution problem.

The classic way to find e(t) is to do complex FFTs on a and d and
complex divide to get the FFT of e, then reverse FFT. That usually
makes a bunch of divide-by-0 or divide-by-almost-0 points, which sort
of blows up.


Depends on the signals. Using C/C++ and a modern toolchain,
setting all failures of isnormal(i) and isnormal (q) to zero
in a complex division routine takes care of it.

For audio, you ain\'t gonna need that even. The tone signal
and end signal will be too similar. You\'re just after
the basis of a long FIR filter anyway.

It\'s different if there\'s noise but the filtering to get around
that is pretty straight up. It\'s usually an LPF.

I do it in time domain.


Recording a single sample pulse thru the device is cheating :) These
guys call it \"Singular Value Decomposition\"

https://ieeexplore.ieee.org/abstract/document/52264

( I keed; there\'s more to it than that but the single pulse
is a good start ).

Keed?

Very interesting. I would be tempted to use the Penrose Pseudoinverse
(which has an SVD within), for its error tolerance.

..<https://en.wikipedia.org/wiki/Moore%E2%80%93Penrose_inverse>

I can see this working for reasonably small data sets, as the SVD
scales an O[n^3) or so. But the DFT (using FFT) scales as O(N*logN).

..<https://en.wikipedia.org/wiki/Computational_complexity_of_mathematical_operations>


Joe Gwinn
 
On Thu, 5 Jan 2023 18:32:20 -0600, Les Cargill <lcargil99@gmail.com>
wrote:

John Larkin wrote:
On Wed, 4 Jan 2023 10:30:35 -0800, Joerg <news@analogconsultants.com
wrote:

On 1/2/23 2:34 PM, Joe Gwinn wrote:
On Mon, 2 Jan 2023 12:59:00 -0800, Joerg <news@analogconsultants.com
wrote:

On 1/2/23 12:20 PM, whit3rd wrote:
On Monday, January 2, 2023 at 10:25:28 AM UTC-8, John Larkin wrote:

\"QUESTION: which upper level math courses did you find most applicable
to your major or masters courses. Are there any other free/cheap
courses that can set me up for success in Power Electronics and/or
Embedded systems?\"

I don\'t think that higher-level math courses set people up for success
in any EE field except academics.

Sensing, measuring, and filtering get a lot of utility from Fourier transform
techniques; it\'s hard to imagine success in phase-shift measurement without
using a F-transform. Absence of high-level math courses sets people
up for failure, but they won\'t ever know that.


Actually during one of my consulting projects in the mid-90\'s I reversed
that trend at a client. They had a big DSP do lots of Fourier transforms
and the auto-calibration routine for that board took forever. Tens of
seconds. I reverted all that to time-domain and it was finished after a
few hundred msec, every single time.

In the 80\'s we often did it with zero-crossers. Less math but blazingly
fast.

What was this big DSP doing? This story rings a bell.

In radar, the initial calibration involves multiple alternating
conversions between time and frequency domains, because the desired
result is a clean pulse in the time domain, achieved by adjusting
phase and amplitude settings as a function of frequency.


I am not at liberty to go into great detail but in a nutshell the DSP
was there to calibrate a multi-channel RF system via FFT with respect to
amplitude and phase. High precision was required. Theoretically it
could, of course, be done with the FFT but it took way too long and it
didn\'t always converge to the precision they needed. The sofwtare also
was, let\'s say, a bit temperamental.


Once the correct settings have been found iteratively, subsequent
calibration is by adjusting the various settings back to those golden
numbers - the file containing those golden numbers is of course called
a golden database.

Antenna pattern is first calibrated by a like process.


My time-domain routine didn\'t need any golden numbers and converged
every single time within less than half a second. We let the uC handle
that because the computational load dropped to peanuts. The big DSP
became unemployed.

The project start was the usual, everyone saying that FFT was the name
of the game and there wasn\'t any other decent way. If it didn\'t work in
time domain I\'d have to buy everyone a beer at night. If it did,
everyone had to buy me a beer. I needed a designated driver that night ...

Given an actual waveform a(t) and a desired waveform d(t), we can fix
a to make d with an equalizer having impulse response e(t)

d(t) = a(t) ** e(t) ** is convolution

Finding e is the reverse convolution problem.

The classic way to find e(t) is to do complex FFTs on a and d and
complex divide to get the FFT of e, then reverse FFT. That usually
makes a bunch of divide-by-0 or divide-by-almost-0 points, which sort
of blows up.


Depends on the signals. Using C/C++ and a modern toolchain,
setting all failures of isnormal(i) and isnormal (q) to zero
in a complex division routine takes care of it.

For audio, you ain\'t gonna need that even. The tone signal
and end signal will be too similar. You\'re just after
the basis of a long FIR filter anyway.

It\'s different if there\'s noise but the filtering to get around
that is pretty straight up. It\'s usually an LPF.

I do it in time domain.


Recording a single sample pulse thru the device is cheating :) These
guys call it \"Singular Value Decomposition\"

https://ieeexplore.ieee.org/abstract/document/52264

( I keed; there\'s more to it than that but the single pulse
is a good start ).

Keed?

Very interesting. I would be tempted to use the Penrose Pseudoinverse
(which has an SVD within), for its error tolerance.

..<https://en.wikipedia.org/wiki/Moore%E2%80%93Penrose_inverse>

I can see this working for reasonably small data sets, as the SVD
scales an O[n^3) or so. But the DFT (using FFT) scales as O(N*logN).

..<https://en.wikipedia.org/wiki/Computational_complexity_of_mathematical_operations>


Joe Gwinn
 
On Tue, 10 Jan 2023 20:05:13 -0800 (PST), Three Jeeps
<jjhudak4@gmail.com> wrote:

On Tuesday, January 10, 2023 at 5:57:10 PM UTC-5, Joerg wrote:
On 1/10/23 8:22 AM, Phil Hobbs wrote:
Phil Hobbs wrote:
Joerg wrote:
On 1/2/23 5:57 PM, Phil Hobbs wrote:
John Larkin wrote:
On Mon, 2 Jan 2023 11:00:52 -0800, Joerg <ne...@analogconsultants.com
wrote:

On 1/1/23 11:08 PM, Jan Panteltje wrote:
[...]
In the EE school I was in it was known that only \'hobbyists\'
would pass the final exams. The dropout in the first year was
very very very high.


At my university the drop-out rate (start to degree) was at times
83%.

Too many kids selected an EE degree based on some high school
counselor\'s advice, or dreams of a tidy income. Too late.

I dunno. Washing out of a hard program isn\'t the worst thing that can
happen to a young person. It\'s not nearly as bad as hanging on by the
skin of your teeth and then failing over a decade or so in the
industry.

The old saying, \"C\'s get degrees\" has caused a lot of misery of that
sort.


I had pretty bad grades because I worked a lot on the side, did
\"pre-degree consulting\" and stuff like that. Bad grades are ok.

In an honest system, bad grades mean that the student either didn\'t do
the work, or was unable or unwilling to do it well. There can be lots
of reasons for that, such as being unavoidably too busy, but that\'s
not the usual case.

The result is wasted time and money, and usually a skill set that\'s
full of holes and harder to build on later. It sounds like you were
sort of making up your own enrichment curriculum as you went on, which
is a bit different, of course.

I really lost interest in attending university lectures after a few
things were taught by professors that were profoundly wrong. The first
one was that RF transmitters must have an output impedance equal to the
impedance of the connected load or cable. The week after I brought in
the schematic of a then-modern transistorized ham radio transceiver and
pointed out the final amplifier. The professor didn\'t really know what
to say.

Number two: The same guy said that grounded gate circuits in RF stages
make no sense at all. Huh? I did one of those during my very first job
assignment when the ink on my degree was barely dry. And lots before as
a hobbyist.

Number three: Another professor said that we only need to learn all this
transistor-level stuff for the exam. Once we graduated this would all be
obsoleted by integrated circuits. That one took the cake. Still, it
seemed I was the only one who didn\'t believe such nonsense. However, it
provided me with the epiphany \"Ha! This is my niche!\". And that\'s what
it became. Never looked back.

This was at a European ivy league place which made it even more
disappointing.
I knew some very smart folks whose grades were poor, but they were
mostly unmotivated or undisciplined. One guy (a math genius) was in
my grad school study group for awhile, but was way too handsome for
his own good--he spent his time playing soccer and chasing women, and
tried to skate by on talent as he\'d always done. Eventually it
stopped working. If you go far enough, it always does.
My dad hinted that I was a bread scholar who\'d only learn something if
it can be put to profitable use, and prontissimo. For the most part he
was right.

That\'s the real benefit of weed-out courses--not that many people
flunk, but that the ones who succeed have to learn to learn mental
discipline in the process. That\'ll stand you in good stead for a
lifetime. (Flunking isn\'t the worst thing that can happen to you. I
got fired from my first job, which was very beneficial overall.)

Agree, it makes the students tough. Just like military service does.
When I was at boot camp I really resented being in the Army, life was
hard, sergeants screaming in our faces, and so on. Later in life I
realized that it had taught me a lot that I use to this day.
Students sometimes ask me for advice, and I always tell them three
things: first, in every field, make sure you have the fundamentals
down cold; second, concentrate your course work on things that are
hard to pick up on your own, especially math; and third, join a
research group where you can do a lot of stuff on your own. (The
ideal is to have an interesting smallish project, where you have to do
everything, and a bunch of smart and supportive colleagues.)

That\'s the most direct path to wizardhood that I know about.

I think a job is very educational. In Germany we had to do a minimum of
six months of \"relevant industrial practice\" for a masters degree. Sort
of internships, during our studies. Three of those months had to be
completed by the 4th semester. It could not be all at one place but
AFAIR at four companies. The jobs had to be meticulously documented.
These documents had to be turned in and the university had to approve
them or it wouldn\'t count. Not always easy. Two of mine were in a
foreign language (to them) and they gave me some grief about that.

They did away with that requirement which I think was a major mistake.

I did some other bigger jobs also and at some point was a taxpayer in
three different countries. That alone is a teachable situation.

Another upside of this is that you don\'t finish university with a chunk
of student debt but with savings in the bank.

[...]
I should add that \"good grades\" don\'t always mean A/A+. The US
educational system has long had this tendency to reward letter-perfect
regurgitation over understanding and independent thought. That\'s very
prevalent in K-12 but less so in university. Still, one might be better
off taking one more course per semester and not being letter-perfect.

Stanford was/is on the quarter system, so you have more choices.

I have never attended any American schools other than some snippets of
open learning (without credits) but I have interviewed lots of
candidates. US and Canadian Universities seem to be pretty good and IME
it doesn\'t make much of a difference whether that was a small local one
or ivy league. US schools OTOH often seem to be the pits. Many kids
can\'t even spell correctly or understand math. Except kids from
non-public schools like home-schoolers, charter or parochial. A
surprising number of good job candidates had a Jesuit High background.
They must be doing something right.
--
Regards, Joerg

http://www.analogconsultants.com/

umm \"The first one was that RF transmitters must have an output impedance equal to the
impedance of the connected load or cable. \"
I am not an \'RF\' guy but have dabbled with ham radio designs, and did do audio amp designs. I clearly remember circuit analysis being done to ensure that impedance matching was done because it is essential for maximum power transfer. So how is that wrong?

At the classic maximum power transfer point, the generator dissipates
as much power as the load. So efficiency is at best 50%. You can do
that in small systems, like an RF MMIC driving a cable or something.

One can systhesize a fake source impedance in a switcher amp, at high
efficiency. Carefully.


The fact that you had a counter example doesn\'t make the theory wrong, just the counterexample.

On the third point, I don\'t think he was wrong, just very narrow minded. In one of my digital logic design courses various methods of gate minimization\' were beat into us (K-maps, prime implicates, etc). Thought it was foolish, after all, IC gates were cheap, fast, plentiful. Twenty years later I remember doing gate minimization for PALs....

Does anybody use K-maps any more? I never have.
 
On Tue, 10 Jan 2023 20:05:13 -0800 (PST), Three Jeeps
<jjhudak4@gmail.com> wrote:

On Tuesday, January 10, 2023 at 5:57:10 PM UTC-5, Joerg wrote:
On 1/10/23 8:22 AM, Phil Hobbs wrote:
Phil Hobbs wrote:
Joerg wrote:
On 1/2/23 5:57 PM, Phil Hobbs wrote:
John Larkin wrote:
On Mon, 2 Jan 2023 11:00:52 -0800, Joerg <ne...@analogconsultants.com
wrote:

On 1/1/23 11:08 PM, Jan Panteltje wrote:
[...]
In the EE school I was in it was known that only \'hobbyists\'
would pass the final exams. The dropout in the first year was
very very very high.


At my university the drop-out rate (start to degree) was at times
83%.

Too many kids selected an EE degree based on some high school
counselor\'s advice, or dreams of a tidy income. Too late.

I dunno. Washing out of a hard program isn\'t the worst thing that can
happen to a young person. It\'s not nearly as bad as hanging on by the
skin of your teeth and then failing over a decade or so in the
industry.

The old saying, \"C\'s get degrees\" has caused a lot of misery of that
sort.


I had pretty bad grades because I worked a lot on the side, did
\"pre-degree consulting\" and stuff like that. Bad grades are ok.

In an honest system, bad grades mean that the student either didn\'t do
the work, or was unable or unwilling to do it well. There can be lots
of reasons for that, such as being unavoidably too busy, but that\'s
not the usual case.

The result is wasted time and money, and usually a skill set that\'s
full of holes and harder to build on later. It sounds like you were
sort of making up your own enrichment curriculum as you went on, which
is a bit different, of course.

I really lost interest in attending university lectures after a few
things were taught by professors that were profoundly wrong. The first
one was that RF transmitters must have an output impedance equal to the
impedance of the connected load or cable. The week after I brought in
the schematic of a then-modern transistorized ham radio transceiver and
pointed out the final amplifier. The professor didn\'t really know what
to say.

Number two: The same guy said that grounded gate circuits in RF stages
make no sense at all. Huh? I did one of those during my very first job
assignment when the ink on my degree was barely dry. And lots before as
a hobbyist.

Number three: Another professor said that we only need to learn all this
transistor-level stuff for the exam. Once we graduated this would all be
obsoleted by integrated circuits. That one took the cake. Still, it
seemed I was the only one who didn\'t believe such nonsense. However, it
provided me with the epiphany \"Ha! This is my niche!\". And that\'s what
it became. Never looked back.

This was at a European ivy league place which made it even more
disappointing.
I knew some very smart folks whose grades were poor, but they were
mostly unmotivated or undisciplined. One guy (a math genius) was in
my grad school study group for awhile, but was way too handsome for
his own good--he spent his time playing soccer and chasing women, and
tried to skate by on talent as he\'d always done. Eventually it
stopped working. If you go far enough, it always does.
My dad hinted that I was a bread scholar who\'d only learn something if
it can be put to profitable use, and prontissimo. For the most part he
was right.

That\'s the real benefit of weed-out courses--not that many people
flunk, but that the ones who succeed have to learn to learn mental
discipline in the process. That\'ll stand you in good stead for a
lifetime. (Flunking isn\'t the worst thing that can happen to you. I
got fired from my first job, which was very beneficial overall.)

Agree, it makes the students tough. Just like military service does.
When I was at boot camp I really resented being in the Army, life was
hard, sergeants screaming in our faces, and so on. Later in life I
realized that it had taught me a lot that I use to this day.
Students sometimes ask me for advice, and I always tell them three
things: first, in every field, make sure you have the fundamentals
down cold; second, concentrate your course work on things that are
hard to pick up on your own, especially math; and third, join a
research group where you can do a lot of stuff on your own. (The
ideal is to have an interesting smallish project, where you have to do
everything, and a bunch of smart and supportive colleagues.)

That\'s the most direct path to wizardhood that I know about.

I think a job is very educational. In Germany we had to do a minimum of
six months of \"relevant industrial practice\" for a masters degree. Sort
of internships, during our studies. Three of those months had to be
completed by the 4th semester. It could not be all at one place but
AFAIR at four companies. The jobs had to be meticulously documented.
These documents had to be turned in and the university had to approve
them or it wouldn\'t count. Not always easy. Two of mine were in a
foreign language (to them) and they gave me some grief about that.

They did away with that requirement which I think was a major mistake.

I did some other bigger jobs also and at some point was a taxpayer in
three different countries. That alone is a teachable situation.

Another upside of this is that you don\'t finish university with a chunk
of student debt but with savings in the bank.

[...]
I should add that \"good grades\" don\'t always mean A/A+. The US
educational system has long had this tendency to reward letter-perfect
regurgitation over understanding and independent thought. That\'s very
prevalent in K-12 but less so in university. Still, one might be better
off taking one more course per semester and not being letter-perfect.

Stanford was/is on the quarter system, so you have more choices.

I have never attended any American schools other than some snippets of
open learning (without credits) but I have interviewed lots of
candidates. US and Canadian Universities seem to be pretty good and IME
it doesn\'t make much of a difference whether that was a small local one
or ivy league. US schools OTOH often seem to be the pits. Many kids
can\'t even spell correctly or understand math. Except kids from
non-public schools like home-schoolers, charter or parochial. A
surprising number of good job candidates had a Jesuit High background.
They must be doing something right.
--
Regards, Joerg

http://www.analogconsultants.com/

umm \"The first one was that RF transmitters must have an output impedance equal to the
impedance of the connected load or cable. \"
I am not an \'RF\' guy but have dabbled with ham radio designs, and did do audio amp designs. I clearly remember circuit analysis being done to ensure that impedance matching was done because it is essential for maximum power transfer. So how is that wrong?

At the classic maximum power transfer point, the generator dissipates
as much power as the load. So efficiency is at best 50%. You can do
that in small systems, like an RF MMIC driving a cable or something.

One can systhesize a fake source impedance in a switcher amp, at high
efficiency. Carefully.


The fact that you had a counter example doesn\'t make the theory wrong, just the counterexample.

On the third point, I don\'t think he was wrong, just very narrow minded. In one of my digital logic design courses various methods of gate minimization\' were beat into us (K-maps, prime implicates, etc). Thought it was foolish, after all, IC gates were cheap, fast, plentiful. Twenty years later I remember doing gate minimization for PALs....

Does anybody use K-maps any more? I never have.
 
On Tue, 10 Jan 2023 20:05:13 -0800 (PST), Three Jeeps
<jjhudak4@gmail.com> wrote:

On Tuesday, January 10, 2023 at 5:57:10 PM UTC-5, Joerg wrote:
On 1/10/23 8:22 AM, Phil Hobbs wrote:
Phil Hobbs wrote:
Joerg wrote:
On 1/2/23 5:57 PM, Phil Hobbs wrote:
John Larkin wrote:
On Mon, 2 Jan 2023 11:00:52 -0800, Joerg <ne...@analogconsultants.com
wrote:

On 1/1/23 11:08 PM, Jan Panteltje wrote:
[...]
In the EE school I was in it was known that only \'hobbyists\'
would pass the final exams. The dropout in the first year was
very very very high.


At my university the drop-out rate (start to degree) was at times
83%.

Too many kids selected an EE degree based on some high school
counselor\'s advice, or dreams of a tidy income. Too late.

I dunno. Washing out of a hard program isn\'t the worst thing that can
happen to a young person. It\'s not nearly as bad as hanging on by the
skin of your teeth and then failing over a decade or so in the
industry.

The old saying, \"C\'s get degrees\" has caused a lot of misery of that
sort.


I had pretty bad grades because I worked a lot on the side, did
\"pre-degree consulting\" and stuff like that. Bad grades are ok.

In an honest system, bad grades mean that the student either didn\'t do
the work, or was unable or unwilling to do it well. There can be lots
of reasons for that, such as being unavoidably too busy, but that\'s
not the usual case.

The result is wasted time and money, and usually a skill set that\'s
full of holes and harder to build on later. It sounds like you were
sort of making up your own enrichment curriculum as you went on, which
is a bit different, of course.

I really lost interest in attending university lectures after a few
things were taught by professors that were profoundly wrong. The first
one was that RF transmitters must have an output impedance equal to the
impedance of the connected load or cable. The week after I brought in
the schematic of a then-modern transistorized ham radio transceiver and
pointed out the final amplifier. The professor didn\'t really know what
to say.

Number two: The same guy said that grounded gate circuits in RF stages
make no sense at all. Huh? I did one of those during my very first job
assignment when the ink on my degree was barely dry. And lots before as
a hobbyist.

Number three: Another professor said that we only need to learn all this
transistor-level stuff for the exam. Once we graduated this would all be
obsoleted by integrated circuits. That one took the cake. Still, it
seemed I was the only one who didn\'t believe such nonsense. However, it
provided me with the epiphany \"Ha! This is my niche!\". And that\'s what
it became. Never looked back.

This was at a European ivy league place which made it even more
disappointing.
I knew some very smart folks whose grades were poor, but they were
mostly unmotivated or undisciplined. One guy (a math genius) was in
my grad school study group for awhile, but was way too handsome for
his own good--he spent his time playing soccer and chasing women, and
tried to skate by on talent as he\'d always done. Eventually it
stopped working. If you go far enough, it always does.
My dad hinted that I was a bread scholar who\'d only learn something if
it can be put to profitable use, and prontissimo. For the most part he
was right.

That\'s the real benefit of weed-out courses--not that many people
flunk, but that the ones who succeed have to learn to learn mental
discipline in the process. That\'ll stand you in good stead for a
lifetime. (Flunking isn\'t the worst thing that can happen to you. I
got fired from my first job, which was very beneficial overall.)

Agree, it makes the students tough. Just like military service does.
When I was at boot camp I really resented being in the Army, life was
hard, sergeants screaming in our faces, and so on. Later in life I
realized that it had taught me a lot that I use to this day.
Students sometimes ask me for advice, and I always tell them three
things: first, in every field, make sure you have the fundamentals
down cold; second, concentrate your course work on things that are
hard to pick up on your own, especially math; and third, join a
research group where you can do a lot of stuff on your own. (The
ideal is to have an interesting smallish project, where you have to do
everything, and a bunch of smart and supportive colleagues.)

That\'s the most direct path to wizardhood that I know about.

I think a job is very educational. In Germany we had to do a minimum of
six months of \"relevant industrial practice\" for a masters degree. Sort
of internships, during our studies. Three of those months had to be
completed by the 4th semester. It could not be all at one place but
AFAIR at four companies. The jobs had to be meticulously documented.
These documents had to be turned in and the university had to approve
them or it wouldn\'t count. Not always easy. Two of mine were in a
foreign language (to them) and they gave me some grief about that.

They did away with that requirement which I think was a major mistake.

I did some other bigger jobs also and at some point was a taxpayer in
three different countries. That alone is a teachable situation.

Another upside of this is that you don\'t finish university with a chunk
of student debt but with savings in the bank.

[...]
I should add that \"good grades\" don\'t always mean A/A+. The US
educational system has long had this tendency to reward letter-perfect
regurgitation over understanding and independent thought. That\'s very
prevalent in K-12 but less so in university. Still, one might be better
off taking one more course per semester and not being letter-perfect.

Stanford was/is on the quarter system, so you have more choices.

I have never attended any American schools other than some snippets of
open learning (without credits) but I have interviewed lots of
candidates. US and Canadian Universities seem to be pretty good and IME
it doesn\'t make much of a difference whether that was a small local one
or ivy league. US schools OTOH often seem to be the pits. Many kids
can\'t even spell correctly or understand math. Except kids from
non-public schools like home-schoolers, charter or parochial. A
surprising number of good job candidates had a Jesuit High background.
They must be doing something right.
--
Regards, Joerg

http://www.analogconsultants.com/

umm \"The first one was that RF transmitters must have an output impedance equal to the
impedance of the connected load or cable. \"
I am not an \'RF\' guy but have dabbled with ham radio designs, and did do audio amp designs. I clearly remember circuit analysis being done to ensure that impedance matching was done because it is essential for maximum power transfer. So how is that wrong?

At the classic maximum power transfer point, the generator dissipates
as much power as the load. So efficiency is at best 50%. You can do
that in small systems, like an RF MMIC driving a cable or something.

One can systhesize a fake source impedance in a switcher amp, at high
efficiency. Carefully.


The fact that you had a counter example doesn\'t make the theory wrong, just the counterexample.

On the third point, I don\'t think he was wrong, just very narrow minded. In one of my digital logic design courses various methods of gate minimization\' were beat into us (K-maps, prime implicates, etc). Thought it was foolish, after all, IC gates were cheap, fast, plentiful. Twenty years later I remember doing gate minimization for PALs....

Does anybody use K-maps any more? I never have.
 
On 2023-01-11 05:57, Jan Panteltje wrote:
On a sunny day (Tue, 10 Jan 2023 14:57:00 -0800) it happened Joerg
news@analogconsultants.com> wrote in <k268puFi1h6U1@mid.individual.net>:


Number two: The same guy said that grounded gate circuits in RF stages
make no sense at all. Huh? I did one of those during my very first job
assignment when the ink on my degree was barely dry. And lots before as
a hobbyist.

Sure, used those in some projects.

[...]

Grounded gate --or grounded base-- amplifier stages are very useful!

A cascode is a grounded source stage followed by a grounded gate
stage. That greatly reduces its input capacitance, and so helps
to extend its useful bandwidth. I use cascodes extensively in beam
instrumentation amplifiers in particle accelerators.

I also like Norton amplifiers, which are a grounded base stage with
transformer feedback. Those can be designed to have a real 50 Ohm
input resistance, excellent 3rd order intercept, low noise and a
flat gain over a very large bandwidth. High quality radio receivers
invariably have some version of this concept in the first IF stage.
There are examples in the ARRL handbook.

Jeroen Belleman
 
On 2023-01-11 05:57, Jan Panteltje wrote:
On a sunny day (Tue, 10 Jan 2023 14:57:00 -0800) it happened Joerg
news@analogconsultants.com> wrote in <k268puFi1h6U1@mid.individual.net>:


Number two: The same guy said that grounded gate circuits in RF stages
make no sense at all. Huh? I did one of those during my very first job
assignment when the ink on my degree was barely dry. And lots before as
a hobbyist.

Sure, used those in some projects.

[...]

Grounded gate --or grounded base-- amplifier stages are very useful!

A cascode is a grounded source stage followed by a grounded gate
stage. That greatly reduces its input capacitance, and so helps
to extend its useful bandwidth. I use cascodes extensively in beam
instrumentation amplifiers in particle accelerators.

I also like Norton amplifiers, which are a grounded base stage with
transformer feedback. Those can be designed to have a real 50 Ohm
input resistance, excellent 3rd order intercept, low noise and a
flat gain over a very large bandwidth. High quality radio receivers
invariably have some version of this concept in the first IF stage.
There are examples in the ARRL handbook.

Jeroen Belleman
 
On 2023-01-11 05:57, Jan Panteltje wrote:
On a sunny day (Tue, 10 Jan 2023 14:57:00 -0800) it happened Joerg
news@analogconsultants.com> wrote in <k268puFi1h6U1@mid.individual.net>:


Number two: The same guy said that grounded gate circuits in RF stages
make no sense at all. Huh? I did one of those during my very first job
assignment when the ink on my degree was barely dry. And lots before as
a hobbyist.

Sure, used those in some projects.

[...]

Grounded gate --or grounded base-- amplifier stages are very useful!

A cascode is a grounded source stage followed by a grounded gate
stage. That greatly reduces its input capacitance, and so helps
to extend its useful bandwidth. I use cascodes extensively in beam
instrumentation amplifiers in particle accelerators.

I also like Norton amplifiers, which are a grounded base stage with
transformer feedback. Those can be designed to have a real 50 Ohm
input resistance, excellent 3rd order intercept, low noise and a
flat gain over a very large bandwidth. High quality radio receivers
invariably have some version of this concept in the first IF stage.
There are examples in the ARRL handbook.

Jeroen Belleman
 

Welcome to EDABoard.com

Sponsor

Back
Top