Conical inductors--still $10!...

On 19/07/20 15:18, Bill Sloman wrote:
On Sunday, July 19, 2020 at 1:09:57 PM UTC+10, jla...@highlandsniptechnology.com wrote:
On Sun, 19 Jul 2020 12:39:35 +1000, david eather
eathDELETEer@tpg.com.au> wrote:
Is Trump more important than the Republican party? More important than
the economy? The country?

Why do you blame Trump for a Chinese virus? He\'s been ahead of the WHO
on this.

Really?

Haven\'t you heard? Black is the new white, and up is the new down.

It is easier to get people to believe a Big Lie than a little lie.
All you have to is repeat it frequently enough. Sounding as if you
are convinced yourself helps that process.
 
On 19/07/20 15:18, Bill Sloman wrote:
On Sunday, July 19, 2020 at 1:09:57 PM UTC+10, jla...@highlandsniptechnology.com wrote:
On Sun, 19 Jul 2020 12:39:35 +1000, david eather
eathDELETEer@tpg.com.au> wrote:
Is Trump more important than the Republican party? More important than
the economy? The country?

Why do you blame Trump for a Chinese virus? He\'s been ahead of the WHO
on this.

Really?

Haven\'t you heard? Black is the new white, and up is the new down.

It is easier to get people to believe a Big Lie than a little lie.
All you have to is repeat it frequently enough. Sounding as if you
are convinced yourself helps that process.
 
On 19/07/20 15:18, Bill Sloman wrote:
On Sunday, July 19, 2020 at 1:09:57 PM UTC+10, jla...@highlandsniptechnology.com wrote:
On Sun, 19 Jul 2020 12:39:35 +1000, david eather
eathDELETEer@tpg.com.au> wrote:
Is Trump more important than the Republican party? More important than
the economy? The country?

Why do you blame Trump for a Chinese virus? He\'s been ahead of the WHO
on this.

Really?

Haven\'t you heard? Black is the new white, and up is the new down.

It is easier to get people to believe a Big Lie than a little lie.
All you have to is repeat it frequently enough. Sounding as if you
are convinced yourself helps that process.
 
On Saturday, July 18, 2020 at 7:29:52 AM UTC+10, Joe Gwinn wrote:
On Thu, 16 Jul 2020 21:05:12 -0700 (PDT), Bill Sloman
bill.sloman@ieee.org> wrote:

On Friday, July 17, 2020 at 6:43:02 AM UTC+10, Joe Gwinn wrote:
On Thu, 16 Jul 2020 15:54:54 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 16/07/20 15:32, jlarkin@highlandsniptechnology.com wrote:
On Thu, 16 Jul 2020 14:42:56 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 16/07/20 14:26, jlarkin@highlandsniptechnology.com wrote:
On Thu, 16 Jul 2020 07:28:17 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 16/07/20 06:32, dagmargoodboat@yahoo.com wrote:

<snip>

I was an embedded realtime programmer, writing in assembly code on the
metal in those days. All the embedded realtime programmers at that
company had hardware degrees, which was necessary to do much of
anything. Computer science had not yet been invented.

Not exactly true. I did Theory of Computation Part 1 in 1967 as a graduate student. It didn\'t go all that far into computer science, but Turing\'s name did crop up from time to time, and it was taught by professional computer scientists, rather than mathematicians who specialised in numerical analysis (though that was where they mostly came from).

The proposition that LISP would save the world

https://en.wikipedia.org/wiki/Lisp_(programming_language)

might have been around - LISP was invented in 1958 - but you need a lots more mass memory than was economically feasible at the time to have long enough lists to process to any effect.

I remember those days. And the LISP crowd, and programs that can be
written, but cannot be read.

What I meant that the academic area called Computer Science, for which
one could get a degree, had not been invented.

The University of Melbourne seems to have got in early.

\" The original “Computation Laboratory” became the “Computation Department” as these courses flourished, and, by the late 1960s, a further name change gave birth to the Department of Information Science, as a regular academic department in the Faculty of Science.\"

They might not have had a chair at that stage - \"by 1975, the Department of Information Science had reached the point where a Chair appointment was warranted, and Professor Peter Poole was recruited. Poole changed the name of the Department to Computer Science\" - but what they were doing was computer science.

> Computers were of course far older. I got a Masters in CS in 1981.

Not exactly primary evidence.

--
Bill Sloman, Sydney
 
On Saturday, July 18, 2020 at 7:29:52 AM UTC+10, Joe Gwinn wrote:
On Thu, 16 Jul 2020 21:05:12 -0700 (PDT), Bill Sloman
bill.sloman@ieee.org> wrote:

On Friday, July 17, 2020 at 6:43:02 AM UTC+10, Joe Gwinn wrote:
On Thu, 16 Jul 2020 15:54:54 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 16/07/20 15:32, jlarkin@highlandsniptechnology.com wrote:
On Thu, 16 Jul 2020 14:42:56 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 16/07/20 14:26, jlarkin@highlandsniptechnology.com wrote:
On Thu, 16 Jul 2020 07:28:17 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 16/07/20 06:32, dagmargoodboat@yahoo.com wrote:

<snip>

I was an embedded realtime programmer, writing in assembly code on the
metal in those days. All the embedded realtime programmers at that
company had hardware degrees, which was necessary to do much of
anything. Computer science had not yet been invented.

Not exactly true. I did Theory of Computation Part 1 in 1967 as a graduate student. It didn\'t go all that far into computer science, but Turing\'s name did crop up from time to time, and it was taught by professional computer scientists, rather than mathematicians who specialised in numerical analysis (though that was where they mostly came from).

The proposition that LISP would save the world

https://en.wikipedia.org/wiki/Lisp_(programming_language)

might have been around - LISP was invented in 1958 - but you need a lots more mass memory than was economically feasible at the time to have long enough lists to process to any effect.

I remember those days. And the LISP crowd, and programs that can be
written, but cannot be read.

What I meant that the academic area called Computer Science, for which
one could get a degree, had not been invented.

The University of Melbourne seems to have got in early.

\" The original “Computation Laboratory” became the “Computation Department” as these courses flourished, and, by the late 1960s, a further name change gave birth to the Department of Information Science, as a regular academic department in the Faculty of Science.\"

They might not have had a chair at that stage - \"by 1975, the Department of Information Science had reached the point where a Chair appointment was warranted, and Professor Peter Poole was recruited. Poole changed the name of the Department to Computer Science\" - but what they were doing was computer science.

> Computers were of course far older. I got a Masters in CS in 1981.

Not exactly primary evidence.

--
Bill Sloman, Sydney
 
On Saturday, July 18, 2020 at 6:36:11 AM UTC+10, jla...@highlandsniptechnology.com wrote:
On Fri, 17 Jul 2020 12:23:48 -0500, Joe Chisolm
jchisolm6@earthlink.net> wrote:

On Fri, 17 Jul 2020 07:30:52 -0700, jlarkin wrote:

On Fri, 17 Jul 2020 01:42:10 -0700 (PDT), whit3rd <whit3rd@gmail.com
wrote:

On Thursday, July 16, 2020 at 6:42:34 AM UTC-7,
jla...@highlandsniptechnology.com wrote:

<snip>

Yeah, the data is a mess, and the maximally alarmist numbers make the
headlines.

But nowhere near as much of mess as John Larkin likes to think. It\'s clear that he can\'t make any kind of sense of the data he looks at, but that seems to be because he doesn\'t want to recognise what the data would tell him if he could make sense of it.

It\'s much like his approach to climate change.

Fear, hysteria, politicking, opportunism, fame and fortune, expertise,
we have them all. People get stupid as the square of the number of
cameras aimed at them. Or book advances.

Some reporters may. Critical thinking is the skill that lets you tease real meaning out of clouds of hype. John Larkin can\'t do it.

--
Bill Sloman, Sydney
 
On Saturday, July 18, 2020 at 6:36:11 AM UTC+10, jla...@highlandsniptechnology.com wrote:
On Fri, 17 Jul 2020 12:23:48 -0500, Joe Chisolm
jchisolm6@earthlink.net> wrote:

On Fri, 17 Jul 2020 07:30:52 -0700, jlarkin wrote:

On Fri, 17 Jul 2020 01:42:10 -0700 (PDT), whit3rd <whit3rd@gmail.com
wrote:

On Thursday, July 16, 2020 at 6:42:34 AM UTC-7,
jla...@highlandsniptechnology.com wrote:

<snip>

Yeah, the data is a mess, and the maximally alarmist numbers make the
headlines.

But nowhere near as much of mess as John Larkin likes to think. It\'s clear that he can\'t make any kind of sense of the data he looks at, but that seems to be because he doesn\'t want to recognise what the data would tell him if he could make sense of it.

It\'s much like his approach to climate change.

Fear, hysteria, politicking, opportunism, fame and fortune, expertise,
we have them all. People get stupid as the square of the number of
cameras aimed at them. Or book advances.

Some reporters may. Critical thinking is the skill that lets you tease real meaning out of clouds of hype. John Larkin can\'t do it.

--
Bill Sloman, Sydney
 
On Saturday, July 18, 2020 at 6:36:11 AM UTC+10, jla...@highlandsniptechnology.com wrote:
On Fri, 17 Jul 2020 12:23:48 -0500, Joe Chisolm
jchisolm6@earthlink.net> wrote:

On Fri, 17 Jul 2020 07:30:52 -0700, jlarkin wrote:

On Fri, 17 Jul 2020 01:42:10 -0700 (PDT), whit3rd <whit3rd@gmail.com
wrote:

On Thursday, July 16, 2020 at 6:42:34 AM UTC-7,
jla...@highlandsniptechnology.com wrote:

<snip>

Yeah, the data is a mess, and the maximally alarmist numbers make the
headlines.

But nowhere near as much of mess as John Larkin likes to think. It\'s clear that he can\'t make any kind of sense of the data he looks at, but that seems to be because he doesn\'t want to recognise what the data would tell him if he could make sense of it.

It\'s much like his approach to climate change.

Fear, hysteria, politicking, opportunism, fame and fortune, expertise,
we have them all. People get stupid as the square of the number of
cameras aimed at them. Or book advances.

Some reporters may. Critical thinking is the skill that lets you tease real meaning out of clouds of hype. John Larkin can\'t do it.

--
Bill Sloman, Sydney
 
On Saturday, July 18, 2020 at 8:35:04 PM UTC-7, jla...@highlandsniptechnology.com wrote:
On Sat, 18 Jul 2020 20:30:39 -0700 (PDT), whit3rd <whit3rd@gmail.com
wrote:

On Saturday, July 18, 2020 at 8:09:57 PM UTC-7, jla...@highlandsniptechnology.com wrote:

Why do you blame Trump for a Chinese virus? He\'s been ahead of the WHO
on this.

That\'s clearly irrational. Neither China nor Trump are blamed for the virus.

Trump is sometimes blamed for incoherent messaging and other responses
to the situation, as is proper.

And, \'ahead of the WHO\' is an odd claim; we, as a nation,
are MEMBERS of WHO.

We haven\'t left it. Trump simply doesn\'t act as a member, but as an antagonist.
And, claiming to be \'ahead\' is a classic cheap shot. I\'m ahead of you, by the way!

Design something and we\'ll see.

So, I can add \'cheap shot\' to \'analysis\' as another concept that John Larkin doesn\'t get?
 
On Saturday, July 18, 2020 at 8:35:04 PM UTC-7, jla...@highlandsniptechnology.com wrote:
On Sat, 18 Jul 2020 20:30:39 -0700 (PDT), whit3rd <whit3rd@gmail.com
wrote:

On Saturday, July 18, 2020 at 8:09:57 PM UTC-7, jla...@highlandsniptechnology.com wrote:

Why do you blame Trump for a Chinese virus? He\'s been ahead of the WHO
on this.

That\'s clearly irrational. Neither China nor Trump are blamed for the virus.

Trump is sometimes blamed for incoherent messaging and other responses
to the situation, as is proper.

And, \'ahead of the WHO\' is an odd claim; we, as a nation,
are MEMBERS of WHO.

We haven\'t left it. Trump simply doesn\'t act as a member, but as an antagonist.
And, claiming to be \'ahead\' is a classic cheap shot. I\'m ahead of you, by the way!

Design something and we\'ll see.

So, I can add \'cheap shot\' to \'analysis\' as another concept that John Larkin doesn\'t get?
 
On Sun, 12 Jul 2020 22:08:07 +0100, <edward.ming.lee@gmail.com> wrote:

On Sunday, July 12, 2020 at 1:13:37 PM UTC-7, Commander Kinsey wrote:
On Sun, 12 Jul 2020 13:48:47 +0100, <edward.ming.lee@gmail.com> wrote:

On Sunday, July 12, 2020 at 5:35:47 AM UTC-7, John Doe wrote:
Some CPUs have integrated graphics. Applications that require
massive video cards need no more than the latest CPU.

or the latest GPU!

Power consumption is really a function of the fab process. Latest CPUs are always using the latest process (around 5um for TSMC). GPUs are usually 14um to 20um. GPUs are not updated much lately. Apple and AMD are going all out with latest TSMC CPUs.

Er.... wrong. AMD Radeon VII = 7nm:
https://www.amd.com/en/products/graphics/amd-radeon-vii

That was just an experiment.

And I read somewhere that like everything else nowadays, that number is not to be trusted, there are different things it can measure.

Radeon VII Allegedly Reaches End of Life Status, AMD Neither Confirms Nor Denies. Update, August 24, 2019 3:30pm PT: Matt Bach from Puget Systems reports that an AMD representative told the custom PC builder that the Radeon VII is indeed now end of life (EOL), meaning it is no longer being manufactured.

Funny how it\'s still advertised here: https://www.amd.com/en/products/graphics/amd-radeon-vii

And oh look Nvidea are using 7nm too: https://www.tweaktown.com/news/64340/nvidias-next-gen-ampere-gpus-arrive-2020-7nm/index.html
 
On Sun, 12 Jul 2020 22:08:07 +0100, <edward.ming.lee@gmail.com> wrote:

On Sunday, July 12, 2020 at 1:13:37 PM UTC-7, Commander Kinsey wrote:
On Sun, 12 Jul 2020 13:48:47 +0100, <edward.ming.lee@gmail.com> wrote:

On Sunday, July 12, 2020 at 5:35:47 AM UTC-7, John Doe wrote:
Some CPUs have integrated graphics. Applications that require
massive video cards need no more than the latest CPU.

or the latest GPU!

Power consumption is really a function of the fab process. Latest CPUs are always using the latest process (around 5um for TSMC). GPUs are usually 14um to 20um. GPUs are not updated much lately. Apple and AMD are going all out with latest TSMC CPUs.

Er.... wrong. AMD Radeon VII = 7nm:
https://www.amd.com/en/products/graphics/amd-radeon-vii

That was just an experiment.

And I read somewhere that like everything else nowadays, that number is not to be trusted, there are different things it can measure.

Radeon VII Allegedly Reaches End of Life Status, AMD Neither Confirms Nor Denies. Update, August 24, 2019 3:30pm PT: Matt Bach from Puget Systems reports that an AMD representative told the custom PC builder that the Radeon VII is indeed now end of life (EOL), meaning it is no longer being manufactured.

Funny how it\'s still advertised here: https://www.amd.com/en/products/graphics/amd-radeon-vii

And oh look Nvidea are using 7nm too: https://www.tweaktown.com/news/64340/nvidias-next-gen-ampere-gpus-arrive-2020-7nm/index.html
 
On Sun, 12 Jul 2020 22:53:35 +0100, John Doe <always.look@message.header> wrote:

edward.ming.lee@gmail.com wrote:

John Doe wrote:

Some CPUs have integrated graphics. Applications that require
massive video cards need no more than the latest CPU.

or the latest GPU!

That\'s what I meant by \"massive video cards\".

What on earth did you mean? You talk about \"massive video cards\" then refer to the required CPU.
 
On Sun, 12 Jul 2020 22:53:35 +0100, John Doe <always.look@message.header> wrote:

edward.ming.lee@gmail.com wrote:

John Doe wrote:

Some CPUs have integrated graphics. Applications that require
massive video cards need no more than the latest CPU.

or the latest GPU!

That\'s what I meant by \"massive video cards\".

What on earth did you mean? You talk about \"massive video cards\" then refer to the required CPU.
 
On Sun, 12 Jul 2020 22:53:35 +0100, John Doe <always.look@message.header> wrote:

edward.ming.lee@gmail.com wrote:

John Doe wrote:

Some CPUs have integrated graphics. Applications that require
massive video cards need no more than the latest CPU.

or the latest GPU!

That\'s what I meant by \"massive video cards\".

What on earth did you mean? You talk about \"massive video cards\" then refer to the required CPU.
 
On Sun, 12 Jul 2020 22:57:22 +0100, John Doe <always.look@message.header> wrote:

\"Commander Kinsey\" <CFKinsey@military.org.jp> wrote:

John Doe <always.look@message.header> wrote:

Some CPUs have integrated graphics. Applications that require
massive video cards need no more than the latest CPU.

Necessity is the mother of invention.

I can max out any CPU and/or GPU easily. Folding at home. Rosetta
at home. Einstein at home.....

Few people care about those applications. Most people who run those
applications are wild-eyed third worlders who can\'t afford \"any CPU
or GPU\" anyway. Whatever gets you through the day...

You continue to make no sense at all. Millions run those applications, some on a little smartphone, some (like a guy I know who works for NASA) with SEVEN graphics cards, watercooled, stacked onto one motherboard. It\'s cooled by a radiator. A central heating radiator. Why would a 3rd worlder spend money running those programs?
 
On Sun, 12 Jul 2020 22:57:22 +0100, John Doe <always.look@message.header> wrote:

\"Commander Kinsey\" <CFKinsey@military.org.jp> wrote:

John Doe <always.look@message.header> wrote:

Some CPUs have integrated graphics. Applications that require
massive video cards need no more than the latest CPU.

Necessity is the mother of invention.

I can max out any CPU and/or GPU easily. Folding at home. Rosetta
at home. Einstein at home.....

Few people care about those applications. Most people who run those
applications are wild-eyed third worlders who can\'t afford \"any CPU
or GPU\" anyway. Whatever gets you through the day...

You continue to make no sense at all. Millions run those applications, some on a little smartphone, some (like a guy I know who works for NASA) with SEVEN graphics cards, watercooled, stacked onto one motherboard. It\'s cooled by a radiator. A central heating radiator. Why would a 3rd worlder spend money running those programs?
 
On Sun, 12 Jul 2020 22:57:22 +0100, John Doe <always.look@message.header> wrote:

\"Commander Kinsey\" <CFKinsey@military.org.jp> wrote:

John Doe <always.look@message.header> wrote:

Some CPUs have integrated graphics. Applications that require
massive video cards need no more than the latest CPU.

Necessity is the mother of invention.

I can max out any CPU and/or GPU easily. Folding at home. Rosetta
at home. Einstein at home.....

Few people care about those applications. Most people who run those
applications are wild-eyed third worlders who can\'t afford \"any CPU
or GPU\" anyway. Whatever gets you through the day...

You continue to make no sense at all. Millions run those applications, some on a little smartphone, some (like a guy I know who works for NASA) with SEVEN graphics cards, watercooled, stacked onto one motherboard. It\'s cooled by a radiator. A central heating radiator. Why would a 3rd worlder spend money running those programs?
 
On Thursday, July 16, 2020 at 3:18:00 AM UTC-4, Cydrome Leader wrote:
Commander Kinsey <CFKinsey@military.org.jp> wrote:
Why are CPUs only about 80W TDP? Can\'t they make ones with three times as many cores that have 250W TDP like graphics cards?

they do and have for years. Here\'s a current one, hope you have some cash

https://ark.intel.com/content/www/us/en/ark/products/205684/intel-xeon-platinum-8380hl-processor-38-5m-cache-2-90-ghz.html

I remember back in the day AMD was trying to compete with Intel and developed a chip that used so much electricity it had a small gasoline generator built in. It worked pretty well, but the users kept complaining about the emissions and it didn\'t meet the CAFE requirements.

Go figure, wussies!

--

Rick C.

-- Get 1,000 miles of free Supercharging
-- Tesla referral code - https://ts.la/richard11209
 
On Thursday, July 16, 2020 at 3:18:00 AM UTC-4, Cydrome Leader wrote:
Commander Kinsey <CFKinsey@military.org.jp> wrote:
Why are CPUs only about 80W TDP? Can\'t they make ones with three times as many cores that have 250W TDP like graphics cards?

they do and have for years. Here\'s a current one, hope you have some cash

https://ark.intel.com/content/www/us/en/ark/products/205684/intel-xeon-platinum-8380hl-processor-38-5m-cache-2-90-ghz.html

I remember back in the day AMD was trying to compete with Intel and developed a chip that used so much electricity it had a small gasoline generator built in. It worked pretty well, but the users kept complaining about the emissions and it didn\'t meet the CAFE requirements.

Go figure, wussies!

--

Rick C.

-- Get 1,000 miles of free Supercharging
-- Tesla referral code - https://ts.la/richard11209
 

Welcome to EDABoard.com

Sponsor

Back
Top