Peak Silicon?

On Mon, 10 Jun 2019 09:11:55 +0100, Martin Brown
<'''newspam'''@nezumi.demon.co.uk> wrote:

On 09/06/2019 16:52, John Larkin wrote:
On Sun, 9 Jun 2019 16:18:50 +0100, Martin Brown
'''newspam'''@nezumi.demon.co.uk> wrote:

On 08/06/2019 09:50, tabbypurr@gmail.com wrote:
On Friday, 7 June 2019 16:49:35 UTC+1, John Larkin wrote:

https://wolfstreet.com/2019/06/04/global-semiconductor-sales-plunge-but-why/




There may be another kind of Moore's Law: we just don't need all those
transistors.

We could use more. But we're in the timezone now where computers over
10 years old can still be perfectly capable. The urgency that used to
exist has largely gone.

Until the next great must have application comes along that requires an
order of magnitude performance increase to work there will be a hiatus.
Existing designs are plenty fast enough for all office and consumer uses
which means there is no compelling reason for upgrading any more.

I'd like Spice to run 1000x faster, and have parts value sliders, so I
can tune things on a screen like I can on my bench. But that would be
used rarely, and I have no other need for more compute power. Not many
people do.

How much more would you be willing to pay for that? Super computers are
available although the user interface is more batch oriented.

An Nvidia gaming card has a lot of compute power. That, and a
reasonable fee for software, might be affordable.

Renting time on a faster remote CPU cluster might be one way out.

We tried running Spice on an Amazon cluster. It wasn't any better than
using a fast local computer.

But really, I can live with my Dell PC, Windows 7, and LT Spice as-is.
I don't really need more CPU power, more DRAM, more hard drive storage
or speed. 300 mbits seems like plenty of internet speed.

I inherited a hammer that is at least 60 years old. That works fine
too. An HP35 is maybe the best scientific calculator.





--

John Larkin Highland Technology, Inc

lunatic fringe electronics
 
On Monday, June 10, 2019 at 6:59:12 AM UTC-7, John Larkin wrote:
On Mon, 10 Jun 2019 09:11:55 +0100, Martin Brown
'''newspam'''@nezumi.demon.co.uk> wrote:

Renting time on a faster remote CPU cluster might be one way out.

We tried running Spice on an Amazon cluster. It wasn't any better than
using a fast local computer.

The source code for SPICE (old versions that I recall) doesn't
really need parallelism. There are multiprocessor variants,
though, like Xyce, and for the right problem, it might be worth
exploring in a multicore environment.

A Mac Pro was just announced, with up to 28 cores; multiprocessing
seems to be in a growth spurt.
 
On Monday, 10 June 2019 20:12:18 UTC+1, whit3rd wrote:
On Monday, June 10, 2019 at 6:59:12 AM UTC-7, John Larkin wrote:
On Mon, 10 Jun 2019 09:11:55 +0100, Martin Brown
'''newspam'''@nezumi.demon.co.uk> wrote:

Renting time on a faster remote CPU cluster might be one way out.

We tried running Spice on an Amazon cluster. It wasn't any better than
using a fast local computer.

The source code for SPICE (old versions that I recall) doesn't
really need parallelism. There are multiprocessor variants,
though, like Xyce, and for the right problem, it might be worth
exploring in a multicore environment.

A Mac Pro was just announced, with up to 28 cores; multiprocessing
seems to be in a growth spurt.

I think that's very much a way forward, though utilising them well remains a challenge. I think there's room for growth there. Amdahl was certainly right but I think is often overly pessimistically interepreted.


NT
 
On Sunday, 9 June 2019 20:54:07 UTC+1, Rick C wrote:
On Sunday, June 9, 2019 at 11:18:54 AM UTC-4, Martin Brown wrote:
On 08/06/2019 09:50, tabbypurr wrote:
On Friday, 7 June 2019 16:49:35 UTC+1, John Larkin wrote:

https://wolfstreet.com/2019/06/04/global-semiconductor-sales-plunge-but-why/


There may be another kind of Moore's Law: we just don't need all those
transistors.

We could use more. But we're in the timezone now where computers over
10 years old can still be perfectly capable. The urgency that used to
exist has largely gone.

Until the next great must have application comes along that requires an
order of magnitude performance increase to work there will be a hiatus.
Existing designs are plenty fast enough for all office and consumer uses
which means there is no compelling reason for upgrading any more.

One reason why I don't bother with getting faster Internet access is because the browsers are so slow. Also there are video playback issues that sometimes suck up available CPU resources and freeze my cursor. Perhaps that is an issue with Netflix, but a faster CPU would resolve it I expect.

Faster spread sheet calculations would also be welcome. I have plenty of filter design files that recalculate very slowly when anything is changed.

So, no, PCs are not fast enough for all purposes.

I doubt they'll ever be fast enough for all purposes. But they're fast enough for the great majority of users, and that removes the most of urgency to upgrade.


NT
 
On Sun, 9 Jun 2019 21:16:04 -0700 (PDT), whit3rd <whit3rd@gmail.com>
wrote:

On Sunday, June 9, 2019 at 6:23:37 PM UTC-7, k...@notreal.com wrote:

"The world will only need seven computers."

And, that was probably true when 'computer' was a suite of rooms filled
with vacuum tubes.

It wasn't any more true then than it is now.

With respect to peak silicon, the acreage of silicon per kilobuck is high
for photovoltaics, low for CPUs. Silicon isn't mainly for computers, unless
one measures in Yuan Renminbi, instead of kg.

Bullshit. THe main use of silicon is still beaches. (just as relevant
as the above crap)
 
On Monday, June 10, 2019 at 3:46:09 PM UTC-4, tabb...@gmail.com wrote:
On Sunday, 9 June 2019 20:54:07 UTC+1, Rick C wrote:
On Sunday, June 9, 2019 at 11:18:54 AM UTC-4, Martin Brown wrote:
On 08/06/2019 09:50, tabbypurr wrote:
On Friday, 7 June 2019 16:49:35 UTC+1, John Larkin wrote:

https://wolfstreet.com/2019/06/04/global-semiconductor-sales-plunge-but-why/


There may be another kind of Moore's Law: we just don't need all those
transistors.

We could use more. But we're in the timezone now where computers over
10 years old can still be perfectly capable. The urgency that used to
exist has largely gone.

Until the next great must have application comes along that requires an
order of magnitude performance increase to work there will be a hiatus.
Existing designs are plenty fast enough for all office and consumer uses
which means there is no compelling reason for upgrading any more.

One reason why I don't bother with getting faster Internet access is because the browsers are so slow. Also there are video playback issues that sometimes suck up available CPU resources and freeze my cursor. Perhaps that is an issue with Netflix, but a faster CPU would resolve it I expect.

Faster spread sheet calculations would also be welcome. I have plenty of filter design files that recalculate very slowly when anything is changed.

So, no, PCs are not fast enough for all purposes.

I doubt they'll ever be fast enough for all purposes. But they're fast enough for the great majority of users, and that removes the most of urgency to upgrade.

They've been fast enough for most purposes for the last 10 years and yet people don't keep a PC for even 4 years very often. If nothing else the "urgency" to upgrade comes from the increased demands of advancing software including simple memory growth. I won't have a PC without 32 GB of RAM now. I'm waiting for the industry to catch up with me and offer 32 GB standard on a $1000 machine with decent, non-shrunk keys and a 17 inch display.

My present machine is ok, but it is heavy and hot. That's another feature that is getting more popular as CPUs advance, lower power. So there will always be reasons to get a new PC as process technology advances.

--

Rick C.

-+ Get 1,000 miles of free Supercharging
-+ Tesla referral code - https://ts.la/richard11209
 
On Tuesday, 11 June 2019 04:07:10 UTC+1, Rick C wrote:
On Monday, June 10, 2019 at 3:46:09 PM UTC-4, tabby wrote:
On Sunday, 9 June 2019 20:54:07 UTC+1, Rick C wrote:
On Sunday, June 9, 2019 at 11:18:54 AM UTC-4, Martin Brown wrote:
On 08/06/2019 09:50, tabbypurr wrote:
On Friday, 7 June 2019 16:49:35 UTC+1, John Larkin wrote:

https://wolfstreet.com/2019/06/04/global-semiconductor-sales-plunge-but-why/


There may be another kind of Moore's Law: we just don't need all those
transistors.

We could use more. But we're in the timezone now where computers over
10 years old can still be perfectly capable. The urgency that used to
exist has largely gone.

Until the next great must have application comes along that requires an
order of magnitude performance increase to work there will be a hiatus.
Existing designs are plenty fast enough for all office and consumer uses
which means there is no compelling reason for upgrading any more.

One reason why I don't bother with getting faster Internet access is because the browsers are so slow. Also there are video playback issues that sometimes suck up available CPU resources and freeze my cursor. Perhaps that is an issue with Netflix, but a faster CPU would resolve it I expect.

Faster spread sheet calculations would also be welcome. I have plenty of filter design files that recalculate very slowly when anything is changed.

So, no, PCs are not fast enough for all purposes.

I doubt they'll ever be fast enough for all purposes. But they're fast enough for the great majority of users, and that removes the most of urgency to upgrade.

They've been fast enough for most purposes for the last 10 years and yet people don't keep a PC for even 4 years very often. If nothing else the "urgency" to upgrade comes from the increased demands of advancing software including simple memory growth. I won't have a PC without 32 GB of RAM now. I'm waiting for the industry to catch up with me and offer 32 GB standard on a $1000 machine with decent, non-shrunk keys and a 17 inch display.

My present machine is ok, but it is heavy and hot. That's another feature that is getting more popular as CPUs advance, lower power. So there will always be reasons to get a new PC as process technology advances.

Of course there will, but no urgency for most people, since existing machines already do most jobs.


NT
 
On Saturday, June 8, 2019 at 3:59:16 PM UTC+2, John Larkin wrote:
On Sat, 8 Jun 2019 11:54:52 +0000 (UTC),
DecadentLinuxUserNumeroUno@decadence.org wrote:

"Tim Williams" <tiwill@seventransistorlabs.com> wrote in
news:qdfhr4$ebc$1@dont-email.me:

"Robert Baer" <robertbaer@localnet.com> wrote in message
news:ERCKE.54$wz.14@fx41.iad...
Yes,,that dip near 2009 is unusual, but the growth trend popped
back on
track afterwards.


Not really. Semi mfg is expensive. Its fate is closely tied with
the availability of financing. 2008 crash, 2009 parts shortages.
I remember 26-52+ week lead times, from most mfgs, back in 2010.

Tim


Trump tarrifs... What effect will stupid crap like that have?

Note: rhetorical.

Sounds like the Mexico thing may work out. That argument was "If you
stop pushing immigrants across our border, we won't destroy your
economy."

China next.

Trump hasn't noticed that destroying other people's economies doesn't do anything good for your own. If there were US sources for the good on which he is raising tariff's, the US might get more domestic turnover, but it is for more expensive articles than the US used to import, so even the US loses.

In reality, the goods just come from a different country where wages are almost as cheap, at a slightly higher price.

--
Bill Sloman, Sydney
 
On 10/06/2019 12:56, Rick C wrote:
On Monday, June 10, 2019 at 4:13:00 AM UTC-4, Martin Brown wrote:
On 09/06/2019 20:54, Rick C wrote:

Faster spread sheet calculations would also be welcome. I have
plenty of filter design files that recalculate very slowly when
anything is changed.

You need to redesign the spreadsheet algorithm then or set it to
manual recalculation. I have done some very large scale things in
spreadsheets and have never really had that much bother with the
speed. Disabling screen updating and automatic calculation makes a
very big difference.

I think you are missing the point. The spreadsheet is doing what it
is supposed to do, but it is slow to do all the required
calculations. A faster CPU would be useful. The point of using
computers is to save human effort. To say the solution to a slow CPU
is to add back human effort to "optimize" the algorithm is admitting
the CPU is too slow and compensating for it by tossing it back in the
human's lap. In these cases, unless the method is completely
revamped, I don't think there is a more efficient approach. The
approach I picked gave me insight into the problem so I could more
effectively think about it. So it would be hard to optimize it
before I wrote the initial spreadsheet.

If you want a quick and dirty answer then a spreadsheet is great for a
quick model mock up but if you want production code then like any other
way of doing things it has to be optimised for the chosen platform.

A word of warning the random number generator in Excel is NOT as
documented and some of their more esoteric statistical functions and
numerical algorithms have dubious stability.

You should learn to use the tool that you have chosen to attack the
problem with rather than demanding that it must run quicker to cope with
your slow running implementation.
So, no, PCs are not fast enough for all purposes.

Not for all purposes but when used correctly they are well capable
of doing everything that a typical home or office user requires.

And I suppose "correctly" means in a way that "the CPU is fast
enough"? That's a rather circular argument. I also don't see a
point in limiting the user group to "home or office" use. PCs are
much less often used in the home these days. Cell phones are much
more common. Office is the main market and that includes what I do
with PCs.

PCs are perhaps less used at home now but tablets and phones are also
fast enough for the sorts of things they are likely to get used for.

Gaming is the main driver for ever faster CPU/graphics rendering now.

--
Regards,
Martin Brown
 
On Tuesday, June 11, 2019 at 11:31:35 AM UTC-4, Martin Brown wrote:
On 10/06/2019 12:56, Rick C wrote:
On Monday, June 10, 2019 at 4:13:00 AM UTC-4, Martin Brown wrote:
On 09/06/2019 20:54, Rick C wrote:

Faster spread sheet calculations would also be welcome. I have
plenty of filter design files that recalculate very slowly when
anything is changed.

You need to redesign the spreadsheet algorithm then or set it to
manual recalculation. I have done some very large scale things in
spreadsheets and have never really had that much bother with the
speed. Disabling screen updating and automatic calculation makes a
very big difference.

I think you are missing the point. The spreadsheet is doing what it
is supposed to do, but it is slow to do all the required
calculations. A faster CPU would be useful. The point of using
computers is to save human effort. To say the solution to a slow CPU
is to add back human effort to "optimize" the algorithm is admitting
the CPU is too slow and compensating for it by tossing it back in the
human's lap. In these cases, unless the method is completely
revamped, I don't think there is a more efficient approach. The
approach I picked gave me insight into the problem so I could more
effectively think about it. So it would be hard to optimize it
before I wrote the initial spreadsheet.

If you want a quick and dirty answer then a spreadsheet is great for a
quick model mock up but if you want production code then like any other
way of doing things it has to be optimised for the chosen platform.

A word of warning the random number generator in Excel is NOT as
documented and some of their more esoteric statistical functions and
numerical algorithms have dubious stability.

You should learn to use the tool that you have chosen to attack the
problem with rather than demanding that it must run quicker to cope with
your slow running implementation.

Oh? Is this Martin's law? I think you don't get to make rules. I can and will buy any PC I wish.


So, no, PCs are not fast enough for all purposes.

Not for all purposes but when used correctly they are well capable
of doing everything that a typical home or office user requires.

And I suppose "correctly" means in a way that "the CPU is fast
enough"? That's a rather circular argument. I also don't see a
point in limiting the user group to "home or office" use. PCs are
much less often used in the home these days. Cell phones are much
more common. Office is the main market and that includes what I do
with PCs.

PCs are perhaps less used at home now but tablets and phones are also
fast enough for the sorts of things they are likely to get used for.

Gaming is the main driver for ever faster CPU/graphics rendering now.

The graphics for sure. I suppose a faster CPU is also good. I don't know how many "gamers" there are out there, but since they mark up prices by 50% when selling nearly identical PCs with the only observable difference being the multicolored lights on the keyboard and the full size cursor keys I would say that indicates there is a significant market for "gaming" PCs.

That's actually what I plan to buy once they include 32 GBs of RAM and the price comes down to $1,000. I've seen the $1,000 price point a number of times, but with 16GB of RAM. That seems to be the current max RAM on machines off the shelf unless they are insanely expensive. I guess we will need to wait for the next generation of RAM chips so they get 16GB on a single module as most laptops only have two slots.
 
On 2019-06-11 17:31, Martin Brown wrote:
On 10/06/2019 12:56, Rick C wrote:
On Monday, June 10, 2019 at 4:13:00 AM UTC-4, Martin Brown wrote:
On 09/06/2019 20:54, Rick C wrote:

Faster spread sheet calculations would also be welcome. I have
plenty of filter design files that recalculate very slowly when
anything is changed.

You need to redesign the spreadsheet algorithm then or set it to
manual recalculation. I have done some very large scale things in
spreadsheets and have never really had that much bother with the
speed. Disabling screen updating and automatic calculation makes a
very big difference.

I think you are missing the point. The spreadsheet is doing what it
is supposed to do, but it is slow to do all the required
calculations. A faster CPU would be useful. The point of using
computers is to save human effort. To say the solution to a slow CPU
is to add back human effort to "optimize" the algorithm is admitting
the CPU is too slow and compensating for it by tossing it back in the
human's lap. In these cases, unless the method is completely
revamped, I don't think there is a more efficient approach. The
approach I picked gave me insight into the problem so I could more
effectively think about it. So it would be hard to optimize it
before I wrote the initial spreadsheet.

If you want a quick and dirty answer then a spreadsheet is great for a
quick model mock up but if you want production code then like any other
way of doing things it has to be optimised for the chosen platform.

A word of warning the random number generator in Excel is NOT as
documented and some of their more esoteric statistical functions and
numerical algorithms have dubious stability.

You should learn to use the tool that you have chosen to attack the
problem with rather than demanding that it must run quicker to cope with
your slow running implementation.


So, no, PCs are not fast enough for all purposes.

Not for all purposes but when used correctly they are well capable
of doing everything that a typical home or office user requires.

And I suppose "correctly" means in a way that "the CPU is fast
enough"? That's a rather circular argument. I also don't see a
point in limiting the user group to "home or office" use. PCs are
much less often used in the home these days. Cell phones are much
more common. Office is the main market and that includes what I do
with PCs.

PCs are perhaps less used at home now but tablets and phones are also
fast enough for the sorts of things they are likely to get used for.

Gaming is the main driver for ever faster CPU/graphics rendering now.

Which, come to think of it, is sort-of sad.

Jeroen Belleman
 
On Tuesday, 11 June 2019 18:48:00 UTC+1, Jeroen Belleman wrote:
On 2019-06-11 17:31, Martin Brown wrote:

Gaming is the main driver for ever faster CPU/graphics rendering now.


Which, come to think of it, is sort-of sad.

Jeroen Belleman

Yup. But it does mean they pay for a lot of the tech development for us.


NT
 
On 10/06/2019 14:59, John Larkin wrote:
On Mon, 10 Jun 2019 09:11:55 +0100, Martin Brown
'''newspam'''@nezumi.demon.co.uk> wrote:

On 09/06/2019 16:52, John Larkin wrote:
On Sun, 9 Jun 2019 16:18:50 +0100, Martin Brown
'''newspam'''@nezumi.demon.co.uk> wrote:

On 08/06/2019 09:50, tabbypurr@gmail.com wrote:
On Friday, 7 June 2019 16:49:35 UTC+1, John Larkin wrote:

https://wolfstreet.com/2019/06/04/global-semiconductor-sales-plunge-but-why/

There may be another kind of Moore's Law: we just don't need all those
transistors.

We could use more. But we're in the timezone now where computers over
10 years old can still be perfectly capable. The urgency that used to
exist has largely gone.

Until the next great must have application comes along that requires an
order of magnitude performance increase to work there will be a hiatus.
Existing designs are plenty fast enough for all office and consumer uses
which means there is no compelling reason for upgrading any more.

I'd like Spice to run 1000x faster, and have parts value sliders, so I
can tune things on a screen like I can on my bench. But that would be
used rarely, and I have no other need for more compute power. Not many
people do.

How much more would you be willing to pay for that? Super computers are
available although the user interface is more batch oriented.

An Nvidia gaming card has a lot of compute power. That, and a
reasonable fee for software, might be affordable.

Your problem will be finding a suitably good parallel implementation of
Spice that works well on your simulations for a large number of threads.

BTW have you tried restricting the number of permitted threads in Spice
to around 75% of those nominally available. On hard chess programs I
have found that going beyond that point results in more heat but not
more speed (and sometimes a slight slowdown as it band limits memory).

Renting time on a faster remote CPU cluster might be one way out.

We tried running Spice on an Amazon cluster. It wasn't any better than
using a fast local computer.

But really, I can live with my Dell PC, Windows 7, and LT Spice as-is.
I don't really need more CPU power, more DRAM, more hard drive storage
or speed. 300 mbits seems like plenty of internet speed.

On this we are pretty much agreed. I could do with a bit more than the
poxy rural ADSL 5Mbps I can get but I can live within my means and use a
20Mbps mobile data stream when I need anything more.
I inherited a hammer that is at least 60 years old. That works fine
too. An HP35 is maybe the best scientific calculator.

Hammers haven't changed all that much with time. Since the 1990's CPU
performance has on average tripled every 5 years. Clock speed is pretty
much now maxed out and feature size is hitting very tough limits too.

Power per MIP is the new must have feature as voltages get lowered.

I first played with a friends HP35 but I preferred the TI SR59 myself. I
got away with using a slide rule and mental arithmetic until I ran into
crystallography where answers to 3 sig fig were not acceptable.

--
Regards,
Martin Brown
 
On 12/6/19 1:31 am, Martin Brown wrote:
> Gaming is the main driver for ever faster CPU/graphics rendering now.

GPU development is now also getting a significant boost from the machine
learning market. Any developer who actually wants to be serious about it
will drop $10K on their workstation GPU, and that's just for
development, the server clusters are f'ing huge.
 

Welcome to EDABoard.com

Sponsor

Back
Top