The future of the PC and especially processors and software.

Whoey Louie <trader4@optonline.net> wrote in news:c13a5975-c2e7-4e57-
be62-5e91a7f1afe4@googlegroups.com:

The End of Moore's Law?

Did not say END. It is slowing, so by definition no longer matches
the declaration of "the law" already. Look that up on google.

Oh and while we examine things and growth, or as is with this case,
shrinkage... At the start it was easy to see. As we approached the
end of our optical capabilities we advance yet again with water and
UV and now EUV.

It does not matter what you think is next because we are able now to
define near atom scale elements with ease.

So the end of the law is already here for that reason. We cannot go
any smaller. Atoms do not shrink that way. This is one reason why
Intel is stacking now. They see the future.
 
Whoey Louie <trader4@optonline.net> wrote in news:c13a5975-c2e7-4e57-
be62-5e91a7f1afe4@googlegroups.com:

They said that in the 80s at 3 microns too. Small isn't a tech term.

Was not then. IS now, since we are down at the atomic level. There
is no further shrinkage. Electron hand offs require mated elements.
Logic element feature size is near its limits.

Small is very much a tech term. So is no smaller.
 
On Friday, February 14, 2020 at 4:45:32 AM UTC-5, DecadentLinux...@decadence.org wrote:
Whoey Louie <trader4@optonline.net> wrote in news:c13a5975-c2e7-4e57-
be62-5e91a7f1afe4@googlegroups.com:

They said that in the 80s at 3 microns too. Small isn't a tech term.



Was not then. IS now, since we are down at the atomic level. There
is no further shrinkage. Electron hand offs require mated elements.
Logic element feature size is near its limits.

Small is very much a tech term. So is no smaller.

There are also issues with the photolithography at such small feature sizes..

I think I read something recently that with ever decreasing sizes, but ever increasing costs means the cost per device is not continuing to drop as feature sizes drop. The gains are all in power consumption.

Until now reducing feature sizes have been utilized by including more on a chip for the same cost. Now it is more likely that the same complexity devices will be made at the same price but with decreasing power requirements.

If you think about it, it is pretty amazing the aggregate performance of the chips made today and the aggregate power consumption. All so I can ask my cell phone to turn off airplane mode and it can tell me that it can't do that because it is in airplane mode.

The driving force in ever increasing computing requirements is, "if it feels good, do it". I'm just sayin'...

--

Rick C.

+ Get 1,000 miles of free Supercharging
+ Tesla referral code - https://ts.la/richard11209
 
On Friday, February 14, 2020 at 4:45:32 AM UTC-5, DecadentLinux...@decadence.org wrote:
Whoey Louie <trader4@optonline.net> wrote in news:c13a5975-c2e7-4e57-
be62-5e91a7f1afe4@googlegroups.com:

They said that in the 80s at 3 microns too. Small isn't a tech term.



Was not then. IS now, since we are down at the atomic level. There
is no further shrinkage.

This again from the village idiot, even after I showed you Intel
and ASML's roadmap all the way to 1.5 nm. The same village idiot that
didn't know 7nm was already shipping, with 5 nm coming later this year.
 
On Friday, February 14, 2020 at 4:43:19 AM UTC-5, DecadentLinux...@decadence.org wrote:
Whoey Louie <trader4@optonline.net> wrote in news:c13a5975-c2e7-4e57-
be62-5e91a7f1afe4@googlegroups.com:

The End of Moore's Law?



Did not say END. It is slowing, so by definition no longer matches
the declaration of "the law" already. Look that up on google.

Oh and while we examine things and growth, or as is with this case,
shrinkage... At the start it was easy to see. As we approached the
end of our optical capabilities we advance yet again with water and
UV and now EUV.

It does not matter what you think is next because we are able now to
define near atom scale elements with ease.

So the end of the law is already here for that reason. We cannot go
any smaller. Atoms do not shrink that way. This is one reason why
Intel is stacking now. They see the future.

The above from the dummy that didn't even know that 7nm, which you claimed
was going to be the end of the line is already in volume production and
5nm will start shipping later this year.

You go girl!
 
On Friday, February 14, 2020 at 4:09:59 AM UTC-5, Michael Terrell wrote:
On Thursday, February 13, 2020 at 3:54:03 PM UTC-5, Whoey Louie wrote:
On Thursday, February 13, 2020 at 2:57:14 PM UTC-5, Michael Terrell wrote:
On Thursday, February 13, 2020 at 1:59:37 PM UTC-5, Whoey Louie wrote:
On Thursday, February 13, 2020 at 12:30:24 PM UTC-5, DecadentLinux....@decadence.org wrote:
Whoey Louie <trader4@optonline.net> wrote in
news:93205fbb-6f00-46b8-9ad1-44c8797a82f4@googlegroups.com:

On Thursday, February 13, 2020 at 12:52:37 AM UTC-5,
skybu...@hotmail.com wrote:
On Monday, February 10, 2020 at 3:02:55 AM UTC+1,
DecadentLinux...@decadence.org wrote:
skybuck2000@hotmail.com wrote in news:46638391-8249-4ab6-b0a6-
b56ec485cdbc@googlegroups.com:

At the very least it should go from:
7 nanometers now, 64 to:
3.5 nanometers, 128
1.75 nanometers, 256
0.97 nanometers, 512
And maybe even 0.5 nanomenters 1024.

5 x 1.5 = 7 years.


You are a true idiot.

For one thing it was hard enough to get to 64 bit, and it is
the
rest of the gear not just the CPU. MAYBE go to 128, but not
much need to go further and by then we will have optical
computers and quantum computers kicking their asses. A 4 bit
optical computer can kick any silicon switches ass. Once we
get there, transistor based switching will get surpassed

We are down near atom sized features now. Elements 50 atoms
wide
and less. We will not likely be going much smaller than say
3.5nm. There are limits to how small a switch can be made and
I do not simply refer to our optical limitations.

Fucking idiot, I read the news. TECH NEWS.

Researches did 0.5 nanometer transistors.

If you don't believe me, them or google that's fine. JUST FUCK
OFF.

It's not that he doesn't believe you. It's that it's like he
said, they built one experimental transistor in a lab that's
0.5nm. That is a glimpse into the future of what may be possible.
Today 7nm is state of the art for actual PRODUCTION of chips. If
Moore's Law continues to hold true, we may see production of 0.5nm
in about 8 years. But it could take much longer, it gets
increasingly difficult to continue to decrease size, we've already
had to substantially modify fab processes to get where we are and
the further we push, the more complex it gets. I'd say we've been
extremely lucky that we've been able to overcome all the obstacles
so far. But the chances of that continuing diminish the further
we go.

If you had ben paying attention, the general feeling is that
Moore's Law is done. We are at very near the limit of what can be
fashioned into switches.


Well, that depends on who you listen to. For example from over a
year ago:

https://www.cnet.com/news/moores-law-is-dead-nvidias-ceo-jensen-huang-says-at-ces-2019/

"At least that's what Nvidia CEO Jensen Huang believes. The executive, who co-founded graphics-chip maker Nvidia, on Wednesday declared that "Moore's Law isn't possible anymore.

Moore's Law used to grow at 10x every five years [and] 100x every 10 years," Huang said during a Q&A panel with a small group of reporters and analysts at CES 2019. "Right now Moore's Law is growing a few percent every year. Every 10 years maybe only 2s. ... So Moore's Law has finished."


What the hell was he talking about? We just went from 10nm being state
of the art to 7nm and 5 nm will roll out later this year. That's
a few percent? ROFL He sure can't do math.


And what does Intel say?

"Intel, for its part, doesn't think Moore's Law is dead. Companies are just finding new ways to keep it going, like Intel's new 3D chip stacking. The manufacturing technology it calls Foveros stacks different chip elements directly on top of each other, a move that should dramatically increase performance and the range of chips Intel can profitably sell.

"Elements of this debate have been going on since the early 2000s," Intel Chief Technology Officer Michael Mayberry said in an EETimes post in August. "Meanwhile, technologists ignore the debate and keep making progress."



And he's right. From MIT Technology Review, 2000, here is where they were
saying it was over back then:

https://www.technologyreview.com/s/400710/the-end-of-moores-law/

The End of Moore's Law?







Maybe we will get atom scale transistor switches laid out on
graphene lattice layers.

Till then, we are pretty darn close to about as far as Silicon and
Germanium can get. Rememeber, it is not just the optical node we are
at, but the interconnects and such. SkyTard Chihuahua barked about
how Intel is making a mistake by stacking chips, yet memory chip
makers have been doing it for years.

7 is small.

They said that in the 80s at 3 microns too. Small isn't a tech term.



5 is likely the cost / chip over size limit.


Funny thing then that the largest chip manufacturer in the world is
rolling production on 5 nm out later this year. By 2021 it will be
in cell phones and computers.


3.5 is
dreamland.

Who should we believe? You or Intel? From a recent presentation of
the roadmap from Intel/ASML. One builds em, the other makes the fab
gear:

https://www.anandtech.com/show/15217/intels-manufacturing-roadmap-from-2019-to-2029

1.4nm in 2029
Intel expects to be on 2 year cadence with its manufacturing process node technology, starting with 10nm in 2019 and moving to 7nm EUV in 2021, then a fundamental new node in each of 2023, 2025, 2027, 2029. This final node is what ASML has dubbed '1.4nm'. This is the first mention on 1.4nm in the context of Intel on any Intel-related slide. For context, if that 1.4nm is indicative of any actual feature, would be the equivalent of 12 silicon atoms across.




And his 0.5 will end up as a special seen in Quantum dot making or
such.

Maybe so, but Intel has a roadmap to 1.4, which is getting close.


It's interesting what is on the market. I bought one of these a couple months ago, to get a newer version of Android.

10-1-Tablet-8G-128G-Android-8-0-bluetooth-3G-WiFi-PC-Dual-Camera-GPS-Phablet
https://www.ebay.com/itm/352837784631

Ten core tablet, 8GB of RAM, 128GB of Flash. Will take 64GB MicroSD Card ant two SIMM cards for cellular service for $58.


That's an example of why the PC market is declining. And it doesn't have
an X86 CPU in it either. Bad news for Intel and AMD. I guess the
question is how fast the business PC market continues to grow as the
consumer market declines.

You are an example of people wearing blinders. I can't carry my desktop system with me, and my cell phone or tablet gets charged about every three months because they are rarely used. Let me see a cell phone or tablet with six TB of storage, dual 24" monitors and on a hard wired home network with five other computers. Show me either with a large flatbed scanner, or an IEEE-488 interface. How about creating and maintaining websites on either? They d cover the needs of many non tech types. If the sales of home grade PCs drops it won't affect the need for business computers or high end laptops.

ROFL

You don't understand that PC shipments peaked 7 years ago and I'm the
one with blinders on? Good grief. Open your eyes. Where a family
might have had three PC ten years ago, now they may have one PC,
a couple tablets and everyone has a smartphone. Good grief.
 
On Friday, February 14, 2020 at 12:41:03 AM UTC-5, upsid...@downunder.com wrote:
On Thu, 13 Feb 2020 18:46:30 -0800 (PST), Whoey Louie
trader4@optonline.net> wrote:

On Thursday, February 13, 2020 at 9:26:13 PM UTC-5, DecadentLinux...@decadence.org wrote:
Whoey Louie <trader4@optonline.net> wrote in
news:28d3ad0c-cf5d-46b1-8d47-a5ed3f35e897@googlegroups.com:


clip

It's interesting what is on the market. I bought one of these
a couple
months ago, to get a newer version of Android.

10-1-Tablet-8G-128G-Android-8-0-bluetooth-3G-WiFi-PC-Dual-Camera-G
PS-Phab
let
https://www.ebay.com/itm/352837784631

Ten core tablet, 8GB of RAM, 128GB of Flash. Will take 64GB
MicroSD Card
ant two SIMM cards for cellular service for $58.


That's an example of why the PC market is declining. And it
doesn't have an X86 CPU in it either. Bad news for Intel and AMD.
I guess the question is how fast the business PC market continues
to grow as the consumer market declines.

I think you are fucking lost. Have you ever even seen a server
farm or data center?

Servers aren't PCs either stupid.




Home folks still buy and use PCs too, just not dweebtards like you.
That doesn't make you the expert on the market... By far.

I know the PC market peaked 7 years ago and it's been declining because
people are using smartphones and tablets.




Your phone and that tablet ARE PCs.

Now they aren't stupid. Industry terms have meaning. Even kids no
that a cell phone isn't a PC.

We have seen similar development before.

In the old days, there was one big mainframe and a huge number of
users connected with dumb terminals (e,g, VT100) or similar systems
using IBM style transaction processing (CICS). All high level
processing was done in the mainframe and the user terminals did only
very simple screen editing functionality to reduce the loading of the
slow serial connections.

It was quite easy to manage a large number of users. Changes needed to
be done only in one place and all users had a new version of some
program the next time they logged in.

Then came the PCs with a lot of local storage and processing power.
Managing the user applications are a headache of updating the local
nodes when the machine is online etc.

Some manufacturers tried to create a "thin client" architecture, in
which diskless nodes were booted from a server over the LAN. This thin
client system did not reach high popularity.

A lot of server applications are now http based and only a limited
functionality in the user clients. The connection to server is over
LAN or various forms of WLAN (including 4G/5G).



And even without a hard coded x86, they can still emulate it in a
VDM.


And cite for us the smartphones that do emulate it and that will
run X86 apps for a PC.

A server node (cloud) can run x86 code, the user interface is moved
to the smart phone.

Of course none of that has anything to do with the fact that PC sales
peaked 7 years ago and have been declining. Which is what I stated.
And no one is running PC
apps on servers for smartphones either. This is like arguing
that mainframe sales didn't peak and decline, because they were
REPLACED with other alternatives.
 
Whoey Louie <trader4@optonline.net> wrote in
news:77011f42-ac3c-4be2-b4fc-92ab6571742f@googlegroups.com:

The above from the dummy that didn't even know that 7nm, which you
claimed was going to be the end of the line is already in volume
production and 5nm will start shipping later this year.

You go girl!

You are a goddamned retard.

I never said 7nm was the end of the line, you fucking retarded
fuck. In fact my first response to the skytard states 3.5 as the
end. So, you, motherfucker, need to shut the fuck up, you retarded
little piece of shit.

I posted links about IBM at 7nm days ago. Before your little
retard tirade interloped into the thread.

You were the one saying that Intel is going down because AMD is on
a smaller node. You were agreeing with the SkyTard.

You are both wrong.

All because Intel kept their 14nm node going and robust.

And little retarded punks like you spouting stupid shit like 'you
go girl'.

Tell me your address and I'll show you how a fast moving hunk of
lead 'goes', girl.
 
Whoey Louie <trader4@optonline.net> wrote in
news:b2efb466-390f-42c4-9e5f-fa1a28003908@googlegroups.com:

On Friday, February 14, 2020 at 4:45:32 AM UTC-5,
DecadentLinux...@decadence.org wrote:
Whoey Louie <trader4@optonline.net> wrote in
news:c13a5975-c2e7-4e57- be62-5e91a7f1afe4@googlegroups.com:

They said that in the 80s at 3 microns too. Small isn't a tech
term.



Was not then. IS now, since we are down at the atomic level.
There
is no further shrinkage.

This again from the village idiot, even after I showed you Intel
and ASML's roadmap all the way to 1.5 nm.

You didn't "show me" anything, you fucking putz fuckhead. I do NOT
visit ANY inane link a fucktard like you posts. And your little post
here with dates... yeah, I ignored that too, dumbfuck. I do not get
info like that from jackasses like you.


The same village idiot
that didn't know 7nm was already shipping,

You know NOTHING about what I know. Again, you retarded putz, I
posted links on 7nm IBM chips days ago. And THEN you started
mumbling about Intel failing, which I refute.


with 5 nm coming later
this year.

AMD, not Intel. Each node costs millions as they build entire
facilities surrounding a node move. Intel, OTOH, builds facilities
that can fab multiple nodes. And their architectures are such that
they can be fabbed on each node.

Something the gamer race AMD is on does not do. Do not discount
Intel quite yet. They will emerge on top... just wait.

But no... you reading up on it on the web does not make you
knowing a fucking thing. And your attitude here is further proof.
You are nothing but a lard assed pussy boy, boy. You know even
less about it than SkyTard Mumbling does.
 
On 2020-02-13, Whoey Louie <trader4@optonline.net> wrote:
On Thursday, February 13, 2020 at 2:57:14 PM UTC-5, Michael Terrell wrote:
On Thursday, February 13, 2020 at 1:59:37 PM UTC-5, Whoey Louie wrote:
On Thursday, February 13, 2020 at 12:30:24 PM UTC-5, DecadentLinux...@decadence.org wrote:
Whoey Louie <trader4@optonline.net> wrote in
news:93205fbb-6f00-46b8-9ad1-44c8797a82f4@googlegroups.com:

On Thursday, February 13, 2020 at 12:52:37 AM UTC-5,
skybu...@hotmail.com wrote:
On Monday, February 10, 2020 at 3:02:55 AM UTC+1,
DecadentLinux...@decadence.org wrote:
skybuck2000@hotmail.com wrote in news:46638391-8249-4ab6-b0a6-
b56ec485cdbc@googlegroups.com:

At the very least it should go from:
7 nanometers now, 64 to:
3.5 nanometers, 128
1.75 nanometers, 256
0.97 nanometers, 512
And maybe even 0.5 nanomenters 1024.

5 x 1.5 = 7 years.


You are a true idiot.

For one thing it was hard enough to get to 64 bit, and it is
the
rest of the gear not just the CPU. MAYBE go to 128, but not
much need to go further and by then we will have optical
computers and quantum computers kicking their asses. A 4 bit
optical computer can kick any silicon switches ass. Once we
get there, transistor based switching will get surpassed

We are down near atom sized features now. Elements 50 atoms
wide
and less. We will not likely be going much smaller than say
3.5nm. There are limits to how small a switch can be made and
I do not simply refer to our optical limitations.

Fucking idiot, I read the news. TECH NEWS.

Researches did 0.5 nanometer transistors.

If you don't believe me, them or google that's fine. JUST FUCK
OFF.

It's not that he doesn't believe you. It's that it's like he
said, they built one experimental transistor in a lab that's
0.5nm. That is a glimpse into the future of what may be possible.
Today 7nm is state of the art for actual PRODUCTION of chips. If
Moore's Law continues to hold true, we may see production of 0.5nm
in about 8 years. But it could take much longer, it gets
increasingly difficult to continue to decrease size, we've already
had to substantially modify fab processes to get where we are and
the further we push, the more complex it gets. I'd say we've been
extremely lucky that we've been able to overcome all the obstacles
so far. But the chances of that continuing diminish the further
we go.

If you had ben paying attention, the general feeling is that
Moore's Law is done. We are at very near the limit of what can be
fashioned into switches.


Well, that depends on who you listen to. For example from over a
year ago:

https://www.cnet.com/news/moores-law-is-dead-nvidias-ceo-jensen-huang-says-at-ces-2019/

"At least that's what Nvidia CEO Jensen Huang believes. The executive, who co-founded graphics-chip maker Nvidia, on Wednesday declared that "Moore's Law isn't possible anymore.

Moore's Law used to grow at 10x every five years [and] 100x every 10 years," Huang said during a Q&A panel with a small group of reporters and analysts at CES 2019. "Right now Moore's Law is growing a few percent every year. Every 10 years maybe only 2s. ... So Moore's Law has finished."


What the hell was he talking about? We just went from 10nm being state
of the art to 7nm and 5 nm will roll out later this year. That's
a few percent? ROFL He sure can't do math.


And what does Intel say?

"Intel, for its part, doesn't think Moore's Law is dead. Companies are just finding new ways to keep it going, like Intel's new 3D chip stacking. The manufacturing technology it calls Foveros stacks different chip elements directly on top of each other, a move that should dramatically increase performance and the range of chips Intel can profitably sell.

"Elements of this debate have been going on since the early 2000s," Intel Chief Technology Officer Michael Mayberry said in an EETimes post in August. "Meanwhile, technologists ignore the debate and keep making progress."



And he's right. From MIT Technology Review, 2000, here is where they were
saying it was over back then:

https://www.technologyreview.com/s/400710/the-end-of-moores-law/

The End of Moore's Law?







Maybe we will get atom scale transistor switches laid out on
graphene lattice layers.

Till then, we are pretty darn close to about as far as Silicon and
Germanium can get. Rememeber, it is not just the optical node we are
at, but the interconnects and such. SkyTard Chihuahua barked about
how Intel is making a mistake by stacking chips, yet memory chip
makers have been doing it for years.

7 is small.

They said that in the 80s at 3 microns too. Small isn't a tech term.



5 is likely the cost / chip over size limit.


Funny thing then that the largest chip manufacturer in the world is
rolling production on 5 nm out later this year. By 2021 it will be
in cell phones and computers.


3.5 is
dreamland.

Who should we believe? You or Intel? From a recent presentation of
the roadmap from Intel/ASML. One builds em, the other makes the fab
gear:

https://www.anandtech.com/show/15217/intels-manufacturing-roadmap-from-2019-to-2029

1.4nm in 2029
Intel expects to be on 2 year cadence with its manufacturing process node technology, starting with 10nm in 2019 and moving to 7nm EUV in 2021, then a fundamental new node in each of 2023, 2025, 2027, 2029. This final node is what ASML has dubbed '1.4nm'. This is the first mention on 1.4nm in the context of Intel on any Intel-related slide. For context, if that 1.4nm is indicative of any actual feature, would be the equivalent of 12 silicon atoms across.




And his 0.5 will end up as a special seen in Quantum dot making or
such.

Maybe so, but Intel has a roadmap to 1.4, which is getting close.


It's interesting what is on the market. I bought one of these a couple months ago, to get a newer version of Android.

10-1-Tablet-8G-128G-Android-8-0-bluetooth-3G-WiFi-PC-Dual-Camera-GPS-Phablet
https://www.ebay.com/itm/352837784631

Ten core tablet, 8GB of RAM, 128GB of Flash. Will take 64GB MicroSD Card ant two SIMM cards for cellular service for $58.

That's an example of why the PC market is declining. And it doesn't have
an X86 CPU in it either.

How can you tell?

--
Jasen.
 
On Friday, February 14, 2020 at 9:02:40 PM UTC-5, Jasen Betts wrote:
On 2020-02-13, Whoey Louie wrote:
On Thursday, February 13, 2020 Michael Terrell wrote:

It's interesting what is on the market. I bought one of these a couple months ago, to get a newer version of Android.

10-1-Tablet-8G-128G-Android-8-0-bluetooth-3G-WiFi-PC-Dual-Camera-GPS-Phablet
https://www.ebay.com/itm/352837784631

Ten core tablet, 8GB of RAM, 128GB of Flash. Will take 64GB MicroSD Card ant two SIMM cards for cellular service for $58.

That's an example of why the PC market is declining. And it doesn't have
an X86 CPU in it either.

How can you tell?

The same way that he can prove that you can't run anything else on a server. He can't.

The three Dell servers that I have can be used as industrial grade PCs with built in RAID controllers.
 
Whoey Louie <trader4@optonline.net> wrote in
news:57bccf52-a0b6-4da2-b194-70d00aa2ddcc@googlegroups.com:

On Friday, February 14, 2020 at 4:24:47 PM UTC-5,
DecadentLinux...@decadence.org wrote:
Whoey Louie <trader4@optonline.net> wrote in
news:b2efb466-390f-42c4-9e5f-fa1a28003908@googlegroups.com:

On Friday, February 14, 2020 at 4:45:32 AM UTC-5,
DecadentLinux...@decadence.org wrote:
Whoey Louie <trader4@optonline.net> wrote in
news:c13a5975-c2e7-4e57- be62-5e91a7f1afe4@googlegroups.com:

They said that in the 80s at 3 microns too. Small isn't a
tech term.



Was not then. IS now, since we are down at the atomic
level. There
is no further shrinkage.

This again from the village idiot, even after I showed you
Intel and ASML's roadmap all the way to 1.5 nm.

You didn't "show me" anything, you fucking putz fuckhead. I do
NOT
visit ANY inane link a fucktard like you posts.

Well that helps explain why you are wrong, always wrong.

Learn to read. I visit info sites. Just not the retarded spews
you post.

There
are none so ignorant as those that will not look.

Like you and your lack of reading ability.
You said I did not know something which I posted about already before
you even said it. You are extremely stupid.

If you did,
maybe you wouldn't be so wrong, angry and the village idiot.

I garner real information. I do not need additional references
from the Trumpesqueian sed village twerp.

You seem to lack the capacity to have visited any of the cites I
posted, so fuck you, hypocritical retard boy.

And what is it with you and posting blank lines after your spew?
 
On Saturday, February 15, 2020 at 12:21:42 AM UTC-5, Michael Terrell wrote:
On Friday, February 14, 2020 at 9:02:40 PM UTC-5, Jasen Betts wrote:
On 2020-02-13, Whoey Louie wrote:
On Thursday, February 13, 2020 Michael Terrell wrote:

It's interesting what is on the market. I bought one of these a couple months ago, to get a newer version of Android.

10-1-Tablet-8G-128G-Android-8-0-bluetooth-3G-WiFi-PC-Dual-Camera-GPS-Phablet
https://www.ebay.com/itm/352837784631

Ten core tablet, 8GB of RAM, 128GB of Flash. Will take 64GB MicroSD Card ant two SIMM cards for cellular service for $58.

That's an example of why the PC market is declining. And it doesn't have
an X86 CPU in it either.

How can you tell?

Because x86 has only a very small share of the tablet market. Because
tablet makers wanted to go to Android and Arm to get lower cost, away
from Intel/MSFT. Because a $60 unbranded, no name tablet from China
is the last place an X86 CPU would be used. In other words, I understand
the market.

Oh, and because I can read a datasheet. The link you supplied, to
that tablet, says it has an MTK6797 CPU. That's ARM.



The same way that he can prove that you can't run anything else on a server. He can't.

I never said any such thing.
 
On Friday, February 14, 2020 at 4:24:47 PM UTC-5, DecadentLinux...@decadence.org wrote:
Whoey Louie <trader4@optonline.net> wrote in
news:b2efb466-390f-42c4-9e5f-fa1a28003908@googlegroups.com:

On Friday, February 14, 2020 at 4:45:32 AM UTC-5,
DecadentLinux...@decadence.org wrote:
Whoey Louie <trader4@optonline.net> wrote in
news:c13a5975-c2e7-4e57- be62-5e91a7f1afe4@googlegroups.com:

They said that in the 80s at 3 microns too. Small isn't a tech
term.



Was not then. IS now, since we are down at the atomic level.
There
is no further shrinkage.

This again from the village idiot, even after I showed you Intel
and ASML's roadmap all the way to 1.5 nm.

You didn't "show me" anything, you fucking putz fuckhead. I do NOT
visit ANY inane link a fucktard like you posts.

Well that helps explain why you are wrong, always wrong. There are
none so ignorant as those that will not look. If you did, maybe you
wouldn't be so wrong, angry and the village idiot.
 
On 2020-02-15, Whoey Louie <trader4@optonline.net> wrote:
On Saturday, February 15, 2020 at 12:21:42 AM UTC-5, Michael Terrell wrote:
On Friday, February 14, 2020 at 9:02:40 PM UTC-5, Jasen Betts wrote:
On 2020-02-13, Whoey Louie wrote:
That's an example of why the PC market is declining. And it doesn't have
an X86 CPU in it either.

How can you tell?


Because x86 has only a very small share of the tablet market. Because
tablet makers wanted to go to Android and Arm to get lower cost, away
from Intel/MSFT. Because a $60 unbranded, no name tablet from China
is the last place an X86 CPU would be used. In other words, I understand
the market.

Oh, and because I can read a datasheet. The link you supplied, to
that tablet, says it has an MTK6797 CPU. That's ARM.

Thanks, that web site hs too many scroll bars - I missed one.

--
Jasen.
 
There are phones with micro-HDMI output, and nearly all can stream
OUT. That means they can be put on a display and then used no
differently than a PC. One is already able to add things like
keyboards, POS attachments.

With bluetooth keyboards and mice it should be doable now. Same with printers ad who knows what else. Even if they can do a hotspot regular wifi should do it, no ? (if they try, might need some sort of router/switch) The HDMI takes care of the painfully small screen and there is almost always a way to get sound out of it.

And about the transistors. Well over a decade ago my Fairchild newsletter said they had developed a one molecule transistor. Of course they didn't say just how big of a molecule... Still it can't be all that big if it sticks together, right ?

Optical is technically no faster than electronics but it does eliminate capacitance. That would only be produced by something phosphorescent or some shit, I dunno. With the density of these ICs now inductance is surely a thing of the past mostly.

And higher impedances, which are more doable in a small, sealed environment lower power drain and therefore heat. Then the main inefficiency becomes the capacitance when the devices switch states. With no capacitance, and using FETs with infinite input resistance, and push pull driving instead of against a resistor or current source, where the hell would the power go ?

Someone pointed out to me, look at cellphones. They have enough computing power to be PCs right now. Do you hear a fan whizzing in there ? And they don't get all that hot. The key is efficiency. I don't use mine for much computing, the closest really is Uber. When it gets warm that is from the RF obviously, where else just to talk on the phone ? I know there is digital encoding and all that, but that is not a new thing.

Now I see optical as being able to take alot more heat, but from where ? You would have o impede the light and you don't have much room for that.

I do believe optical is going to happen. It really is the next step. We are already at diminishing returns.

And then we will all have 500TB/sec internet, 3,000THz processors, 10 TB RAM and webpages will still load slow because them fucking web designers have to put everything in there. On my page here on Google Groups, I can drag where it says "sci.electronics.design" anywhere on the screen. What in the fuck would I possibly need that for ?

I finally see the use for the "bounce" when you reach the end of the scrolling limit, so you know you are at the limit. But if one were to watch the scroll bar at the right...

Well maybe that will be a thing of the past, it is on phones...

Oh, and there is a debate on 5G, which I frequent. The main question is - WHY ? They do not have commercially available true 3D TV yet, and that is probably the only use for the holographic disks. There is nothing to put on them, just like 5G.

Maybe you can explain WHY 5G, nobody else can come up with a real reason except to make mo money.

Maybe I am a dweeb, an asshole, a fuddy duddy but what I got works. Windows 10 doesn't. How many fucking years is it going to take to finish installing it and having more and more shit not work ? I think there is a good FUCKING reason they gave it away.
 
On Tuesday, February 18, 2020 at 6:09:35 PM UTC-5, jurb...@gmail.com wrote:
There are phones with micro-HDMI output, and nearly all can stream
OUT. That means they can be put on a display and then used no
differently than a PC. One is already able to add things like
keyboards, POS attachments.

With bluetooth keyboards and mice it should be doable now. Same with printers ad who knows what else. Even if they can do a hotspot regular wifi should do it, no ? (if they try, might need some sort of router/switch) The HDMI takes care of the painfully small screen and there is almost always a way to get sound out of it.

And about the transistors. Well over a decade ago my Fairchild newsletter said they had developed a one molecule transistor. Of course they didn't say just how big of a molecule... Still it can't be all that big if it sticks together, right ?

There are molecules bigger than today's transistors... well, as big as a transistor channel width anyway. Biological molecules can be very, very large chains like DNA, RNA and proteins.


Optical is technically no faster than electronics but it does eliminate capacitance. That would only be produced by something phosphorescent or some shit, I dunno. With the density of these ICs now inductance is surely a thing of the past mostly.

And higher impedances, which are more doable in a small, sealed environment lower power drain and therefore heat. Then the main inefficiency becomes the capacitance when the devices switch states. With no capacitance, and using FETs with infinite input resistance, and push pull driving instead of against a resistor or current source, where the hell would the power go ?

Someone pointed out to me, look at cellphones. They have enough computing power to be PCs right now.

Not quite. There are no cell phones that process at the level of my laptop, 3 GHz with 8 processors.


Do you hear a fan whizzing in there ? And they don't get all that hot. The key is efficiency. I don't use mine for much computing, the closest really is Uber. When it gets warm that is from the RF obviously, where else just to talk on the phone ? I know there is digital encoding and all that, but that is not a new thing.

Now I see optical as being able to take alot more heat, but from where ? You would have o impede the light and you don't have much room for that.

I do believe optical is going to happen. It really is the next step. We are already at diminishing returns.

Is anyone actually testing any sort of optical computing? Or is this always 10 years out?


And then we will all have 500TB/sec internet, 3,000THz processors, 10 TB RAM and webpages will still load slow because them fucking web designers have to put everything in there. On my page here on Google Groups, I can drag where it says "sci.electronics.design" anywhere on the screen. What in the fuck would I possibly need that for ?

I finally see the use for the "bounce" when you reach the end of the scrolling limit, so you know you are at the limit. But if one were to watch the scroll bar at the right...

Well maybe that will be a thing of the past, it is on phones...

Oh, and there is a debate on 5G, which I frequent. The main question is - WHY ? They do not have commercially available true 3D TV yet, and that is probably the only use for the holographic disks. There is nothing to put on them, just like 5G.

Maybe you can explain WHY 5G, nobody else can come up with a real reason except to make mo money.

5G is to allow the phone companies to provide wirelessly what you currently get over cable, i.e. high speed streaming and regular TV. While you can currently stream video on your phone, it is pricey and if many users try to do it at once the airwaves will clog. 5G brings enough bandwidth to allow all users to dump their cable. Well, all users who live downtown in major cities. The added bandwidth requires smaller cells so that it won't be profitable other than in the dense cities.

Cell companies want to displace cable companies.


> Maybe I am a dweeb, an asshole, a fuddy duddy but what I got works. Windows 10 doesn't. How many fucking years is it going to take to finish installing it and having more and more shit not work ? I think there is a good FUCKING reason they gave it away.

Windows 10 works for the vast majority of PC users. Sure, something this complex will always have issues that you won't like. But it works and works well, better than any version of Windows to date. People don't switch to Linux because it is a tool rather than a consumer item. Unless you are a geek it is hard to deal with at times. I've started down that road before and had other users tell me that if I don't want to spend my time reading man pages, then I should not be using a computer.

Windows is for the rest of us even if we love to hate it.

--

Rick C.

-- Get 1,000 miles of free Supercharging
-- Tesla referral code - https://ts.la/richard11209
 
On Tuesday, 18 February 2020 23:09:35 UTC, jurb...@gmail.com wrote:

> Optical is technically no faster than electronics but it does eliminate capacitance. That would only be produced by something phosphorescent or some shit, I dunno.

When the book taught me about adding impurities to silicon I didn't realise that's what they meant :)
 
jurb6006@gmail.com wrote:
There are phones with micro-HDMI output, and nearly all can stream
OUT. That means they can be put on a display and then used no
differently than a PC. One is already able to add things like
keyboards, POS attachments.

With bluetooth keyboards and mice it should be doable now. Same with printers ad who knows what else. Even if they can do a hotspot regular wifi should do it, no ? (if they try, might need some sort of router/switch) The HDMI takes care of the painfully small screen and there is almost always a way to get sound out of it.

And about the transistors. Well over a decade ago my Fairchild newsletter said they had developed a one molecule transistor. Of course they didn't say just how big of a molecule... Still it can't be all that big if it sticks together, right ?

Optical is technically no faster than electronics but it does eliminate capacitance. That would only be produced by something phosphorescent or some shit, I dunno. With the density of these ICs now inductance is surely a thing of the past mostly.

And higher impedances, which are more doable in a small, sealed environment lower power drain and therefore heat. Then the main inefficiency becomes the capacitance when the devices switch states. With no capacitance, and using FETs with infinite input resistance, and push pull driving instead of against a resistor or current source, where the hell would the power go ?

Someone pointed out to me, look at cellphones. They have enough computing power to be PCs right now. Do you hear a fan whizzing in there ? And they don't get all that hot. The key is efficiency. I don't use mine for much computing, the closest really is Uber. When it gets warm that is from the RF obviously, where else just to talk on the phone ? I know there is digital encoding and all that, but that is not a new thing.

Now I see optical as being able to take alot more heat, but from where ? You would have o impede the light and you don't have much room for that.

I do believe optical is going to happen. It really is the next step. We are already at diminishing returns.

And then we will all have 500TB/sec internet, 3,000THz processors, 10 TB RAM and webpages will still load slow because them fucking web designers have to put everything in there. On my page here on Google Groups, I can drag where it says "sci.electronics.design" anywhere on the screen. What in the fuck would I possibly need that for ?

I finally see the use for the "bounce" when you reach the end of the scrolling limit, so you know you are at the limit. But if one were to watch the scroll bar at the right...

Well maybe that will be a thing of the past, it is on phones...

Oh, and there is a debate on 5G, which I frequent. The main question is - WHY ? They do not have commercially available true 3D TV yet, and that is probably the only use for the holographic disks. There is nothing to put on them, just like 5G.

Maybe you can explain WHY 5G, nobody else can come up with a real reason except to make mo money.
* YUP! NO other reason.
Illegally force a change to make more money, and in a direction not
compatible with previous technology.
For example, analog TV to digital TV.
"Better resolution"? Could have been compatible - was done going from
B&W to color...
Oh yeah they forgot 3D TV which could have been done over 20 years
ago, at least 3 different ways (and be compatible).

Maybe I am a dweeb, an asshole, a fuddy duddy but what I got works. Windows 10 doesn't. How many fucking years is it going to take to finish installing it and having more and more shit not work ? I think there is a good FUCKING reason they gave it away.
 
On Friday, February 21, 2020 at 1:54:57 AM UTC-5, Robert Baer wrote:
jurb6006@gmail.com wrote:
There are phones with micro-HDMI output, and nearly all can stream
OUT. That means they can be put on a display and then used no
differently than a PC. One is already able to add things like
keyboards, POS attachments.

With bluetooth keyboards and mice it should be doable now. Same with printers ad who knows what else. Even if they can do a hotspot regular wifi should do it, no ? (if they try, might need some sort of router/switch) The HDMI takes care of the painfully small screen and there is almost always a way to get sound out of it.

And about the transistors. Well over a decade ago my Fairchild newsletter said they had developed a one molecule transistor. Of course they didn't say just how big of a molecule... Still it can't be all that big if it sticks together, right ?

Optical is technically no faster than electronics but it does eliminate capacitance. That would only be produced by something phosphorescent or some shit, I dunno. With the density of these ICs now inductance is surely a thing of the past mostly.

And higher impedances, which are more doable in a small, sealed environment lower power drain and therefore heat. Then the main inefficiency becomes the capacitance when the devices switch states. With no capacitance, and using FETs with infinite input resistance, and push pull driving instead of against a resistor or current source, where the hell would the power go ?

Someone pointed out to me, look at cellphones. They have enough computing power to be PCs right now. Do you hear a fan whizzing in there ? And they don't get all that hot. The key is efficiency. I don't use mine for much computing, the closest really is Uber. When it gets warm that is from the RF obviously, where else just to talk on the phone ? I know there is digital encoding and all that, but that is not a new thing.

Now I see optical as being able to take alot more heat, but from where ? You would have o impede the light and you don't have much room for that.

I do believe optical is going to happen. It really is the next step. We are already at diminishing returns.

And then we will all have 500TB/sec internet, 3,000THz processors, 10 TB RAM and webpages will still load slow because them fucking web designers have to put everything in there. On my page here on Google Groups, I can drag where it says "sci.electronics.design" anywhere on the screen. What in the fuck would I possibly need that for ?

I finally see the use for the "bounce" when you reach the end of the scrolling limit, so you know you are at the limit. But if one were to watch the scroll bar at the right...

Well maybe that will be a thing of the past, it is on phones...

Oh, and there is a debate on 5G, which I frequent. The main question is - WHY ? They do not have commercially available true 3D TV yet, and that is probably the only use for the holographic disks. There is nothing to put on them, just like 5G.

Maybe you can explain WHY 5G, nobody else can come up with a real reason except to make mo money.
* YUP! NO other reason.
Illegally force a change to make more money, and in a direction not
compatible with previous technology.
For example, analog TV to digital TV.
"Better resolution"? Could have been compatible - was done going from
B&W to color...

Color was a bandaid and it didn't require 10X the number of pixels.
You couldn't stuff 10x into the existing analog spectrum. Also, it's
not clear how exactly this makes the manufacturers, the industry, more
money. For HDTV, regardless of how it was done, you'd still need a
new TV, the broadcasters still needed new gear, cable companies needed
new boxes. The only additional revenue was from the short term suppliers
of the adapters to allow old NTSC TVs to be used with digital for people
receiving OTA. And they were cheap, not a lot to be made there.








Oh yeah they forgot 3D TV which could have been done over 20 years
ago, at least 3 different ways (and be compatible).

Consumers apparently are not much interested in 3D. There are 3D TVs and
some media, but people don't seem very interested in it.
 

Welcome to EDABoard.com

Sponsor

Back
Top