Is microprocessor an integrated circuit???

On Sat, 29 Jan 2005 13:27:05 -0500, keith <krw@att.bizzzz> wrote:


Sure, but if a processor can exist without a program counter a
micro-programmed machine can exist without a micro-program counter. A
counter is a means to an end. It's not an end.
Some of the low-ends COPS machines used a pseudo-random shift register
instead of a program counter. That reduced prop delays and hardware
complexity, but sequential instructions hopped all over the place. I
know a guy who actually programmed one of these monsters (used it in a
commercial ignition timing strobe, with all sorts of tricky timing
loops) and he's still fairly coherent.

There are a few web sites devoted to designing computer languages and
architectures with the worst possible structures, apparently unaware
that National beat them to it.

John
 
On Fri, 28 Jan 2005 15:51:30 -0500, Keith Williams wrote:
richgrise@example.net says...
Microcode can be listed. Logic can only be drawn.
I have spoken! (so I'm probably wrong.) ;-P

I can list this "PLA" microcode (I have the power;), but it never
exists in hardware. It's converted by the synthesis tools from VHDL
into gates. What does that make it? Is it microprogrammed? The
source sure looks like a PLA. It's even called a PLA in the source.
It's random logic on the chip though.
Ah-HA! The 'P' is for 'programmable' or 'programmed', right? It's _not_
an(a?) MPLA! ;-) ;-)

I'm telling you that "thar be dragons" if you insist on defining things
with nice black lines.
Boy, ain't _that_ the truth!

Thanks!
Rich
 
On Thu, 27 Jan 2005 18:24:12 +0100, Lasse Langwadt Christensen wrote:

looking at the 8008 manual, it had an accumulator, six 8bit registers, a
flag register and a 14 bit program counter and seven 14bit registers for
pc stack. I'm sure that is what the two arrays are
But, does it have a microprogram counter? Is there anything like a
detailed block diagram of it anywhere on the web that you know of?[0] I
find myself fascinated by this discussion. :)

Thanks,
Rich

[0] Or Anyone? :)
 
On Fri, 28 Jan 2005 22:33:18 +0000, Bradley1234 wrote:
Or maybe Im so smart, so amazingly qualified and so experienced that Im not
afraid to be misinterpreted or misunderestimated

No worries, Bradley. I'm sure you're not being misunderestimated around
here. ;-P

Cheres!
Rich
 
In article <ZuPKd.631$wM.456@trnddc05>, Bradley1234 <someone@yahoo.com> wrote:
Not that I would defend IBM, its funny you reference them that way. I must
conclude you know a great deal about IBM

It appears in the article that the 8 bit byte is enclosed in a 10 bit
wrapper, probably a parity for every 4 bits?

Go read more carefully. That isn't what they are doing.


--
--
kensmith@rahul.net forging knowledge
 
In article <pan.2005.01.29.22.24.40.304340@example.net>,
Rich Grise <richgrise@example.net> wrote:
[...]
BTW, that IBM article was a real foot-shoot. It translates its 8-bit bytes
to some 10-bit code for data transfer.
Yes it is sending 10 bit codes to represent what are at the input 8 bit
values but it is also sending "special control characters" ie: values
outside the usual 8 bit range. This makes the "10 bit byte" contain more
than 8 bits of information thus disproving Bradly1234's claims.


But a "byte" is _NOT_ automatically eight bits.

Cheers!
Rich

--
--
kensmith@rahul.net forging knowledge
 
On Sun, 30 Jan 2005 04:32:09 GMT, Craig Bergren
<cbergren@tvbox.bergren.us> wrote:

On Wed, 26 Jan 2005 21:30:02 +0000, Bradley1234 wrote:



There are many examples of microprocessors that didn't use microcode.

Hey cool, Im going to learn something new, I like to learn. Even though
we have an arbitrary definition going, where microprocessor might also
mean "purple monkey dishwasher"

PLEASE show an example of a microprocessor that doesnt use microcode

If Im wrong, Ill take back what I said


Intel 4004, 8008
Motorola 6800, 6801, 6805, 6809

Now take back what you said. You can check your favorite source,
Wikipedia if you don't believe me.
Coldfire executes most of the 68K instruction set, and it's not
microcoded.

John
 
On Sat, 29 Jan 2005 20:47:22 -0800, John Larkin wrote:

On Sun, 30 Jan 2005 04:32:09 GMT, Craig Bergren
cbergren@tvbox.bergren.us> wrote:

On Wed, 26 Jan 2005 21:30:02 +0000, Bradley1234 wrote:



There are many examples of microprocessors that didn't use microcode.

Hey cool, Im going to learn something new, I like to learn. Even
though we have an arbitrary definition going, where microprocessor
might also mean "purple monkey dishwasher"

PLEASE show an example of a microprocessor that doesnt use microcode

If Im wrong, Ill take back what I said


Intel 4004, 8008
Motorola 6800, 6801, 6805, 6809

Now take back what you said. You can check your favorite source,
Wikipedia if you don't believe me.

Coldfire executes most of the 68K instruction set, and it's not
microcoded.

John
While the 68000 through 68040 are microcoded, you are right, the 68060 and
it's descendants are hard wired. MC68300 is a descendant of the 68020 and
therefore microcoded. While the core CPU is hard wired on the ColdFire,
it has an on chip TPU (time processing unit) that is microcoded,
 
In article <bl2lv0prac5jqju2knmk9e1a19vf277tmb@4ax.com>,
Spehro Pefhany <speffSNIP@interlogDOTyou.knowwhat> wrote:
On Fri, 28 Jan 2005 01:21:29 GMT, the renowned mzenier@eskimo.com
(Mark Zenier) wrote:

In article <uZTJd.187$Eh5.115@trnddc04>, Bradley1234
someone@yahoo.com> wrote:
PLEASE show an example of a microprocessor that doesnt use microcode

Most of them have their control unit logic implemented in a PLA
(Programmable Logic Array) which directly implements a two level logic
equation. With microcode, there would be an address that was decoded
to provide a word (or row) of the ROM's contents. There's no such thing
as an address in a PLA, just inputs and outputs.

If you have a PLA and a ROM in a black box, and are allowed to observe
the outputs only after they have settled, what difference is there
between the two?
Hey, it was a trick to see how deep "Bradley"'s knowledge goes.
The real difference is in how the PLA contents are described to
the software that creates it. It could be a table, or it could be
equations.

For most of the 1970 vintage microprocessors, given their behavior
(especially for invalid instructions), I'd expect equations.

Mark Zenier mzenier@eskimo.com Washington State resident
 
On Mon, 31 Jan 2005 03:06:19 GMT, Bradley1234 <someone@yahoo.com> wrote:
Translates to some 10 bit code thing? So you now admit its never, ever
claiming to be a 10 bit byte?

The data remains an 8 bit byte, since universally the obvious definition for
byte is an undisputed 8 bit width.

Any concept of how data is transferred serially? There are zillions of
schemes that use start bits, stop bits, parity, control, bamboo bits, endian
ness bits, it becomes extra bits to help establish framing to deliver the 8
bit bytes.
Do you ever tire of being spanked?



"WordNet (r) 2.0"
byte
n : a sequence of 8 bits (enough to represent one character of
alphanumeric data) processed as a single unit of
information

"Jargon File (4.3.0, 30 APR 2001)"
byte /bi:t/ n. techspeak A unit of memory or data equal to the
amount used to represent one character; on modern architectures this is
usually 8 bits, but may be 9 on 36-bit machines. Some older
architectures used `byte' for quantities of 6 or 7 bits, and the PDP-10
supported `bytes' that were actually bitfields of 1 to 36 bits! These
usages are now obsolete, and even 9-bit bytes have become rare in the
general trend toward power-of-2 word sizes.

Historical note: The term was coined by Werner Buchholz in 1956 during
the early design phase for the IBM Stretch computer; originally it was
described as 1 to 6 bits (typical I/O equipment of the period used 6-bit
chunks of information). The move to an 8-bit byte happened in late 1956,
and this size was later adopted and promulgated as a standard by the
System/360. The word was coined by mutating the word `bite' so it would
not be accidentally misspelled as bit. See also nybble.



"The Free On-line Dictionary of Computing (27 SEP 03)"
Byte
<publication> A popular computing magazine.
Home (http://www.byte.com).
(1997-03-27)



"The Free On-line Dictionary of Computing (27 SEP 03)"
byte
<unit> /bi:t/ (B) A component in the machine data hierarchy
usually larger than a bit and smaller than a word; now
most often eight bits and the smallest addressable unit of
storage. A byte typically holds one character.

A byte may be 9 bits on 36-bit computers. Some older
architectures used "byte" for quantities of 6 or 7 bits, and
the PDP-10 and IBM 7030 supported "bytes" that were actually
bit-fields of 1 to 36 (or 64) bits! These usages are now
obsolete, and even 9-bit bytes have become rare in the general
trend toward power-of-2 word sizes.

The term was coined by Werner Buchholz in 1956 during the
early design phase for the IBM Stretch computer. It was a
mutation of the word "bite" intended to avoid confusion with
"bit". In 1962 he described it as "a group of bits used to
encode a character, or the number of bits transmitted in
parallel to and from input-output units". The move to an
8-bit byte happened in late 1956, and this size was later
adopted and promulgated as a standard by the System/360
operating system (announced April 1964).

James S. Jones <jsjones@graceland.edu> adds:

I am sure I read in a mid-1970's brochure by IBM that outlined
the history of computers that BYTE was an acronym that stood
for "Bit asYnchronous Transmission E__?__" which related to
width of the bus between the Stretch CPU and its CRT-memory
(prior to Core).

Terry Carr <bear@mich.com> says:

In the early days IBM taught that a series of bits transferred
together (like so many yoked oxen) formed a Binary Yoked
Transfer Element (BYTE).

True origin? First 8-bit byte architecture?

See also nibble, octet.

Jargon File

(2003-09-21)
 
On Sun, 30 Jan 2005 22:24:24 -0500, keith <krw@att.bizzzz> wrote:

On Mon, 31 Jan 2005 03:15:40 +0000, Bradley1234 wrote:

Your link didnt work, wise guy. Besides, thats copyrighted material.

You've been shown excerpts from the 'C' spec. Why don't you hang it up
and admit you're wrong?
Because he's a pain slut, a person who need continuous public
humiliation. Usenet seems to attract this type.

John
 
In article <LghLd.1081$B64.810@trnddc07>,
Bradley1234 <someone@yahoo.com> wrote:
Translates to some 10 bit code thing? So you now admit its never, ever
claiming to be a 10 bit byte?

You haven't actually read and understood the document have you?

In my post, I was showing specifically how the document disproves your
claim. The 8 bit input values can take on 256 values. There are
additional codes for "special control characters" this means more than a
total of 256 values can be taken on by the 10 bit byte. Since the
Log2(256+N) is greater than 8 if N is positive you have lost the argument.
That is all there is to it.


I'll leave my text here so you can read it again.


Yes it is sending 10 bit codes to represent what are at the input 8 bit
values but it is also sending "special control characters" ie: values
outside the usual 8 bit range. This makes the "10 bit byte" contain more
than 8 bits of information thus disproving Bradly1234's claims.
--
--
kensmith@rahul.net forging knowledge
 
In article <ctk9ft$17l$1@theodyn.ncf.ca>,
Michael Black <et472@FreeNet.Carleton.CA> wrote:
"Bradley1234" (someone@yahoo.com) writes:
Translates to some 10 bit code thing? So you now admit its never, ever
claiming to be a 10 bit byte?

The data remains an 8 bit byte, since universally the obvious definition for
byte is an undisputed 8 bit width.

The trick is to use a source that is old enough not to be corrupted by
more recent and common useage of the word.

The document I referenced is fairly new and is from IBM and refers to 10
bit bytes that take on more than 256 values. Bradly1234 was refering to
"accepted in the industry" as his test for the true meaning. IBM is big
enough that they are a big enough portion of the industry to qualify with
this as disproof of his position. No further disproof is needed IMHO.



--
--
kensmith@rahul.net forging knowledge
 
In article <StrLd.9453$8Z1.1523@newssvr14.news.prodigy.com>, no@No.com
says...
"Ken Smith" <kensmith@green.rahul.net> wrote in message
news:ctka31$1hm$2@blue.rahul.net...
In article <ctk9ft$17l$1@theodyn.ncf.ca>,
Michael Black <et472@FreeNet.Carleton.CA> wrote:

"Bradley1234" (someone@yahoo.com) writes:
Translates to some 10 bit code thing? So you now admit its
never, ever
claiming to be a 10 bit byte?

The data remains an 8 bit byte, since universally the obvious
definition for
byte is an undisputed 8 bit width.

The trick is to use a source that is old enough not to be
corrupted by
more recent and common useage of the word.

The document I referenced is fairly new and is from IBM and
refers to 10
bit bytes that take on more than 256 values. Bradly1234 was
refering to
"accepted in the industry" as his test for the true meaning.
IBM is big
enough that they are a big enough portion of the industry to
qualify with
this as disproof of his position. No further disproof is needed
IMHO.

(Like for instance EBIDIC???))
No, like EBCDIC. ;-) (Extended Binary Coded Decimal Interchange Code)
However, IBM is not a major force in establishing a standard or
even a definition, so your opine is irrelevant!
Really? They don't have people sitting on standards committees?

--
Keith
 
On Mon, 31 Jan 2005 14:42:58 GMT, Clarence_A <no@No.com> wrote:
However, IBM is not a major force in establishing a standard or
even a definition, so your opine is irrelevant!

They were, and your dislike of the company is irrelevent.
 
I read in sci.electronics.design that Keith Williams <krw@att.bizzzz>
wrote (in <MPG.1c68116d7f4de7659898c2@news.individual.net>) about 'Is
microprocessor an integrated circuit???', on Mon, 31 Jan 2005:

Really? They don't have people sitting on standards committees?
Wall-to-wall! (;-)
--
Regards, John Woodgate, OOO - Own Opinions Only.
The good news is that nothing is compulsory.
The bad news is that everything is prohibited.
http://www.jmwa.demon.co.uk Also see http://www.isce.org.uk
 
On Mon, 31 Jan 2005 14:42:58 GMT, "Clarence_A" <no@No.com> wrote:

However, IBM is not a major force in establishing a standard or
even a definition, so your opine is irrelevant!
Please stop using opine as a noun. It is wrong, pompous, pretentious,
and grating in my humble opinion:)

--
Thaas
 
On Mon, 31 Jan 2005 17:40:01 GMT, Thaas wrote:

On Mon, 31 Jan 2005 14:42:58 GMT, "Clarence_A" <no@No.com> wrote:


However, IBM is not a major force in establishing a standard or
even a definition, so your opine is irrelevant!


Please stop using opine as a noun. It is wrong, pompous, pretentious,
and grating in my humble opinion:)
Thank you. That was bugging me too.

Bob
 
On Mon, 31 Jan 2005 15:08:42 +0000, John Woodgate wrote:

I read in sci.electronics.design that Keith Williams <krw@att.bizzzz
wrote (in <MPG.1c68116d7f4de7659898c2@news.individual.net>) about 'Is
microprocessor an integrated circuit???', on Mon, 31 Jan 2005:

Really? They don't have people sitting on standards committees?

Wall-to-wall! (;-)
....but no major force.

--
Keith
 
On Mon, 31 Jan 2005 03:49:10 +0000, Bradley1234 wrote:

This is a very contentious subject, severely lacking in an apparent
interest in facts, only bar room style babble, like drunk talk.

If Im wrong, Ill take back what I said


Intel 4004, 8008
Motorola 6800, 6801, 6805, 6809
Plonk!
 

Welcome to EDABoard.com

Sponsor

Back
Top