Embedded Microcontroler Question

Dave VanHorn wrote:
{snip}

Large systems let artless programmers get "something that runs"
quickly. Small systems weed out artless programmers. :)
Ahmmm. This assumes that only small things can be equated to art. What
about the Eiffel Tower? The Taj Mahal?

All this bloody elitism.

Kevin Aylward
salesEXTRACT@anasoft.co.uk
http://www.anasoft.co.uk
SuperSpice, a very affordable Mixed-Mode
Windows Simulator with Schematic Capture,
Waveform Display, FFT's and Filter Design.
 
Ahmmm. This assumes that only small things can be equated to art. What
about the Eiffel Tower? The Taj Mahal?
Not at all. Large systems can be artful, but small systems generally must be
artful.
 
Dave VanHorn wrote:
Ahmmm. This assumes that only small things can be equated to art.
What about the Eiffel Tower? The Taj Mahal?

Not at all. Large systems can be artful, but small systems generally
must be artful.
Nonsense. Most things in engineering are just that, engineering. Same
old shit different day.

Why is that so many want to think that what *they* do is so
special/difficult/wonderful/unique/etc. Full of it comes to mind.

Kevin Aylward
salesEXTRACT@anasoft.co.uk
http://www.anasoft.co.uk
SuperSpice, a very affordable Mixed-Mode
Windows Simulator with Schematic Capture,
Waveform Display, FFT's and Filter Design.
 
Kevin Aylward wrote:
David Brown wrote:

"Kevin Aylward" <kevindotaylwardEXTRACT@anasoft.co.uk> wrote in
message news:dCdjb.2249$HM4.1584@newsfep3-gui.server.ntli.net...

Chuck Harris wrote:

....
If you are working with a microcontroller with 4k flash and
256 bytes ram, then that's all you've got, and if some half-wit C
programmer uses "printf" instead of writing their own specialised
conversion routines, you end up needing a bigger and more expensive
processor.

I am not really addressing those tiny 4k embedded stuff. I'm talking
about *real* projects. Like for example, I use a hardware MIDI sound
player/file player. It comes with 8MB of flash just to save the midi
files on. It would be bloody daft to write this product in asm.
8MB boards are not *real* embedded anymore than sticking a pentium m/b
in a non-standard box that doesn't look like a pc.
 
Russell Shaw wrote:
Kevin Aylward wrote:
David Brown wrote:

"Kevin Aylward" <kevindotaylwardEXTRACT@anasoft.co.uk> wrote in
message news:dCdjb.2249$HM4.1584@newsfep3-gui.server.ntli.net...

Chuck Harris wrote:

...
If you are working with a microcontroller with 4k flash and
256 bytes ram, then that's all you've got, and if some half-wit C
programmer uses "printf" instead of writing their own specialised
conversion routines, you end up needing a bigger and more expensive
processor.

I am not really addressing those tiny 4k embedded stuff. I'm talking
about *real* projects. Like for example, I use a hardware MIDI sound
player/file player. It comes with 8MB of flash just to save the midi
files on. It would be bloody daft to write this product in asm.

8MB boards are not *real* embedded anymore than sticking a pentium m/b
in a non-standard box that doesn't look like a pc.
Rubbish. Embedded covers a *lot* of ground. "Embedded" and "memory size"
has absolutely no correlation at all. The proof is that, take our 4k uC
washing machine board and stuff 128M on it. Nothing changes. In this
MIDI case, the flash is for a virtual file system. The main point being
that, this part of the cost budget, making the cost of using say, a few
hundred k of main ram a non issue. For me, embedded is really geared
around computer controlled hardware in dedicated applications.

Kevin Aylward
salesEXTRACT@anasoft.co.uk
http://www.anasoft.co.uk
SuperSpice, a very affordable Mixed-Mode
Windows Simulator with Schematic Capture,
Waveform Display, FFT's and Filter Design.
 
Kevin Aylward wrote:
Dave VanHorn wrote:

Ahmmm. This assumes that only small things can be equated to art.
What about the Eiffel Tower? The Taj Mahal?

Not at all. Large systems can be artful, but small systems generally
must be artful.

Nonsense. Most things in engineering are just that, engineering. Same
old shit different day.

Why is that so many want to think that what *they* do is so
special/difficult/wonderful/unique/etc. Full of it comes to mind.
It is only as repetitive as the job you apply for. I'm doing things so
different/interesting/challenging every day that i'd go broke if i didn't
switch to paying projects once in a while. The boring/easy/repetitive
projects are the ones for making a buck.
 
Kevin Aylward wrote:
Russell Shaw wrote:

Kevin Aylward wrote:

David Brown wrote:
....

If you are working with a microcontroller with 4k flash and
256 bytes ram, then that's all you've got, and if some half-wit C
programmer uses "printf" instead of writing their own specialised
conversion routines, you end up needing a bigger and more expensive
processor.

I am not really addressing those tiny 4k embedded stuff. I'm talking
about *real* projects. Like for example, I use a hardware MIDI sound
player/file player. It comes with 8MB of flash just to save the midi
files on. It would be bloody daft to write this product in asm.

8MB boards are not *real* embedded anymore than sticking a pentium m/b
in a non-standard box that doesn't look like a pc.

Rubbish. Embedded covers a *lot* of ground. "Embedded" and "memory size"
has absolutely no correlation at all. The proof is that, take our 4k uC
washing machine board and stuff 128M on it. Nothing changes.
Nonsense. *Everything* changes. All restrictions on price and
size for the micro are removed.

In this
MIDI case, the flash is for a virtual file system. The main point being
that, this part of the cost budget, making the cost of using say, a few
hundred k of main ram a non issue. For me, embedded is really geared
around computer controlled hardware in dedicated applications.
Well, i have a pc just used for an emprom programmer. It must be
embedded (it's under the table;).
 
Russell Shaw wrote:
Kevin Aylward wrote:
Russell Shaw wrote:

Kevin Aylward wrote:

David Brown wrote:
...

If you are working with a microcontroller with 4k flash and
256 bytes ram, then that's all you've got, and if some half-wit C
programmer uses "printf" instead of writing their own specialised
conversion routines, you end up needing a bigger and more
expensive processor.

I am not really addressing those tiny 4k embedded stuff. I'm
talking about *real* projects. Like for example, I use a hardware
MIDI sound player/file player. It comes with 8MB of flash just to
save the midi files on. It would be bloody daft to write this
product in asm.

8MB boards are not *real* embedded anymore than sticking a pentium
m/b in a non-standard box that doesn't look like a pc.

Rubbish. Embedded covers a *lot* of ground. "Embedded" and "memory
size" has absolutely no correlation at all. The proof is that, take
our 4k uC washing machine board and stuff 128M on it. Nothing
changes.

Nonsense. *Everything* changes. All restrictions on price and
size for the micro are removed.
Nonsense. That is not my point. Take any system that you consider is a
"real" embedded system". Place on it 128Mb of ram as use it "as is".
Nothing has changed topologically. Therefore, the claim that there is
any relation between memory size and the notion of embedded is false.

In this
MIDI case, the flash is for a virtual file system. The main point
being that, this part of the cost budget, making the cost of using
say, a few hundred k of main ram a non issue. For me, embedded is
really geared around computer controlled hardware in dedicated
applications.

Well, i have a pc just used for an emprom programmer. It must be
embedded (it's under the table;).
Almost.

Kevin Aylward
salesEXTRACT@anasoft.co.uk
http://www.anasoft.co.uk
SuperSpice, a very affordable Mixed-Mode
Windows Simulator with Schematic Capture,
Waveform Display, FFT's and Filter Design.
 
Russell Shaw wrote:
Kevin Aylward wrote:
Dave VanHorn wrote:

Ahmmm. This assumes that only small things can be equated to art.
What about the Eiffel Tower? The Taj Mahal?

Not at all. Large systems can be artful, but small systems generally
must be artful.

Nonsense. Most things in engineering are just that, engineering. Same
old shit different day.

Why is that so many want to think that what *they* do is so
special/difficult/wonderful/unique/etc. Full of it comes to mind.

It is only as repetitive as the job you apply for. I'm doing things so
different/interesting/challenging every day that i'd go broke if i
didn't switch to paying projects once in a while. The
boring/easy/repetitive projects are the ones for making a buck.
But your still assuming that what you are doing has some special,
intrinsic greater, worth, to make you more respected, oe better than
someone else. Sorry mate, just about *everything* we *all* do is trivial
in the big picture. In the last century we have had Einstein's
Relativity, Plank/Hienburgs/etc Quantum Mechanics, and a couple of
others. Everything else, is all plain old engineering, capable of being
done by 90% of all engineers, including the,
err..."different/interesting/challenging" ones.

My point was that the poster was claiming he was "better" than someone
else, by virtue of what he was doing. My view is that, this is absolute
crap. There is very little engineering that can not be done by most, if
not all competent engineers.

Kevin Aylward
salesEXTRACT@anasoft.co.uk
http://www.anasoft.co.uk
SuperSpice, a very affordable Mixed-Mode
Windows Simulator with Schematic Capture,
Waveform Display, FFT's and Filter Design.
 
I know this is top-posting, but it might make other comments further down a
bit clearer...

I work with small systems - mostly 8-bit micros, occasionally 16-bit, and a
few 32-bit microcontrollers. So when I think of "embedded", that's what I
mean. There are, I think, three fairly distinct types of embedded processor
systems. There are the low-end, mask-programmed systems (using ROM'ed 4-bit
or 8-bit chips, dedicated ASICs, etc.) which are very high volume and very
low cost, the medium level systems (using mostly 8-bit or 16-bit
microcontrollers, program running from internal flash or otp-rom, all/most
ram built-in), and the high level systems (32-bit, typically with a
reasonable OS, plenty of external memory, effectively a "normal" computer
with limited resources and a dedicated task).

When you are talking about the high-end systems - something that can run
Linux or wince or even eCos, I agree with most of what you say. You'd be
daft to write anything but the most specialised parts of the system in
assembly. You work with a similar sort of development environment as on a
desktop PC - you can use dynamic memory and file I/O, you connect your
system by ethernet for remote debugging, you worry about things like memory
leaks and security. A good desktop programmer can easily be a good high-end
embedded programmer.

But for medium level systems, it's a different world. You work with a much
smaller system, and with a different set of problems. You don't worry about
memory leaks, because you don't use dynamic memory - you have to *know* that
you have the space you need. You frequently don't worry about security -
crackers would need an X-ray machine to see into the system. But you do
worry about size, and about performance. You worry about bugs in your
tools - the user base is so much smaller than for high-end processor tools.
You worry about getting your interrupt handling and peripheral handling
correct. You have to ensure that your code runs from flash or rom, and uses
minimal ram space. You use an emulator or jtag debugger to test the system.
You have the datasheets for every component on the board, and understand how
they work and interface with the processor. You are good enough at digital
electronics to be able to do most of a design yourself. A good desktop
programmer has some overlap in skills with a good medium-level embedded
programmer, but there is still a big seperation.

There are many reasons why such microcontrollers are used. Cost of the chip
is one of them, although it is only the most important factor in big
volumes. Cost of the design is often more relevant - it is far easier, and
therefore cheaper, to design, prototype and test a board with an 8-bit
microcontroller than with an embedded high-end processor. The board is
smaller and uses less power, which are also often important. They require
smaller and cheaper production equipment. They frequently have built-in
peripherals that don't exist in high-end chips. For many types of
application, they can be faster than high-end processors, simply because you
have complete control. And they can be far more reliable (an embedded Linux
system might *achieve* years of uptime - and embedded control system might
*demand* decades of uptime).

Things are definitely changing, however. Ten years ago, I wrote mostly
assembly with only the occasional program in C, whereas now I write mostly
C, and normally only write pure assembly programs as updates for old
systems. In the past few years, there have been far more reasonably priced
and reasonably reliable C compilers for smaller micros - I think the days
when you had to pay many thousands of dollars for a "C" compiler that didn't
support structures or pointers are thankfully gone. The chips are also
getting bigger memories - 16k or 32k program space buys you a lot of freedom
compared to 4k. But there are still plenty of uses for pure assembly
programming - a tiny 8-pin micro with 2k flash will often demand assembly
programming, as will a fast software uart on a slow chip. And when you want
top performance (often because you want to reduce the oscillator speed and
power consumption) on small micros, then assembly can make a vast difference
since you are not constrainted by C thinking.

I've put a few comments in further down - I guess that your comments would
have been different, however, if I'd made it clearer the sort of systems I
was talking about.


"Kevin Aylward" <kevindotaylwardEXTRACT@anasoft.co.uk> wrote in message
news:YKTjb.163$NB6.84@newsfep3-gui.server.ntli.net...
David Brown wrote:
"Kevin Aylward" <kevindotaylwardEXTRACT@anasoft.co.uk> wrote in
message news:dCdjb.2249$HM4.1584@newsfep3-gui.server.ntli.net...
Chuck Harris wrote:
Hi Kevin,

I always tend towards "C" for any embedded work I do,
even on the tiny little PIC processors. But my projects
usually have production runs that number in the tens to
hundreds of units. Programmer time is much more important
than silicon cost in these scales.


I have done a fair bit of embedded, its was all in C.

However, there are gobs and gobs of very small embedded
processors that do simple mundane tasks like: make a quartz
watch go, or run a blender, or a washing machine, or a TV
remote, or a garage door opener, or make a greeting card hum
Happy Birthday ... that MIGHT be programmed in native
assembler code. Products such as these are produced in such
high quantities that even a slight increase in die size would
cost the manufacturer millions of dollars.


Its not that bad, imo, for most products. Time to market is probably
the most important factor. C is going to be an order of magnitude
faster to

That depends on the complexity of the project, and on the type of
chip. On small projects and small architectures, assembly can be
faster to write and debug (assuming you are familiar with the chip).

Cant agree with this at all. I don't see that size makes any difference
whatsoever. C is a higher level language. Its simpler. End of story.
Size of resources makes a big difference, but not the size of the program.
A "hello world" program is going to be easier to write in C in most systems,
but if you have 2k program space and 128 bytes ram and have to write a real,
functioning program, then it's a different matter. It is often claimed that
assembly programs are smaller and faster than C programs - for big programs
and big processors, this is seldom true since the effort required to write
such smaller and faster programs is beyond all but the most obsessive
writers. On bigger systems, C "overheads" such as the run-time library and
call overheads become proportionally less important. But conversly, on
small systems they are much more important. Even a tiny C library that fits
into 1.5k is huge if all you have is 2k program space. Similarly, overheads
in general code space and ram space (especially stack space) become more
relevant.

There is no ideal language for all systems - when programming for desktops,
I pick python over C - it is a higher level language, so it's simpler. End
of story, apparently.


For some small micros, the C compilers available are so limited
and/or buggy and/or expensive that they are just not worth the effort
(and there are some small micros for which C compilers don't exist).

This is a valid point, if that is indeed the case. I am sceptical on
this actual assertion though.
Ask in comp.arch.embedded for some of the horror stories. As I mentioned at
the top of the post, it is not as bad as it used to be, but there are still
some terrible tools out there.

There is also the consideration of the cost of the tools - for small
budget projects, that can be very relevant.

Oh, come on now. Don't agree here at all. What's a weeks engineering
salary? The cost cannot be an issue, except for schoolboys.
The cost can definitely be an issue - some tools are absurdly expensive.
That's fine if you are going to be working with them over a long time, but
if you are using a number of different architectures and need tools for them
all, it can add up pretty quickly.

There is also the
reliability and support for the compiler - assemblers tend to be very
simple and therefore reliable, while with a C compiler you may end up
fighting compiler bugs or waiting for supplier support.


well, I cant really comment on how often this is the case. But, again,
I'm sceptical.
Again, there are plenty of horror stories, with suppliers refusing to fix
glaring bugs but insisting that you upgrade to (and pay for) newer versions
of the software, or refusing to admit to the problem in the first place. By
no means all suppliers are like this, but there have been publised cases.
Even with the best of suppliers, bug fixes may take weeks, which could be a
problem.

For larger chips (especially 16-bit and 32-bit), it's a different
matter - C or another HLL is normally the only sensible choice.

Yes real projects.
See my comments at the top. Or look at the statistics for the number of
embedded x86 chips sold compared to the number of 8051's sold.

write then asm. Secondly, they are millions of C programmers out
there, so getting a programmer is pretty easy. Thirdly, unless your
a *good*

And of these millions of C programs, less than 1% have the experiance
or understanding to work well in embedded systems.

Crap. Smacks of elitism to me. Your view here is simple not credible. A
good programmer can transfer to embedded in about a weak. There is no
magic in embedded whatsoever. Its plain old engineering. If you were
talking about programmers becoming analogue designers, you might have a
point.
The sort of embedded systems I am talking about require different skills
than desktop programming or high-end embedded systems programming. There
are also plenty of differences between desktop and high-end embedded
programming, but the gap is not quite as big. Note that a good embedded
programmer is not necessarily a good desktop programmer - the skills are
different.

When looking at
the CV of a prospective embedded programmer employee, I would not
consider experiance with C as a particular benifit unless it was
specifically in embedded systems.

Well, only goes to show that they are a lot of unqualified people
vetting CV's.


asm programmer, I doubt if you will do much better in speed/memory,
then

Again, that depends on the chips you are working with - I would not
expect to be able to "beat" a PowerPC compiler at code generation,
but I would expect to beat a Pic or COP8 compiler with my eyes closed.

a good C programmer, of which there are loads of them.

It is difficult to determine a good C programmer from a bad one when
looking at potential employees,

Agreed.

but the vast majority are not going
to be good embedded C programmers.

In my opinion, this opinion of yours is total crap. I think your idea of
embedded is rather restricted. Essentially, its any computer in a box
controlling hardware.
Yes, I am talking about a restricted set of "embedded", and I appologise for
not making that clear earlier. However, this "restricted" set far outweighs
the "real computer in a box" type of embedded system.

Of course, that doesn't make it
easier to find good assembly programmers...


I wonder if there are any accurate numbers out there?

I don't know what the number of asm programmers there are, but I bet
its of the order of < 1% of C programmers.


In the embedded world, there are many more assembly programmers than
that (although I have no figures to base that on).

Old habits die hard.
Sometimes with good reason - although sometimes *without* good reason.

It also depends
on what you mean by an "asm programmer" and a "C programmer". I
write more code in C than assembly these days, but I would definitely
say I am an experiances assembly programmer as well as a C
programmer. I would also say that the ability to write and
understand assembly programs for a chip is essential to doing good
embedded C programming on the chip.

I wouldn't, unless you restricted to like, 1k of memory. There's a lot
more to life than washing machines. There are 1000's of embedded
products were low level knowledge is simply not relevant. You need to
read/write to a hardware port and that's it.
You're skipping over a huge proportion of embedded systems, which lie
between washing machines and mp3 players.

I would
certainly be interested in seeing the results of a very large
"volume of production" vs "programming language" poll.


There is a knew jerk thought of asm=embedded, i.e. memory and speed,
but realistically, most products, imo, don't have this problem
today. Memory is dirt cheap for starters.

Memory is dirt cheap on large systems - it is not cheap on small
systems.

Valid point, but grasping a bit I think.

If you are working with a microcontroller with 4k flash and
256 bytes ram, then that's all you've got, and if some half-wit C
programmer uses "printf" instead of writing their own specialised
conversion routines, you end up needing a bigger and more expensive
processor.


I am not really addressing those tiny 4k embedded stuff. I'm talking
about *real* projects. Like for example, I use a hardware MIDI sound
player/file player. It comes with 8MB of flash just to save the midi
files on. It would be bloody daft to write this product in asm.
I'm not talking about gameboys and mp3 players, but *real* projects for
*real* systems, for *real* customers. I don't have any numbers, but I would
guess that the majority of embedded development is in the middle sector, for
customers looking for specialised, small, reliable control systems. There
are only a few people involved in making mp3 players or washing machines -
there are many that make the control systems in all sorts of industrial and
commercial equipment.

mvh.,

David



Best Regards,

Kevin Aylward
salesEXTRACT@anasoft.co.uk
http://www.anasoft.co.uk
SuperSpice, a very affordable Mixed-Mode
Windows Simulator with Schematic Capture,
Waveform Display, FFT's and Filter Design.
 
David Brown wrote:
I know this is top-posting, but it might make other comments further
down a bit clearer...

I work with small systems - mostly 8-bit micros, occasionally 16-bit,
and a few 32-bit microcontrollers. So when I think of "embedded",
that's what I mean. There are, I think, three fairly distinct types
of embedded processor systems. There are the low-end,
mask-programmed systems (using ROM'ed 4-bit or 8-bit chips, dedicated
ASICs, etc.) which are very high volume and very low cost, the medium
level systems (using mostly 8-bit or 16-bit microcontrollers, program
running from internal flash or otp-rom, all/most ram built-in), and
the high level systems (32-bit, typically with a reasonable OS,
plenty of external memory, effectively a "normal" computer with
limited resources and a dedicated task).

When you are talking about the high-end systems - something that can
run Linux or wince or even eCos, I agree with most of what you say.
You'd be daft to write anything but the most specialised parts of the
system in assembly. You work with a similar sort of development
environment as on a desktop PC - you can use dynamic memory and file
I/O, you connect your system by ethernet for remote debugging, you
worry about things like memory leaks and security. A good desktop
programmer can easily be a good high-end embedded programmer.

But for medium level systems, it's a different world.
Yes.

You work with
a much smaller system, and with a different set of problems.
Yes.

You
don't worry about memory leaks, because you don't use dynamic memory
- you have to *know* that you have the space you need.
Yes.

You
frequently don't worry about security - crackers would need an X-ray
machine to see into the system. But you do worry about size, and
about performance. You worry about bugs in your tools - the user
base is so much smaller than for high-end processor tools. You worry
about getting your interrupt handling and peripheral handling
correct.
Yes.

You have to ensure that your code runs from flash or rom,
and uses minimal ram space. You use an emulator or jtag debugger to
test the system. You have the datasheets for every component on the
board, and understand how they work and interface with the processor.
Yes.

You are good enough at digital electronics to be able to do most of a
design yourself. A good desktop programmer has some overlap in
skills with a good medium-level embedded programmer, but there is
still a big separation.
Oh dear, oh dear. I don't seem to be getting through here. I am not
disputing any difference, only the relevence of the differences. The
main gist of this is that, that a good desktop programmer *can't* be a
good mainstream embedded programmer. Look, I don't care a toss about
whether a desktop programmer has some specific skill set at some
specific *instant* in time. My argument is that, say 90% of *competent*
good desktop programmers can, with trivially minimal amount of
study/refresh/work be competent and good mainstream embedded
programmers.

Any concept that the above is some Masons reserved only knowledge,
specifically reserved for those only who have never done desktop
programming before is simply ludicrous. As I have already said. None of
this is rocket science. Its all plain old engineering. Any software
engineer, who can tie his own shoelaces, is basically capable of doing
the job. Sure, it can take a few years to get to grips with programming
in general, but once that knowledge is gained, redirecting it another
variation is a no-brainer. There is simple nothing in the above, that
requires more than a few weeks of research.

write then asm. Secondly, they are millions of C programmers out
there, so getting a programmer is pretty easy. Thirdly, unless your
a *good*

And of these millions of C programs, less than 1% have the
experiance or understanding to work well in embedded systems.

Crap. Smacks of elitism to me. Your view here is simple not
credible. A good programmer can transfer to embedded in about a
weak. There is no magic in embedded whatsoever. Its plain old
engineering. If you were talking about programmers becoming analogue
designers, you might have a point.

The sort of embedded systems I am talking about require different
skills than desktop programming or high-end embedded systems
programming.
Yes. But they are not reserved exclusively for Masons.

There are also plenty of differences between desktop
and high-end embedded programming, but the gap is not quite as big.
Note that a good embedded programmer is not necessarily a good
desktop programmer - the skills are different.
Again, you missed my point. Of course some of the skills are different.
The point is that any already experienced, competent programmer can
update with minimum effort. The skill set is simply not that difficult
to learn.


Kevin Aylward
salesEXTRACT@anasoft.co.uk
http://www.anasoft.co.uk
SuperSpice, a very affordable Mixed-Mode
Windows Simulator with Schematic Capture,
Waveform Display, FFT's and Filter Design.
 
"Kevin Aylward" <kevindotaylwardEXTRACT@anasoft.co.uk> wrote in message
news:r5Pkb.118$B75.84@newsfep3-gui.server.ntli.net...
David Brown wrote:
I know this is top-posting, but it might make other comments further
down a bit clearer...

I work with small systems - mostly 8-bit micros, occasionally 16-bit,
and a few 32-bit microcontrollers. So when I think of "embedded",
that's what I mean. There are, I think, three fairly distinct types
of embedded processor systems. There are the low-end,
mask-programmed systems (using ROM'ed 4-bit or 8-bit chips, dedicated
ASICs, etc.) which are very high volume and very low cost, the medium
level systems (using mostly 8-bit or 16-bit microcontrollers, program
running from internal flash or otp-rom, all/most ram built-in), and
the high level systems (32-bit, typically with a reasonable OS,
plenty of external memory, effectively a "normal" computer with
limited resources and a dedicated task).

When you are talking about the high-end systems - something that can
run Linux or wince or even eCos, I agree with most of what you say.
You'd be daft to write anything but the most specialised parts of the
system in assembly. You work with a similar sort of development
environment as on a desktop PC - you can use dynamic memory and file
I/O, you connect your system by ethernet for remote debugging, you
worry about things like memory leaks and security. A good desktop
programmer can easily be a good high-end embedded programmer.

But for medium level systems, it's a different world.

Yes.

You work with
a much smaller system, and with a different set of problems.

Yes.

You
don't worry about memory leaks, because you don't use dynamic memory
- you have to *know* that you have the space you need.

Yes.

You
frequently don't worry about security - crackers would need an X-ray
machine to see into the system. But you do worry about size, and
about performance. You worry about bugs in your tools - the user
base is so much smaller than for high-end processor tools. You worry
about getting your interrupt handling and peripheral handling
correct.

Yes.

You have to ensure that your code runs from flash or rom,
and uses minimal ram space. You use an emulator or jtag debugger to
test the system. You have the datasheets for every component on the
board, and understand how they work and interface with the processor.

Yes.

You are good enough at digital electronics to be able to do most of a
design yourself. A good desktop programmer has some overlap in
skills with a good medium-level embedded programmer, but there is
still a big separation.

Oh dear, oh dear. I don't seem to be getting through here. I am not
disputing any difference, only the relevence of the differences. The
main gist of this is that, that a good desktop programmer *can't* be a
good mainstream embedded programmer. Look, I don't care a toss about
whether a desktop programmer has some specific skill set at some
specific *instant* in time. My argument is that, say 90% of *competent*
good desktop programmers can, with trivially minimal amount of
study/refresh/work be competent and good mainstream embedded
programmers.
I'd disagree that 90% of good desktop programmers could become good embedded
programmers with a trivial amount of training, although that depends on what
you mean by "mainstream". If you mean "8Mb ram, 32-bit processors, full OS,
etc" systems, that that may be close to realistic - but that is not what I
mean by "mainstream". For what I think of as "mainstream" embedded systems
(and I think it corresponds well to what many others think of as
"mainstream" - again, I'd give comp.arch.embedded as a reasonable
reference), I think either the percentage, the "goodness" of the programmer,
or the triviality of the training, is wrong.


Any concept that the above is some Masons reserved only knowledge,
specifically reserved for those only who have never done desktop
programming before is simply ludicrous.
I'd agree with that - no one has suggested that an inability to program on a
desktop is an advantage in programming embedded systems. All I have said is
that good desktop programming knowledge and ability does not translate to
good embedded systems programming.

As I have already said. None of
this is rocket science. Its all plain old engineering. Any software
engineer, who can tie his own shoelaces, is basically capable of doing
the job. Sure, it can take a few years to get to grips with programming
in general, but once that knowledge is gained, redirecting it another
variation is a no-brainer. There is simple nothing in the above, that
requires more than a few weeks of research.
I think you underestimate what is involved in embedded systems programming
for real embedded systems rather than just boxed pc's - in particular, you
seem to have an odd idea about what being a "good" embedded programmer
means. To me, the adverb "good" implies something beyond merely "capable",
and it can't be achieved (in any field) without a lot of knowledge and
experiance. Being a "good" desktop programmer means you have some of the
skills it takes to be a "good" embedded programmer, but you are by no means
ready. Sure, you'll get there with time and effort - but you're not there
yet.


write then asm. Secondly, they are millions of C programmers out
there, so getting a programmer is pretty easy. Thirdly, unless your
a *good*

And of these millions of C programs, less than 1% have the
experiance or understanding to work well in embedded systems.

Crap. Smacks of elitism to me. Your view here is simple not
credible. A good programmer can transfer to embedded in about a
weak. There is no magic in embedded whatsoever. Its plain old
engineering. If you were talking about programmers becoming analogue
designers, you might have a point.

The sort of embedded systems I am talking about require different
skills than desktop programming or high-end embedded systems
programming.

Yes. But they are not reserved exclusively for Masons.

There are also plenty of differences between desktop
and high-end embedded programming, but the gap is not quite as big.
Note that a good embedded programmer is not necessarily a good
desktop programmer - the skills are different.


Again, you missed my point. Of course some of the skills are different.
The point is that any already experienced, competent programmer can
update with minimum effort. The skill set is simply not that difficult
to learn.
I think I did miss your point - I was under the impression that you felt
desktop programmers can jump right in and do embedded programming. But I
think you also underestimate the difference between desktop programming and
small microcontroller programming. I've seen the results of competent
desktop programmers writing stuff for small microcontrollers, and they were
not pretty - in each case, I ended up writing the code from scratch. I'm
sure the programmers in question could have learned to do it correctly, and
there is no doubt that it would have paid off for me to give them some basic
training - that might have ended up with programs that were good enough for
the job. But turning them into "good" embedded programmers would not have
been a trivial task.


Kevin Aylward
salesEXTRACT@anasoft.co.uk
http://www.anasoft.co.uk
SuperSpice, a very affordable Mixed-Mode
Windows Simulator with Schematic Capture,
Waveform Display, FFT's and Filter Design.
 
David Brown wrote:
"Kevin Aylward" <kevindotaylwardEXTRACT@anasoft.co.uk> wrote in
message news:r5Pkb.118$B75.84@newsfep3-gui.server.ntli.net...
David Brown wrote:


Oh dear, oh dear. I don't seem to be getting through here. I am not
disputing any difference, only the relevence of the differences. The
main gist of this is that, that a good desktop programmer *can't*
be a good mainstream embedded programmer. Look, I don't care a toss
about whether a desktop programmer has some specific skill set at
some specific *instant* in time. My argument is that, say 90% of
*competent* good desktop programmers can, with trivially minimal
amount of study/refresh/work be competent and good mainstream
embedded programmers.


I'd disagree that 90% of good desktop programmers could become good
embedded programmers with a trivial amount of training, although that
depends on what you mean by "mainstream". If you mean "8Mb ram,
32-bit processors, full OS, etc" systems, that that may be close to
realistic - but that is not what I mean by "mainstream". For what I
think of as "mainstream" embedded systems (and I think it corresponds
well to what many others think of as "mainstream" - again, I'd give
comp.arch.embedded as a reasonable reference), I think either the
percentage, the "goodness" of the programmer, or the triviality of
the training, is wrong.
Lets say my mainstream, is where one is not limited to 1k ram, i.e.
where there is a reasonable amount of memory. For example, the
multitasking HDSL system I mentioned had around 100k.

Well, so far, the only details actually presented are all very much
simple things. Again, I am only addressing C, not asm.

For example, it only takes 5 secs to say, don't do a printf(), or don't
use a[n], use *a++ in a loop instead.

Lets see some real examples where you believe it will takes months for
someone, already very competent in C on desktops, to duplicate
effectively on embedded.

Any concept that the above is some Masons reserved only knowledge,
specifically reserved for those only who have never done desktop
programming before is simply ludicrous.

I'd agree with that - no one has suggested that an inability to
program on a desktop is an advantage in programming embedded systems.
All I have said is that good desktop programming knowledge and
ability does not translate to good embedded systems programming.
I agree with this given the assumption of direct transfer of exact
knowledge. However, the original vain in his thread was strongly worded
to the competence of the person, rather than the skill set needed.

As I have already said. None of
this is rocket science. Its all plain old engineering. Any software
engineer, who can tie his own shoelaces, is basically capable of
doing the job. Sure, it can take a few years to get to grips with
programming in general, but once that knowledge is gained,
redirecting it another variation is a no-brainer. There is simple
nothing in the above, that requires more than a few weeks of
research.


I think you underestimate what is involved in embedded systems
programming for real embedded systems rather than just boxed pc's -
in particular, you seem to have an odd idea about what being a "good"
embedded programmer means. To me, the adverb "good" implies
something beyond merely "capable", and it can't be achieved (in any
field) without a lot of knowledge and experiance.
I agree, however, C programming is not so specialised that 90% of it
goes out of the window when switching from desks to beds.

Being a "good"
desktop programmer means you have some of the skills it takes to be a
"good" embedded programmer, but you are by no means ready. Sure,
you'll get there with time and effort - but you're not there yet.
Yes, but were only debating on how much time and effort is required.

Yes. But they are not reserved exclusively for Masons.

There are also plenty of differences between desktop
and high-end embedded programming, but the gap is not quite as big.
Note that a good embedded programmer is not necessarily a good
desktop programmer - the skills are different.


Again, you missed my point. Of course some of the skills are
different. The point is that any already experienced, competent
programmer can update with minimum effort. The skill set is simply
not that difficult to learn.


I think I did miss your point - I was under the impression that you
felt desktop programmers can jump right in and do embedded
programming.
Depends what part of the system they are dealing with. Suppose they had
just written an fft in c on the desk, and now had to do the same, in c
on the bed. Or an Ethernet stack. There's loads of stuff that would be
quite transparent.

But I think you also underestimate the difference
between desktop programming and small microcontroller programming.
Not at all. I did some asm in 1981 on a very small machine!

I've seen the results of competent desktop programmers writing stuff
for small microcontrollers, and they were not pretty - in each case,
I ended up writing the code from scratch.
Someone's first project is always less then perfect.

I'm sure the programmers
in question could have learned to do it correctly, and there is no
doubt that it would have paid off for me to give them some basic
training -
Probably just a good kick up the backside would have been enough. Often,
just pointing out something can make a dramatic improvement.

that might have ended up with programs that were good
enough for the job. But turning them into "good" embedded
programmers would not have been a trivial task.
This makes no logical sense. For example, lets suppose that it takes
say, 5 years for an EE/CS graduate to become reasonably good at either
embedded or desktop programming. just how much extra time do you really
think that it will take to convert one to he other?.

Again, give me some real examples of embedded specific issues, and how
long you think it will take an already competent desk programmer to
assimilate the details of those issues.

Kevin Aylward
salesEXTRACT@anasoft.co.uk
http://www.anasoft.co.uk
SuperSpice, a very affordable Mixed-Mode
Windows Simulator with Schematic Capture,
Waveform Display, FFT's and Filter Design.
 
"Kevin Aylward" <kevindotaylwardEXTRACT@anasoft.co.uk> wrote in message
news:0OQkb.329$B75.154@newsfep3-gui.server.ntli.net...
David Brown wrote:
"Kevin Aylward" <kevindotaylwardEXTRACT@anasoft.co.uk> wrote in
message news:r5Pkb.118$B75.84@newsfep3-gui.server.ntli.net...
David Brown wrote:


Oh dear, oh dear. I don't seem to be getting through here. I am not
disputing any difference, only the relevence of the differences. The
main gist of this is that, that a good desktop programmer *can't*
be a good mainstream embedded programmer. Look, I don't care a toss
about whether a desktop programmer has some specific skill set at
some specific *instant* in time. My argument is that, say 90% of
*competent* good desktop programmers can, with trivially minimal
amount of study/refresh/work be competent and good mainstream
embedded programmers.


I'd disagree that 90% of good desktop programmers could become good
embedded programmers with a trivial amount of training, although that
depends on what you mean by "mainstream". If you mean "8Mb ram,
32-bit processors, full OS, etc" systems, that that may be close to
realistic - but that is not what I mean by "mainstream". For what I
think of as "mainstream" embedded systems (and I think it corresponds
well to what many others think of as "mainstream" - again, I'd give
comp.arch.embedded as a reasonable reference), I think either the
percentage, the "goodness" of the programmer, or the triviality of
the training, is wrong.

Lets say my mainstream, is where one is not limited to 1k ram, i.e.
where there is a reasonable amount of memory. For example, the
multitasking HDSL system I mentioned had around 100k.
Then we are talking about different "mainstreams". I would say that the
embedded mainstream is exactly those systems limited to around 1k ram, or
often less. There are plenty of systems that have more - I've also programm
ed for larger systems. But I'm thinking mainly of smaller systems here.

Well, so far, the only details actually presented are all very much
simple things. Again, I am only addressing C, not asm.

For example, it only takes 5 secs to say, don't do a printf(), or don't
use a[n], use *a++ in a loop instead.

Lets see some real examples where you believe it will takes months for
someone, already very competent in C on desktops, to duplicate
effectively on embedded.
It doesn't take months to learn to use certain different constructs, but it
does take time to get the experiance to know when to use them. I've seen
compilers that will generate better code given "a[n++]" and compilers that
generate better code given "*p++". When programming for a large system, the
prime importance is readability of the code - when programming for a small
system, readability is vital but it might not be your only concern. To be a
good embedded programmer, you need to be able to see when something like
that is important, and to know how to interpret the generated assembly to
see what your final code looks like - and to know how to improve on that
when necessary.

How long does it take to teach someone to think small? Or to think about
how the code they write translates to the processor they are using? Or to
think about minimal memory usage? Or about the interaction between main
threads and interrupt routines? Or about the hardware details of the
microcontroller's peripherals and other peripherals on the board? Or about
understanding the board's schematic diagrams? Or about distinguishing
software problems from hardware problems? Or defensive programming for
dealing with hardware problems that occur during use? Or about designing a
usable interface based on two 7-seg displays and three buttons? I don't
know how long that takes - I don't have enough experiance of teaching other
programmers to give you an answer. All I can say is that I don't think it
is trivial. If you don't agree, then that's fine - all I can give you is my
opinion.

I agree, however, C programming is not so specialised that 90% of it
goes out of the window when switching from desks to beds.
I tend to look at it from a different view. I don't think the language is
very relevant - I would expect a compentent desktop C programmer to be able
to switch to desktop Pascal programming quickly and easily (and I have seen
that happen successfully), because the language is just a detail. I would
also expect a good embedded programmer to pick up a new assembly language
pretty quickly. I don't know how to quantify the difference in the
philosophy, and the way of approaching problems, that differentiates small
embedded programming from large system programming, but it's not the
language that is the key difference.


Depends what part of the system they are dealing with. Suppose they had
just written an fft in c on the desk, and now had to do the same, in c
on the bed. Or an Ethernet stack. There's loads of stuff that would be
quite transparent.
These are likely to be completely different. If you have written an
ethernet stack for a PC and expect to run it on a small micro, you will end
up over-stretching your resources by a large factor. Look at some websites
for implementations of tcp/ip and embedded web servers - they have all sorts
of interesting tricks to be able to work well on tiny systems, and the
implementations bear little resemblance to stacks on large systems or
desktops. Similarly with an fft - on a PC you happily use large arrays of
doubles, while on an embedded system without even a hardware integer
multiplier you might be looking for all sorts of tricks using lookup tables,
fixed point or integer arithmetic, etc. These two are in fact excellent
examples of how desktop experiance differs wildly from embedded systems.
They also demonstrate where there is overlap - an understanding of ethernet
protocols and fft theory is important on each platform.


I've seen the results of competent desktop programmers writing stuff
for small microcontrollers, and they were not pretty - in each case,
I ended up writing the code from scratch.

Someone's first project is always less then perfect.
Oh yes, and I can also give you plenty of examples of my own code which is
way less than perfect. As I say, it takes a long time to be a good embedded
programmer.

I'm sure the programmers
in question could have learned to do it correctly, and there is no
doubt that it would have paid off for me to give them some basic
training -

Probably just a good kick up the backside would have been enough. Often,
just pointing out something can make a dramatic improvement.
Indeed, there are often small things that can make a big difference.

that might have ended up with programs that were good
enough for the job. But turning them into "good" embedded
programmers would not have been a trivial task.

This makes no logical sense. For example, lets suppose that it takes
say, 5 years for an EE/CS graduate to become reasonably good at either
embedded or desktop programming. just how much extra time do you really
think that it will take to convert one to he other?.

Again, give me some real examples of embedded specific issues, and how
long you think it will take an already competent desk programmer to
assimilate the details of those issues.
I can't give real examples - I can only give opinions. And like so many
questions, the answer is always "it depends". A graduate who has been
studying general theoretical programming will be able to move into different
disciplines quickly, while one that has been concentrating solidly on, say
MS VC++ programming will not, even though are good desktop programmers.



Kevin Aylward
salesEXTRACT@anasoft.co.uk
http://www.anasoft.co.uk
SuperSpice, a very affordable Mixed-Mode
Windows Simulator with Schematic Capture,
Waveform Display, FFT's and Filter Design.
 
David Brown wrote:
"Kevin Aylward" <kevindotaylwardEXTRACT@anasoft.co.uk> wrote in
message news:0OQkb.329$B75.154@newsfep3-gui.server.ntli.net...

Lets say my mainstream, is where one is not limited to 1k ram, i.e.
where there is a reasonable amount of memory. For example, the
multitasking HDSL system I mentioned had around 100k.

Then we are talking about different "mainstreams". I would say that
the embedded mainstream is exactly those systems limited to around 1k
ram, or often less. There are plenty of systems that have more -
I've also programm ed for larger systems. But I'm thinking mainly of
smaller systems here.
Certainly, 1k is having one hand behinds one back.

Well, so far, the only details actually presented are all very much
simple things. Again, I am only addressing C, not asm.

For example, it only takes 5 secs to say, don't do a printf(), or
don't use a[n], use *a++ in a loop instead.

Lets see some real examples where you believe it will takes months
for someone, already very competent in C on desktops, to duplicate
effectively on embedded.


It doesn't take months to learn to use certain different constructs,
but it does take time to get the experiance to know when to use them.
I've seen compilers that will generate better code given "a[n++]" and
compilers that generate better code given "*p++".
I was really addressing the issue that a[n] requires a multiplication,
not a code size issue, to get the pointer value. A multiplication in
software takes lots of cycles. A PC, nowadays, does it in one clock
cycle or two.

When programming
for a large system, the prime importance is readability of the code -
when programming for a small system, readability is vital but it
might not be your only concern. To be a good embedded programmer,
you need to be able to see when something like that is important, and
to know how to interpret the generated assembly to see what your
final code looks like - and to know how to improve on that when
necessary.
Well, by default, again I am assuming no asm. I have already agreed that
asm takes an order of magnitude effort, and a significant learning
curve, that's why I suggested C in the first place.

How long does it take to teach someone to think small?
Imo, for a competent programmer, its instantaneous.

Or to think
about how the code they write translates to the processor they are
using?
Again, I am restricting this to such systems were asm is not relevant.

Or to think about minimal memory usage?
Again, its instantaneous. One might still do this with 128M. Depends on
the project.

Or about the
interaction between main threads and interrupt routines?
Or about
the hardware details of the microcontroller's peripherals and other
peripherals on the board? Or about understanding the board's
schematic diagrams? Or about distinguishing software problems from
hardware problems? Or defensive programming for dealing with
hardware problems that occur during use? Or about designing a usable
interface based on two 7-seg displays and three buttons? I don't
know how long that takes - I don't have enough experiance of teaching
other programmers to give you an answer. All I can say is that I
don't think it is trivial. If you don't agree, then that's fine -
all I can give you is my opinion.
It just all plain engineering to me. Maybe some might be a bit tricky
for some, but overall, I just cant see any major issues here for most. I
suppose we will have to agree to differ.

I agree, however, C programming is not so specialised that 90% of it
goes out of the window when switching from desks to beds.


I tend to look at it from a different view. I don't think the
language is very relevant
- I would expect a compentent desktop C
programmer to be able to switch to desktop Pascal programming quickly
and easily (and I have seen that happen successfully), because the
language is just a detail. I would also expect a good embedded
programmer to pick up a new assembly language pretty quickly. I
don't know how to quantify the difference in the philosophy, and the
way of approaching problems, that differentiates small embedded
programming from large system programming, but it's not the language
that is the key difference.
I agree with this. The point is that C has say, 30 statements, the
general knowledge in using these statements is directly transferable.

Depends what part of the system they are dealing with. Suppose they
had just written an fft in c on the desk, and now had to do the
same, in c on the bed. Or an Ethernet stack. There's loads of stuff
that would be quite transparent.


These are likely to be completely different. If you have written an
ethernet stack for a PC and expect to run it on a small micro, you
will end up over-stretching your resources by a large factor. Look
at some websites for implementations of tcp/ip and embedded web
servers - they have all sorts of interesting tricks to be able to
work well on tiny systems, and the implementations bear little
resemblance to stacks on large systems or desktops.
But once you have wrote it once, you can write a second one much easier.
Consider the difference between a first timer stack writer, but
experienced embedded, verses, experienced desktop stack writer, first
time embedded.

Similarly with
an fft - on a PC you happily use large arrays of doubles, while on an
embedded system without even a hardware integer multiplier you might
be looking for all sorts of tricks using lookup tables, fixed point
or integer arithmetic, etc. These two are in fact excellent examples
of how desktop experiance differs wildly from embedded systems.
I chose the fft because I cant imagine any practical applications that
would require ffts or is doable with ffts, running on 8bit, 4k systems.
It just aint goanna happen, imo.

There is an argument for washing machines/fridges to be connected to the
internet, but then, by default, one has to bite the bullet and go for
the bigger systems.

They
also demonstrate where there is overlap - an understanding of
ethernet protocols and fft theory is important on each platform.


I've seen the results of competent desktop programmers writing stuff
for small microcontrollers, and they were not pretty - in each case,
I ended up writing the code from scratch.

Someone's first project is always less then perfect.

Oh yes, and I can also give you plenty of examples of my own code
which is way less than perfect. As I say, it takes a long time to be
a good embedded programmer.
Having said that, I invented C++ in my first C code program around 13
years ago. I had no C++ knowledge at all, but developed this code with
an array of structures with function pointers in them such that the
final syntax was identical to that used in c++, i.e. name->function().
I was absolutely stunned when I saw my first C++ code. After a few
attempts, it appeared to be the natural way to solve the problem.

My point here is that we often think that what we know is special, but
in reality it aint. Just about anyone can do the right thing when the
situation occurs.

that might have ended up with programs that were good
enough for the job. But turning them into "good" embedded
programmers would not have been a trivial task.

This makes no logical sense. For example, lets suppose that it takes
say, 5 years for an EE/CS graduate to become reasonably good at
either embedded or desktop programming. just how much extra time do
you really think that it will take to convert one to he other?.

Again, give me some real examples of embedded specific issues, and
how long you think it will take an already competent desk programmer
to assimilate the details of those issues.


I can't give real examples - I can only give opinions. And like so
many questions, the answer is always "it depends". A graduate who
has been studying general theoretical programming
As all do, in general.

will be able to
move into different disciplines quickly, while one that has been
concentrating solidly on, say MS VC++ programming will not, even
though are good desktop programmers.
But that doesn't happen. EE/CS course are by their nature, very general.

Kevin Aylward
salesEXTRACT@anasoft.co.uk
http://www.anasoft.co.uk
SuperSpice, a very affordable Mixed-Mode
Windows Simulator with Schematic Capture,
Waveform Display, FFT's and Filter Design.
 
Hi Kevin,

I think "agree to differ" sums things up quite well. But it's always good
to hear other opinions every now and again, and I don't think we are quite
as far apart as we originally thought.

"Kevin Aylward" <kevindotaylwardEXTRACT@anasoft.co.uk> wrote in message
news:xBSkb.350$B75.97@newsfep3-gui.server.ntli.net...
David Brown wrote:
"Kevin Aylward" <kevindotaylwardEXTRACT@anasoft.co.uk> wrote in
message news:0OQkb.329$B75.154@newsfep3-gui.server.ntli.net...

Lets say my mainstream, is where one is not limited to 1k ram, i.e.
where there is a reasonable amount of memory. For example, the
multitasking HDSL system I mentioned had around 100k.

Then we are talking about different "mainstreams". I would say that
the embedded mainstream is exactly those systems limited to around 1k
ram, or often less. There are plenty of systems that have more -
I've also programm ed for larger systems. But I'm thinking mainly of
smaller systems here.

Certainly, 1k is having one hand behinds one back.
Note that the 1k is ram space, not program space - I often work with systems
with less ram than that, but there are few flash/otp chips with less than 2k
program space.

Well, so far, the only details actually presented are all very much
simple things. Again, I am only addressing C, not asm.

For example, it only takes 5 secs to say, don't do a printf(), or
don't use a[n], use *a++ in a loop instead.

Lets see some real examples where you believe it will takes months
for someone, already very competent in C on desktops, to duplicate
effectively on embedded.


It doesn't take months to learn to use certain different constructs,
but it does take time to get the experiance to know when to use them.
I've seen compilers that will generate better code given "a[n++]" and
compilers that generate better code given "*p++".

I was really addressing the issue that a[n] requires a multiplication,
not a code size issue, to get the pointer value. A multiplication in
software takes lots of cycles. A PC, nowadays, does it in one clock
cycle or two.
Exactly the sort of thing an embedded programmer needs to know - some chips
(or varients of them) have hardware multipliers, making this fast. Others
don't. Some compilers will optomise loops with a[n] into pointers and
addition, others will call multiply routines for each access. And different
sizes of array elements can make a big difference - a multiply of 2 is
seldom much overhead, but if the elements are 15 byte structure, maybe you'd
be better off with an extra byte of padding (trading performance for memory
space). Perhaps the compiler can generate nice inline code for a
multiply-by-15 (I know gcc can do that sort of thing). Perhaps the chip has
a hardware multiplier (and do you know how that will interact with
interrupts?). A good embedded programmer needs to understand about these
sorts of things.


When programming
for a large system, the prime importance is readability of the code -
when programming for a small system, readability is vital but it
might not be your only concern. To be a good embedded programmer,
you need to be able to see when something like that is important, and
to know how to interpret the generated assembly to see what your
final code looks like - and to know how to improve on that when
necessary.

Well, by default, again I am assuming no asm. I have already agreed that
asm takes an order of magnitude effort, and a significant learning
curve, that's why I suggested C in the first place.
You can't be a good embedded programmer without at least an understanding
for assembly. I think it is essential to have read the datasheets for the
chip you are using, and to have a feel for the processor architecture (I
once read the i486 manual, but I don't think it helps my PC programming much
:). You don't have to become fluent in writing a chip's assembly, but you
should be able to understand it.

How long does it take to teach someone to think small?

Imo, for a competent programmer, its instantaneous.

Or to think
about how the code they write translates to the processor they are
using?

Again, I am restricting this to such systems were asm is not relevant.
Again, I am not. I only write asm for the very smallest systems, or for
special applications, but understanding asm is important to writing good C
programs for medium systems.

Or to think about minimal memory usage?

Again, its instantaneous. One might still do this with 128M. Depends on
the project.

Or about the
interaction between main threads and interrupt routines?
Or about
the hardware details of the microcontroller's peripherals and other
peripherals on the board? Or about understanding the board's
schematic diagrams? Or about distinguishing software problems from
hardware problems? Or defensive programming for dealing with
hardware problems that occur during use? Or about designing a usable
interface based on two 7-seg displays and three buttons? I don't
know how long that takes - I don't have enough experiance of teaching
other programmers to give you an answer. All I can say is that I
don't think it is trivial. If you don't agree, then that's fine -
all I can give you is my opinion.

It just all plain engineering to me. Maybe some might be a bit tricky
for some, but overall, I just cant see any major issues here for most. I
suppose we will have to agree to differ.


I agree, however, C programming is not so specialised that 90% of it
goes out of the window when switching from desks to beds.


I tend to look at it from a different view. I don't think the
language is very relevant
- I would expect a compentent desktop C
programmer to be able to switch to desktop Pascal programming quickly
and easily (and I have seen that happen successfully), because the
language is just a detail. I would also expect a good embedded
programmer to pick up a new assembly language pretty quickly. I
don't know how to quantify the difference in the philosophy, and the
way of approaching problems, that differentiates small embedded
programming from large system programming, but it's not the language
that is the key difference.

I agree with this. The point is that C has say, 30 statements, the
general knowledge in using these statements is directly transferable.



Depends what part of the system they are dealing with. Suppose they
had just written an fft in c on the desk, and now had to do the
same, in c on the bed. Or an Ethernet stack. There's loads of stuff
that would be quite transparent.


These are likely to be completely different. If you have written an
ethernet stack for a PC and expect to run it on a small micro, you
will end up over-stretching your resources by a large factor. Look
at some websites for implementations of tcp/ip and embedded web
servers - they have all sorts of interesting tricks to be able to
work well on tiny systems, and the implementations bear little
resemblance to stacks on large systems or desktops.

But once you have wrote it once, you can write a second one much easier.
Consider the difference between a first timer stack writer, but
experienced embedded, verses, experienced desktop stack writer, first
time embedded.

Similarly with
an fft - on a PC you happily use large arrays of doubles, while on an
embedded system without even a hardware integer multiplier you might
be looking for all sorts of tricks using lookup tables, fixed point
or integer arithmetic, etc. These two are in fact excellent examples
of how desktop experiance differs wildly from embedded systems.

I chose the fft because I cant imagine any practical applications that
would require ffts or is doable with ffts, running on 8bit, 4k systems.
It just aint goanna happen, imo.
I agree that an 8-bit, 4k system is too small, but people *do* use fft's on
16-bit integer only processors with a little bit bigger memory. They are
certainly used on systems that are small enough that you have to think hard
about the implementation, not just copy a PC implementation.

There is an argument for washing machines/fridges to be connected to the
internet, but then, by default, one has to bite the bullet and go for
the bigger systems.
And I thought my current washing machine was complicated...

They
also demonstrate where there is overlap - an understanding of
ethernet protocols and fft theory is important on each platform.


I've seen the results of competent desktop programmers writing stuff
for small microcontrollers, and they were not pretty - in each case,
I ended up writing the code from scratch.

Someone's first project is always less then perfect.

Oh yes, and I can also give you plenty of examples of my own code
which is way less than perfect. As I say, it takes a long time to be
a good embedded programmer.

Having said that, I invented C++ in my first C code program around 13
years ago. I had no C++ knowledge at all, but developed this code with
an array of structures with function pointers in them such that the
final syntax was identical to that used in c++, i.e. name->function().
I was absolutely stunned when I saw my first C++ code. After a few
attempts, it appeared to be the natural way to solve the problem.

My point here is that we often think that what we know is special, but
in reality it aint. Just about anyone can do the right thing when the
situation occurs.
You have more faith in the abilities of the average programmer than I do.
Perhaps it is more that I think a great many desktop programmers are pretty
poor rather than that embedded programming is so hard.

that might have ended up with programs that were good
enough for the job. But turning them into "good" embedded
programmers would not have been a trivial task.

This makes no logical sense. For example, lets suppose that it takes
say, 5 years for an EE/CS graduate to become reasonably good at
either embedded or desktop programming. just how much extra time do
you really think that it will take to convert one to he other?.

Again, give me some real examples of embedded specific issues, and
how long you think it will take an already competent desk programmer
to assimilate the details of those issues.


I can't give real examples - I can only give opinions. And like so
many questions, the answer is always "it depends". A graduate who
has been studying general theoretical programming

As all do, in general.

will be able to
move into different disciplines quickly, while one that has been
concentrating solidly on, say MS VC++ programming will not, even
though are good desktop programmers.


But that doesn't happen. EE/CS course are by their nature, very general.
I suppose that is the case with EE/CS graduates - but I think that many
desktop programmers, even the good ones, don't have such general training,
and have only done more specialised courses.


Kevin Aylward
salesEXTRACT@anasoft.co.uk
http://www.anasoft.co.uk
SuperSpice, a very affordable Mixed-Mode
Windows Simulator with Schematic Capture,
Waveform Display, FFT's and Filter Design.
 

Welcome to EDABoard.com

Sponsor

Back
Top