Test driven development?

M

Mike Harding

Guest
I'm curious: what do people think of test driven development
when applied to embedded (read small microprocessors and
micro controller) systems?

Mike Harding
 
"Mike Harding" <mike_harding1@nixspam-hotmail.com> wrote in message
news:4fgoh0p08a2ujvb5i8jrqljae2tm60gr55@4ax.com...
I'm curious: what do people think of test driven development
when applied to embedded (read small microprocessors and
micro controller) systems?

Mike Harding
I guess that anything that helps you find problems before you start is a
good thing. But at the end of the day, who really has time to implement such
a pattern?
 
Depends on the job.
If it's a big high profile mission critical job and you have the time
to do it then it's a serious option, or more often just an essential
requirement dictated from higher up.
But more often than not it's a waste of time and most people don't
bother, development times are too short these days.
It's also a popular way for a bad or just plain slow and inefficient
programmers to justify a job that takes 10 times longer than a good
talented programmer who knows what they are doing :->
Common with application level software written by "software
departments", but in practice rarely used on the small micro level.
Dave :)
 
"Andrew" <a.pearson@no.spam.aasiascales.com.au> wrote in message
news:411bd6c6$1@dnews.tpgi.com.au...
I guess that anything that helps you find problems before you start is a
good thing. But at the end of the day, who really has time to implement
such
a pattern?
The customer ; )
 
"Peter Parker" <parkerp@NOSPAMalphalink.com.au> wrote in message
news:41208f0a@news.alphalink.com.au...
"Andrew" <a.pearson@no.spam.aasiascales.com.au> wrote in message
news:411bd6c6$1@dnews.tpgi.com.au...
I guess that anything that helps you find problems before you start is a
good thing. But at the end of the day, who really has time to implement
such
a pattern?

The customer ; )
Not when they recieve the bill.
 
"David L. Jones" <tronnort@yahoo.com> wrote in message
news:cfgvg0$506@odak26.prod.google.com...
Depends on the job.
If it's a big high profile mission critical job and you have the time
to do it then it's a serious option, or more often just an essential
requirement dictated from higher up.
And rightly so. Not testing (or developing using a credible life cycle
development methodology) for mission critical systems is not appropriate. If
it is mission critical then it has to be robust.

But more often than not it's a waste of time and most people don't
bother, development times are too short these days.
If quality doesn't matter. Speed to market with a dud product that ends up
having buckets of money and time poured in to fix it up is not uncommon -
but it reflects on the lack of professionalism of the organisation.

It's also a popular way for a bad or just plain slow and inefficient
programmers to justify a job that takes 10 times longer than a good
talented programmer who knows what they are doing :-
As opposed to the gradutes and cowboys who can deliver even less efficient
code 10 times faster but with a total lack of respect for performance,
processing costs or future maintenance capability ;-|

Common with application level software written by "software
departments", but in practice rarely used on the small micro level.
Dave :)
Where quality doesn't matter.
 
"Wayne Reid" <REMOVEgokangas@hotmail.com> wrote in message
news:pRaVc.26033$%r.287266@nasal.pacific.net.au...
"David L. Jones" <tronnort@yahoo.com> wrote in message
news:cfgvg0$506@odak26.prod.google.com...
Depends on the job.
If it's a big high profile mission critical job and you have the time
to do it then it's a serious option, or more often just an essential
requirement dictated from higher up.

And rightly so. Not testing (or developing using a credible life cycle
development methodology) for mission critical systems is not appropriate.
If
it is mission critical then it has to be robust.

But more often than not it's a waste of time and most people don't
bother, development times are too short these days.

If quality doesn't matter. Speed to market with a dud product that ends up
having buckets of money and time poured in to fix it up is not uncommon -
but it reflects on the lack of professionalism of the organisation.

It's also a popular way for a bad or just plain slow and inefficient
programmers to justify a job that takes 10 times longer than a good
talented programmer who knows what they are doing :-

As opposed to the gradutes and cowboys who can deliver even less efficient
code 10 times faster but with a total lack of respect for performance,
processing costs or future maintenance capability ;-|

Common with application level software written by "software
departments", but in practice rarely used on the small micro level.
Dave :)

Where quality doesn't matter.


It's not really all that hard to build test functions into your code, it
just takes a mind-set. Which, I admit, I don't always have - it takes a bit
more effort to do. :) As you say though, it produces better quality
product.

Ken
 
"Ken Taylor" <ken123@xtra.co.nz> wrote in message news:<2ol6jtFbb1gjU1@uni-berlin.de>...
"Wayne Reid" <REMOVEgokangas@hotmail.com> wrote in message
news:pRaVc.26033$%r.287266@nasal.pacific.net.au...

"David L. Jones" <tronnort@yahoo.com> wrote in message
news:cfgvg0$506@odak26.prod.google.com...
Depends on the job.
If it's a big high profile mission critical job and you have the time
to do it then it's a serious option, or more often just an essential
requirement dictated from higher up.

And rightly so. Not testing (or developing using a credible life cycle
development methodology) for mission critical systems is not appropriate.
If
it is mission critical then it has to be robust.

But more often than not it's a waste of time and most people don't
bother, development times are too short these days.

If quality doesn't matter. Speed to market with a dud product that ends up
having buckets of money and time poured in to fix it up is not uncommon -
but it reflects on the lack of professionalism of the organisation.

It's also a popular way for a bad or just plain slow and inefficient
programmers to justify a job that takes 10 times longer than a good
talented programmer who knows what they are doing :-

As opposed to the gradutes and cowboys who can deliver even less efficient
code 10 times faster but with a total lack of respect for performance,
processing costs or future maintenance capability ;-|

Common with application level software written by "software
departments", but in practice rarely used on the small micro level.
Dave :)

Where quality doesn't matter.


It's not really all that hard to build test functions into your code, it
just takes a mind-set. Which, I admit, I don't always have - it takes a bit
more effort to do. :) As you say though, it produces better quality
product.

Ken
It can also be a complete waste of time if your code is only small
and/or simple and fully testable within your actual application. It
will also mean extra code that can force you into a more expensive
micro solution which can mean big $$$$$$ if you are into high volume.
Test driven development is not a universal solution, it will always
depend on the circumstances.

We had a classic situation a few months back. We needed something
within a day or two to meet a critical deadline. We were quoted 3
weeks+ by the software department who wanted to implement the proper
software development cycle etc etc. I said bugger that and wrote it in
two hours, fully tested in the actual application in another hour or
two. Deadline met, customer happy, and a quality product 100% proven
by full application testing. Yes, we documented it later. Some markets
demand ridiculous deadlines.

Dave :)
 
"David L. Jones" <tronnort@yahoo.com> wrote in message
news:894aa0b3.0408212326.3bc32bd7@posting.google.com...
"Ken Taylor" <ken123@xtra.co.nz> wrote in message
news:<2ol6jtFbb1gjU1@uni-berlin.de>...
"Wayne Reid" <REMOVEgokangas@hotmail.com> wrote in message
news:pRaVc.26033$%r.287266@nasal.pacific.net.au...

"David L. Jones" <tronnort@yahoo.com> wrote in message
news:cfgvg0$506@odak26.prod.google.com...
Depends on the job.
If it's a big high profile mission critical job and you have the
time
to do it then it's a serious option, or more often just an essential
requirement dictated from higher up.

And rightly so. Not testing (or developing using a credible life cycle
development methodology) for mission critical systems is not
appropriate.
If
it is mission critical then it has to be robust.

But more often than not it's a waste of time and most people don't
bother, development times are too short these days.

If quality doesn't matter. Speed to market with a dud product that
ends up
having buckets of money and time poured in to fix it up is not
uncommon -
but it reflects on the lack of professionalism of the organisation.

It's also a popular way for a bad or just plain slow and inefficient
programmers to justify a job that takes 10 times longer than a good
talented programmer who knows what they are doing :-

As opposed to the gradutes and cowboys who can deliver even less
efficient
code 10 times faster but with a total lack of respect for performance,
processing costs or future maintenance capability ;-|

Common with application level software written by "software
departments", but in practice rarely used on the small micro level.
Dave :)

Where quality doesn't matter.


It's not really all that hard to build test functions into your code, it
just takes a mind-set. Which, I admit, I don't always have - it takes a
bit
more effort to do. :) As you say though, it produces better quality
product.

Ken

It can also be a complete waste of time if your code is only small
and/or simple and fully testable within your actual application. It
will also mean extra code that can force you into a more expensive
micro solution which can mean big $$$$$$ if you are into high volume.
Test driven development is not a universal solution, it will always
depend on the circumstances.

We had a classic situation a few months back. We needed something
within a day or two to meet a critical deadline. We were quoted 3
weeks+ by the software department who wanted to implement the proper
software development cycle etc etc. I said bugger that and wrote it in
two hours, fully tested in the actual application in another hour or
two. Deadline met, customer happy, and a quality product 100% proven
by full application testing. Yes, we documented it later. Some markets
demand ridiculous deadlines.

Dave :)
The quite valid point being along the lines of right tool for the job......

Cheers.

ken
 

Welcome to EDABoard.com

Sponsor

Back
Top