C
Clifford Heath
Guest
On 5/10/19 1:32 am, jlarkin@highlandsniptechnology.com wrote:
I mean that the behaviour falls inside the acceptable range of outcomes
with high enough probability, and outside the unacceptable range with
low probability. For a bridge, or a commercial airplane, the cost of
failure is very high. A toaster, not so much.
The business defines the acceptable outcomes and the risks, and the
engineer tries to ensure that the product meets expectations, i.e. falls
inside the acceptable ranges.
Glad to hear that
That's why modern languages are moving away from the need for such things.
> Computer Science doesn't seem to have much interest in computers.
Any computer that is Turing complete can simulate any other computer,
and can compute any computable function. If it wasn't for CS, we
wouldn't even know that. CS folk work in the area of the theoretically
possible (like pure math) not the realisable. That's important, but it
doesn't try to solve business problems.
I think you'd be totally wrong there. At least any university graduate.
> Python looks OK. I should get into that one day.
If you don't like white-space as syntax, look at Ruby. It's similar, but
has a more conventional (and in fact quite beautiful) syntax. You'll
find fewer scientists use it however, and more web wannabe's.
CH
On Fri, 4 Oct 2019 17:42:35 +1000, Clifford Heath <no.spam@please.net
wrote:
On 3/10/19 10:09 am, Clifford Heath wrote:
On 3/10/19 8:55 am, John Larkin wrote:
On Wed, 2 Oct 2019 17:22:50 -0400, bitrex <user@example.net> wrote:
software engineering and electrical engineering have diverged very far
over time to the point that outside a basic circuits class some students
take.
They should teach EE students the fundamentals, as always. Physics,
electromagnetics, circuit theory, EDA (engineering unit concepts),
signals+systems, control theory. Without that background, anything
more "advanced" is useless.
I have no idea what software engineering is. I should buy a textbook.
I did buy a sociology textbook; it was hilarious.
Can anybody recommend a good (or at least popular) undergrad intro
textbook in software engineering?
Most software is not "engineered" in any real sense (and for that matter
most "engineering" is not engineered either). Most "software engineers"
cannot adequately even define what they mean by "engineering".
My definition: engineering is solving problems by applying known
(measured) properties of available resources in a way that provides a
guarantee that certain outcomes will always be met, despite any natural
variation in the properties of those resources. So when a structural
engineer specifies a certain steel beam to support some load, they take
into account the specified load, the way the beam will be supported, and
the known properties of steel and of the type of beam specified.
The meaning of "properties of resources" in software is the properties
of storage devices, operating systems, databases, data structures and
the algorithms that implement operations on them. These properties are
almost only ever specified in the "happy path", and seldom if ever is
the worst-case behaviour specified. Instead, subsystems and libraries
are designed to avoid catastrophic worst cases (polynomial or
exponential runtimes, for example) - O(N*logN) is the worst that's
allowed without prior warning.
In any case software writers seldom know or consider these properties,
they just throw stuff at the wall and see what sticks. The same way the
structural engineer doesn't actually calculate the steel beam, they just
look it up in a table.
A former colleague trained first as a mech eng because he loved solving
the mathematical problems. During his first year working, he realised
that never in his career would he get to actually *use* the mathematical
analyses that he loved, so he went back to school and qualified in CS.
That's the kind of guy you want writing software for you (but they're
rare)!
So CS courses do actually teach the required analysis, and in my
experience software writers do have some awareness of it, but only
enough to avoid using pathological algorithms. For the most part the
difficult algorithms are implemented in infrastructure code, and very
few people are competent to write that. (though many others try, and
they succeed only because spectacular computing power hides the
fundamental weakness of their code).
And then there's *systems* engineering, which is what happens when more
than one organisation works together to produce some system that's
larger than any of them could do by themselves. This brings in a raft of
human factors like clear communication, change management, etc. This too
can be engineered to some extent, but the vast majority of software
people never really find themselves in such a situation - and the Agile
movement is all about avoiding it. The one exceptional case is embedded
software, where you have a hardware team and a software team, and
neither can do what the other can. Our immaturity at systems engineering
explains the parlous state of embedded software.
I asked the dean of a big CS depertment what sort of programming they
teach nowadays. She handed me my head. "We don't teach programming!"
Oh. Sorry.
I *think* I understand that response. Modern CS is about *derivation*,
not about construction of step-by-step algorithms that perform some
derivation. By "derivation" I mean that for every current-state and
event there is an algebraic expression that derives next-state. This is
the basic idea behind functional programming. Every derivation is a
"function" that takes one parameter (current state) and by applying the
function, produces the next state. The compiler is responsible for
finding an algorithm to efficiently compute each function.
"Programming" then becomes the job of the compiler. The coder must find
a way to express their computation algebraically, and the compiler does
the rest. Haskell is perhaps the best examplar of this.
Clifford Heath.
John: You asked, I answered. It would be nice to get some feedback.
CH
What do you mean by "guarantee that certain outcomes will always be
met"? There is always risk, always bugs. We always take chances, and
if we try too hard to eliminate risk we'll never get stuff done.
I mean that the behaviour falls inside the acceptable range of outcomes
with high enough probability, and outside the unacceptable range with
low probability. For a bridge, or a commercial airplane, the cost of
failure is very high. A toaster, not so much.
The business defines the acceptable outcomes and the risks, and the
engineer tries to ensure that the product meets expectations, i.e. falls
inside the acceptable ranges.
I avoid designing stuff that, if it fails, will blow expensive things
up or kill people.
Glad to hear that
Programming is a lot more than algebra and algorithms. I don't think
the average programmer even knows how to do algebra. The real problem
in programming is to control procedural flow and synchronize events
and avoid idiotic pointer and buffer crashes.
That's why modern languages are moving away from the need for such things.
> Computer Science doesn't seem to have much interest in computers.
Any computer that is Turing complete can simulate any other computer,
and can compute any computable function. If it wasn't for CS, we
wouldn't even know that. CS folk work in the area of the theoretically
possible (like pure math) not the realisable. That's important, but it
doesn't try to solve business problems.
I think the average programmer has never heard of a state machine,
either.
I think you'd be totally wrong there. At least any university graduate.
> Python looks OK. I should get into that one day.
If you don't like white-space as syntax, look at Ruby. It's similar, but
has a more conventional (and in fact quite beautiful) syntax. You'll
find fewer scientists use it however, and more web wannabe's.
CH