dead programming languages...

On 24-Feb-23 9:10 am, John Robertson wrote:
On 2023/02/22 11:05 a.m., John Larkin wrote:
https://en.wikipedia.org/wiki/Timeline_of_programming_languages


Now I\'m told that we should be coding hard embedded products in C++ or
Rust.


If you know COBOL then the US IRS department may have work for you.
Apparently that is the language for their tax system...

Is 2036 going to be a problem? They want to phase COBOL out by 2030.

John :-#)#

I had to learn COBOL for my computer science degree (yes, I know that
that dates me). What a waste of time that was.

Though I may have written the world\'s only data-driven COBOL program.

Sylvia.
 
On 2/23/2023 12:26 PM, Joe Gwinn wrote:
Filtering the TIOBE list to remove languages unsuited to embedded
hardware uses, we end up with C and C++, and assembler.

That seems an awfully narrow view of what is appropriate for
embedded systems.

I use an \"enhanced syntax\" C, SQL, ASM and a perverted Limbo
(for user-written \"scripts\") in my current system.

I could have chosen other languages for the UI: I\'d considered
python, lua, ml and rexx but opted for Limbo as it had inherent
support for RPC, was close to the C dialect, produced tiny code
and *seemed* like it would be most expressive for a user (who
doesn\'t want to be a \"coder\")

[It\'s also very portable as it runs on a VM]

Note my system is RT -- but RT doesn\'t have to mean \"real fast\".
Nor does HRT have to mean \"must meet deadline\".

Turned around, any language that is not decades old and now in wide
use is unlikely to endure. The first C version was released in 1972,
and the first C++ in 1983.

C is the modern BASIC. Anyone who has learned (or is using/playing
with) one of the other languages likely already knows (or is familiar
with) C. Enough so that he can understand what a *running* C program does
(but may not be able to make a NONrunning program work!)

The most important issue for development is the general availability
of the entire needed ecosystem, including toolchain (compilers,
debuggers, et al), operating-system interfaces, tracers, kernel
debuggers, et al, a community of interest, and customer support.

If available at time t < now, then you can always preserve the
development environment, going forward. Bugs in the toolchain can be
noted and worked-around (if the vendor goes away -- as happens often)

The bigger, long term problem with toolchains is ensuring that some
*new* platform that you intend to target will be supported! There\'s
nothing you can do in preserving an existing/old toolchain that
will magically make it work for the new target!

The next issue is availability of programmers for the chosen language
and ecosystem, covering not just the initial team but also to cover
the usual rates of employee turnover to maintain full staffing over
time. Which brings us to the next issue:

As a language wanes in popularity, the level of expertise also tends
to diminish. Someone may be able to \"get a feel for\" the code, yet
miss important details or consequences of the implementation. I
was reviewing a bit of SNOBOL and was quick to spot a bug that the
original implementer had likely overlooked. Someone with only a
cursory understanding of the language wouldn\'t have noticed it
until a particular test case irritated it!

How widely is this toolchain supported? Is it just one company, so
there will be a forced redesign, recode in a different language, and
reimplementation of hardware when that company triples their prices
and ultimately fails, or decides to leave this business for greener
pastures, or whatever. Which happens all the time.

So, there must be multiple entities supporting the chosen ecosystem.
Historically, open-source ecosystems have fared better here.

The advantage being that, push-comes-to-shove, *you* can fix the
tools.

If one does need to change ecosystems, how hard will it be? If the
programming language is widely supported, then while a lot of work,
adapting existing code to the new toolchain is practical, while
rewriting an entire codebase in a different language is usually
totally impractical - so that existing code is a dead loss.

Exactly. The folks who wrote it *may* have learned something
from the effort. But, the folks intending to benefit from the effort
($$) are SoL.

For embedded realtime uses, the list of suitable languages and
ecosystems is relatively short. Assembly code is excluded, because
going from one processor type to another to another is basically a
full rewrite.

So we are basically left with C and C++ in their various dialects and
forms.

The basic difference is that C is smaller, simpler, and faster than
C++, and far more suited to direct control of hardware.

Again, this seems an overly narrow selection. Remember, RT just means
\"time is important\"; it says nothing about time SCALES. IIRC, AT&T
uses (used?) it in some of their infrastructure.

If you can predict and *bound* performance of operations temporally,
then it can address RT applications. It (and most other) languages
can only address HRT applications \"open loop\"; the designer must ensure
that deadlines are met cuz the language can\'t inherently do that
(it has no notion of deadlines). Or, you have to wrap the application(s)
in an environment that can enforce them. E.g., kill off any applications
that miss their *hard* deadlines.

In the large radars of my experience, we use both. Stuff close to
hardware is in C (in turn controlling hardware using VHDL of some
kind), and the millions of lines of application code are in C++.

So, my vote would be plain ANSI C. Most C++ compilers can handle C as
well, and there are also C-specific compilers and toolchains.

The risk with C++ compilers is failing to constrain it to interpret
the code as strictly C. You\'d likely not recognize if it had made some
C++ interpretations as the functionality would likely be unchanged
(but performance might be!)
 
On 2/23/2023 3:57 PM, bitrex wrote:
On 2/23/2023 5:28 PM, Don Y wrote:
On 2/23/2023 3:08 PM, Clifford Heath wrote:
On 23/02/23 15:14, Don Y wrote:
On 2/22/2023 9:00 PM, Clifford Heath wrote:
On 23/02/23 14:02, bitrex wrote:
On 2/22/2023 2:05 PM, John Larkin wrote:
https://en.wikipedia.org/wiki/Timeline_of_programming_languages

Now I\'m told that we should be coding hard embedded products in C++ or
Rust.

User-defined strong types that enforce their own usage are probably worth
the price of admission alone; e.g. quantities in newton/meters should be
of type NewtonMeters and foot/pounds should be FootPounds, and casually
performing operations with the two causes a compile error.

On the contrary, the language should automatically provide the appropriate
conversions.

I disagree, esp for anything beyond simple types.  (i.e., promote a
char to an int and hope the developer truly understands how signedness
is handled, etc.)

Pascal?

Requiring an explicit cast reassure me (the NEXT guy looking at
the code) that the developer actually intended to do what
he\'s doing in the way he\'s doing it -- even if it \"makes sense\".
This is what I liked most about C++ (overloading operators
so the *syntax* was clearer by hiding the machinery -- but not
eliminating it!)

How many times do you see an int being used as a pointer?
Is it *really* intended to be a pointer in THIS context?
I get tired of having to chase down compiler warnings
of this sort of thing in inherited code:  \"If you WANT
it to be a pointer, then explicitly cast it as such!
Don\'t just count on the compiler to *use* it as one!\"

Why on earth are you answering my statement about *units conversion* with a
rant about integer representations and pointer casts.

Because it\'s not *units* that bitrex was addressing,
rather, *types*.  Did you miss:

------------------------vvvvv
    \"User-defined strong types that enforce their own usage
    are probably worth the price of admission alone;\"

He could, perhaps, have come up with a different example.

Sure, there are all sorts of good reasons to use strong types beyond just
enforcing unit conversions, it also makes for self-documenting code. A
contrived example is:

class Rectangle
{
public:
    Rectangle(float width, float height);
    ....
};

But then at the call site one coder writes:

auto r = Rectangle(4, 5);

Someone else looks at that will have to go back to the .h file to see what
order the parameters are. You could instead have

class Rectangle
{
public:
    Rectangle(Width width, Height height);
    ....
};

and then at the call site is written:

auto rectangle = Rectangle{Width{4}, Height{5}};

So it\'s more clear what\'s going on.

More importantly, referencing r.radius will be flagged as an error
whereas r.area and c.area (e.g., Circle c) will work as expected.
 
On 2/23/2023 12:42 PM, three_jeeps wrote:
In my world (safety critical sw systems, e.g. flight control, medical), Ada
is still used - probably on life support tho. More usage in Europe than
USA. C is out and out dangerous in this environment even when standards such
as MISRA-C are used.

Every language is dangerous when the practitioners don\'t understand the
tools of their trade, sufficiently. Would you hand a soldering iron to
an accountant and expect him to make a good joint?

What do you me by \'coding hard embedded products\'? you mean \'hard real-time
embedded systems\' eg. system timing and thread scheduling must be
completely deterministic?

HRT only means that there is no value to continuing work on the
problem once the deadline has passed.

You could write your HRT code in LISP, COBOL, <whatever>. If it
meets the deadline (even if only occasionally), then so be it.
If not, pull the plug AT the deadline cuz there\'s nothing more
that can be done TOWARDS ACHIEVING THE GOAL.

The designer has to decide how important the deadlines are.
Missile defense likely assigns a high cost to missing a deadline.
Yet, I\'m sure there is some accepted level of \"missed deadlines\"
formally specified in the design documents for those systems.
You don\'t shut down the defense system when *a* deadline is
missed (i.e., when the incoming armament has passed beyond the
point where you can counteract it). Rather, you say \"Ooops\"
and move your resources on to addressing other deadlines
(incoming armaments) -- instead of wasting effort on a lost
cause!

HRT is an excuse people use to avoid thinking about what to
do WHEN they miss a deadline (how much MORE resources would
you devote to being 100.00% sure you don\'t miss *any*? Is
this a reasonable cost to assume for the benefit afforded?)
 
On 2/23/2023 4:32 PM, Don Y wrote:
The basic difference is that C is smaller, simpler, and faster than
C++, and far more suited to direct control of hardware.

Again, this seems an overly narrow selection.  Remember, RT just means
\"time is important\"; it says nothing about time SCALES.  IIRC, AT&T
uses (used?) it in some of their infrastructure.

Sorry: \"it\" being Limbo/Inferno
 
On Wednesday, February 22, 2023 at 7:02:56 PM UTC-8, bitrex wrote:
On 2/22/2023 2:05 PM, John Larkin wrote:
https://en.wikipedia.org/wiki/Timeline_of_programming_languages


Now I\'m told that we should be coding hard embedded products in C++ or
Rust.

User-defined strong types that enforce their own usage are probably
worth the price of admission alone; e.g. quantities in newton/meters
should be of type NewtonMeters and foot/pounds should be FootPounds, and
casually performing operations with the two causes a compile error.

You can get 98% of the way there by forcing yourself to name variables
with their units.

float delay_time = 123.456; // No, just no
float delay_time_ms = 123.456; // OK

-- john, KE5FX
 
On 2/23/2023 4:18 PM, Sylvia Else wrote:
I had to learn COBOL for my computer science degree (yes, I know that that
dates me). What a waste of time that was.

Moreso than Pascal? Algol? LISP? PL/1?

Do they still teach lambda calculus? Petri nets?

At least you can take solace in knowing that your paycheck was likely
printed as a result of the language\'s existence!

Though I may have written the world\'s only data-driven COBOL program.

Sylvia.
 
On 2/23/2023 4:55 PM, John Miles, KE5FX wrote:
On Wednesday, February 22, 2023 at 7:02:56 PM UTC-8, bitrex wrote:
On 2/22/2023 2:05 PM, John Larkin wrote:
https://en.wikipedia.org/wiki/Timeline_of_programming_languages


Now I\'m told that we should be coding hard embedded products in C++ or
Rust.

User-defined strong types that enforce their own usage are probably
worth the price of admission alone; e.g. quantities in newton/meters
should be of type NewtonMeters and foot/pounds should be FootPounds, and
casually performing operations with the two causes a compile error.

You can get 98% of the way there by forcing yourself to name variables
with their units.

float delay_time = 123.456; // No, just no
float delay_time_ms = 123.456; // OK

But that relies solely on discipline.

And, makes for cumbersome names, john_man.

What if you are referencing something \"generic\" or that carries
its own type information?

Do I really care if IP_address is v4 or v6 -- if I\'m checking
to see if it matches some other IP_address?
 
On 2/22/23 19:42, Don Y wrote:
On 2/22/2023 8:25 PM, Don Y wrote:
On 2/22/2023 8:02 PM, bitrex wrote:
User-defined strong types that enforce their own usage are probably
worth the price of admission alone; e.g. quantities in newton/meters
should be of type NewtonMeters and foot/pounds should be FootPounds,
and casually performing operations with the two causes a compile error.

Unless you\'d overloaded a cast operator to do that for you.

[I tried building a \"units\" library that allowed arbitrary
combinations of units (types) via overloaded operators.
It was wholly impractical.]

Note that my \"Calculation\" (utility) does this -- but with explicit
code to examine the types of each argument in each calculation,
accumulating a string of units (numerator, denominator) for the
results:
    5 feet x 14 inches = XXX area  (length*length)
    XXX area x 3 yards = YYY volume (length*length*length)
    YYY volume / 99 seconds = ZZZ flow rate (length*length*length/time)
etc.

The advantage this has is that you can mix units that aren\'t
strictly related.
That\'s also pretty handy for dimensional analysis...
 
On 24/02/23 09:28, Don Y wrote:
On 2/23/2023 3:08 PM, Clifford Heath wrote:
On 23/02/23 15:14, Don Y wrote:
On 2/22/2023 9:00 PM, Clifford Heath wrote:
On 23/02/23 14:02, bitrex wrote:
On 2/22/2023 2:05 PM, John Larkin wrote:
https://en.wikipedia.org/wiki/Timeline_of_programming_languages

Now I\'m told that we should be coding hard embedded products in
C++ or
Rust.

User-defined strong types that enforce their own usage are probably
worth the price of admission alone; e.g. quantities in
newton/meters should be of type NewtonMeters and foot/pounds should
be FootPounds, and casually performing operations with the two
causes a compile error.

On the contrary, the language should automatically provide the
appropriate conversions.

I disagree, esp for anything beyond simple types.  (i.e., promote a
char to an int and hope the developer truly understands how signedness
is handled, etc.)

Pascal?

Requiring an explicit cast reassure me (the NEXT guy looking at
the code) that the developer actually intended to do what
he\'s doing in the way he\'s doing it -- even if it \"makes sense\".
This is what I liked most about C++ (overloading operators
so the *syntax* was clearer by hiding the machinery -- but not
eliminating it!)

How many times do you see an int being used as a pointer?
Is it *really* intended to be a pointer in THIS context?
I get tired of having to chase down compiler warnings
of this sort of thing in inherited code:  \"If you WANT
it to be a pointer, then explicitly cast it as such!
Don\'t just count on the compiler to *use* it as one!\"

Why on earth are you answering my statement about *units conversion*
with a rant about integer representations and pointer casts.

Because it\'s not *units* that bitrex was addressing,
rather, *types*.  Did you miss:

------------------------vvvvv
   \"User-defined strong types that enforce their own usage
   are probably worth the price of admission alone;\"

He could, perhaps, have come up with a different example.

You clipped the example that I was responding to, namely:

\"quantities in newton/meters should be of type NewtonMeters and
foot/pounds should be FootPounds, and casually performing operations
with the two causes a compile error\"

That\'s about dimensional analysis, not data representations.
Strong types are a good thing too, but dimensional analysis might have
stopped some stoopid Americans from crash-landing on Mars.

CH
 
On 2/23/2023 5:34 PM, wmartin wrote:
Note that my \"Calculation\" (utility) does this -- but with explicit
code to examine the types of each argument in each calculation,
accumulating a string of units (numerator, denominator) for the
results:
     5 feet x 14 inches = XXX area  (length*length)
     XXX area x 3 yards = YYY volume (length*length*length)
     YYY volume / 99 seconds = ZZZ flow rate (length*length*length/time)
etc.

The advantage this has is that you can mix units that aren\'t
strictly related.

That\'s also pretty handy for dimensional analysis...

It was intended for use by \"average joes\" so they didn\'t have to deal
with conversion factors between similar units (e.g., feet vs. meters)
as well as keeping track of compound units (coulombs / second).
The user can request the result in whatever (compound) unit he wants
and the conversion will be automatic -- iff compatible (converting
square feet to amperes ain\'t gonna happen!)

Computations are done with \"BigRationals\" so the user doesn\'t have
to worry about overflow, underflow, etc.

Of course, it\'s not very speedy! :>
 
On 2/23/2023 6:02 PM, Clifford Heath wrote:
On 24/02/23 09:28, Don Y wrote:
On 2/23/2023 3:08 PM, Clifford Heath wrote:
On 23/02/23 15:14, Don Y wrote:
On 2/22/2023 9:00 PM, Clifford Heath wrote:
On 23/02/23 14:02, bitrex wrote:
On 2/22/2023 2:05 PM, John Larkin wrote:
https://en.wikipedia.org/wiki/Timeline_of_programming_languages

Now I\'m told that we should be coding hard embedded products in C++ or
Rust.

User-defined strong types that enforce their own usage are probably worth
the price of admission alone; e.g. quantities in newton/meters should be
of type NewtonMeters and foot/pounds should be FootPounds, and casually
performing operations with the two causes a compile error.

On the contrary, the language should automatically provide the appropriate
conversions.

I disagree, esp for anything beyond simple types.  (i.e., promote a
char to an int and hope the developer truly understands how signedness
is handled, etc.)

Pascal?

Requiring an explicit cast reassure me (the NEXT guy looking at
the code) that the developer actually intended to do what
he\'s doing in the way he\'s doing it -- even if it \"makes sense\".
This is what I liked most about C++ (overloading operators
so the *syntax* was clearer by hiding the machinery -- but not
eliminating it!)

How many times do you see an int being used as a pointer?
Is it *really* intended to be a pointer in THIS context?
I get tired of having to chase down compiler warnings
of this sort of thing in inherited code:  \"If you WANT
it to be a pointer, then explicitly cast it as such!
Don\'t just count on the compiler to *use* it as one!\"

Why on earth are you answering my statement about *units conversion* with a
rant about integer representations and pointer casts.

Because it\'s not *units* that bitrex was addressing,
rather, *types*.  Did you miss:

------------------------vvvvv
    \"User-defined strong types that enforce their own usage
    are probably worth the price of admission alone;\"

He could, perhaps, have come up with a different example.

You clipped the example that I was responding to, namely:

\"quantities in newton/meters should be of type NewtonMeters and foot/pounds
should be FootPounds, and casually performing operations with the two causes a
compile error\"

That\'s about dimensional analysis, not data representations.

No, you missed the point he was making. Note the reference to \"compile error\".

Strong types are a good thing too, but dimensional analysis might have stopped
some stoopid Americans from crash-landing on Mars.

CH
 
On 24/02/23 12:17, Don Y wrote:
On 2/23/2023 6:02 PM, Clifford Heath wrote:
On 24/02/23 09:28, Don Y wrote:
On 2/23/2023 3:08 PM, Clifford Heath wrote:
On 23/02/23 15:14, Don Y wrote:
On 2/22/2023 9:00 PM, Clifford Heath wrote:
On 23/02/23 14:02, bitrex wrote:
On 2/22/2023 2:05 PM, John Larkin wrote:
https://en.wikipedia.org/wiki/Timeline_of_programming_languages

Now I\'m told that we should be coding hard embedded products in
C++ or
Rust.

User-defined strong types that enforce their own usage are
probably worth the price of admission alone; e.g. quantities in
newton/meters should be of type NewtonMeters and foot/pounds
should be FootPounds, and casually performing operations with the
two causes a compile error.

On the contrary, the language should automatically provide the
appropriate conversions.

I disagree, esp for anything beyond simple types.  (i.e., promote a
char to an int and hope the developer truly understands how signedness
is handled, etc.)

Pascal?

Requiring an explicit cast reassure me (the NEXT guy looking at
the code) that the developer actually intended to do what
he\'s doing in the way he\'s doing it -- even if it \"makes sense\".
This is what I liked most about C++ (overloading operators
so the *syntax* was clearer by hiding the machinery -- but not
eliminating it!)

How many times do you see an int being used as a pointer?
Is it *really* intended to be a pointer in THIS context?
I get tired of having to chase down compiler warnings
of this sort of thing in inherited code:  \"If you WANT
it to be a pointer, then explicitly cast it as such!
Don\'t just count on the compiler to *use* it as one!\"

Why on earth are you answering my statement about *units conversion*
with a rant about integer representations and pointer casts.

Because it\'s not *units* that bitrex was addressing,
rather, *types*.  Did you miss:

------------------------vvvvv
    \"User-defined strong types that enforce their own usage
    are probably worth the price of admission alone;\"

He could, perhaps, have come up with a different example.

You clipped the example that I was responding to, namely:

\"quantities in newton/meters should be of type NewtonMeters and
foot/pounds should be FootPounds, and casually performing operations
with the two causes a compile error\"

That\'s about dimensional analysis, not data representations.

No, you missed the point he was making.  Note the reference to \"compile
error\".

Dimensional errors should also be compiler errors.

That\'s what he said, and that\'s what he meant.

Go read it again, see if your comprehension is any better this time.

CH
 
On 24-Feb-23 11:00 am, Don Y wrote:
On 2/23/2023 4:18 PM, Sylvia Else wrote:
I had to learn COBOL for my computer science degree (yes, I know that
that dates me). What a waste of time that was.

Moreso than Pascal?  Algol?  LISP?  PL/1?

Well, other than Algol we didn\'t have to learn those.

I do remember writing something in a weird dialect of Algol released by
Xerox, where all the reserved words had to be quotes. But that must have
been while I was working for a year before I went to University.

Do they still teach lambda calculus?  Petri nets?

No idea. I was at uni in the late 70s

At least you can take solace in knowing that your paycheck was likely
printed as a result of the language\'s existence!

Could probably have been done just as well in Fortran.

Sylvia.
 
On 2/23/2023 6:42 PM, Clifford Heath wrote:
On 24/02/23 12:17, Don Y wrote:
On 2/23/2023 6:02 PM, Clifford Heath wrote:
On 24/02/23 09:28, Don Y wrote:
On 2/23/2023 3:08 PM, Clifford Heath wrote:
On 23/02/23 15:14, Don Y wrote:
On 2/22/2023 9:00 PM, Clifford Heath wrote:
On 23/02/23 14:02, bitrex wrote:
On 2/22/2023 2:05 PM, John Larkin wrote:
https://en.wikipedia.org/wiki/Timeline_of_programming_languages

Now I\'m told that we should be coding hard embedded products in C++ or
Rust.

User-defined strong types that enforce their own usage are probably
worth the price of admission alone; e.g. quantities in newton/meters
should be of type NewtonMeters and foot/pounds should be FootPounds,
and casually performing operations with the two causes a compile error.

On the contrary, the language should automatically provide the
appropriate conversions.

I disagree, esp for anything beyond simple types.  (i.e., promote a
char to an int and hope the developer truly understands how signedness
is handled, etc.)

Pascal?

Requiring an explicit cast reassure me (the NEXT guy looking at
the code) that the developer actually intended to do what
he\'s doing in the way he\'s doing it -- even if it \"makes sense\".
This is what I liked most about C++ (overloading operators
so the *syntax* was clearer by hiding the machinery -- but not
eliminating it!)

How many times do you see an int being used as a pointer?
Is it *really* intended to be a pointer in THIS context?
I get tired of having to chase down compiler warnings
of this sort of thing in inherited code:  \"If you WANT
it to be a pointer, then explicitly cast it as such!
Don\'t just count on the compiler to *use* it as one!\"

Why on earth are you answering my statement about *units conversion* with
a rant about integer representations and pointer casts.

Because it\'s not *units* that bitrex was addressing,
rather, *types*.  Did you miss:

------------------------vvvvv
    \"User-defined strong types that enforce their own usage
    are probably worth the price of admission alone;\"

He could, perhaps, have come up with a different example.

You clipped the example that I was responding to, namely:

\"quantities in newton/meters should be of type NewtonMeters and foot/pounds
should be FootPounds, and casually performing operations with the two causes
a compile error\"

That\'s about dimensional analysis, not data representations.

No, you missed the point he was making.  Note the reference to \"compile error\".

Dimensional errors should also be compiler errors.

That\'s what he said, and that\'s what he meant.

Go read it again, see if your comprehension is any better this time.

Read his followup.
 
On 2/23/2023 7:29 PM, Sylvia Else wrote:
On 24-Feb-23 11:00 am, Don Y wrote:
On 2/23/2023 4:18 PM, Sylvia Else wrote:
I had to learn COBOL for my computer science degree (yes, I know that that
dates me). What a waste of time that was.

Moreso than Pascal?  Algol?  LISP?  PL/1?

Well, other than Algol we didn\'t have to learn those.

It seems like ever CS course I took used a different language and
different OS. The notion was that the language wasn\'t important;
the *concepts* being taught were!

E.g., whether a language used call by value or call by reference
semantics might be significant -- and why. Or, how organizing
things in *lists* differs from representing data in other
structures. Typed vs. untyped data. etc. Why you would design
a language (or a machine architecture!) to support these things...

I do remember writing something in a weird dialect of Algol released by Xerox,
where all the reserved words had to be quotes. But that must have been while I
was working for a year before I went to University.

Yeah, I wrote more code in ASM (work) than I ever wrote in school but
*while* I was in school. Big difference between the tools and
development/execution environments! Hard to switch between them
from morning (school) to evening (work) -- and then back again!

And, a significant mindset adjustment has to be made.
I recall writing a driver for a PROM programmer in Pascal.
I would build functions to output a nybble. Another to
invoke that, twice, to output a *byte*. Invoke THAT twice
to output a *word*, etc.

In ASM, this seems only natural. And, it\'s incredibly terse. But,
in a HLL, it looks inconveniently bloated and inefficient!

Do they still teach lambda calculus?  Petri nets?

No idea. I was at uni in the late 70s

So, your experience is as stale as mine. :< While I don\'t
use either technologies, formally, any more, I know they both had
a definite influence on the way I *think* about algorithms and
implementations.

At least you can take solace in knowing that your paycheck was likely
printed as a result of the language\'s existence!

Could probably have been done just as well in Fortran.

That was my first (actually, second) experience writing code.
On Hollerith cards, of course. Amusing to think your proficiency in
operating the punch was as much a factor in your \"productivity\"
as was your programming skills!

[For the longest time, I had a box of blank cards stashed in a
closet -- they had a stylized /Mens et Manus/ logo (seal) imprinted
on their fronts. They were handy as scrap paper -- small and
disposable!]
 
On 2/23/2023 8:00 PM, Don Y wrote:
On 2/23/2023 7:29 PM, Sylvia Else wrote:
On 24-Feb-23 11:00 am, Don Y wrote:
On 2/23/2023 4:18 PM, Sylvia Else wrote:
I had to learn COBOL for my computer science degree (yes, I know that that
dates me). What a waste of time that was.

Moreso than Pascal?  Algol?  LISP?  PL/1?

Well, other than Algol we didn\'t have to learn those.

I actually liked Algol. I often use the \":=\" notation
in my commentaries (and \"::=\" for BNFs)

Limbo is a bastardization of this. The \":=\" operator instantiates
and defines a variable to have the type of its RHS as well as making
the assignment. The \"=\" operator just makes the assignment (to an
already existing variable instance)
 
On 2/23/2023 6:55 PM, John Miles, KE5FX wrote:
On Wednesday, February 22, 2023 at 7:02:56 PM UTC-8, bitrex wrote:
On 2/22/2023 2:05 PM, John Larkin wrote:
https://en.wikipedia.org/wiki/Timeline_of_programming_languages


Now I\'m told that we should be coding hard embedded products in C++ or
Rust.

User-defined strong types that enforce their own usage are probably
worth the price of admission alone; e.g. quantities in newton/meters
should be of type NewtonMeters and foot/pounds should be FootPounds, and
casually performing operations with the two causes a compile error.

You can get 98% of the way there by forcing yourself to name variables
with their units.

float delay_time = 123.456; // No, just no
float delay_time_ms = 123.456; // OK

-- john, KE5FX

That\'s cool and you should do that anyway, but user-defined types can
also contain their own logic and enforce their own usage, like a type
called Age or ElapsedTime or suchlike won\'t let you do something like
Age age = -5 (or some non-contrived operation where the result is
materially equivalent.)
 
On 24-Feb-23 2:00 pm, Don Y wrote:

That was my first (actually, second) experience writing code.
On Hollerith cards, of course.  Amusing to think your proficiency in
operating the punch was as much a factor in your \"productivity\"
as was your programming skills!

Using a punch? Sheer luxury. We were using coding sheets that managed to
be garbled by punch operators.

Fortunately, this was before the time when most people could type, and
the few machines available to students were not much used, other than by me.

Also, the place I was working during the holidays was a time-sharing
service (remember those?), so I did most of the COBOL work there.

Sylvia.
 
On 2/23/2023 8:21 PM, bitrex wrote:
That\'s cool and you should do that anyway, but user-defined types can also
contain their own logic and enforce their own usage, like a type called Age or
ElapsedTime or suchlike won\'t let you do something like Age age = -5 (or some
non-contrived operation where the result is materially equivalent.)

The compiler will enforce rules about illegal mixing of types.
You can\'t, for example, malloc(3.14159) -- because the implied
type of \"3.14159\" is not compatible with a size_t.

Languages that support user defined types, casts, operator
overloading, etc. give you much more leeway in what it will
\"quietly\" accept as legit.

So, if you had a:
malloc(size_t)
and
malloc(float)
it would know to choose the second of these without complaining
(and, leave it up to you to sort out what a float means in that
context)

Some languages silently perform implicit type conversions and/or
casts. This can mislead the developer, depending on the characteristics
of the particular compiler/target in use.

E.g., you can compare an int to a float. But, the float may
erroneously appear to be identical to -- or different from -- the
\"same valued\" (or different valued) int as the representation
of the int (or float) may not neatly comply with the new
interpretation (cast) imposed on it.

C tends to be less pedantic. So,
typedef int feet;
typedef int inches;
just says \"\'feet\' and \'inches\' are *pseudonyms* for \'int\'\"

As such:
feet length;
inches width;
...
area = length * width;
is perfectly legitimate. And, likely NOT what you really wanted.

Or:
typedef int apple;
typedef int carrots;
apple bushels;
carrots bunchs;
...
quantity = bushels + bunchs;
This *might* make sense. But, likely doesn\'t (unless you are interested
in \"items\" -- in which case, why didn\'t you define an \"items\" type??)

OTOH, if you can create true (\"first class\") types, you can create
casts to convert Feet to Inches (uppercase to show their promotion
to \"actual\" types) and let the compiler automatically invoke the
appropriate cast to save you the explicit invocation of a
\"conversion function\".

But, many of these operators aren\'t trivial. It\'s not like sign
extending a signed byte to a signed long. As a result, someone
looking at the code will have a harder time:
- remembering that this is happening \"behind the scenes\"
- understanding the cost (time/storage) of the operation
 

Welcome to EDABoard.com

Sponsor

Back
Top