We would not be here were it not for DTSS

On Wednesday, February 5, 2020 at 2:18:16 AM UTC+11, DecadentLinux...@decadence.org wrote:
Bill Sloman <bill.sloman@ieee.org> wrote in
news:fbd9cf64-ba5b-4b1d-b804-2289b671907d@googlegroups.com:

On Tuesday, February 4, 2020 at 9:21:25 PM UTC+11,
DecadentLinux...@decadence.org wrote:
Bill Sloman <bill.sloman@ieee.org> wrote in
news:3ec9c3c4-2f49-4603- aaa5-be43254a90cc@googlegroups.com:

The physical size of the computer didn't come into it.

It was just a reference to the timeline ya dumbfuck. Those
were the
only machines around then. You are thick.

You are thicker. The PDP-8 wasn't remotely room-sized, and your
grasp of what was going on is - to put it kindly -
over-simplified.


AGAIN, you RETARDED FUCK!!! It did not even come out until the
middle of 1965.

Your grasp of the timeline is munged because of your retarded
attitude.

Whoever taught you computer history seems to cut his material back
to keep it within grasp of a rather dumb class.

I was writing in assembler as a kid, back when you were watching
others use computers.

I wrote 900 lines of assembler for the PDP8 I used back in 1967. The program forms appendix 2 of my Ph.D. thesis (pages 226-251). The description of what it did (and why it did it that way) is on pages 55 to 71.

It wasn't a kiddy project, and it ran on interrupts from three different sources, which made life interesting.

> You're a goddamned idiot. You == classless.

The idiocy is all yours.

--
Bill Sloman, Sydney
 
On Wednesday, February 5, 2020 at 12:51:24 AM UTC+11, Martin Brown wrote:
On 04/02/2020 09:26, Bill Sloman wrote:
On Tuesday, February 4, 2020 at 8:04:04 PM UTC+11,
DecadentLinux...@decadence.org wrote:
upsidedown@downunder.com wrote in
news:5d8i3ftv14mcvhvlpu40spj5u2tt1tek3n@4ax.com:

BASIC was not the first attempt to give computing to the masses.

FORTRAN (FORmula TRAnsform) was an 1950's attempt.

Wrong. FORTRAN was for the original building sized computers, not
"the masses". The "masses" it "was for" was college computer
science and engineering students, and staff engineers at the
companies using the computers that were in the field. There were
no computers in the hands of "the masses" back then.

This misses the point. The original computer programmers programmed
in machine language. Assembly languages made the process easier by
letting you write easily remembered words which could be directly
translated to the appropriate binary string.

FORTRAN was the first higher-level language which was compiled to
create the appropriate sequences of binary strings.

No it wasn't Alick Glennie's Autocode for the Manchester Mark 1 computer
in 1952 was arguably the first ever compiled language.

Other computer labs may disagree about exactly whose was the first.

FORTRAN went on to rule the world for quite a while. I don't recall ever even hearing about Autocode.

Autocode Mk1 went on to be used on the Ferranti Mercury line of computers. Only the very largest businesses and national laboratories could afford to run one.

He's another of those unsung heroes that nobody has ever heard of.
https://en.wikipedia.org/wiki/Alick_Glennie
Even his Wiki entry is a bit sparse.

The physical size of the computer didn't come into it. The fact that
there weren't many of them initially limited the number of people who
needed to write computer programs, but when integrated circuits made
it a lot cheaper to build computers, the demand went up, and places
like Dartmouth started churning outprogrammers. Their existence
reflected the fact semiconductor technology had changed the world.
They were an effect, not a cause.

The physical size and sheer difficulty of keeping them running kept
computers out of all but a few enthusiast hands until the advent of the
home computer. A few dedicated chips to play ping pong arrived first.

The PDP-11 mini-computer had a shoe-box sized variant.

https://en.wikipedia.org/wiki/PDP-11

DEC sold about 600,000 of them, and did well with the VAX too.

They weren't home computers, but anybody who needed a computer seems to have been able to get one and find room for it.

Nascom Z80 based from ÂŁ200 was the first half decent home computer kit
that I can recall but it required fairly deep pockets at the time.
https://en.wikipedia.org/wiki/Nascom_(computer_kit)
(and a fair degree of skill with a soldering iron).

My first home computer was one of Alan Sugar's Amstrad versions of the IBM PC (which I picked up second hand). My wife got hers some months earlier.

If your computer history major didn't teach you that, you were being
sold a line of goods, probably by somebody who put a high value on
their own contribution.

Until the Sinclair MK14 there really were no affordable home computers
of any sort in the UK. Affordable and capable mass market things like
the TI99/4, BBC Micro and ZX80 didn't appear until the 1980's. Not long
after that most homes had one if only for teenagers to play games on.

Not really relevant data. The process of getting computers cheap enough to make them office furniture had happened some years earlier. At work I'd had a glass teletype hooked up to a VAX years before I got a home computer.

--
Bill Sloman, Sydney
 
On Wednesday, February 5, 2020 at 4:55:50 AM UTC+11, dca...@krl.org wrote:
On Tuesday, February 4, 2020 at 1:16:19 AM UTC-5, Bill Sloman wrote:
On Tuesday, February 4, 2020 at 3:58:53 PM UTC+11, tabb...@gmail.com wrote:
On Tuesday, 4 February 2020 01:40:00 UTC, John Robertson wrote:
On 2020/02/03 5:13 p.m., Bill Sloman wrote:
On Tuesday, February 4, 2020 at 6:57:19 AM UTC+11, John S wrote:
On 2/2/2020 5:38 AM, DecadentLinuxUserNumeroUno@decadence.org wrote:

This is one of the major groups that brought us forward.

https://www.youtube.com/watch?v=WYPNjSoDrqw


Yes, I think so. That was a very interesting video.

I remember time-sharing and BASIC as well as Super BASIC. My program
allowed me to design very small high voltage transformers in a few
seconds rather than hours. And other designs as well. That was in the
early 60s.

Note that BASIC still exists in many forms and is very useful for
solving problems quickly. Power Basic comes to mind. I think JL uses it
for several purposes, like for his stock parts inventory.

And people think that climate change denial propaganda is worth propagating.

The fact that you can use BASIC and it it's variants to eventually do what you could do faster with a more appropriate language doesn't make a good choice.

DLUNU was claiming that the invention of BASIC represented some kind of advance. Teaching lots of undergraduates to program was a good idea, but that's it.


That's not unlike saying that teaching kids to read and write was a good
idea, but that's it. Probably just use reading and writing to record
cooking recipes - a waste of time!

It seems to me that allowing the average kid access to computers sparked
a major interest in these machines that would not have developed
anywhere near as fast.

In this matter I might fairly claim that you are the denier...(ducking)

Funny that he doesn't get the entire point of basic & how it changed the world.

The entire point of Basic was that it would run on tiny computers, and it didn't change the world.

Actually Basic originated at Dartmouth as a time share system run on a main frame.

A time shared system offers each user a tiny slice of the main frame.

> And the article presents it as the first time share system. Time sharing did change the world to some extent.

But Dartmouth didn't invent it. The idea of a switching a processor between tasks goes back a long way. The PDP8 I did my thesis work on was set up for interrupt control and I used three level of interrupt in the program I wrote for it (which did involve checking where each interrupt had come from) - so clearly the idea had been around when the PDP-8 was being developed, and it was introduced in 1965.

--
Bill Sloman, Sydney
 
On Wed, 5 Feb 2020 09:20:09 +1100, Clifford Heath <no.spam@please.net>
wrote:

On 5/2/20 2:34 am, DecadentLinuxUserNumeroUno@decadence.org wrote:
Bill Sloman <bill.sloman@ieee.org> wrote in news:f64588f4-cc6c-4b12-
9d88-a142dffaeb6e@googlegroups.com:
BASIC was essentially FORTRAN, cut down to shrink the compiler.

BASIC did not have FORMAT, COMMON, EQUIVALENCE statements and other
oddities (such as spaces in names), but it had character strings which
FORTRAN II/IV did not have.


>> BASIC was NOT a compiled language, you stupid fuck.

What prevents compiling BASIC programs ? An interpreter is required
only for languages that support self modifying source programs. This
is a different thing as self modifying instructions, which only needed
program storage in R/W memory.

There has been BASIC source interpreters, converters to some
intermediate format as well as fully compiled implementations.

And it did not TRANslate FORMulae, just used a stack to evaluate the
results as it stepped through an expression.

For evaluating arithmetic expressions, you do not need recursion, one
could convert the expression to RPN and execute it in a software
stack. Early Fortrans also had limits of expression complexity, e.g. a
table index could only be of the constant1 * indexVar + constant2
form, in which constant1 and/or constanr2 could be omitted.

But I think Bill was referring to lexical similarity, and I think I can
see that. It certainly wasn't pretending to be an Algol derivative.

The control structures in Basic and Fortran II/IV were primitive.
Fortran-77 introduced block structures.
 
On Wednesday, February 5, 2020 at 5:31:20 AM UTC+11, dagmarg...@yahoo.com wrote:
On Monday, February 3, 2020 at 11:58:53 PM UTC-5, tabb...@gmail.com wrote:
On Tuesday, 4 February 2020 01:40:00 UTC, John Robertson wrote:
On 2020/02/03 5:13 p.m., Bill Sloman wrote:
On Tuesday, February 4, 2020 at 6:57:19 AM UTC+11, John S wrote:
On 2/2/2020 5:38 AM, DecadentLinuxUserNumeroUno@decadence.org wrote:

<snip>

Funny that he doesn't get the entire point of basic & how it changed the world.

Indeed. BASIC was great, FORTRAN's mindset was inflexible, fossilized
& sclerotic; fat-headed.

There was never a FORTRAN mindset. It was just programming language - a rather clunky one, but complete. There was a lot of good stuff written in FORTRAN which wasn't worth re-writing in later languages, so people have kept on using it.

--
Bill Sloman, Sydney
 
On 5/2/20 9:38 am, Bill Sloman wrote:
On Wednesday, February 5, 2020 at 2:34:37 AM UTC+11, DecadentLinux...@decadence.org wrote:
Bill Sloman <bill.sloman@ieee.org> wrote in news:f64588f4-cc6c-4b12-
9d88-a142dffaeb6e@googlegroups.com:
BASIC was essentially FORTRAN, cut down to shrink the compiler.
BASIC was NOT a compiled language, you stupid fuck.

An interpreter is just another compiler.

Having built and worked on many compilers and interpreters in my career,
I can confidently assert that this is utterly false.

A compiler builds a symbolic version of some code in order to emit
equivalent code in another language.

An interpreter merely executes the instructions in the source code.

A tokenising interpreter builds a symbolic representation of the input
syntax (and executes that), but does not produce a symbolic model of the
intent (semantics).

Pascal's P-code was "another language" produced by a real compiler, but
to make the job easier, the target language was for a VM, not the target
hardware - so the job of producing equivalent machine code was not
necessary.

The original Basic was a pure source-code interpreter, but there are
later implementations of similar languages that use each of the above
approaches.

Please stop this pissing contest.

CH
 
onsdag den 5. februar 2020 kl. 04.00.08 UTC+1 skrev whit3rd:
On Tuesday, February 4, 2020 at 4:51:35 AM UTC-8, Bill Sloman wrote:
On Tuesday, February 4, 2020 at 9:33:20 PM UTC+11, tabb...@gmail.com wrote:

Computing today without Basic would be significantly less advanced.

Really? BASIC was a very primitive language, and it only survives because some people never got around to learning anything better.

Yeah, it's not the language, it's the widespread AVAILABILITY of a language that matters;
there weren't a lot of FORTRAN or APL or ALGOL machines on the consumer market, but Commodore,
Apple, TI 99/4, Atari, and IBM delivered (early 1980s) machines that, out-of-the-box, could run BASIC.
That was an improvement on the CP/M machines offered in the seventies.

yeh, how many people got into a career of programming from their first taste of programming basic on one of the 17 million C64s sold
 
On Tuesday, February 4, 2020 at 4:51:35 AM UTC-8, Bill Sloman wrote:
On Tuesday, February 4, 2020 at 9:33:20 PM UTC+11, tabb...@gmail.com wrote:

Computing today without Basic would be significantly less advanced.

Really? BASIC was a very primitive language, and it only survives because some people never got around to learning anything better.

Yeah, it's not the language, it's the widespread AVAILABILITY of a language that matters;
there weren't a lot of FORTRAN or APL or ALGOL machines on the consumer market, but Commodore,
Apple, TI 99/4, Atari, and IBM delivered (early 1980s) machines that, out-of-the-box, could run BASIC.
That was an improvement on the CP/M machines offered in the seventies.

The Macintosh broke that mold (you could get BASIC, but that wasn't standard equipment).
The GUI options favored more modern environments (like, Excel - which was Macintosh-only
at introduction).

For my Mac, there was a BASIC but also FORTRAN and APL. And did array-processor things
for work, using the graphic-terminal capability of the good old toaster.
 
Bill Sloman <bill.sloman@ieee.org> wrote in
news:cf28cebc-9f11-4392-8ba6-6d041110e74a@googlegroups.com:

My first home computer was one of Alan Sugar's Amstrad versions of
the IBM PC (which I picked up second hand). My wife got hers some
months earlier.

You left out what it ran on. That is how one dates their "first
computer".

I used actual IBM XTs at my workplace in 1986 to make 4X PCB
layouts with AutoCAD 2.

My first PC was a NextStep 286 though. That easily pinpoints the
timeline as being after the 286 came out.

Your thing could be a 286 or a 386, etc. but just mentioning the
line does not pinpoint the model or the year.

I guess it could even be an XT clone, but the clone makers started
with the 286 in most cases. There were not too many that started
with the 8088. Save for like Heathkit or such.
 
On 2020-02-04, upsidedown@downunder.com <upsidedown@downunder.com> wrote:

What prevents compiling BASIC programs ? An interpreter is required
only for languages that support self modifying source programs. This
is a different thing as self modifying instructions, which only needed
program storage in R/W memory.

Many basic dialects (eg BASICA, QBASIC) support some form of eval() these dialects don't
compile.

--
Jasen.
 
On Wednesday, February 5, 2020 at 2:00:08 PM UTC+11, whit3rd wrote:
On Tuesday, February 4, 2020 at 4:51:35 AM UTC-8, Bill Sloman wrote:
On Tuesday, February 4, 2020 at 9:33:20 PM UTC+11, tabb...@gmail.com wrote:

Computing today without Basic would be significantly less advanced.

Really? BASIC was a very primitive language, and it only survives because some people never got around to learning anything better.

Yeah, it's not the language, it's the widespread AVAILABILITY of a language that matters; there weren't a lot of FORTRAN or APL or ALGOL machines on the consumer market, but Commodore, Apple, TI 99/4, Atari, and IBM delivered (early 1980s) machines that, out-of-the-box, could run BASIC.

The machines don't run a particular programing language.

They run whatever software they are loaded with.

Quite often you have to install the Linux operating system to get access to the compiler you want.

> That was an improvement on the CP/M machines offered in the seventies.

You are confusing the operating system with the compilers that came with it..

> The Macintosh broke that mold (you could get BASIC, but that wasn't standard equipment). The GUI options favored more modern environments (like, Excel - which was Macintosh-only at introduction).

Apple did try harder than most to stop you running non-Apple software. It wasn't a virtue.

> For my Mac, there was a BASIC but also FORTRAN and APL. And did array-processor things for work, using the graphic-terminal capability of the good old toaster.

So you didn't know what was going on under the bonnet.

--
Bill Sloman, Sydney
 
Lasse Langwadt Christensen <langwadt@fonz.dk> wrote in
news:a72ad596-cd45-42cb-bd51-d8ae90bd4441@googlegroups.com:

onsdag den 5. februar 2020 kl. 04.00.08 UTC+1 skrev whit3rd:
On Tuesday, February 4, 2020 at 4:51:35 AM UTC-8, Bill Sloman
wrote:
On Tuesday, February 4, 2020 at 9:33:20 PM UTC+11,
tabb...@gmail.com wrote:

Computing today without Basic would be significantly less
advanced.

Really? BASIC was a very primitive language, and it only
survives because some people never got around to learning
anything better.

Yeah, it's not the language, it's the widespread AVAILABILITY of
a language that matters; there weren't a lot of FORTRAN or APL or
ALGOL machines on the consumer market, but Commodore, Apple, TI
99/4, Atari, and IBM delivered (early 1980s) machines that,
out-of-the-box, could run BASIC. That was an improvement on the
CP/M machines offered in the seventies.


yeh, how many people got into a career of programming from their
first taste of programming basic on one of the 17 million C64s
sold

It is amazing how many C64 rescue videos are out there.
 
On Wednesday, February 5, 2020 at 4:41:39 PM UTC+11, DecadentLinux...@decadence.org wrote:
Bill Sloman <bill.sloman@ieee.org> wrote in
news:cf28cebc-9f11-4392-8ba6-6d041110e74a@googlegroups.com:

My first home computer was one of Alan Sugar's Amstrad versions of
the IBM PC (which I picked up second hand). My wife got hers some
months earlier.


You left out what it ran on. That is how one dates their "first
computer".

MS/DOS was the operating system, if I remember rightly.

I used actual IBM XTs at my workplace in 1986 to make 4X PCB
layouts with AutoCAD 2.

You have my deepest sympathy.

My first PC was a NextStep 286 though. That easily pinpoints the
timeline as being after the 286 came out.

Your thing could be a 286 or a 386, etc. but just mentioning the
line does not pinpoint the model or the year.

It would have done if you'd lived in t eh UK at the time - that late 1980s.

https://en.wikipedia.org/wiki/Amstrad

It was 8086 based..

I guess it could even be an XT clone, but the clone makers started
with the 286 in most cases. There were not too many that started
with the 8088. Save for like Heathkit or such.

Amstrad was a lot more mass market than that.

--
Bill Sloman, Sydney
 
Bill Sloman <bill.sloman@ieee.org> wrote in news:9bdd6032-3d58-4935-
9086-f003ac12c3db@googlegroups.com:

You left out what it ran on. That is how one dates their "first
computer".

MS/DOS was the operating system, if I remember rightly.

I was referring the the processor that line had.
 
On Tuesday, February 4, 2020 at 10:06:08 PM UTC-8, Bill Sloman wrote:
On Wednesday, February 5, 2020 at 2:00:08 PM UTC+11, whit3rd wrote:
On Tuesday, February 4, 2020 at 4:51:35 AM UTC-8, Bill Sloman wrote:
On Tuesday, February 4, 2020 at 9:33:20 PM UTC+11, tabb...@gmail.com wrote:

Computing today without Basic would be significantly less advanced.

Really? BASIC was a very primitive language, and it only survives because some people never got around to learning anything better.

Yeah, it's not the language, it's the widespread AVAILABILITY ...Commodore, Apple, TI 99/4, Atari, and IBM delivered (early 1980s) machines that, out-of-the-box, could run BASIC.

The machines don't run a particular programing language.

They do if it's in ROM.
The Macintosh broke that mold (you could get BASIC, but that wasn't standard equipment). The GUI options favored more modern environments (like, Excel - which was Macintosh-only at introduction).

Apple did try harder than most to stop you running non-Apple software. It wasn't a virtue.

For my Mac, there was a BASIC but also FORTRAN and APL. And did array-processor things for work, using the graphic-terminal capability of the good old toaster.

No secrecy problem from Apple; the APL was from Portable Software, the Fortran came from Absoft,
and Excel from Microsoft... and the terminal emulator was written by a user

> So you didn't know what was going on under the bonnet.
Oh, I made and used a few long T15 screwdrivers back in the day. And volumes one through
six of Inside Macintosh still sit on my shelf.
 
On Tuesday, 4 February 2020 15:39:55 UTC, DecadentLinux...@decadence.org wrote:
Bill Sloman <bill.sloman@ieee.org> wrote in
news:f64588f4-cc6c-4b12-9d88-a142dffaeb6e@googlegroups.com:

It's got no other obvious advantage as a teaching tool, and if it
persists in computer science classes, it is for the same reason
that elementary electronics classes still use the 741 and the 555
- nobody has bothered to rewrite the course notes around anything
better.

You are a true idiot.

pretty much, albeit not the classic kind of idiot. The huge narcissistic ego does him no favours.
 
On Wednesday, February 5, 2020 at 6:26:03 PM UTC+11, DecadentLinux...@decadence.org wrote:
Bill Sloman <bill.sloman@ieee.org> wrote in news:c7e201a5-ed2d-47a9-
84b1-0ab89736ca56@googlegroups.com:

So you didn't know what was going on under the bonnet.

Billy Sloman, the perpetual jackass thinking he is the only one that
was there or saw what did what on what and by what manner.

Scarcely the only one. The whole Linux community (of which I'm not a part) has a much more hands on approach than I found attractive, though I was happy to exploit some of their work from time to time.

> Never thought I'd see it, but you are actually worse than Trump.

Bizarre comparison.

You experienced a social stack overflow, and it has rendered you a
child.

The childishness if all yours.

--
Bill Sloman, Sydney
 
Bill Sloman <bill.sloman@ieee.org> wrote in news:9bdd6032-3d58-4935-
9086-f003ac12c3db@googlegroups.com:

You have my deepest sympathy.

It was an improvement over 4X tape layouts.

You should deserve sympathy for your psychopathy, but you do not.
 
On Wednesday, February 5, 2020 at 5:50:25 PM UTC+11, whit3rd wrote:
On Tuesday, February 4, 2020 at 10:06:08 PM UTC-8, Bill Sloman wrote:
On Wednesday, February 5, 2020 at 2:00:08 PM UTC+11, whit3rd wrote:
On Tuesday, February 4, 2020 at 4:51:35 AM UTC-8, Bill Sloman wrote:
On Tuesday, February 4, 2020 at 9:33:20 PM UTC+11, tabb...@gmail.com wrote:

Computing today without Basic would be significantly less advanced.

Really? BASIC was a very primitive language, and it only survives because some people never got around to learning anything better.

Yeah, it's not the language, it's the widespread AVAILABILITY ...Commodore, Apple, TI 99/4, Atari, and IBM delivered (early 1980s) machines that, out-of-the-box, could run BASIC.

The machines don't run a particular programing language.

They do if it's in ROM.

The processor doesn't care if it reading data out of ROM or from some other kind of memory. If the manufacturer went to the trouble of building in a particular software on ROM, it's always there, but you can pretty much always get what you want from some other source.

The Macintosh broke that mold (you could get BASIC, but that wasn't standard equipment). The GUI options favored more modern environments (like, Excel - which was Macintosh-only at introduction).

Apple did try harder than most to stop you running non-Apple software. It wasn't a virtue.

For my Mac, there was a BASIC but also FORTRAN and APL. And did array-processor things for work, using the graphic-terminal capability of the good old toaster.

No secrecy problem from Apple; the APL was from Portable Software, the Fortran came from Absoft, and Excel from Microsoft... and the terminal emulator was written by a user.

It wasn't a secrecy problem, just a commercial decision.

So you didn't know what was going on under the bonnet.

Oh, I made and used a few long T15 screwdrivers back in the day. And volumes one through six of Inside Macintosh still sit on my shelf.

So you chose to pay half as much again for your processing hardware as you needed to. It keeps life simple, but it isn't an economical approach.

--
Bill Sloman, Sydney
 
Bill Sloman <bill.sloman@ieee.org> wrote in news:c7e201a5-ed2d-47a9-
84b1-0ab89736ca56@googlegroups.com:

So you didn't know what was going on under the bonnet.

Billy Sloman, the perpetual jackass thinking he is the only one that
was there or saw what did what on what and by what manner.

Never thought I'd see it, but you are actually worse than Trump.

You experienced a social stack overflow, and it has rendered you a
child.
 

Welcome to EDABoard.com

Sponsor

Back
Top