It's not that ALGOL-68 was that bad. It's that the description of it was terrible. There was an attempt to formalize the semantics, and it was a disaster. Nobody knew how to describe language semantics back then, so they invented a new notation. There was an ALGOL-68 report, which was incomprehensible.[1] Then there was "Informal Introduction to ALGOL-68"[2], which was difficult. Finally, there was a "Very Informal Introduction to ALGOL-68"[3], which was somewhat readable.
Some sample text from the report:
A "nest" is a 'NEST'. The nest "of" a construct is the "NEST" enveloped
by the original of that construct, but not by any 'defining LAYER'
contained in that original.
(The nest of a construct carries a record of all the declarations forming
the environment in which that construct is to be interpreted.
Those constructs which are contained in a range R, but not in any
smaller range contained within R, may be said to comprise a "reach". All
constructs in a given reach have the same nest, which is that of the
immediately surrounding reach with the addition of one extra "LAYER".
The syntax ensures (3.2.1.b, 3.4.1.i,j,k, 3.5.1.e, 5.4.1.1.b) that each 'PROP'
(4.8.1.E) or "property" in the extra 'LAYER' is matched by a defining.
indicator (4.8.1.a) contained in a definition in that reach.)
They're trying to get hold of the concept of variable scope and lifetime, back when that was a new idea.
They were also trying to invent type theory, which they needed and which didn't exist yet. The first 50 pages of the report are all about the notation they invented to be used in the rest of the report. It's not a very good notation. It's like calculus before Leibniz, when people were trying to use Newton's fluxions.[4]
Computer science just didn't have enough widely understood abstractions for people to even talk about this stuff clearly.
Yes, I completely agree that the real problem of ALGOL-68 has been the complete lack of suitable documentation for it.
Even if Wirth is the author of some important innovations in programming language design, all his programming languages have been significantly inferior to ALGOL-68. Nevertheless, most of them have been more successful precisely because Wirth has written some simple and clear manuals for them, which were easily understood by any programmer.
The formal description used in the report about ALGOL-68 could have been very useful for implementers, but because the language included a lot of novelties it never had any chance of adoption without having, besides a formal description, an extensive tutorial and rationale of the design, accompanied by many programming examples.
This is very sad, because ALGOL-68 got a lot of things right, which are wrong in most later programming languages, but because of its reputation of being incomprehensible it has been very seldom a source of inspiration for the designers of programming languages, as it would have deserved.
Some popular languages, including C, the Bourne shell and C++, have taken various elements from ALGOL 68, but most of them have been superficial elements, like keywords or details of syntax, instead of taking more important elements.
For instance among the ALGOL 68 elements borrowed by C is the keyword "union", but the unions of C are barely better than the EQUIVALENCE of Fortran, and they are not comparable with the unions of ALGOL 68, which were based on the proposal of John McCarthy and which correspond to what are now more frequently called sum types. The "variant records" of Pascal were also much worse designed by Wirth than the unions introduced earlier in the ALGOL 68 that he had criticized.
Actually, I used Algol 68 at uni, while reading EE. That was due to the lecturer tasked with giving us some computing instruction choosing that language. This would have been in around 87 or 88.
I actually quite liked it, finding it easy to translate to / from C and A68, but that was mainly because he only described a subset (incl. nested functions), and didn't even describe REF.
So we got exposed to all of the various FOR/WHILE/DO/OD thing, IF's returning values (used like a C ternary op), etc. Some folks went off and acquired reference books for the language, and explored other areas. I never did as at the time I was comfortable with the K&R C compiler on our dept. unix machines.
The real paint point with A68 there was that the compiler was a batch submission one, took quite some time to run, and the machine doing the work was quite overloaded.
Possibly one could also place various statements inside the THEN and ELSE sections, as long as they ended with an expression. I've a vague impression about that, but can't properly recall.
There was also a short form available, but we were never actually told that, although some folks found it themselves from reference books.
Comes up all the time. Steam engine valve gear is a pulse-width modulation motor driver. Number 5 Crossbar, the best electromechanical telephone switch, is a distributed redundant microservices architecture. Western Union Plan-55A, for switching telegrams, is a mail server and forwarder. Each of those, in its day, was unique in its space, with its own terminology and theory. Today we see them as instances of common patterns.
Right now, large language models are in that state. They work, but why they work and why they fail is barely understood. As more systems are developed in that space and adjacent to it, more general understanding may emerge.
I assume that it has about the same content as the book "Informal introduction to algol 68", published by the same authors a few years later, and which had at least two editions, e.g. in 1973 and in 1977.
Searching the Internet finds some scanned copies of that book.
COMMENT
Algol 68 program to calculate the Sieve of Eratosthenes
for some upper limit N
COMMENT
PROC eratosthenes = (INT n) []INT:
(
[n]INT sieve;
FOR i TO UPB sieve DO
sieve[i] := i
OD;
INT k = ENTIER sqrt(n);
sieve[1] := 0;
FOR i FROM 2 TO k DO
IF sieve[i] NE 0 THEN
FOR j FROM i*i BY i TO n DO
sieve[j] := 0
OD
FI
OD;
sieve
);
INT n;
print("Upper limit to calculate sieve? ");
read(n);
print((eratosthenes(n), newline))
A few things: This code is very readable. The blogger didn't understand it because he didn't understand what the "Sieve of Eratosthenes" did: It's a means of calculating lists of prime numbers. He incorrectly stated that it was used for calculating Fibonacci numbers.
Syntax-wise, it looks like some mix of Go, C and Python. Nothing at all unusual. It would be pretty easy to learn to program in this, and comfortable too.
> Syntax-wise, it looks like some mix of Go, C and Python. Nothing at all unusual. It would be pretty easy to learn to program in this, and comfortable too.
/bin/sh, i.e. the Bourne shell, has been written by a programmer who was experienced in ALGOL 68, having implemented some significant projects in it.
In designing the language interpreted by the shell, he has taken several features directly from ALGOL 68.
However he has also done one improvement, renaming the bracket pair keywords "do" and "od" to "do" and "done", responding thus to the criticism of many programmers towards ALGOL 68, that "od" sounds odd.
One interesting thing: the sieve array does not have static bounds, the size of it is an argument. Presumably that means it’s stack-allocated dynamically a la ”alloca” (I doubt it’s heap-allocated, given there’s no freeing of it). C didn’t get that feature until C99 and adding it has been almost universally been seen as a mistake.
I imagine that’s an example of one of the many, many things that makes Algol-68 a pain to implement. Does make it easier for the developer though.
I've looked up alloca() and it's considered about as safe to use as recursion, and potentially safer than stack-allocating one big fixed-size array. Its bad reputation might be because of poor implementations in certain compilers.
In the past, it was customary in the English mathematics texts, especially in the UK, to use the French word for the integer part of a number, because the usage of this function had been borrowed from French mathematicians.
ALGOL 60 had taken many notations from the standard mathematical notation of that time, including operators and function names, so it resembled much more a mathematical text than the American programming languages like Fortran, where the main criterion for designing the syntax was using the restricted character set of the IBM printers, which lacked most mathematical symbols, leading to notations like "*", "/" and "**" for multiplication, division and exponentiation, where ALGOL would use "×", "÷", and "↑".
The operator ENTIER (French for “whole”) takes a REAL operand and likewise
yields an INT result, but the yield is the largest integer equal to or less than the operand. Thus ENTIER 2.2 yields 2, ENTIER -2.2 yields -3
Isn't Algol the crazy language using special characters all over the place as operators? I could see that being undesirable, but I don't see it in that snippet.
edit: I'm thinking of APL. It looks like it's the same kind of functionality as what the Matlab, R or numpy cohort provides.
Yes, as in your own correction, you were thinking about APL.
ALGOL 68 was certainly more readable than C or Pascal, even if it was somewhat more verbose.
A very good feature of ALGOL 68 was that all the syntactic structures were enclosed in brackets, so no ambiguities or editing errors were possible, like in C or ALGOL 60.
Moreover, each syntactic structure had its own pair of brackets, so you did not have the reading difficulties caused by all bracket pairs being the same, like "()" in LISP, "begin" and "end" in Pascal or "{}" in C.
Especially useful is having different bracket pairs for iterations ("do" and "od" in ALGOL 68, renamed as "do" and "done" in the Bourne shell, which sounds better) in comparison with the bracket pairs used for conditional statements ("if" and "fi" in ALGOL 68).
In C and derived languages, even the indentation is not enough to understand the code in many cases, especially when the lines are restricted to 80 characters, when you have a long loop body that cannot be seen inside a single page.
In ALGOL 68, you have distinct closing brackets for each syntactic structure, so it is easy to recognize their meaning.
Using keywords for brackets is more verbose than in C, but much less verbose than using terminating comments. Moreover, now we can use Unicode for program texts, so instead of keywords one could use different pairs of Unicode brackets that are graphically distinct. For instance one can use angular brackets for conditional statements and S-shaped bag delimiter brackets for iterations.
As I understand it, B[1] was the immediate predecessor of C.
Pascal (and its descendants) and Ada (and VHDL) seem closer syntactically to algol 60 than C is (begin/end as block delimiters, no parentheses needed for control structures, := for assignment, etc.) The example code at [1] (using bold keywords) should be understandable to most HN readers 64 years later.
B came from BCPL which came from CPL which came from ALGOL 60. The main thing about ALGOL was structured programming as opposed to goto statements. Compared to that begin/end vs braces is a very minor issue.
> BCPL has been rumored to have originally stood for "Bootstrap Cambridge Programming Language", but CPL was never created since development stopped at BCPL, and the acronym was later reinterpreted for the BCPL book.
I have seen no evidence of this in papers by Martin Richards, including the earliest BCPL manual or the contemporary papers on CPL by Strachey et al.
The descriptions of how the CPL compiler worked (eg Strachey’s paper on GPM, Richards more recent retrospectives) all talk about writing the compiler in CPL and translating it by hand to macro assembly.
I have not yet managed to look at Richards PhD thesis which contains the first draft description of BCPL, which he sketched after working on the CPL compiler and shortly before moving to MIT where he first implemented BCPL.
There's a lot of hearsay in that articles, and a lot of sentiment rooted in the particulars of that time.
Sure, it was a complex thing in the late 60s/early 70s. Sure, Wirth came up with something simpler. But I'm missing a deeper analysis, especially with a more modern view point where basically any language is at least as complex as Algol 68[0].
> Arguably Wirth’s Algol-W was a better successor to Algol-60
I might not even disagree, but what were the arguments, and how are they holding up?
> and arguably did not have the same connections to industry as the likes of Fortran and Cobol
Sure. But neither did Algol-W or Pascal. And pretty much anything else in the 20th century.
EWD comes up as a dissenter for Algol-68, and the longer my career as a software developer the more I disagree with him on anything that isn't pure math.
The appellation is in some part due to his custom of writing monographs for himself titled EWD-n[1]. They are a fascinating mix of deep mathematical and philosophical insight and curmudgeonly reflective essaying.
Yeah, if EWD had had his way, very little software would ever have been written. I think his insistence on proving the correctness of imperative programs is understandable but entirely wrong headed. The sheer amount of insight needed to get through the working day would be unattainable by most and unsustainable for all but a few.
Not really, I did an implementation in the late 70s, ran on a mainframe of the time (1MHz 6Mb). The language itself is not much more than modern C in scope - and in fact many ideas that were new in 68 are expected in modern languages
The big problem was that the spec was essentially unreadable.
>Algol-68 (or derivatives of) did rise to some prominence in one place – the USSR. The most prominent implementation of the language came from Leningrad State University. It was apparently used to design Russian telephone exchanges.
6-pass compiler for Algol-68 :) The telephone exchange was built on the base of the microcomputer (while developed in USSR/Russia, and ultimately that microcomputer was mostly used by the Russian military, the major customer for the exchange who funded the work at some point was Alcatel) which was like Tandem, only 3 instead of 2 systems in parallel because of low quality of electronics, the CPU was USSR-developed "Samson" [1], a kind of Elbrus offshoot. The exchange software was developed in SDL (Z.100, kind of like Erlang-by-European-commitee-in-1980ies) and compiled into Algol-68 which was compiled into that CPU codes.
6 passes seems like a lot but likely that was just to fit it in the tiny computers of the time.
One problem with the language is that you can't parse declarations and expressions until after you have parsed the bracket structure of the language (because you can't tell whether a name is a type or an operator until you have processed their declarations (which do not have to be done before use) - that sort of means you have to do at least 2 lexical/parse passes
It was a surprisingly pleasant language to work with considering its age. The main annoyances were:
* Extensible arrays were fiddly to work with (which is understandable given that allocating arrays of arbitrary size barely makes sense on a 1968 computer).
* Formatted IO is weird. I never really understood how it worked and just tweaked examples till I got what I needed.
* No hashtables in the stdlib.
* Available implementations on modern hardware are not production quality.
I'm always inspired when someone (like the author of the Algol 68 compiler you used) decides to make a historically important – but obsolete and generally unavailable – software system usable and available on modern hardware.
A few people were doing Advent of Code at work and I guess I wanted the bragging rights of doing it in the most obscure language possible.
I think if I did something like this again I would do it with an old Advent of Code to remove the competitive pressure. It was fun to start with but it ended up stressing me out quite a bit towards the end (which is not ideal over the Christmas vacation).
Algol-68 is sort of the reference case for the problem with prescriptive standards.
They sound like a good idea "Lets get the best minds together and create the ideal standard" however because they are not based on anything built tend to be over engineered and hard to implement.
At the opposite end of the spectrum is posix, which is a sort of terrible "lets just glob what everyone is doing together" descriptive standard, at least it gets everyone operating on the same page. so I think of it as a good place to start, not the end result.
There is a reason the ieft was so successful with their "working code and rough consensuses". Which is why it worries me as they try to redefine them self as a "standards" body. as opposed to publishing standards.
It sounds like a classic case of the Second System effect. Where the original product was functional but a maybe little too basic, so everybody has an idea of how to improve it. Many of the ideas are good on their own, but the committee ends up accepting far too many and the thing suffers from terminal feature creep.
Perhaps we should make a language with a hard cap on the number of “features” in the language syntax + standard library. For everything you propose to add, you would then also need to propose something to remove.
> For everything you propose to add, you would then also need to propose something to remove.
There are surely lots of features the proposer would be willing to remove (although one also has to define how to count the number of features, or else every proposal, no matter how long, will be for one giant feature, and will propose removing one tiny feature in return). The problem is that what I consider dispensable in return for my essential proposal will conflict with what the next programmer over considers essential ….
Though in principal I love the idea, it would kill backwards compatibility. But maybe each major version revision should have a vote on keeping or removing some older features.
Only those holding no stakes would play the vote game; anyone actually depending on compatibility would silently (or sometimes not so silently) depart.
> The first implementation of the standard, based on the late-1968 draft Report, was introduced by the Royal Radar Establishment in the UK as ALGOL 68-R in July 1970. This was, however, a subset of the full language, and Barry Mailloux, the final editor of the Report, joked that "It is a question of morality. We have a Bible and you are sinning!"[31] This version nevertheless became very popular on the ICL machines, and became a widely-used language in military coding, especially in the UK
Burroughs Algol was based on Algol 60, Pascal was derived from Algol W which forked off from Algol 60 well before Algol 68, PL/I also predates Algol 68.
I found that in Modula-2 and Oberon, however times have changed.
For me the best version of Oberon linage is Active Oberon, while I appreciate Wirth I think he went too far on his quest for language simplification. Even Go has more features than Oberon
-07.
As for Modula-2, it is now a standard language on GCC, otherwise Zig and Odin are relatively similar for the curly bracket folks.
I found Modula-2 to be annoying to use, not for any deep reason, but because of its use of block capitals for keywords, and a confusing nomenclature for casting. It does actually matter that a language is comfortable to write.
Keywords were never an issue to me, because I love tools, and most proper Modula-2 IDEs supported automatic formatting, just like I write SQL in lower case and let my tools work for me.
People should stop designing languages for notepad and classical UNIX V6 vi as editor experience.
Casts well, each language has its own nomenclature for type conversions.
I think Wirth said something like "I am a programmer who is a professor, and a professor who is a programmer." Definitely a huge inspiration to me, both for the elegance and compactness of his system designs as well as the idea that research could and should affect practice, and vice-versa.
At least with -fbounds-safety clang is finally catching up with what Pascal (including UCSD, Apple, and Turbo Pascal) and Ada (a rather more complicated descendant of Pascal) compilers had in the 1980s. (Delphi and Free Pascal still have range checking today.)
But not really - lots of new stuff was added - structures, unions, pointers, function pointers, operator definitions, a heap - all stuff that hadn't appeared in a language spec before, things we all see and use every day.
The big problem was that the language spec was incomprehensible (I've done a language implementation), trying to embed syntax and semantics into the one spec - the maths guys went a bit overboard there. The other main problem was trying to solve the reserved word problem by defining effectively multiple fonts/type faces for different parts of the language .... at a time when even lower case wasn't really an option for most people
It's not that ALGOL-68 was that bad. It's that the description of it was terrible. There was an attempt to formalize the semantics, and it was a disaster. Nobody knew how to describe language semantics back then, so they invented a new notation. There was an ALGOL-68 report, which was incomprehensible.[1] Then there was "Informal Introduction to ALGOL-68"[2], which was difficult. Finally, there was a "Very Informal Introduction to ALGOL-68"[3], which was somewhat readable.
Some sample text from the report:
A "nest" is a 'NEST'. The nest "of" a construct is the "NEST" enveloped by the original of that construct, but not by any 'defining LAYER' contained in that original. (The nest of a construct carries a record of all the declarations forming the environment in which that construct is to be interpreted. Those constructs which are contained in a range R, but not in any smaller range contained within R, may be said to comprise a "reach". All constructs in a given reach have the same nest, which is that of the immediately surrounding reach with the addition of one extra "LAYER". The syntax ensures (3.2.1.b, 3.4.1.i,j,k, 3.5.1.e, 5.4.1.1.b) that each 'PROP' (4.8.1.E) or "property" in the extra 'LAYER' is matched by a defining. indicator (4.8.1.a) contained in a definition in that reach.)
They're trying to get hold of the concept of variable scope and lifetime, back when that was a new idea.
They were also trying to invent type theory, which they needed and which didn't exist yet. The first 50 pages of the report are all about the notation they invented to be used in the rest of the report. It's not a very good notation. It's like calculus before Leibniz, when people were trying to use Newton's fluxions.[4] Computer science just didn't have enough widely understood abstractions for people to even talk about this stuff clearly.
[1] https://www.softwarepreservation.org/projects/ALGOL/report/A...
[2] https://www.abebooks.com/servlet/BookDetailsPL?bi=3196204463...
[3] https://www.computinghistory.org.uk/det/8776/A-Very-Informal...
[4] https://en.wikipedia.org/wiki/Fluxion
Yes, I completely agree that the real problem of ALGOL-68 has been the complete lack of suitable documentation for it.
Even if Wirth is the author of some important innovations in programming language design, all his programming languages have been significantly inferior to ALGOL-68. Nevertheless, most of them have been more successful precisely because Wirth has written some simple and clear manuals for them, which were easily understood by any programmer.
The formal description used in the report about ALGOL-68 could have been very useful for implementers, but because the language included a lot of novelties it never had any chance of adoption without having, besides a formal description, an extensive tutorial and rationale of the design, accompanied by many programming examples.
This is very sad, because ALGOL-68 got a lot of things right, which are wrong in most later programming languages, but because of its reputation of being incomprehensible it has been very seldom a source of inspiration for the designers of programming languages, as it would have deserved.
Some popular languages, including C, the Bourne shell and C++, have taken various elements from ALGOL 68, but most of them have been superficial elements, like keywords or details of syntax, instead of taking more important elements.
For instance among the ALGOL 68 elements borrowed by C is the keyword "union", but the unions of C are barely better than the EQUIVALENCE of Fortran, and they are not comparable with the unions of ALGOL 68, which were based on the proposal of John McCarthy and which correspond to what are now more frequently called sum types. The "variant records" of Pascal were also much worse designed by Wirth than the unions introduced earlier in the ALGOL 68 that he had criticized.
Actually, I used Algol 68 at uni, while reading EE. That was due to the lecturer tasked with giving us some computing instruction choosing that language. This would have been in around 87 or 88.
I actually quite liked it, finding it easy to translate to / from C and A68, but that was mainly because he only described a subset (incl. nested functions), and didn't even describe REF.
So we got exposed to all of the various FOR/WHILE/DO/OD thing, IF's returning values (used like a C ternary op), etc. Some folks went off and acquired reference books for the language, and explored other areas. I never did as at the time I was comfortable with the K&R C compiler on our dept. unix machines.
The real paint point with A68 there was that the compiler was a batch submission one, took quite some time to run, and the machine doing the work was quite overloaded.
> IF's returning values (used like a C ternary op)
I assume this just refers to a conditional expression?
As I recall, it was something like:
Possibly one could also place various statements inside the THEN and ELSE sections, as long as they ended with an expression. I've a vague impression about that, but can't properly recall.There was also a short form available, but we were never actually told that, although some folks found it themselves from reference books.
A68 also allowed brackets interchangeably with the key words, you could say:
x := ( y < 1 | 1 | 2 );
Dude, this is an amazing comment. Really points out how it looks when someone us ahead of their time:
Algol-68 was clearly so ahead of its time, it's hard to understand anything they're saying. They're in outer spaceComes up all the time. Steam engine valve gear is a pulse-width modulation motor driver. Number 5 Crossbar, the best electromechanical telephone switch, is a distributed redundant microservices architecture. Western Union Plan-55A, for switching telegrams, is a mail server and forwarder. Each of those, in its day, was unique in its space, with its own terminology and theory. Today we see them as instances of common patterns.
Right now, large language models are in that state. They work, but why they work and why they fail is barely understood. As more systems are developed in that space and adjacent to it, more general understanding may emerge.
Glad to hear it might not be utter incompetence.
I’d love to find [3] as a PDF!
I assume that it has about the same content as the book "Informal introduction to algol 68", published by the same authors a few years later, and which had at least two editions, e.g. in 1973 and in 1977.
Searching the Internet finds some scanned copies of that book.
E.g.
https://www.softwarepreservation.org/projects/ALGOL/book/Lin...
I've just looked at this snippet, which the blog used back in 2017 to illustrate that Algol 68 was "utter madness": https://craftofcoding.wordpress.com/2017/03/06/a-brief-look-...
A few things: This code is very readable. The blogger didn't understand it because he didn't understand what the "Sieve of Eratosthenes" did: It's a means of calculating lists of prime numbers. He incorrectly stated that it was used for calculating Fibonacci numbers.Syntax-wise, it looks like some mix of Go, C and Python. Nothing at all unusual. It would be pretty easy to learn to program in this, and comfortable too.
[Edit: The blogger later demonstrated a good understanding of the Sieve of Eratosthenes, but still expressed criticism of Algol68: https://craftofcoding.wordpress.com/2021/04/06/algorithm-35-...]
> This code is very readable
It really is.
> Syntax-wise, it looks like some mix of Go, C and Python. Nothing at all unusual. It would be pretty easy to learn to program in this, and comfortable too.
Or maybe a mix of Pascal, PL/I, and /bin/sh.
/bin/sh, i.e. the Bourne shell, has been written by a programmer who was experienced in ALGOL 68, having implemented some significant projects in it.
In designing the language interpreted by the shell, he has taken several features directly from ALGOL 68.
However he has also done one improvement, renaming the bracket pair keywords "do" and "od" to "do" and "done", responding thus to the criticism of many programmers towards ALGOL 68, that "od" sounds odd.
In particular, one of the projects that Steve Bourne had previously done is the Algol68C compiler at Cambridge. See: https://www.softwarepreservation.org/projects/ALGOL/algol68i...
Steve Bourne wanted to use od to end loops in his shell but was prevented because od was already the octal dump program.
And as other HN commenters have noted recently, Bourne used #define to make the shell source code look like Algol:
https://research.swtch.com/shmacro
I agree, very readable.
One interesting thing: the sieve array does not have static bounds, the size of it is an argument. Presumably that means it’s stack-allocated dynamically a la ”alloca” (I doubt it’s heap-allocated, given there’s no freeing of it). C didn’t get that feature until C99 and adding it has been almost universally been seen as a mistake.
I imagine that’s an example of one of the many, many things that makes Algol-68 a pain to implement. Does make it easier for the developer though.
I've looked up alloca() and it's considered about as safe to use as recursion, and potentially safer than stack-allocating one big fixed-size array. Its bad reputation might be because of poor implementations in certain compilers.
What does ENTIER do? I can read the code, but I have no idea what that means.
From context, it looks like casting to an INT
Makes sense. No idea what the derivation of “ENTIER” is, though.
"entier" means "integer" in French.
In the past, it was customary in the English mathematics texts, especially in the UK, to use the French word for the integer part of a number, because the usage of this function had been borrowed from French mathematicians.
ALGOL 60 had taken many notations from the standard mathematical notation of that time, including operators and function names, so it resembled much more a mathematical text than the American programming languages like Fortran, where the main criterion for designing the syntax was using the restricted character set of the IBM printers, which lacked most mathematical symbols, leading to notations like "*", "/" and "**" for multiplication, division and exponentiation, where ALGOL would use "×", "÷", and "↑".
https://www.softwarepreservation.org/projects/ALGOL/book/pam...
The operator ENTIER (French for “whole”) takes a REAL operand and likewise yields an INT result, but the yield is the largest integer equal to or less than the operand. Thus ENTIER 2.2 yields 2, ENTIER -2.2 yields -3
Isn't Algol the crazy language using special characters all over the place as operators? I could see that being undesirable, but I don't see it in that snippet.
edit: I'm thinking of APL. It looks like it's the same kind of functionality as what the Matlab, R or numpy cohort provides.
Yes, as in your own correction, you were thinking about APL.
ALGOL 68 was certainly more readable than C or Pascal, even if it was somewhat more verbose.
A very good feature of ALGOL 68 was that all the syntactic structures were enclosed in brackets, so no ambiguities or editing errors were possible, like in C or ALGOL 60.
Moreover, each syntactic structure had its own pair of brackets, so you did not have the reading difficulties caused by all bracket pairs being the same, like "()" in LISP, "begin" and "end" in Pascal or "{}" in C.
Especially useful is having different bracket pairs for iterations ("do" and "od" in ALGOL 68, renamed as "do" and "done" in the Bourne shell, which sounds better) in comparison with the bracket pairs used for conditional statements ("if" and "fi" in ALGOL 68).
In C and derived languages, even the indentation is not enough to understand the code in many cases, especially when the lines are restricted to 80 characters, when you have a long loop body that cannot be seen inside a single page.
In ALGOL 68, you have distinct closing brackets for each syntactic structure, so it is easy to recognize their meaning.
Using keywords for brackets is more verbose than in C, but much less verbose than using terminating comments. Moreover, now we can use Unicode for program texts, so instead of keywords one could use different pairs of Unicode brackets that are graphically distinct. For instance one can use angular brackets for conditional statements and S-shaped bag delimiter brackets for iterations.
ALGOL is the main forerunner of C. Anything vaguely C-like is also ALGOL-like.
As I understand it, B[1] was the immediate predecessor of C.
Pascal (and its descendants) and Ada (and VHDL) seem closer syntactically to algol 60 than C is (begin/end as block delimiters, no parentheses needed for control structures, := for assignment, etc.) The example code at [1] (using bold keywords) should be understandable to most HN readers 64 years later.
[1] https://en.wikipedia.org/wiki/B_(programming_language)
[2] https://en.wikipedia.org/wiki/ALGOL_60
B came from BCPL which came from CPL which came from ALGOL 60. The main thing about ALGOL was structured programming as opposed to goto statements. Compared to that begin/end vs braces is a very minor issue.
For a look at how BCPL developed (by being easily ported to new machines), see: https://www.softwarepreservation.org/projects/BCPL
I always thought BCPL would have been a great language for 80s era micros. Fortunately, we had Turbo Pascal.
But C was better because it incorporated byte addressing (BCPL was based on word addresses).
Except the contemporary C compilers were uniformly terrible.
Indeed. And they're all algol descendants as you note.
But there is certainly a difference between c-like syntax (java, javascript...) vs. algol-like syntax (pascal, ada, ...)
And BCPL means Bootstraping CPL, it was never intended to anything beyond bootstraping the CPL compiler that never came to be, as the project folded.
No, BCPL was Basic CPL. BCPL was created in 1967 after the CPL project ground to a halt in 1966, based on the subset of CPL used in the CPL compiler.
Apparently everyone is wrong on Internet.
> BCPL has been rumored to have originally stood for "Bootstrap Cambridge Programming Language", but CPL was never created since development stopped at BCPL, and the acronym was later reinterpreted for the BCPL book.
https://en.m.wikipedia.org/wiki/BCPL
I have seen no evidence of this in papers by Martin Richards, including the earliest BCPL manual or the contemporary papers on CPL by Strachey et al.
The descriptions of how the CPL compiler worked (eg Strachey’s paper on GPM, Richards more recent retrospectives) all talk about writing the compiler in CPL and translating it by hand to macro assembly.
I have not yet managed to look at Richards PhD thesis which contains the first draft description of BCPL, which he sketched after working on the CPL compiler and shortly before moving to MIT where he first implemented BCPL.
There's a lot of hearsay in that articles, and a lot of sentiment rooted in the particulars of that time.
Sure, it was a complex thing in the late 60s/early 70s. Sure, Wirth came up with something simpler. But I'm missing a deeper analysis, especially with a more modern view point where basically any language is at least as complex as Algol 68[0].
> Arguably Wirth’s Algol-W was a better successor to Algol-60
I might not even disagree, but what were the arguments, and how are they holding up?
> and arguably did not have the same connections to industry as the likes of Fortran and Cobol
Sure. But neither did Algol-W or Pascal. And pretty much anything else in the 20th century.
[0]: http://cowlark.com/2009-11-15-go/
EWD comes up as a dissenter for Algol-68, and the longer my career as a software developer the more I disagree with him on anything that isn't pure math.
Having EWD as a dissenter also seems a rather low bar, to be fair. (One might say he was a bit of a Edsgerlord.)
EWD is apparently Edsger W. Dijkstra for those also unaccustomed to reading him cited by his initials.
The appellation is in some part due to his custom of writing monographs for himself titled EWD-n[1]. They are a fascinating mix of deep mathematical and philosophical insight and curmudgeonly reflective essaying.
[1] https://www.cs.utexas.edu/~EWD/
> writing monographs for himself
One might almost call it a "blog" if it weren't for the fact that EWD started ~35 years before Frontier NewsPage.
Yeah, if EWD had had his way, very little software would ever have been written. I think his insistence on proving the correctness of imperative programs is understandable but entirely wrong headed. The sheer amount of insight needed to get through the working day would be unattainable by most and unsustainable for all but a few.
Algol 68 was certainly too much language for the tiny machines of 1970, but on a 2020 box it might be fairly decent.
There is a modern implementation if you feel like checking it out firsthand
https://jmvdveer.home.xs4all.nl/en.algol-68-genie.html
Not really, I did an implementation in the late 70s, ran on a mainframe of the time (1MHz 6Mb). The language itself is not much more than modern C in scope - and in fact many ideas that were new in 68 are expected in modern languages
The big problem was that the spec was essentially unreadable.
6MB was quite an amount back then.
Yup, we bought 1.5Mb of core for our B6700 for over a million dollars (I think we only had 3Mb on the machine I was working on)
>Algol-68 (or derivatives of) did rise to some prominence in one place – the USSR. The most prominent implementation of the language came from Leningrad State University. It was apparently used to design Russian telephone exchanges.
6-pass compiler for Algol-68 :) The telephone exchange was built on the base of the microcomputer (while developed in USSR/Russia, and ultimately that microcomputer was mostly used by the Russian military, the major customer for the exchange who funded the work at some point was Alcatel) which was like Tandem, only 3 instead of 2 systems in parallel because of low quality of electronics, the CPU was USSR-developed "Samson" [1], a kind of Elbrus offshoot. The exchange software was developed in SDL (Z.100, kind of like Erlang-by-European-commitee-in-1980ies) and compiled into Algol-68 which was compiled into that CPU codes.
[1] (in Russian) https://www.computer-museum.ru/articles/sistemi_kompleksi/90...
6 passes seems like a lot but likely that was just to fit it in the tiny computers of the time.
One problem with the language is that you can't parse declarations and expressions until after you have parsed the bracket structure of the language (because you can't tell whether a name is a type or an operator until you have processed their declarations (which do not have to be done before use) - that sort of means you have to do at least 2 lexical/parse passes
I did some of the Advent of Code in Algol 68 a few years back: https://github.com/addrummond/aoc_2021_algol68
It was a surprisingly pleasant language to work with considering its age. The main annoyances were:
* Extensible arrays were fiddly to work with (which is understandable given that allocating arrays of arbitrary size barely makes sense on a 1968 computer).
* Formatted IO is weird. I never really understood how it worked and just tweaked examples till I got what I needed.
* No hashtables in the stdlib.
* Available implementations on modern hardware are not production quality.
Great idea - what inspired you to try this?
I'm always inspired when someone (like the author of the Algol 68 compiler you used) decides to make a historically important – but obsolete and generally unavailable – software system usable and available on modern hardware.
A few people were doing Advent of Code at work and I guess I wanted the bragging rights of doing it in the most obscure language possible.
I think if I did something like this again I would do it with an old Advent of Code to remove the competitive pressure. It was fun to start with but it ended up stressing me out quite a bit towards the end (which is not ideal over the Christmas vacation).
Nicely done! The multi-wprd identifiers with spaces in them is really freaking me out :)
Algol-68 is sort of the reference case for the problem with prescriptive standards.
They sound like a good idea "Lets get the best minds together and create the ideal standard" however because they are not based on anything built tend to be over engineered and hard to implement.
At the opposite end of the spectrum is posix, which is a sort of terrible "lets just glob what everyone is doing together" descriptive standard, at least it gets everyone operating on the same page. so I think of it as a good place to start, not the end result.
There is a reason the ieft was so successful with their "working code and rough consensuses". Which is why it worries me as they try to redefine them self as a "standards" body. as opposed to publishing standards.
It sounds like a classic case of the Second System effect. Where the original product was functional but a maybe little too basic, so everybody has an idea of how to improve it. Many of the ideas are good on their own, but the committee ends up accepting far too many and the thing suffers from terminal feature creep.
Feature creep and Second System effect are all the rage now. Every recent language has them.
If you don't constantly have three dozen Requests for Implementation brewing, seven of which are going into the next release, you're a dead project.
Perhaps we should make a language with a hard cap on the number of “features” in the language syntax + standard library. For everything you propose to add, you would then also need to propose something to remove.
> For everything you propose to add, you would then also need to propose something to remove.
There are surely lots of features the proposer would be willing to remove (although one also has to define how to count the number of features, or else every proposal, no matter how long, will be for one giant feature, and will propose removing one tiny feature in return). The problem is that what I consider dispensable in return for my essential proposal will conflict with what the next programmer over considers essential ….
Though in principal I love the idea, it would kill backwards compatibility. But maybe each major version revision should have a vote on keeping or removing some older features.
Only those holding no stakes would play the vote game; anyone actually depending on compatibility would silently (or sometimes not so silently) depart.
Emergent property of humans, it turns out.
Industry did adopt ALGOL variants, it was and still is, one of the languages on the Burroughs linage.
https://public.support.unisys.com/framework/publicterms.aspx...
UK Navy also had a system programmed on it.
Also it was largely influential in PL/I and its dialects, which were used a bit everywhere, the most well known being PL.8, PL/S and PL/M.
And the competition that eventually lead to the Pascal tree of programming languages.
History would have been much different if Algol 68 hadn't existed in first place.
of the Tiobe top 10, only 3* are not variants of warmed over algol (and of those 3, 1 is arguably not even a general purpose programming language).
* Fortran, Visual Basic, SQL — and even here, the way they've all evolved from their roots has been towards Algol 68.
You are talking about Algol 60 which is a very different language to Algol 68
> The first implementation of the standard, based on the late-1968 draft Report, was introduced by the Royal Radar Establishment in the UK as ALGOL 68-R in July 1970. This was, however, a subset of the full language, and Barry Mailloux, the final editor of the Report, joked that "It is a question of morality. We have a Bible and you are sinning!"[31] This version nevertheless became very popular on the ICL machines, and became a widely-used language in military coding, especially in the UK
https://en.m.wikipedia.org/wiki/ALGOL_68
Burroughs Algol was based on Algol 60, Pascal was derived from Algol W which forked off from Algol 60 well before Algol 68, PL/I also predates Algol 68.
Apparently you missed the point,
"History would have been much different if Algol 68 hadn't existed in first place."
Without the way Algol 68 went down, Algol W and the related events that gave birth to Pascal linage would never happened.
Likewise, PL/I adoption linage would most likely taken a different evolution.
And we lost Wirth this year sadly. I hope we can find a language with safety, simplicity, and just enough features to be practical someday.
I found that in Modula-2 and Oberon, however times have changed.
For me the best version of Oberon linage is Active Oberon, while I appreciate Wirth I think he went too far on his quest for language simplification. Even Go has more features than Oberon -07.
As for Modula-2, it is now a standard language on GCC, otherwise Zig and Odin are relatively similar for the curly bracket folks.
I found Modula-2 to be annoying to use, not for any deep reason, but because of its use of block capitals for keywords, and a confusing nomenclature for casting. It does actually matter that a language is comfortable to write.
Keywords were never an issue to me, because I love tools, and most proper Modula-2 IDEs supported automatic formatting, just like I write SQL in lower case and let my tools work for me.
People should stop designing languages for notepad and classical UNIX V6 vi as editor experience.
Casts well, each language has its own nomenclature for type conversions.
I think Wirth said something like "I am a programmer who is a professor, and a professor who is a programmer." Definitely a huge inspiration to me, both for the elegance and compactness of his system designs as well as the idea that research could and should affect practice, and vice-versa.
At least with -fbounds-safety clang is finally catching up with what Pascal (including UCSD, Apple, and Turbo Pascal) and Ada (a rather more complicated descendant of Pascal) compilers had in the 1980s. (Delphi and Free Pascal still have range checking today.)
But you could insert ASM instructions directly into PASCAL code, in case you needed something fast and without bounds checking.
I think was added in Turbo Pascal, not in the versions he wrote. I could be mistaken.
The development and scope creep sounds a lot like what happened with Perl 6 ;P
Algol 68 was to Algol as C++ is to C: trying to do too much resulting in over-complexity.
The two page spread showing a graph of the implicit type conversions was a masterpiece.
But not really - lots of new stuff was added - structures, unions, pointers, function pointers, operator definitions, a heap - all stuff that hadn't appeared in a language spec before, things we all see and use every day.
The big problem was that the language spec was incomprehensible (I've done a language implementation), trying to embed syntax and semantics into the one spec - the maths guys went a bit overboard there. The other main problem was trying to solve the reserved word problem by defining effectively multiple fonts/type faces for different parts of the language .... at a time when even lower case wasn't really an option for most people
Algol 68 ended up competing with Ada, and you can guess which one won.