There is one very BIG thing that Cobol pioneered: the requirement that not only the programs, but also the data, must be portable across machines. At a time when machines used different character codes, let alone different numeric formats, Cobol was designed to vastly reduce (though it did not completely eliminate) portability woes.
We take this for granted now, but at the time it was revolutionary. In part, we've done things like mandating Unicode and IEEE 754, but nowadays most of our languages also encourage portability. We think very little of moving an application from Windows on x86_64 to Linux on ARMv8 (apart from the GUI mess), but back when Cobol was being created, you normally threw your programs away (“reprogramming”) when you went to a new machine.
I haven't used Cobol in anger in 50 years (40 years since I even taught it), but for that emphasis on portability, I am very grateful.
the other big cobol feature is high precision (i.e. many digest) fixed point arithmetic. not loosing pennies on large sums, and additionally with well defined arithmetics, portably so as you point out, is a killer feature in finance.
you need special custom numerical types to come even close in, say, java or C++ or any other language.
>the other big cobol feature is high precision (i.e. many digest) fixed point arithmetic. not loosing pennies on large sums, and additionally with well defined arithmetics, portably so as you point out, is a killer feature in finance.
I guess you mean:
>digest -> digits
>loosing -> losing
Is that the same as BCD? Binary Coded Decimal. IIRC, Turbo Pascal had that as an option, or maybe I am thinking of something else, sorry, it's many years ago.
High precision numbers are more akin to the decimal data type in SQL or maybe bignum in some popular languages. It is different from (say) float in that you are not losing information in the least significant digits.
You could represent high precision numbers in BCD or regular binary… or little endian binary… or trinary, I suppose.
Okay, I'll bite. ML did not mostly die, it morphed into two main dialects, SML and OCaml. OCaml is still going strong, and it's debatable whether SML is mostly dead.
My main beef, however, is that the last sentence in the section seems to suggest that the birth of Haskell killed SML on the vine because suddenly everybody only wanted pure, lazy FP. That's just wrong. The reality is that these two branches of Functional Programming (strict/impure and lazy/pure) have continued to evolve together to the present day.
Serious question: Is Ada dead? I actually had to google Ada, and then "Ada language" to find out. It's not dead, and it has a niche.
When I was in grad school in the late 70s, there was a major competition to design a DoD-mandated language, to be used in all DoD projects. Safety and efficiency were major concerns, and the sponsors wanted to avoid the proliferation of languages that existed at the time.
Four (I think) languages were defined by different teams, DoD evaluated them, and a winner was chosen. It was a big thing in the PL community for a while. And then it wasn't. My impression was that it lost to C. Ada provided much better safety (memory overruns were probably impossible or close to it). It would be interesting to read a history of why Ada never took off the way that C did.
Ada isn't dead and it's superior to Rust in many ways, but it is less trendy. adacore.com is the main compiler developer (they do GNAT). adahome.com is an older site with a lot of links.
Wow, that was a trip down memory lane! I have used six of those languages: BASIC, APL, COBOL, Pascal, Algol-W (a derivative of Algol 60), PL/1. Mostly in school. My first dollars earned in software (brief consulting gig in grad school) had me debug a PL/1 program for a bank.
For some reason I remember an odd feature of PL/1: Areas and offsets. If I am remembering correctly, you could allocate structures in an area and reference them by offset within that area. That stuck in my mind for some reason, but I never found a reason to use it. It struck me as a neat way to persist pointer-based data structures. And I don't remember seeing the idea in other languages.
Maybe the reason it stayed with me is that I worked on Object Design's ObjectStore. We had a much more elegant and powerful way of persisting pointer-based structures, but an area/offset idea could have given users some of the capabilities we provided right in the language.
"Significance: In terms of syntax and semantics we don’t see much of COBOL in modern computing."
Would I be wrong in saying that SQL has what feels to me to be a very cobaly syntax. By which I mean, I know it is not directly related to cobal, But someone definitely looked at cobal's clunky attempt at natural language and said "that, I want that for my query language"
>An accurate analysis of the fall of Pascal would be longer than the rest of this essay.
I put the blame solely on the management of Borland. They had the world leading language, and went off onto C++ and search of "Enterprise" instead of just riding the wave.
When Anders gave the world C#, I knew it was game over for Pascal, and also Windows native code. We'd all have to get used to waiting for compiles again.
Modula-3 should be on that list as well.
Unfortunately pretty dead (compiler support is rather abysmal), though pretty influential. Wikipedia lists a couple of languages that it influenced, I think it should also include Go (though Go is allegedly influenced by Modula-2, according to its wikipedia article)
R/S is also heavily influenced by Lisp. Haven’t written it in 10 years, but AFAIR it even has proper macros where argument expressions are passed without evaluation.
One of the most significant new COBOL projects in 2025 was the integration of a new COBOL front-end into the GNU Compiler Collection. There are indeed quite many new projects being started in COBOL, though they primarily focus on modernization and integration with contemporary technologies rather than traditional greenfield development. Also not forget some cloud providers now offer "COBOL as a service" (see e.g. https://docs.aws.amazon.com/m2/latest/userguide/what-is-m2.h...).
By "new COBOL projects" I mean green-field development of entirely new projects written in that language - not the continued development of existing COBOL codebases, or development of tools which interact with COBOL code.
As an aside, the article you linked to is pretty obvious AI slop, even aside from the image ("blockchin infarsucture" and all). Some of the details, like claims that MIT is offering COBOL programming classes or that banks are using COBOL to automatically process blockchain loan agreements, appear to be entirely fabricated.
> There are indeed quite many new projects being started in COBOL
No.
You have to put this relative to projects started in other languages, at which points new projects started in COBOL is even less than a rounding error, it probably wouldn't result in anything other than 0 with a float.
And everyone with relevant fintech project experience knows that new projects on the existing core banking systems are started all the time and that COBOL continues to be a relevant language (whether we like it or not).
Maybe their definition uses recent popularity or how many new projects are started with it. Under that definition, I think it's pretty safe to call it "dead".
Yes. "Dead" normally means "to be devoid of life," but it's often extended to metaphorically cover things like computer languages.
edit: for ancient Greek to become a dead language, will we be required to burn all of the books that were written in it, or can we just settle for not writing any new ones?
Seeing Smalltalk on these lists and not Self always seems... lacking. Besides its direct influence on Smalltalk, and its impact on JIT research, its prototype-based object system lead to Javascript's object model as well.
Interesting read, and would have been good to see the author’s definition of ‘mostly dead’. Some are still used widely in niche areas like COBOL for banking. If a language itself isn’t receiving any updates nor are new packages being developed by users, is it mostly dead?
In any case, the author claims that each of these languages is "dead". There is a "Cause of Death" section for each language, which doesn't allow for another conclusion. By listing languages like ALGOL, APL, CLU, or Simula, the author implies that he means by "dead" "no longer in practical use, or just as an academic/historic curiosity". The article contradicts itself by listing languages like COBOL, BASIC, PL/I, Smalltalk, Pascal, or ML, for which there is still significant practical use, even with investments for new features and continuation of the language and its applications. The article actually disqualifies by listing COBOL or Pascal as "mostly dead", because there is still a large market and significant investment in these languages (companies such as Microfocus and Embarcadero make good money from them). It is misleading and unscientific to equate “no longer mainstream” with “no longer in use.” This makes the article seem arbitrary, poorly researched, and the author not credible.
If we assume peak Perl was in the 00s, say 2005, an impressionable teenager of ~15 learning by then probably will keep using it for the rest of their life, even in absence of uptake by new people. Assuming a lifespan of 85, I estimate this day won't arrive before the 2070s.
I started using it in the mid-90s, and used it extensively at work as long as I could, but by 2012 I gave up the fight. I still break it out once in a great while for a quick text transformation, but it’s so rusty in my memory that I rely on an LLM to remind me of the syntax.
Forth was neat, but it was a bit of an evolutionary dead end. I'm not aware of any significant concepts from Forth which were adopted by other, later programming languages.
RPL (Reverse Polish Lisp, a high level language for HP calculators) possibly drew on it a bit, though the main antecedents are RPN and Lisp, and possibly Poplog (a Poplog guru was at HP at the time, but I don't know if he contributed).
Imho Lisp is deader than COBOL. Especially now that we've learned you can do the really hard and interesting bits of AI with high-performance number crunching in C++ and CUDA.
There is one very BIG thing that Cobol pioneered: the requirement that not only the programs, but also the data, must be portable across machines. At a time when machines used different character codes, let alone different numeric formats, Cobol was designed to vastly reduce (though it did not completely eliminate) portability woes.
We take this for granted now, but at the time it was revolutionary. In part, we've done things like mandating Unicode and IEEE 754, but nowadays most of our languages also encourage portability. We think very little of moving an application from Windows on x86_64 to Linux on ARMv8 (apart from the GUI mess), but back when Cobol was being created, you normally threw your programs away (“reprogramming”) when you went to a new machine.
I haven't used Cobol in anger in 50 years (40 years since I even taught it), but for that emphasis on portability, I am very grateful.
Is Python indentation at some level traced back to Cobol?
the other big cobol feature is high precision (i.e. many digest) fixed point arithmetic. not loosing pennies on large sums, and additionally with well defined arithmetics, portably so as you point out, is a killer feature in finance.
you need special custom numerical types to come even close in, say, java or C++ or any other language.
>the other big cobol feature is high precision (i.e. many digest) fixed point arithmetic. not loosing pennies on large sums, and additionally with well defined arithmetics, portably so as you point out, is a killer feature in finance.
I guess you mean:
>digest -> digits
>loosing -> losing
Is that the same as BCD? Binary Coded Decimal. IIRC, Turbo Pascal had that as an option, or maybe I am thinking of something else, sorry, it's many years ago.
Binary Coded Decimal is something else.
1100 in “regular” binary is 12 in decimal.
0001 0010 in BCD is 12 in decimal.
ie: bcd is an encoding.
High precision numbers are more akin to the decimal data type in SQL or maybe bignum in some popular languages. It is different from (say) float in that you are not losing information in the least significant digits.
You could represent high precision numbers in BCD or regular binary… or little endian binary… or trinary, I suppose.
Okay, I'll bite. ML did not mostly die, it morphed into two main dialects, SML and OCaml. OCaml is still going strong, and it's debatable whether SML is mostly dead.
My main beef, however, is that the last sentence in the section seems to suggest that the birth of Haskell killed SML on the vine because suddenly everybody only wanted pure, lazy FP. That's just wrong. The reality is that these two branches of Functional Programming (strict/impure and lazy/pure) have continued to evolve together to the present day.
Serious question: Is Ada dead? I actually had to google Ada, and then "Ada language" to find out. It's not dead, and it has a niche.
When I was in grad school in the late 70s, there was a major competition to design a DoD-mandated language, to be used in all DoD projects. Safety and efficiency were major concerns, and the sponsors wanted to avoid the proliferation of languages that existed at the time.
Four (I think) languages were defined by different teams, DoD evaluated them, and a winner was chosen. It was a big thing in the PL community for a while. And then it wasn't. My impression was that it lost to C. Ada provided much better safety (memory overruns were probably impossible or close to it). It would be interesting to read a history of why Ada never took off the way that C did.
Ada isn't dead and it's superior to Rust in many ways, but it is less trendy. adacore.com is the main compiler developer (they do GNAT). adahome.com is an older site with a lot of links.
Wow, that was a trip down memory lane! I have used six of those languages: BASIC, APL, COBOL, Pascal, Algol-W (a derivative of Algol 60), PL/1. Mostly in school. My first dollars earned in software (brief consulting gig in grad school) had me debug a PL/1 program for a bank.
For some reason I remember an odd feature of PL/1: Areas and offsets. If I am remembering correctly, you could allocate structures in an area and reference them by offset within that area. That stuck in my mind for some reason, but I never found a reason to use it. It struck me as a neat way to persist pointer-based data structures. And I don't remember seeing the idea in other languages.
Maybe the reason it stayed with me is that I worked on Object Design's ObjectStore. We had a much more elegant and powerful way of persisting pointer-based structures, but an area/offset idea could have given users some of the capabilities we provided right in the language.
"Significance: In terms of syntax and semantics we don’t see much of COBOL in modern computing."
Would I be wrong in saying that SQL has what feels to me to be a very cobaly syntax. By which I mean, I know it is not directly related to cobal, But someone definitely looked at cobal's clunky attempt at natural language and said "that, I want that for my query language"
I agree completely. They were from the same era (COBOL is a few years older), and they do have that dweeby, earnest, natural language influence.
>An accurate analysis of the fall of Pascal would be longer than the rest of this essay.
I put the blame solely on the management of Borland. They had the world leading language, and went off onto C++ and search of "Enterprise" instead of just riding the wave.
When Anders gave the world C#, I knew it was game over for Pascal, and also Windows native code. We'd all have to get used to waiting for compiles again.
Modula-3 should be on that list as well. Unfortunately pretty dead (compiler support is rather abysmal), though pretty influential. Wikipedia lists a couple of languages that it influenced, I think it should also include Go (though Go is allegedly influenced by Modula-2, according to its wikipedia article)
What other languages have been influenced by Go?
The (literal) first and foremost ASCII descendant of APL was MATLAB.
I feel that the article should have made this a lot more clear - as so many people code along the APL -> Matlab / R (via S) -> NumPy family tree.
R/S is also heavily influenced by Lisp. Haven’t written it in 10 years, but AFAIR it even has proper macros where argument expressions are passed without evaluation.
How can COBOL be a "dead" or "mostly dead" language if it still handles over 70% of global business transactions (with ~800 billion lines of code and still growing). See e.g. https://markets.businessinsider.com/news/stocks/cobol-market....
BASIC is the scripting language used by Microsoft Office. Saying that it powers millions of businesses is probably not an exaggeration.
Pascal, particularly the Delphi/Object Pascal flavor, is also still in widespread use today.
Also Smalltalk is still in wide use; ML is also used; there are even many PL/I applications in use today and IBM continues to give support.
I don't know, I heard somewhere that even the C language is in wide use, still ... ;)
No one's starting new projects in COBOL.
One of the most significant new COBOL projects in 2025 was the integration of a new COBOL front-end into the GNU Compiler Collection. There are indeed quite many new projects being started in COBOL, though they primarily focus on modernization and integration with contemporary technologies rather than traditional greenfield development. Also not forget some cloud providers now offer "COBOL as a service" (see e.g. https://docs.aws.amazon.com/m2/latest/userguide/what-is-m2.h...).
By "new COBOL projects" I mean green-field development of entirely new projects written in that language - not the continued development of existing COBOL codebases, or development of tools which interact with COBOL code.
As an aside, the article you linked to is pretty obvious AI slop, even aside from the image ("blockchin infarsucture" and all). Some of the details, like claims that MIT is offering COBOL programming classes or that banks are using COBOL to automatically process blockchain loan agreements, appear to be entirely fabricated.
> There are indeed quite many new projects being started in COBOL
No.
You have to put this relative to projects started in other languages, at which points new projects started in COBOL is even less than a rounding error, it probably wouldn't result in anything other than 0 with a float.
The claim was "No one's starting new projects in COBOL."
And everyone of good faith understood what the claim actually was.
And everyone with relevant fintech project experience knows that new projects on the existing core banking systems are started all the time and that COBOL continues to be a relevant language (whether we like it or not).
Maybe their definition uses recent popularity or how many new projects are started with it. Under that definition, I think it's pretty safe to call it "dead".
If you redefine language, anything is possible.
Yes. "Dead" normally means "to be devoid of life," but it's often extended to metaphorically cover things like computer languages.
edit: for ancient Greek to become a dead language, will we be required to burn all of the books that were written in it, or can we just settle for not writing any new ones?
For ancient Greek all you need is no one speaking it (using it in real life)
Same with a programming language - is no one is wiring code in it, it's dead
I was almost sure that Prolog would be on the list, but apparently not.
Because it's dead or because it's influential?
Seeing Smalltalk on these lists and not Self always seems... lacking. Besides its direct influence on Smalltalk, and its impact on JIT research, its prototype-based object system lead to Javascript's object model as well.
Self was influenced by Smalltalk, not the other way around. Smalltalk was developed in the 1970s. Self in the 1980s.
Thanks for the correction.
Dang I wanted it to keep going
COBOL - “mostly dead” but still somehow the backbone of the global financial system
Interesting read, and would have been good to see the author’s definition of ‘mostly dead’. Some are still used widely in niche areas like COBOL for banking. If a language itself isn’t receiving any updates nor are new packages being developed by users, is it mostly dead?
In any case, the author claims that each of these languages is "dead". There is a "Cause of Death" section for each language, which doesn't allow for another conclusion. By listing languages like ALGOL, APL, CLU, or Simula, the author implies that he means by "dead" "no longer in practical use, or just as an academic/historic curiosity". The article contradicts itself by listing languages like COBOL, BASIC, PL/I, Smalltalk, Pascal, or ML, for which there is still significant practical use, even with investments for new features and continuation of the language and its applications. The article actually disqualifies by listing COBOL or Pascal as "mostly dead", because there is still a large market and significant investment in these languages (companies such as Microfocus and Embarcadero make good money from them). It is misleading and unscientific to equate “no longer mainstream” with “no longer in use.” This makes the article seem arbitrary, poorly researched, and the author not credible.
One day Perl will be on this list
If we assume peak Perl was in the 00s, say 2005, an impressionable teenager of ~15 learning by then probably will keep using it for the rest of their life, even in absence of uptake by new people. Assuming a lifespan of 85, I estimate this day won't arrive before the 2070s.
I started using it in the mid-90s, and used it extensively at work as long as I could, but by 2012 I gave up the fight. I still break it out once in a great while for a quick text transformation, but it’s so rusty in my memory that I rely on an LLM to remind me of the syntax.
As will Python and many others.
Previously: https://news.ycombinator.com/item?id=22690229
(There are a few other threads with a smaller number of comments.)
Kinda surprised to not see Forth listed.
Forth was neat, but it was a bit of an evolutionary dead end. I'm not aware of any significant concepts from Forth which were adopted by other, later programming languages.
RPL (Reverse Polish Lisp, a high level language for HP calculators) possibly drew on it a bit, though the main antecedents are RPN and Lisp, and possibly Poplog (a Poplog guru was at HP at the time, but I don't know if he contributed).
PostScript
Or Lisp. Lisp is definitely not dead, but was definitely very influential.
The article does touch on that:
"COBOL was one of the four “mother” languages, along with ALGOL, FORTRAN, and LISP."
Imho Lisp is deader than COBOL. Especially now that we've learned you can do the really hard and interesting bits of AI with high-performance number crunching in C++ and CUDA.
I wrote Lisp this morning to make Emacs do a thing. In other venues, people use Lisp to script AutoCAD.
Lisp isn't as widely used as, say, Python, but it's still something a lot of people touch every single day.
And Clojure