Alan Kay, my favorite curmudgeon, spent decades trying to remind us we keep reinventing concepts that were worked out in the late 70s and he’s disappointed we’ve been running in circles ever since. He’s still disappointed because very few programmers are ever introduced to the history of computer science in the way that artists study the history of art or philosophers the history of philosophy.
When I was at RIT (2006ish?) there was an elective History of Computing course that started with the abacus and worked up to mainframes and networking. I think the professor retired years ago, but the course notes are still online.
I recall seeing a project on github with a comment:
Q: "Sooo... what does this do that Ansible doesn't?"
A: "I've never heard of Ansible until now."
Lots of people think they are the first to come across some concept or need. Like every generation when they listen to songs with references to drugs and sex.
I think software engineering have so many social problems to a level that other fields just don't have. Dogmatism, superstition, toxicity ... you name it.
The effective history of computing spans a lifetime or three.
There's no sense comparing the two. In the year 2500 it might make sense to be disappointed that people don't compare current computational practices with things done in 2100 or even 1970, but right now, to call what we have "history" does a disservice to the broad meaning of that term.
Another issue: art and philosophy have very limited or zero dependence on a material substrate. Computation has overwhelming dependence on the performance of its physical substrate (by various metrics, including but not limited to: cpu speed, memory size, persistent storage size, persistent storage speed, network bandwidth, network scope, display size, display resolution, input device characteristics, sensory modalities accessibly via digital to analog conversion, ...). To assert that the way problems were solved in 1970 obviously has dramatic lessons for how to solve them in 2025 seems to me to completely miss what we're actually doing with computers.
If Alan Kay doesn't respond directly to this comment, what is Hacker News even for? :)
You're not wrong about history, but that only strengthens Kay's case. E.g., our gazillion-times better physical substrate should have led an array of hotshot devs to write web apps that run circles around GraIL[1] by 2025. (Note the modeless GUI interaction.) Well, guess what? Such a thing definitely doesn't exist. And that can only point to programmers having a general lack of knowledge about their incredibly short history.
(In reality my hope is some slice of devs have achieved this and I've summoned links to their projects by claiming the opposite on the internet.)
Edit: just so I get the right incantation for summoning links-- I'm talking about the whole enchilada of a visual language that runs and rebuilds the user's flowchart program as the user modelessly edits it.
> E.g., our gazillion-times better physical substrate should have led an array of hotshot devs to write web apps that run circles around GraIL[1] by 2025
Why? What problem did it solve that we're suffering from in 2025?
> To assert that the way problems were solved in 1970 obviously has dramatic lessons for how to solve them in 2025 seems to me to completely miss what we're actually doing with computers.
True they might not all be "dramatic lessons" for us, but to ignore them and assume that they hold no lessons for us is also a tragic waste of resources and hard-won knowledge.
Its because CS is not cared about as a true science for the most part. Nearly all of the field is focused on consolidating power and money dynamics. No one cares to make a comprehensive history since it might give your competitors an edge.
Just because a period of history is short doesn't make it _not history_.
Studying history is not just, or even often, a way to rediscover old ways of doing things.
Learning about the people, places, decisions, discussions, and other related context is of intrinsic value.
Also, what does "material substrate" have to do with history? It sounds here like you're using it literally, in which case you're thinking like an engineer and not like a historian. If you're using it metaphorically, well, art and philosophy are absolutely built on layers of what came before.
You first sentences already suggest one comparison between the histories of computing and philosophy: history of computing ought to be much easier. Most of it is still in living memory. Yet somehow, the philosophy people manage it while we computing people rarely bother.
I always think it is great value to have a whole range of history of X courses.
I once thought about a series of PHYS classes that focus on historical ideas and experiments. Students are supposed to replicate the experiments. They have to read book chapters and papers.
History of physics is another history where we have been extremely dependent on the "substrate". Better instruments and capacity to analyze results, obviously, but also advances in mathematics.
Art and philosophy have very limited or zero dependence on a material substrate
Not true for either. For centuries it was very expensive to paint with blue due to the cost of blue pigments (which were essentially crushed gemstones).
Philosophy has advanced considerably since the time of Plato and much of what it studies today is dependent on science and technology. Good luck studying philosophy of quantum mechanics back in the Greek city state era!
That gives even more reason to study the history of CS. Even artists study contemporary art from the last few decades.
Given the pace of CS (like you mentioned) 50 years might as well be centuries and so early computing devices and solutions are worth studying to understand how the technology has evolved and what lessons we can learn and what we can discard.
> Another issue: art and philosophy have very limited or zero dependence on a material substrate. Computation has overwhelming dependence on the performance of its physical substrate
That's absolutely false. Do you know why MCM furniture is characterized by bent plywood? It's because we developed the glues that enabled this during world war II. In fashion you had a lot more colors beginning in the mid 1800s because of the development of synthetic dyes. Really odd that oil paints were really perfected around Holland (major place for flax and thus linseed oil), which is what the dutch masters _did_. Architectural mcmansions began because of the development of pre-fab roof trusses in the 70s and 80s.
How about philosophy? Well, the industrial revolution and it's consequences have been a disaster for the human race. I could go on.
The issue is that engineers think they're smart and can design things from first principles. The problem is that they're really not, and design things from first principles.
> Computation has overwhelming dependence on the performance of its physical substrate (by various metrics, including but not limited to: cpu speed, memory size, persistent storage size, persistent storage speed, network bandwidth, network scope, display size, display resolution
This was clearly true in 01970, but it's mostly false today.
It's still true today for LLMs and, say, photorealistic VR. But what I'm doing right now is typing ASCII text into an HTML form that I will then submit, adding my comment to a persistent database where you and others can read it later. The main differences between this and a guestbook CGI 30 years ago or maybe even a dialup BBS 40 years ago have very little to do with the performance of the physical substrate. It has more in common with the People's Computer Company's Community Memory 55 years ago (?) using teletypes and an SDS 940 than with LLMs and GPU raytracing.
Sometime around 01990 the crucial limiting factor in computer usefulness went from being the performance of the physical substrate to being the programmer's imagination. This happened earlier for some applications than for others; livestreaming videogames probably requires a computer from 02010 or later, or special-purpose hardware to handle the video data.
Screensavers and demoscene prods used to be attempts to push the limits of what that physical substrate could do. When I saw Future Crew's "Unreal", on a 50MHz(?) 80486, around 01993, I had never seen a computer display anything like that before. I couldn't believe it was even possible XScreensaver contains a museum of screensavers from this period, which displayed things normally beyond the computer's ability. But, in 01998, my office computer was a dual-processor 200MHz Pentium Pro, and it had a screensaver that displayed fullscreen high-resolution clips from a Star Trek movie.
From then on, a computer screen could display literally anything the human eye could see, as long as it was prerendered. The dependence on the physical substrate had been severed. As Zombocom says, the only limit was your imagination. The demoscene retreated into retrocomputing and sizecoding compos, replaced by Shockwave, Flash, and HTML, which freed nontechnical users to materialize their imaginings.
The same thing had happened with still 2-D monochrome graphics in the 01980s; that was the desktop publishing revolution. Before that, you had to learn to program to make graphics on a computer, and the graphics were strongly constrained by the physical substrate. But once the physical substrate was good enough, further improvements didn't open up any new possible expressions. You can print the same things on a LaserWriter from 01985 that you can print on the latest black-and-white laser printer. The dependence on the physical substrate has been severed.
For things you can do with ASCII text without an LLM, the cut happened even earlier. That's why we still format our mail with RFC-822, our equations with TeX, and in some cases our code with Emacs, all of whose original physical substrate was a PDP-10.
Most things people do with computers today, and in particular the most important things, are things fewer people have been doing with computers in nearly the same way for 30 years, when the physical substrate was very different: 300 times slower, 300 times smaller, a much smaller network.
Except, maybe, mass emotional manipulation, doomscrolling, LLMs, mass surveillance, and streaming video.
A different reason to study the history of computing, though, is the sense in which your claim is true.
Perceptrons were investigated in the 01950s and largely abandoned after Minsky & Papert's book, and experienced some revival as "neural networks" in the 80s. In the 90s the US Postal Service deployed them to recognize handwritten addresses on snailmail envelopes. (A friend of mine who worked on the project told me that they discovered by serendipity that decreasing the learning rate over time was critical.) Dr. Dobb's hosted a programming contest for handwriting recognition; one entry used a neural network, but was disqualified for running too slowly, though it did best on the test data they had the patience to run it on. But in the early 21st century connectionist theories of AI were far outside the mainstream; they were only a matter of the history of computation. Although a friend of mine in 02005 or so explained to me how ConvNets worked and that they were the state-of-the-art OCR algorithm at the time.
Then ImageNet changed everything, and now we're writing production code with agentic LLMs.
Many things that people have tried before that didn't work at the time, limited by the physical substrate, might work now.
Usually it's because of an initiative by the Long Now Foundation that is supposed, among other things, to raise awareness about their 10,000 years clock and what it stands for.
I reflect on university, and one of the most interesting projects I did was an 'essay on the history of <operating system of your choice>' as part of an OS course. I chose OS X (Snow Leopard) and digging into the history gave me fantastic insights into software development, Unix, and software commercialisation. Echo your Mr Kay's sentiments entirely.
Sadly this naturally happens in any field that ends up expanding due to its success. Suddenly the number of new practitioners outnumbers the number of competent educators. I think it is a fundamental human resources problem with no easy fix. Maybe llms will help with this, but they seem to reinforce the convergence to the mean in many cases as those to be educated is not in a position to ask the deeper questions.
> Sadly this naturally happens in any field that ends up expanding due to its success. Suddenly the number of new practitioners outnumbers the number of competent educators. I think it is a fundamental human resources problem with no easy fix.
In my observation the problem rather is that many of the people who want to "learn" computer science actually just want to get a certification to get a cushy job at some MAGNA company, and then they complain about the "academic ivory tower" stuff that they learned at the university.
So, the big problem is not the lack of competent educators, but practitioners actively sabotaging the teaching of topics that they don't consider to be relevant for the job at a MAGNA company. The same holds for the bigwigs at such companies.
I sometimes even see the conspiracy that if a lot of graduates saw that what their work at these MAGNA involves is from the history of computer science often decades old and has been repeated multiple times over the decades, this might demotivate the employees who are to believe that they work on the "most important, soon to be world changing" thing.
Your experience with bad teachers seems more like an argument in favor of better education than against it. It's possible to develop better teaching material and methods if there is focus on a subject, even if it's time consuming.
At least for history of economics, I think it's harder to really grasp modern economic thinking without considering the layers it's built upon, the context ideas were developed within etc...
That's probably true for macro-economics. Alas that's also the part where people disagree about whether it made objective progress.
Micro-economics is much more approachable with experiments etc.
Btw, I didn't suggest to completely disregard history. Physics and civil engineering don't completely disregard their histories, either. But they also don't engage in constant navel gazing and re-hashing like a good chunk of the philosophers do.
I can't concur enough. We don't teach, "how to design computers and better methods to interface with them" we keep hashing over the same stuff over and over again. It gets worse over time and the effect is that what Engelbart called, "intelligence augmenters" become, "super televisions that cause you political and social angst."
How far we have fallen but so great the the reward if we could, "lift ourselves up again." I have hope in people like Bret Victor and Brenda Laurel.
> He’s still disappointed because very few programmers are ever introduced to the history of computer science in the way that artists study the history of art or philosophers the history of philosophy.
Software development/programming is a field where the importance of planning and design lies somewhere between ignored and outright despised. The role of software architect is both ridiculed and vilified, whereas the role of the brave solo developer is elevated to the status of hero.
What you get from that cultural mix is a community that values ad-hoc solutions made up on the spot by inexperienced programmers who managed to get something up and running, and at the same time is hostile towards those who take the time to learn from history and evaluate tradeoffs.
See for example the cliche of clueless developers attacking even the most basic aspects of software architecture such as the existence of design patterns.
with that sort of community, how does anyone expect to build respect for prior work.
Maybe he managed to work them out and understand them in the '70s, if you believe him. But he has certainly never managed to convey that understanding to anyone else. Frankly I think if you fail to explain your discovery to even a fraction of the wider community, you haven't actually discovered it.
People in tech industry, seem to have no idea how the systems in the wild work. Enterprise Java runs the backbone of operations for all of large business organisations such as banks. It is just as grounded as MS Office is. It is object-oriented software that is running the bulk of production environments of the world. Who is going to maintain these systems for the next few decades?
And in reality, there is nothing wrong with Java or object orientation. It has the best battle-tested and rich ecosystem to build enterprise systems. It mirrors the business entities and a natural hierarchy and evolution of things. It has vast pool of skilled resources and easy to maintain. Python is still a baby when it comes operational readiness and integrations. You might get excited about Jupyter cells and REPL, but that is all a dev-play, not production.
"CSCI 2100: Unlearning Object-Oriented Programming" immediately caused me to disagree this one.
When I code in C, in the end, I usually miss the syntax for defining "objects/classes" (structs with functions and access controls), the syntax/notation that encapsulates/binds/groups the related state and its functions/API to define some specific concept/model == custom data type.
Of course OOP can be taken to extreme complexity and then lose its usefulness.
Unlearning OOP does not necessarily involve forgetting abstraction and the concept of an object. "Unlearning OOP" involves freeing yourself from the notion that all programming should be designed as an object hierarchy.
There is/was a tendency in object-oriented programming to consider that it is the only way Real Software™ is made. You tend to focus more on the architecture of your system than its actual features.
Notice the prerequisite to unlearning something is learning it first. I don't think anyone proposes that the concept of an object is useless.
I don’t necessarily agree with a somewhat childish “unlearn OOP” idea, but… a lot of that enterprise software is of bad quality. Whether it’s OOP or something else’s fault, simply stating that a lot of backbone is written in Java does not prove Java is a good choice, nor does it prove that there is nothing wrong with Java.
If you think programmers should defer to bank managers on what the best way to design software is, do you also think that bank managers should defer to plumbers on what the best way to manage liquidity is?
Its not bank managers but IT managers choosing somewhat the tech stack, rest is inertia and things working. Bank managers dont give a fuck in same way they couldnt care less what plumbing material is used in their buildings. They care about long term stability, as they should. Same is true for almost any business outside SV.
Parent is correct, been doing this my entire (profitable to the absolute limit) career and will most probably retire doing same. You clearly seem to lack any expertise in field discussed.
That happened not by choice, but by chance and due to "OOP consultants" running rampant in the 2000s. Source: i have to maintain Java slop in a bank, and used to maintain Java slop in manufacturing.
> Enterprise Java runs the backbone of operations for all of large business organisations such as banks.
This is rather anti-recommendation. At this point I'm expecting from a bank only to reliably login, preview balance and transaction history, receive and send bank transfers... and they oftentimes fail at this basic feature set. I don't even need credit or interest rates from them.
Banks as an example of "getting things done" is laughable. Real industry gets things done: manufacturing, construction, healthcare etc. We could do without the massive leech that is the finance sector.
With so many issues and costs that people buy stablecoins to exchange money with smart-contracts that have no substantial guarantee other than the issuer's right to block funds...
And that needs skyscrapers and entire financial districts to achieve? This is a tiny fraction of the "work" done by the financial sector. Most of what they do is pointless trading between themselves and sapping the output of the real economy.
The banks' real "product" is trust. You will work an entire month for a "bank transfer" (something you can't even hold, let alone eat or burn) because you believe your landlord will similarly accept a "bank transfer" in exchange for you rent (or, if you have a mortgage, you work an entire month because you believe this means you will be able to continue living in your house unchallenged). This has absolutely nothing to do with what programming languages or paradigms they have in place.
Simulate increasingly unethical product requests and deadlines. The only way to pass is to refuse and justify your refusal with professional standards.
Watch as peers increasingly overpromise and accept unethical product requests and deadlines and leave you high, mighty, and picking up the scraps as they move on to the next thing.
Conventional university degrees already contain practical examples of the principles of both these courses in the form of cheating and the social dynamics around it.
"CSCI 4020: Writing Fast Code in Slow Languages" does exist, at least in the book form. Teach algorithmic complexity theory in slowest possible language like VB or Ruby. Then demonstrate how O(N) in Ruby trumps O(N^2) in C++.
One of my childhood books compared bubble sort implemented in FORTRAN and running on a Cray-1 and quicksort implemented in BASIC and running on TRS-80.
The BASIC implementation started to outrun the supercomputer at some surprisingly pedestrian array sizes. I was properly impressed.
To be fair, the standard bubble sort algorithm isn't vectorized, and so can only use about 5% of the power of a Cray-1. Which is good for another factor of about 5 in the array size.
We had this as a lab in a learning systems course. converting python loops into numpy vector manipulation (map reduce), and then into tensorflow operations, and measuring the speed.
Gave a good idea of how python is even remotely useful for AI.
Python has come along way. It’s never gonna win for something like high-frequency trading, but it will be super competitive in areas you wouldn’t expect.
The Python interpreter and core library is mostly C code, right? Even a Python library can be coded in C. If you want to sort an array for example, it will cost more in Python because it's sorting python objects, but it's coded in C.
Many computer science programs today have basically turned into coding trade schools.
Students can use frameworks, but they don’t understand why languages are designed the way they are, or how systems evolved over time.
It’s important to remember that computing is also a field of ideas and thought, not just implementation.
I would add debugging as a course. Maybe they should teach this but how to dive deep into figuring out how to learn the root cause of defects and various tools would have been enormously helpful for me. Perhaps this already exists
CSCI 0001: Functional programming and type theory (taught in English [0])
For decades, the academia mafia, through impenetrable jargon and intimidating equations, have successfully prevented the masses from adopting this beautiful paradigm of computation. That changes now. Join us to learn why monads really are monoids in the category of endofunctors (oh my! sorry about that).
Where’s the course on managing your reaction when the client starts moving the goal posts on a project that you didn’t specify well enough (or at all), because you’re a young eager developer without any scars yet?
Back in the 90s, this was actually a sneaky part of Dartmouth's CS23 Software Engineering course. At least half your grade came from a 5-person group software project which took half a semester. The groups were chosen for you, of course.
The professors had a habit of sending out an email one week before the due date (right before finals week) which contained several updates to the spec.
It was a surprisingly effective course.
(Dartmouth also followed this up with a theory course that often required writing about 10 pages of proofs per week. I guess they wanted a balance of practice and theory, which isn't the worst way to teach CS.)
In uni we had a semester long group project with the stronger coders as project leaders. Group leaders controlled pass/fail for the group members and vice versa. After the leaders put together project plans for their teams and everyone was supposed to start working WHOOPSIE reorg and all the leaders got put on random teams and new people got "promoted" into leader (of groups they didn't create the plan for). If the project didn't work at the end the entire group failed.
I've never been taught anything more clearly than the lessons from that class.
Shit I was thinking about exactly the same thing: professor deliberately change requirements at last week to mess up the students and give them a bit of taste of true work.
My uni kind of had that course! They just didn't tell us what it was going to be ahead of time and it was horrendous. We all absolutely hated the professor but it was required to graduate so we all came up with various coping strategies and at the very end he said "congratulations this is what the real world is like!"
(I didn't believe him at the time, but in some ways he really didn't go far enough...)
PSYC 2230: Measuring Things - Gathering evidence, overcoming bias, and comparative analysis.
Most developers cannot measure things at any level, in any regard.
CSCI 5540: Transmission Control - Comparative analysis of existing data transfer protocols, to include practical application, as well as authoring new and original protocols
Most developers don’t know how any transmission protocols work internally, except possibly HTTP
CSCI 3204: Tree Traversal - Data structure analysis applied to data model hierarchies and taxonomies
I have heard from so many developers who say they spend most of their education on data structure analysis but in the real world cannot apply it against tree models in practical application on any level. The people in library sciences figure this out in the real world but not educated software developers.
Not only encoding/decoding but searching and sorting is also different. We may also cover font rendering, unicode modifiers and emoji. They are so common and fundamental but very few understand them.
I am happy to sign up for all these classes. Tbh this is what coursera or whatever should be. Not yet another machine learning set of lectures with notebooks you click run on.
Systems Engineering 101/201/301/401: How to design a computer system to be reliable
Security Engineering 101/201/301/401: How security flaws happen and how to prevent them
Conway's Law 101/201: Why the quality of the software you write is less important than your org chart
The Real DevOps 101/201/301: Why and how to simultaneously deliver software faster, with higher quality, and fewer bugs
Old And Busted 101/201: The antiquated patterns developers still use, why they're crap, what to use instead
Thinking Outside the Box 101: Stupid modern designs and why older ones are better
New Technology 101: The new designs that are actually superior and why
Project Management 101/201/301: History of project management trends, and how to manage any kind of work
Managing for Engineers 101/201/301: Why and how to stop trying to do everything, empowering your staff, data-driven continuous improvement
Quality Control 101/201: Improving and maintaining quality
Legal Bullshit 101/201: When you are legally responsible and how not to step in it
In addition,
Team Dynamics 301: A course in Blame Management
Handling the traditional “innocent punished, guilty escape/promoted” issue. With explanation of the meme “Success has 100 fathers/mothers while failure is a orphan.
Jm2c but people tend to conflate two overlapping but different fields.
CS is the study of computation, SE is the study of building computer programs.
Those overlap in the same way physics and chemistry do. Of course the two overlap and chemist's are also exposed to classical and quantum physics and know about Dirac spaces or Born-Oppenheimer equations. But the bulk and core of a chemist curriculum will involve few of these courses and with a focus on what's relevant to the chemist. E.g understanding how quantum physics make water appear transparent in a glass but blue in a lake or deep pool.
Same is for CS and SE. Of course they are related, but CS is much more focused on the theoretical and mathematical parts of computing, not the practical side of building systems.
One wants to know what can be computed and how and with what properties. The other wants to know how to build computer programs, but does not need to understand and be intimate with the mathematics of type inference or Hoare logic.
"Software engineering" is the political ideology that the study of management practices can enable you to deliver a successful software project without learning how to program. It is very popular in fields where a software project can be considered successful without ever delivering usable software, such as cost-plus defense contracting, management consulting, and enterprise software.
If you want to know how to build computer programs, then learn the type system of your chosen language, and learn how to reason about the behavior of sequences, loops, and conditionals—even if you do it informally or with small-step operational semantics instead of Hoare logic, and even if your language doesn't have type inference. Don't listen to the comforting lies of "Software Engineering" promising easy shortcuts. There is no royal road to Geometry, and there is no royal road to Google. Git gud.
But it is also true that there is a great deal that you could learn about computer science that you do not need to write working software, fast. Sequential search is often fast enough. Simple hash tables are usually better than fancy balanced trees. You will probably never use a computer that uses one's complement or the network stack the OSI model describes. If you have an array to sort, you should probably use a sorting function from the system library and definitely not implement bubble sort from scratch. Or even Quicksort. You can program in Erlang or Java for decades without having to understand how the garbage collector works.
> "Software engineering" is the political ideology that the study of management practices can enable you to deliver a successful software project without learning how to program.
Software engineering is not an ideology, but the application of engineering practices to building computer programs, the same way civil engineering is the application of engineering practices to building bridges.
Your statement is odd: software engineering curricula do include theoretical and computational courses, but ultimately those are a limited part and not the focus of the curriculum.
In the same way CS curricula do include few engineering and application-focused exams, but again, they are not the focus.
It's absolutely fine for the two curricula to be different and they are indeed different in most of Europe.
E.g. at the university of Pisa the CS curriculum (obviously speaking about masters, arguing about bachelors is partially irrelevant, you just can't get in enough depth of any topic) has exams like parallel computing, category theory, models of computation, compilers and interpreters.
But the software engineering curriculum has: mobile and physical systems, machine learning, distributed computing, business process modeling, IT risk assessment, IT infrastructures, peer to peer systems, etc.
Of course many exams are shared (albeit they have slightly different focuses) such as: randomized algorithms, competitive programming, advanced programming and you can likely choose one of the other courses as your optionals.
But the focus is ultimately different. One focuses on the theory behind computation, one focuses on the practical aspect.
- telling clients that the proof-of-concept is non-conclusive so it's either bag it or try something different
- spending innovation tokens in something else than a new frontend framework and/or backend language
- understanding that project management methods are tools (not rites) and if your daily standup is 45min then there's a problem
I was definitely guilty of this in my last role. Some of my refactorings were good and needed, but also a distraction from saying the codebase was "good enough" and focusing on the broader people/team/process problems around me.
I wish more comp sci curricula would sprinkle in more general courses in logic and especially 20th century analytic philosophy. Analytic philosophy is insanely relevant to many computer science topics especially AI.
Functional programming exists in any reputable computer science course. Standard is haskel, For a true "unlearning" it might need to be a third or forth year subject
> CSCI 3300: Classical Software Studies
Discuss and dissect historically significant products, including VisiCalc, AppleWorks, Robot Odyssey, Zork, and MacPaint. Emphases are on user interface and creativity fostered by hardware limitations.
Definitely would love that. Reading source code is pretty hard for newbies like me. Some guidance is appreciated.
I think it would be a good idea, especially CSCI 3300. (Learning them in a course is not the only way to learn computer and other stuff, but is (and should be) one way to do.)
(However, CSCI 2100 shouldn't be necessary if you should learn stuff other than OOP the first time, even if you also learn OOP.)
I really don't understand the modern hate towards OOP. From my experience over the last few decades working with large C and C++ codebases, the former turns into a big ball of mud first.
Most hate of OOP comes from the definition that OOP = inheritance. Meanwhile, among people that consider themselves OO programmers, there is often the same revulsion towards inheritance and a preference for encapsulation while still calling that OOP. Because each language is subtly different, these things tend to flame war.
Which of course people do and why of course you have:
I think that OOP can be good for some things, but that does not mean that all or most programs should use OOP for all or most things. I would say that for most things it is not helpful, even though sometimes it is helpful.
The Classical Software Studies would be quite useful. Go write a game in 64kb of RAM in BASIC. It would really stretch your creativity and coding skills.
I think working on that kind of system would be actively harmful for most programmers. It would give them a totally unbalanced intuition for what the appropriate tradeoff between memory consumption and other attributes (maintainability, defect rate, ...) is. If anything, programmers should learn on the kind of machine that will be typical for most of their career - which probably means starting with a giant supercomputing cluster to match what's going to be in everyone's pocket in 20 years' time.
Ha. You call it "history". I call it "childhood". I did that years before getting to Uni :)
Although, to be fair, while it was a helpful practice at coding, I'm not a game designer, so it was a game too terrible to play.
First year Uni though I spent too many hours in the lab, competing with friends, to recreate arcade games on the PC. Skipping the game design part was helpful. To be fair by then we had a glorious 640k of ram. Some Assembly required.
Agreed, it would be very interesting to see some of the care taken for resource management that is lost now because every machine has “enough” RAM and cycles…
I’m e long thought the Gameboy Advance would make a great educational platform. Literally every aspect of the hardware is memory mapped. Just stuff values into structs at hard-coded addresses and stuff happens. No need for any OS or any API at all.
I think OOP became popular because it feels profound when you first grasp it. There is that euphoric moment when all the abstractions suddenly interlock, when inheritance, polymorphism, and encapsulation seem to dance together in perfect logic. It feels like you have entered a secret order of thinkers who understand something hidden. Each design pattern becomes a small enlightenment, a moment of realization that the system is clever in ways that ordinary code is not.
But if you step back far enough, the brilliance starts to look like ornament. Many of these patterns exist only to patch over the cracks in the paradigm itself. OOP is not a natural way of thinking, but a habit of thinking that bends reality into classes and hierarchies whether or not they belong there. It is not that OOP is wrong, but that it makes you mistake complexity for depth.
Then you encounter functional programming, and the same transformation begins again. It feels mind expanding at first, with the purity of immutable data, the beauty of composability, and the comfort of mathematical certainty. You trade one set of rituals for another: monads instead of patterns, recursion instead of loops, composition instead of inheritance. You feel that familiar rush of clarity, the sense that you have seen through the surface and reached the essence.
But this time the shift cuts deeper. The difference between the two paradigms is not just structural but philosophical. OOP organizes the world by binding behavior to state. A method belongs to an object, and that object carries with it an evolving identity. Once a method mutates state, it becomes tied to that state and to everything else that mutates it. The entire program becomes a web of hidden dependencies where touching one corner ripples through the whole. Over time you code yourself into a wall. Refactoring stops being a creative act and turns into damage control.
Functional programming severs that chain. It refuses to bind behavior to mutable state. Statelessness is its quiet revolution. It means that a function’s meaning depends only on its inputs and outputs. Nothing else. Such a function is predictable, transparent, and portable. It can be lifted out of one context and placed into another without consequence. The function becomes the fundamental atom of computation, the smallest truly modular unit in existence.
That changes everything. In functional programming, you stop thinking in terms of objects with responsibilities and start thinking in terms of transformations that can be freely composed. The program stops feeling like a fortress of interlocking rooms and begins to feel like a box of Lego bricks. Each function is a block, self-contained, perfectly shaped, designed to fit with others in infinitely many ways. You do not construct monoliths; you compose arrangements. When you need to change something, you do not tear down the wall. You simply reassemble the bricks into new forms.
This is the heart of functional nirvana: the dream of a codebase that can be reorganized endlessly without decay. Where every part is both independent and harmonious, where change feels like play instead of repair. Most programmers spend their careers trying to reach that state, that perfect organization where everything fits together, but OOP leads them into walls that cannot move. Functional programming leads them into open space, where everything can move.
Reality will always be mutable, but the beauty of functional programming is that it isolates that mutability at the edges. The pure core remains untouched, composed of functions that never lie and never change. Inside that core, every function is both a truth and a tool, as interchangeable as Lego bricks and as stable as mathematics.
So when we ask which paradigm handles complexity better, the answer becomes clear. OOP hides complexity behind walls. Functional programming dissolves it into parts so small and transparent that complexity itself becomes optional. The goal is not purity for its own sake, but freedom; the freedom to recompose, reorganize, and rethink without fear of collapse. That is the real enlightenment: when your code stops feeling like a structure you maintain and starts feeling like a universe you can endlessly reshape.
The great achievement of OOP is that it inspires such passion.
In essence OOP is just, "hey, if you have a struct and a bunch of operations that operate on that struct, let's put the name of the struct and a dot in front of the names of those operations and you don't need to pass the struct itself as an argument"
It beats me how either the high priests or its detractors get so worked up about it, even with the add-ons like inheritance, poly-morphism or patterns. (Which of course also exist in a more mathematically clean way in functional languages.)
These patterns have seen real use (not saying optimal) in the wild.
Of course we know today composition is better than inheritance, plain data structs are enough for most cases, and "parse, don't validate". but did people know it in 1990s?
There's a video from Gary Bernhardt called Functional Core, Imperative Shell which made sense to me. These paradigms can work well together. OOP can be great too, the Ruby language is deeply OO and the core library is rich and OO.
You can also only use hashes and arrays and type them with Typescript, using functions, types and namespaces and duck-typing, instead of classes and methods.
ps: closures are worth thinking about, we use them without even thinking about it
- a basic computer science course, teaching how to be at home in a FLOSS desktop system
- an intermediate course teaching how to properly automate this environment, from scripting to classic development
- a basic course in networking and system management to reach the level of being able to be a dummy sysadmin at home
all of these must be preparatory to CS because without them it's like studying literature before knowing the basics of the language in which it's written. So far, it's assumed that students do it themselves, but the facts prove that this is not the case.
It is interesting that no software engineering or computer science course I’ve seen has ever spent any time on CI/CD.
Jenkins, Docker, Kubernetes, none of these sorts of things - and I don’t even mean these specific technologies, but moreover nothing even in their ballpark.
> It is interesting that no software engineering or computer science course I’ve seen has ever spent any time on CI/CD.
It's hard to fit everything student needs to know in the curriculum. Someone else posted here they had 10 pages of proofs per week, for one course. I would have been fired for assigning so much homework!
I was a CS professor at a local college. My solution was to ignore CS1 and CS2 curriculum (we were not ABET accredited, so that's okay) in the second course of Java programming. Instead, I taught students Maven/Gradle, Git and GitHub, workflows, CI/CD, regular expressions, basic networking, basic design patterns, Spring Boot, and in general everything I thought new programmers ought to know. I even found a book that covered much of this stuff, but in the end I wrote my own learning materials and didn't use a book.
The course was a victim of its success. The school mandated the course for non-Java programmers too, resulting in a lot of push-back from the non-Java students.
If anyone is interested, I have the syllabus online still (I've since retired) at <https://wpollock.com/>. Look for COP2800 and COP2805C. I can also send the Java teaching materials as a PDF to anyone interested (book length, but sadly not publishable quality).
>Someone else posted here they had 10 pages of proofs per week, for one course.
Huh. As a professor, I would not be able to grade this kind of volume in any serious capacity. Especially since proofs need to be scrutinized carefully for completeness and soundness. I wonder how their instructor manages.
I'm doing this in the software engineering¹ course I teach.
However:
a) Practical CI/CD requires understanding (and some practical experience) of many other concepts like Linux shell scripting, version control, build automation, containers, server administration, etc. As few students (in our degree programme) have sufficient experience in these concepts, I spend about half of the semester teaching such devops basics instead of actual software engineering.
b) Effectively, teaching CI/CD means teaching how GitHub or GitLab do CI/CD. I feel a little bit uncomfortable teaching people how to use tech stacks owned by a single company.
¹) Actually it's more a course on basic software craftsmanship for media informatics students because no such course exists in our curriculum and I find it more important that students learn this than that they understand the V model.
It would be easy for me to agree with you. I hold a graduate degree in computer science and I’m named for my contributions proofreading/correcting a graduate text about algorithms in computability theory.
I love abstraction and algorithms and pure theory, but this whole “computer science is a branch of mathematics, nothing more” idea has always struck me as ridiculous. Are you willing to throw out all of the study of operating systems, networking, embedded systems, security (hardware and software), a good chunk of AI, programming languages, UI/UX/human computer interaction, graphics, just to draw a line around algorithms and Turing Machines and say this is all there is to computer science?
Cryptography is all math, networking is largely math and algorithms (IMO yes this should really be replaced with information theory. Just understanding Shannons paper would have been more valuable than learning about how routers work), AI is mostly statistics (And AI as a whole Id argue is the essence of computer science), graphics is largely math and algorithms.
Yes I very much think a computer science degree should be as close to the foundation of theory as possible. And still learning Jenkins and kuberenetes or even a general course on how effectively push code is still far from the things you listed.
Theres so much computer science that isnt even covered that id include before including courses on CI/CD
Yeah cryptography mostly (but certainly not all) math but it accounts for a negligible (pun intended) portion of interesting security work.
AI is a lot of math, especially if you hang out with the quasiconvex optimization crowd, but a vast majority of work in that field can not properly constitute “theory”
I think it’s clear in practice that computer science has officially strayed beyond whatever narrow bounds people originally wished to confine it to.
Alot of its a push for practicality/catering to student's interests. IMO its a result of a really archaic education system. Universities were originally small and meant for theoretical study, not as a de facto path for everyone to enroll into in order to get a job.
If it were me Id get rid of statically defined 4 year programs, and/or definite required courses for degrees, or just degrees in general. Just offer courses and let people come learn what they want.
One of my favorite classes was a python class that focused on building some simple games with tkinter, making a chat client, hosting a server, because it was the first time I understood how actual software worked. Im really glad I took that class.
On the other hand Id love to have learned information theory, lamba calculus, all the early AI, cognitive science, theory of programming languages, philosophy behind all of it that got us here
Your point is well taken and to some extent I agree, but I think you have to recognize. It’s not just student interest, career preparation, and practicality.
The research done by professional academic computer scientists also reflects the broad scope I’m advocating for.
I have never felt that way, and I’ve worked on a variety of projects at a variety of companies.
Everyone has a bespoke mishmash of nonsense pipelines, build tools, side cars, load balancers, Terragrunt, Terraform, Tofu, Serverless, Helm charts, etc.
There are enough interesting things here that you wouldn’t even need to make a tool heavy project style software engineering course - you could legitimately make a real life computer science course that studies the algorithms and patterns and things used.
>CSCI 2100: Unlearning Object-Oriented Programming
Discover how to create and use variables that aren't inside of an object hierarchy. Learn about "functions," which are like methods but more generally useful. Prerequisite: Any course that used the term "abstract base class."
This is just a common meme that often comes from ignorance, or a strawman of what OOP is.
>CSCI 4020: Writing Fast Code in Slow Languages
Analyze performance at a high level, writing interpreted Python that matches or beats typical C++ code while being less fragile and more fun to work with.
I like this one, but see?
Python is heavily OOP, everything is an object in python for example.
I'm wondering if OP took a basic OOP course or would otherwise be interested in taking one? You can learn about a thing you are against, or even form your opinion after actually learning about it.
>I strongly disagree. How is everything being called an object in any way "heavily OOP
Do I need to spell it out? The O in OOP stands for object.Everything is an object therefore it is Object Oriented.
It's not much more complex than that man.
And I don't mean that it supports users writing oop code, I mean that the lang, interpreter and library are themselves written with oop. Inheritance? Check. Classes? Check. Objects? Check. Even classes are an object of type MetaClass.
I had forgotten about prog21, and I'm impressed how he wrapped up his blog:
> I don't think of myself as a programmer. I write code, and I often enjoy it when I do, but that term programmer is both limiting and distracting. I don't want to program for its own sake, not being interested in the overall experience of what I'm creating. If I start thinking too much about programming as a distinct entity then I lose sight of that.
Programming is a useful skill, even in the age of large language models, but it should always be used to achieve some greater goal than just writing programs.
No, it's about creating something which does something useful and is easy to maintain. Plumbers have great ideas and approaches but you just want plumbing which works and can be fixed .
It's time developers realised they are plumbers not **** artists.
[HN reduced my expletive to just four asterisks which seems a bit reductionist]
People don’t get this up in arms when there is a math course about using math tools or math history or how math is used to solve actual problems, but for some reason they do about computer science.
They just label such people as Applied Mathematicians, or worse: Physicists and Engineers; and then get back to sensible business such as algebraic geometry, complex analysis and group theory.
If universities offered a major in which students learned to operate telescopes of various kinds and sizes, and then called it astrophysics, people would be mad too.
> CSCI 3300: Classical Software Studies
Alan Kay, my favorite curmudgeon, spent decades trying to remind us we keep reinventing concepts that were worked out in the late 70s and he’s disappointed we’ve been running in circles ever since. He’s still disappointed because very few programmers are ever introduced to the history of computer science in the way that artists study the history of art or philosophers the history of philosophy.
When I was at RIT (2006ish?) there was an elective History of Computing course that started with the abacus and worked up to mainframes and networking. I think the professor retired years ago, but the course notes are still online.
https://www.cs.rit.edu/~swm/history/index.html
I recall seeing a project on github with a comment:
Q: "Sooo... what does this do that Ansible doesn't?"
A: "I've never heard of Ansible until now."
Lots of people think they are the first to come across some concept or need. Like every generation when they listen to songs with references to drugs and sex.
Come on music is very much cross generational?
I think software engineering have so many social problems to a level that other fields just don't have. Dogmatism, superstition, toxicity ... you name it.
The history of art or philosophy spans millenia.
The effective history of computing spans a lifetime or three.
There's no sense comparing the two. In the year 2500 it might make sense to be disappointed that people don't compare current computational practices with things done in 2100 or even 1970, but right now, to call what we have "history" does a disservice to the broad meaning of that term.
Another issue: art and philosophy have very limited or zero dependence on a material substrate. Computation has overwhelming dependence on the performance of its physical substrate (by various metrics, including but not limited to: cpu speed, memory size, persistent storage size, persistent storage speed, network bandwidth, network scope, display size, display resolution, input device characteristics, sensory modalities accessibly via digital to analog conversion, ...). To assert that the way problems were solved in 1970 obviously has dramatic lessons for how to solve them in 2025 seems to me to completely miss what we're actually doing with computers.
> The effective history of computing spans a lifetime or three.
That argument actually strengthens the original point: Even though it's been that short, youngsters often still don't have a clue.
If Alan Kay doesn't respond directly to this comment, what is Hacker News even for? :)
You're not wrong about history, but that only strengthens Kay's case. E.g., our gazillion-times better physical substrate should have led an array of hotshot devs to write web apps that run circles around GraIL[1] by 2025. (Note the modeless GUI interaction.) Well, guess what? Such a thing definitely doesn't exist. And that can only point to programmers having a general lack of knowledge about their incredibly short history.
(In reality my hope is some slice of devs have achieved this and I've summoned links to their projects by claiming the opposite on the internet.)
Edit: just so I get the right incantation for summoning links-- I'm talking about the whole enchilada of a visual language that runs and rebuilds the user's flowchart program as the user modelessly edits it.
1: https://www.youtube.com/watch?v=QQhVQ1UG6aM
https://news.ycombinator.com/user?id=alankay Has not been active on Hacker News for several years now.
At 85 he has earned the peace of staying away from anything and everything on the internet.
He's actually semi active on Quora!
https://www.quora.com/profile/Alan-Kay-11?share=1
> E.g., our gazillion-times better physical substrate should have led an array of hotshot devs to write web apps that run circles around GraIL[1] by 2025
Why? What problem did it solve that we're suffering from in 2025?
> To assert that the way problems were solved in 1970 obviously has dramatic lessons for how to solve them in 2025 seems to me to completely miss what we're actually doing with computers.
True they might not all be "dramatic lessons" for us, but to ignore them and assume that they hold no lessons for us is also a tragic waste of resources and hard-won knowledge.
Its because CS is not cared about as a true science for the most part. Nearly all of the field is focused on consolidating power and money dynamics. No one cares to make a comprehensive history since it might give your competitors an edge.
Art and Philosophy are hardly regarded as science, either. Actually, less so. Yet...
Just because a period of history is short doesn't make it _not history_.
Studying history is not just, or even often, a way to rediscover old ways of doing things.
Learning about the people, places, decisions, discussions, and other related context is of intrinsic value.
Also, what does "material substrate" have to do with history? It sounds here like you're using it literally, in which case you're thinking like an engineer and not like a historian. If you're using it metaphorically, well, art and philosophy are absolutely built on layers of what came before.
> art [has] very limited or zero dependence on a material substrate
This seems to fundamentally underestimate the nature of most artforms.
You first sentences already suggest one comparison between the histories of computing and philosophy: history of computing ought to be much easier. Most of it is still in living memory. Yet somehow, the philosophy people manage it while we computing people rarely bother.
I always think it is great value to have a whole range of history of X courses.
I once thought about a series of PHYS classes that focus on historical ideas and experiments. Students are supposed to replicate the experiments. They have to read book chapters and papers.
History of physics is another history where we have been extremely dependent on the "substrate". Better instruments and capacity to analyze results, obviously, but also advances in mathematics.
That's not that far off the standard physics course, is it? Certainly lots of labs I took were directly based on historical experiments.
Art and philosophy have very limited or zero dependence on a material substrate
Not true for either. For centuries it was very expensive to paint with blue due to the cost of blue pigments (which were essentially crushed gemstones).
Philosophy has advanced considerably since the time of Plato and much of what it studies today is dependent on science and technology. Good luck studying philosophy of quantum mechanics back in the Greek city state era!
That gives even more reason to study the history of CS. Even artists study contemporary art from the last few decades.
Given the pace of CS (like you mentioned) 50 years might as well be centuries and so early computing devices and solutions are worth studying to understand how the technology has evolved and what lessons we can learn and what we can discard.
> Another issue: art and philosophy have very limited or zero dependence on a material substrate. Computation has overwhelming dependence on the performance of its physical substrate
That's absolutely false. Do you know why MCM furniture is characterized by bent plywood? It's because we developed the glues that enabled this during world war II. In fashion you had a lot more colors beginning in the mid 1800s because of the development of synthetic dyes. Really odd that oil paints were really perfected around Holland (major place for flax and thus linseed oil), which is what the dutch masters _did_. Architectural mcmansions began because of the development of pre-fab roof trusses in the 70s and 80s.
How about philosophy? Well, the industrial revolution and it's consequences have been a disaster for the human race. I could go on.
The issue is that engineers think they're smart and can design things from first principles. The problem is that they're really not, and design things from first principles.
> Computation has overwhelming dependence on the performance of its physical substrate (by various metrics, including but not limited to: cpu speed, memory size, persistent storage size, persistent storage speed, network bandwidth, network scope, display size, display resolution
This was clearly true in 01970, but it's mostly false today.
It's still true today for LLMs and, say, photorealistic VR. But what I'm doing right now is typing ASCII text into an HTML form that I will then submit, adding my comment to a persistent database where you and others can read it later. The main differences between this and a guestbook CGI 30 years ago or maybe even a dialup BBS 40 years ago have very little to do with the performance of the physical substrate. It has more in common with the People's Computer Company's Community Memory 55 years ago (?) using teletypes and an SDS 940 than with LLMs and GPU raytracing.
Sometime around 01990 the crucial limiting factor in computer usefulness went from being the performance of the physical substrate to being the programmer's imagination. This happened earlier for some applications than for others; livestreaming videogames probably requires a computer from 02010 or later, or special-purpose hardware to handle the video data.
Screensavers and demoscene prods used to be attempts to push the limits of what that physical substrate could do. When I saw Future Crew's "Unreal", on a 50MHz(?) 80486, around 01993, I had never seen a computer display anything like that before. I couldn't believe it was even possible XScreensaver contains a museum of screensavers from this period, which displayed things normally beyond the computer's ability. But, in 01998, my office computer was a dual-processor 200MHz Pentium Pro, and it had a screensaver that displayed fullscreen high-resolution clips from a Star Trek movie.
From then on, a computer screen could display literally anything the human eye could see, as long as it was prerendered. The dependence on the physical substrate had been severed. As Zombocom says, the only limit was your imagination. The demoscene retreated into retrocomputing and sizecoding compos, replaced by Shockwave, Flash, and HTML, which freed nontechnical users to materialize their imaginings.
The same thing had happened with still 2-D monochrome graphics in the 01980s; that was the desktop publishing revolution. Before that, you had to learn to program to make graphics on a computer, and the graphics were strongly constrained by the physical substrate. But once the physical substrate was good enough, further improvements didn't open up any new possible expressions. You can print the same things on a LaserWriter from 01985 that you can print on the latest black-and-white laser printer. The dependence on the physical substrate has been severed.
For things you can do with ASCII text without an LLM, the cut happened even earlier. That's why we still format our mail with RFC-822, our equations with TeX, and in some cases our code with Emacs, all of whose original physical substrate was a PDP-10.
Most things people do with computers today, and in particular the most important things, are things fewer people have been doing with computers in nearly the same way for 30 years, when the physical substrate was very different: 300 times slower, 300 times smaller, a much smaller network.
Except, maybe, mass emotional manipulation, doomscrolling, LLMs, mass surveillance, and streaming video.
A different reason to study the history of computing, though, is the sense in which your claim is true.
Perceptrons were investigated in the 01950s and largely abandoned after Minsky & Papert's book, and experienced some revival as "neural networks" in the 80s. In the 90s the US Postal Service deployed them to recognize handwritten addresses on snailmail envelopes. (A friend of mine who worked on the project told me that they discovered by serendipity that decreasing the learning rate over time was critical.) Dr. Dobb's hosted a programming contest for handwriting recognition; one entry used a neural network, but was disqualified for running too slowly, though it did best on the test data they had the patience to run it on. But in the early 21st century connectionist theories of AI were far outside the mainstream; they were only a matter of the history of computation. Although a friend of mine in 02005 or so explained to me how ConvNets worked and that they were the state-of-the-art OCR algorithm at the time.
Then ImageNet changed everything, and now we're writing production code with agentic LLMs.
Many things that people have tried before that didn't work at the time, limited by the physical substrate, might work now.
One question, why are you prepending zero to every single year?
Usually it's because of an initiative by the Long Now Foundation that is supposed, among other things, to raise awareness about their 10,000 years clock and what it stands for.
See https://longnow.org/ideas/long-now-years-five-digit-dates-an...
Reconsider this post after playing with an HDR10 480hz monitor.
> The history of art or philosophy spans millen[n]ia.
And yet for the most part, philosophy and the humanities' seminal development took place within about a generation, viz., Plato and Aristotle.
> Computation has overwhelming dependence on the performance of its physical substrate [...].
Computation theory does not.
I reflect on university, and one of the most interesting projects I did was an 'essay on the history of <operating system of your choice>' as part of an OS course. I chose OS X (Snow Leopard) and digging into the history gave me fantastic insights into software development, Unix, and software commercialisation. Echo your Mr Kay's sentiments entirely.
Sadly this naturally happens in any field that ends up expanding due to its success. Suddenly the number of new practitioners outnumbers the number of competent educators. I think it is a fundamental human resources problem with no easy fix. Maybe llms will help with this, but they seem to reinforce the convergence to the mean in many cases as those to be educated is not in a position to ask the deeper questions.
> Sadly this naturally happens in any field that ends up expanding due to its success. Suddenly the number of new practitioners outnumbers the number of competent educators. I think it is a fundamental human resources problem with no easy fix.
In my observation the problem rather is that many of the people who want to "learn" computer science actually just want to get a certification to get a cushy job at some MAGNA company, and then they complain about the "academic ivory tower" stuff that they learned at the university.
So, the big problem is not the lack of competent educators, but practitioners actively sabotaging the teaching of topics that they don't consider to be relevant for the job at a MAGNA company. The same holds for the bigwigs at such companies.
I sometimes even see the conspiracy that if a lot of graduates saw that what their work at these MAGNA involves is from the history of computer science often decades old and has been repeated multiple times over the decades, this might demotivate the employees who are to believe that they work on the "most important, soon to be world changing" thing.
>Alan Kay, my favorite curmudgeon, spent decades trying to remind us
Alan Kay giving the same (unique, his own, not a bad) speech at every conference for 50 years is not Alan Kay being a curmudgeon
>we keep reinventing concepts that were worked out in the late 70s and he’s disappointed we’ve been running in circles ever since.
it's Alan Kay running in circles
Every attempt at teaching IT history I was subject to, was clouded so much with personal opinions and attitudes that it was anecdotal and preachy.
I think assessing piece of hardware/software is much more difficult and time consuming than art. So there are no people with really broad experience.
Your experience with bad teachers seems more like an argument in favor of better education than against it. It's possible to develop better teaching material and methods if there is focus on a subject, even if it's time consuming.
If only one genius in California can do it well, it's as good as impossible.
Computer scientists should treat history like civil engineering or physics treat their histories. These are subjects that make objective progress.
Art or philosophy might or might not make progress. No one can say for sure. They are bad role models.
"These are subjects that make objective progress."
As opposed to ours, where we're fond of subjective regression. ;-P
At least for history of economics, I think it's harder to really grasp modern economic thinking without considering the layers it's built upon, the context ideas were developed within etc...
That's probably true for macro-economics. Alas that's also the part where people disagree about whether it made objective progress.
Micro-economics is much more approachable with experiments etc.
Btw, I didn't suggest to completely disregard history. Physics and civil engineering don't completely disregard their histories, either. But they also don't engage in constant navel gazing and re-hashing like a good chunk of the philosophers do.
I can't concur enough. We don't teach, "how to design computers and better methods to interface with them" we keep hashing over the same stuff over and over again. It gets worse over time and the effect is that what Engelbart called, "intelligence augmenters" become, "super televisions that cause you political and social angst."
How far we have fallen but so great the the reward if we could, "lift ourselves up again." I have hope in people like Bret Victor and Brenda Laurel.
> He’s still disappointed because very few programmers are ever introduced to the history of computer science in the way that artists study the history of art or philosophers the history of philosophy.
Software development/programming is a field where the importance of planning and design lies somewhere between ignored and outright despised. The role of software architect is both ridiculed and vilified, whereas the role of the brave solo developer is elevated to the status of hero.
What you get from that cultural mix is a community that values ad-hoc solutions made up on the spot by inexperienced programmers who managed to get something up and running, and at the same time is hostile towards those who take the time to learn from history and evaluate tradeoffs.
See for example the cliche of clueless developers attacking even the most basic aspects of software architecture such as the existence of design patterns.
with that sort of community, how does anyone expect to build respect for prior work.
Depends on the community though.
I see that more from devs in startup culture, or shipping products are sofware only company.
It is a very different mindset when software is not the main business of a company, or in consulting.
programmers are introduced to CS history, it's just mostly irrelevant or poorly contextualized so there are no useful lessons
This has been going on forever. That just means the people that "worked it out" were shitty teachers and/or bad at spreading their work.
Maybe he managed to work them out and understand them in the '70s, if you believe him. But he has certainly never managed to convey that understanding to anyone else. Frankly I think if you fail to explain your discovery to even a fraction of the wider community, you haven't actually discovered it.
> CSCI 2100: Unlearning Object-Oriented Programming??
People in tech industry, seem to have no idea how the systems in the wild work. Enterprise Java runs the backbone of operations for all of large business organisations such as banks. It is just as grounded as MS Office is. It is object-oriented software that is running the bulk of production environments of the world. Who is going to maintain these systems for the next few decades?
And in reality, there is nothing wrong with Java or object orientation. It has the best battle-tested and rich ecosystem to build enterprise systems. It mirrors the business entities and a natural hierarchy and evolution of things. It has vast pool of skilled resources and easy to maintain. Python is still a baby when it comes operational readiness and integrations. You might get excited about Jupyter cells and REPL, but that is all a dev-play, not production.
"CSCI 2100: Unlearning Object-Oriented Programming" immediately caused me to disagree this one.
When I code in C, in the end, I usually miss the syntax for defining "objects/classes" (structs with functions and access controls), the syntax/notation that encapsulates/binds/groups the related state and its functions/API to define some specific concept/model == custom data type.
Of course OOP can be taken to extreme complexity and then lose its usefulness.
Unlearning OOP does not necessarily involve forgetting abstraction and the concept of an object. "Unlearning OOP" involves freeing yourself from the notion that all programming should be designed as an object hierarchy. There is/was a tendency in object-oriented programming to consider that it is the only way Real Software™ is made. You tend to focus more on the architecture of your system than its actual features.
Notice the prerequisite to unlearning something is learning it first. I don't think anyone proposes that the concept of an object is useless.
I don’t necessarily agree with a somewhat childish “unlearn OOP” idea, but… a lot of that enterprise software is of bad quality. Whether it’s OOP or something else’s fault, simply stating that a lot of backbone is written in Java does not prove Java is a good choice, nor does it prove that there is nothing wrong with Java.
If you think programmers should defer to bank managers on what the best way to design software is, do you also think that bank managers should defer to plumbers on what the best way to manage liquidity is?
Its not bank managers but IT managers choosing somewhat the tech stack, rest is inertia and things working. Bank managers dont give a fuck in same way they couldnt care less what plumbing material is used in their buildings. They care about long term stability, as they should. Same is true for almost any business outside SV.
Parent is correct, been doing this my entire (profitable to the absolute limit) career and will most probably retire doing same. You clearly seem to lack any expertise in field discussed.
That happened not by choice, but by chance and due to "OOP consultants" running rampant in the 2000s. Source: i have to maintain Java slop in a bank, and used to maintain Java slop in manufacturing.
> Enterprise Java runs the backbone of operations for all of large business organisations such as banks.
This is rather anti-recommendation. At this point I'm expecting from a bank only to reliably login, preview balance and transaction history, receive and send bank transfers... and they oftentimes fail at this basic feature set. I don't even need credit or interest rates from them.
Banks as an example of "getting things done" is laughable. Real industry gets things done: manufacturing, construction, healthcare etc. We could do without the massive leech that is the finance sector.
I still get my monthly salary via wire transfers, no issues… banks do get things done.
Not thanks to Java or OOP, but despite it. You can thank my colleagues fighting Java in the bank each day.
A lot of wire transfer code and other foundational banking code is written in COBOL.
With so many issues and costs that people buy stablecoins to exchange money with smart-contracts that have no substantial guarantee other than the issuer's right to block funds...
And that needs skyscrapers and entire financial districts to achieve? This is a tiny fraction of the "work" done by the financial sector. Most of what they do is pointless trading between themselves and sapping the output of the real economy.
The banks' real "product" is trust. You will work an entire month for a "bank transfer" (something you can't even hold, let alone eat or burn) because you believe your landlord will similarly accept a "bank transfer" in exchange for you rent (or, if you have a mortgage, you work an entire month because you believe this means you will be able to continue living in your house unchallenged). This has absolutely nothing to do with what programming languages or paradigms they have in place.
For CSCI 2100: Unlearning Object-Oriented Programming I'd recommend Casey Muratori's Handmade Hero series.
Also check out https://www.youtube.com/@BetterSoftwareConference
CSCI 4810: The Refusal Lab
Simulate increasingly unethical product requests and deadlines. The only way to pass is to refuse and justify your refusal with professional standards.
CSCI 4812: The Career Lab
Watch as peers increasingly overpromise and accept unethical product requests and deadlines and leave you high, mighty, and picking up the scraps as they move on to the next thing.
Conventional university degrees already contain practical examples of the principles of both these courses in the form of cheating and the social dynamics around it.
Actually a lot of degrees have a relevant ethics class.
I don't disagree that such classes exist, but I have yet to see an ethics lab specifically with the intention of practicing refusal.
But universities want their graduates to be employable, is the thing.
"CSCI 4020: Writing Fast Code in Slow Languages" does exist, at least in the book form. Teach algorithmic complexity theory in slowest possible language like VB or Ruby. Then demonstrate how O(N) in Ruby trumps O(N^2) in C++.
One of my childhood books compared bubble sort implemented in FORTRAN and running on a Cray-1 and quicksort implemented in BASIC and running on TRS-80.
The BASIC implementation started to outrun the supercomputer at some surprisingly pedestrian array sizes. I was properly impressed.
To be fair, the standard bubble sort algorithm isn't vectorized, and so can only use about 5% of the power of a Cray-1. Which is good for another factor of about 5 in the array size.
We had this as a lab in a learning systems course. converting python loops into numpy vector manipulation (map reduce), and then into tensorflow operations, and measuring the speed.
Gave a good idea of how python is even remotely useful for AI.
The book in question:
https://www.amazon.ca/Visual-Basic-Algorithms-Ready-Run/dp/0...
VB is actually quite fast since VB 6, but your point stands.
Python has come along way. It’s never gonna win for something like high-frequency trading, but it will be super competitive in areas you wouldn’t expect.
The Python interpreter and core library is mostly C code, right? Even a Python library can be coded in C. If you want to sort an array for example, it will cost more in Python because it's sorting python objects, but it's coded in C.
It could be much better if most folks used PyPy instead of CPython as favourite implementation.
Optimizing at the algorithmic and architectural level rather than relying on language speed
Many computer science programs today have basically turned into coding trade schools. Students can use frameworks, but they don’t understand why languages are designed the way they are, or how systems evolved over time. It’s important to remember that computing is also a field of ideas and thought, not just implementation.
I would add debugging as a course. Maybe they should teach this but how to dive deep into figuring out how to learn the root cause of defects and various tools would have been enormously helpful for me. Perhaps this already exists
Along with a course on how to read other people's code and how to resist the urge to tear down Chesterton's fence while you fix bugs
Yes please. Even senior engineers apply with their debugging abilities limited to sprinkling print-exit over the code.
Do you have a moment to talk about our saviour, Lord interactive debugging?
Previously:
Computer science courses that don't exist, but should (2015) - https://news.ycombinator.com/item?id=16127508 - Jan 2018 (4 comments)
Computer science courses that don't exist, but should (2015) - https://news.ycombinator.com/item?id=13424320 - Jan 2017 (1 comment)
Computer science courses that don't exist, but should - https://news.ycombinator.com/item?id=10201611 - Sept 2015 (247 comments)
CSCI 0001: Functional programming and type theory (taught in English [0])
For decades, the academia mafia, through impenetrable jargon and intimidating equations, have successfully prevented the masses from adopting this beautiful paradigm of computation. That changes now. Join us to learn why monads really are monoids in the category of endofunctors (oh my! sorry about that).
[0] Insert your favourite natural language
CSI 5000 epistemological foundations of AI
Its always frustrated me how little consideration to the existing study of knowledge is given in AI courses
Where’s the course on managing your reaction when the client starts moving the goal posts on a project that you didn’t specify well enough (or at all), because you’re a young eager developer without any scars yet?
Back in the 90s, this was actually a sneaky part of Dartmouth's CS23 Software Engineering course. At least half your grade came from a 5-person group software project which took half a semester. The groups were chosen for you, of course.
The professors had a habit of sending out an email one week before the due date (right before finals week) which contained several updates to the spec.
It was a surprisingly effective course.
(Dartmouth also followed this up with a theory course that often required writing about 10 pages of proofs per week. I guess they wanted a balance of practice and theory, which isn't the worst way to teach CS.)
In uni we had a semester long group project with the stronger coders as project leaders. Group leaders controlled pass/fail for the group members and vice versa. After the leaders put together project plans for their teams and everyone was supposed to start working WHOOPSIE reorg and all the leaders got put on random teams and new people got "promoted" into leader (of groups they didn't create the plan for). If the project didn't work at the end the entire group failed.
I've never been taught anything more clearly than the lessons from that class.
Shit I was thinking about exactly the same thing: professor deliberately change requirements at last week to mess up the students and give them a bit of taste of true work.
Glad that someone actually did this.
My uni kind of had that course! They just didn't tell us what it was going to be ahead of time and it was horrendous. We all absolutely hated the professor but it was required to graduate so we all came up with various coping strategies and at the very end he said "congratulations this is what the real world is like!"
(I didn't believe him at the time, but in some ways he really didn't go far enough...)
That's the job of a (good) manager.
some scars need to be earned
It's called test driven development.
I would add:
PSYC 2230: Measuring Things - Gathering evidence, overcoming bias, and comparative analysis.
Most developers cannot measure things at any level, in any regard.
CSCI 5540: Transmission Control - Comparative analysis of existing data transfer protocols, to include practical application, as well as authoring new and original protocols
Most developers don’t know how any transmission protocols work internally, except possibly HTTP
CSCI 3204: Tree Traversal - Data structure analysis applied to data model hierarchies and taxonomies
I have heard from so many developers who say they spend most of their education on data structure analysis but in the real world cannot apply it against tree models in practical application on any level. The people in library sciences figure this out in the real world but not educated software developers.
CSCI 3210: Modern text encoding and processing
Learn unicode and utf-8.
Unlearn the 1 char = 1 byte concept
Not only encoding/decoding but searching and sorting is also different. We may also cover font rendering, unicode modifiers and emoji. They are so common and fundamental but very few understand them.
I could add so much to this page.
- COBOL on punch cards
- RPG II/III
- PDP/Vax Shared Memory Modules
- Hierarchical Data File Storage
- Recursive Expression Evaluator
- Batch Processing large datasets
- Compressed dates and numbers
So many of these teach you structural complexity that you’d never learn in today’s world.
I am happy to sign up for all these classes. Tbh this is what coursera or whatever should be. Not yet another machine learning set of lectures with notebooks you click run on.
> User Experience of Command Line Tools
Is this a good place to complain about tools with long option names that only accept a single dash? I'm thinking of you, `java -jar`.
every single go program
Jesus christ Yes to all of it. Also needed:
In addition, Team Dynamics 301: A course in Blame Management Handling the traditional “innocent punished, guilty escape/promoted” issue. With explanation of the meme “Success has 100 fathers/mothers while failure is a orphan.
Jm2c but people tend to conflate two overlapping but different fields.
CS is the study of computation, SE is the study of building computer programs.
Those overlap in the same way physics and chemistry do. Of course the two overlap and chemist's are also exposed to classical and quantum physics and know about Dirac spaces or Born-Oppenheimer equations. But the bulk and core of a chemist curriculum will involve few of these courses and with a focus on what's relevant to the chemist. E.g understanding how quantum physics make water appear transparent in a glass but blue in a lake or deep pool.
Same is for CS and SE. Of course they are related, but CS is much more focused on the theoretical and mathematical parts of computing, not the practical side of building systems.
One wants to know what can be computed and how and with what properties. The other wants to know how to build computer programs, but does not need to understand and be intimate with the mathematics of type inference or Hoare logic.
"Software engineering" is the political ideology that the study of management practices can enable you to deliver a successful software project without learning how to program. It is very popular in fields where a software project can be considered successful without ever delivering usable software, such as cost-plus defense contracting, management consulting, and enterprise software.
If you want to know how to build computer programs, then learn the type system of your chosen language, and learn how to reason about the behavior of sequences, loops, and conditionals—even if you do it informally or with small-step operational semantics instead of Hoare logic, and even if your language doesn't have type inference. Don't listen to the comforting lies of "Software Engineering" promising easy shortcuts. There is no royal road to Geometry, and there is no royal road to Google. Git gud.
But it is also true that there is a great deal that you could learn about computer science that you do not need to write working software, fast. Sequential search is often fast enough. Simple hash tables are usually better than fancy balanced trees. You will probably never use a computer that uses one's complement or the network stack the OSI model describes. If you have an array to sort, you should probably use a sorting function from the system library and definitely not implement bubble sort from scratch. Or even Quicksort. You can program in Erlang or Java for decades without having to understand how the garbage collector works.
There are some good posts on the blog that do a good job of explaining: https://prog21.dadgum.com/177.html https://prog21.dadgum.com/87.html
> "Software engineering" is the political ideology that the study of management practices can enable you to deliver a successful software project without learning how to program.
Software engineering is not an ideology, but the application of engineering practices to building computer programs, the same way civil engineering is the application of engineering practices to building bridges.
Your statement is odd: software engineering curricula do include theoretical and computational courses, but ultimately those are a limited part and not the focus of the curriculum.
In the same way CS curricula do include few engineering and application-focused exams, but again, they are not the focus.
It's absolutely fine for the two curricula to be different and they are indeed different in most of Europe.
E.g. at the university of Pisa the CS curriculum (obviously speaking about masters, arguing about bachelors is partially irrelevant, you just can't get in enough depth of any topic) has exams like parallel computing, category theory, models of computation, compilers and interpreters.
But the software engineering curriculum has: mobile and physical systems, machine learning, distributed computing, business process modeling, IT risk assessment, IT infrastructures, peer to peer systems, etc.
Of course many exams are shared (albeit they have slightly different focuses) such as: randomized algorithms, competitive programming, advanced programming and you can likely choose one of the other courses as your optionals.
But the focus is ultimately different. One focuses on the theory behind computation, one focuses on the practical aspect.
I'd add classes about:
> CSCI 2100: Unlearning Object-Oriented Programming
but 90s fashion is all the rage these days!
CSCI 4321: Unlearning Vibe Coding
> PSYC 4410: Obsessions of the Programmer Mind
I was definitely guilty of this in my last role. Some of my refactorings were good and needed, but also a distraction from saying the codebase was "good enough" and focusing on the broader people/team/process problems around me.
One other course that all undergrads should take is Unix 101 Just the basics of unix (bash, cat , sed, awk) How to grep logs, tail files
Yes! Exactly that - basics of unix - was one of the first-semester courses in my comp sci degree. It has served me well ever since.
CS102 Big Balls of Mud: Data buckets, functions, modules and namespaces
CS103 Methodologies: Advanced Hack at it ‘till it Works
CS103 History: Fashion, Buzzwords and Reinvention
CS104 AI teaches Software Architecture (CS103 prerequisite)
Introduction to PhD study: "How hard can it be, I'm sure I could write that in a week"
I wish more comp sci curricula would sprinkle in more general courses in logic and especially 20th century analytic philosophy. Analytic philosophy is insanely relevant to many computer science topics especially AI.
Functional programming exists in any reputable computer science course. Standard is haskel, For a true "unlearning" it might need to be a third or forth year subject
On my day it was actually a mix of Miranda, Caml Light, Lisp, with Tarski's World and Prolog for FP.
> CSCI 3300: Classical Software Studies Discuss and dissect historically significant products, including VisiCalc, AppleWorks, Robot Odyssey, Zork, and MacPaint. Emphases are on user interface and creativity fostered by hardware limitations.
Definitely would love that. Reading source code is pretty hard for newbies like me. Some guidance is appreciated.
I think it would be a good idea, especially CSCI 3300. (Learning them in a course is not the only way to learn computer and other stuff, but is (and should be) one way to do.)
(However, CSCI 2100 shouldn't be necessary if you should learn stuff other than OOP the first time, even if you also learn OOP.)
I really don't understand the modern hate towards OOP. From my experience over the last few decades working with large C and C++ codebases, the former turns into a big ball of mud first.
Most hate of OOP comes from the definition that OOP = inheritance. Meanwhile, among people that consider themselves OO programmers, there is often the same revulsion towards inheritance and a preference for encapsulation while still calling that OOP. Because each language is subtly different, these things tend to flame war.
Which of course people do and why of course you have:
> PSYC 4410: Obsessions of the Programmer Mind
I think that OOP can be good for some things, but that does not mean that all or most programs should use OOP for all or most things. I would say that for most things it is not helpful, even though sometimes it is helpful.
I’m interested in the history of software development, and detailed case studies. What book(s) should I read?
what about a "Failed Concepts" course? Things that seemed like a good idea, but ultimately don't work. (fe, ORMs come to mind)
"when you need shiny tech (you don't)" (aka, when to use postgres/sqlite)
The Classical Software Studies would be quite useful. Go write a game in 64kb of RAM in BASIC. It would really stretch your creativity and coding skills.
I think working on that kind of system would be actively harmful for most programmers. It would give them a totally unbalanced intuition for what the appropriate tradeoff between memory consumption and other attributes (maintainability, defect rate, ...) is. If anything, programmers should learn on the kind of machine that will be typical for most of their career - which probably means starting with a giant supercomputing cluster to match what's going to be in everyone's pocket in 20 years' time.
Ha. You call it "history". I call it "childhood". I did that years before getting to Uni :)
Although, to be fair, while it was a helpful practice at coding, I'm not a game designer, so it was a game too terrible to play.
First year Uni though I spent too many hours in the lab, competing with friends, to recreate arcade games on the PC. Skipping the game design part was helpful. To be fair by then we had a glorious 640k of ram. Some Assembly required.
Agreed, it would be very interesting to see some of the care taken for resource management that is lost now because every machine has “enough” RAM and cycles…
There's generally courses on embedded
I’m e long thought the Gameboy Advance would make a great educational platform. Literally every aspect of the hardware is memory mapped. Just stuff values into structs at hard-coded addresses and stuff happens. No need for any OS or any API at all.
Bonus points if it fits in 38911 bytes.
CSCI 4083: Developing Software Alongside Other Human Beings
Ls and not cat? Does the author have no memory of cat -v shitstorm?
ls and not mpv or ffmpeg?
https://mpv.io/manual/stable/#options
https://ffmpeg.org/ffmpeg.html#Options
We can take a further step back and say that no one really knows how to do pagination and those who do are sick of it to teach it to others.
Unlearning object oriented programming
I think OOP became popular because it feels profound when you first grasp it. There is that euphoric moment when all the abstractions suddenly interlock, when inheritance, polymorphism, and encapsulation seem to dance together in perfect logic. It feels like you have entered a secret order of thinkers who understand something hidden. Each design pattern becomes a small enlightenment, a moment of realization that the system is clever in ways that ordinary code is not.
But if you step back far enough, the brilliance starts to look like ornament. Many of these patterns exist only to patch over the cracks in the paradigm itself. OOP is not a natural way of thinking, but a habit of thinking that bends reality into classes and hierarchies whether or not they belong there. It is not that OOP is wrong, but that it makes you mistake complexity for depth.
Then you encounter functional programming, and the same transformation begins again. It feels mind expanding at first, with the purity of immutable data, the beauty of composability, and the comfort of mathematical certainty. You trade one set of rituals for another: monads instead of patterns, recursion instead of loops, composition instead of inheritance. You feel that familiar rush of clarity, the sense that you have seen through the surface and reached the essence.
But this time the shift cuts deeper. The difference between the two paradigms is not just structural but philosophical. OOP organizes the world by binding behavior to state. A method belongs to an object, and that object carries with it an evolving identity. Once a method mutates state, it becomes tied to that state and to everything else that mutates it. The entire program becomes a web of hidden dependencies where touching one corner ripples through the whole. Over time you code yourself into a wall. Refactoring stops being a creative act and turns into damage control.
Functional programming severs that chain. It refuses to bind behavior to mutable state. Statelessness is its quiet revolution. It means that a function’s meaning depends only on its inputs and outputs. Nothing else. Such a function is predictable, transparent, and portable. It can be lifted out of one context and placed into another without consequence. The function becomes the fundamental atom of computation, the smallest truly modular unit in existence.
That changes everything. In functional programming, you stop thinking in terms of objects with responsibilities and start thinking in terms of transformations that can be freely composed. The program stops feeling like a fortress of interlocking rooms and begins to feel like a box of Lego bricks. Each function is a block, self-contained, perfectly shaped, designed to fit with others in infinitely many ways. You do not construct monoliths; you compose arrangements. When you need to change something, you do not tear down the wall. You simply reassemble the bricks into new forms.
This is the heart of functional nirvana: the dream of a codebase that can be reorganized endlessly without decay. Where every part is both independent and harmonious, where change feels like play instead of repair. Most programmers spend their careers trying to reach that state, that perfect organization where everything fits together, but OOP leads them into walls that cannot move. Functional programming leads them into open space, where everything can move.
Reality will always be mutable, but the beauty of functional programming is that it isolates that mutability at the edges. The pure core remains untouched, composed of functions that never lie and never change. Inside that core, every function is both a truth and a tool, as interchangeable as Lego bricks and as stable as mathematics.
So when we ask which paradigm handles complexity better, the answer becomes clear. OOP hides complexity behind walls. Functional programming dissolves it into parts so small and transparent that complexity itself becomes optional. The goal is not purity for its own sake, but freedom; the freedom to recompose, reorganize, and rethink without fear of collapse. That is the real enlightenment: when your code stops feeling like a structure you maintain and starts feeling like a universe you can endlessly reshape.
The great achievement of OOP is that it inspires such passion.
In essence OOP is just, "hey, if you have a struct and a bunch of operations that operate on that struct, let's put the name of the struct and a dot in front of the names of those operations and you don't need to pass the struct itself as an argument"
It beats me how either the high priests or its detractors get so worked up about it, even with the add-ons like inheritance, poly-morphism or patterns. (Which of course also exist in a more mathematically clean way in functional languages.)
These patterns have seen real use (not saying optimal) in the wild.
Of course we know today composition is better than inheritance, plain data structs are enough for most cases, and "parse, don't validate". but did people know it in 1990s?
There's a video from Gary Bernhardt called Functional Core, Imperative Shell which made sense to me. These paradigms can work well together. OOP can be great too, the Ruby language is deeply OO and the core library is rich and OO. You can also only use hashes and arrays and type them with Typescript, using functions, types and namespaces and duck-typing, instead of classes and methods.
ps: closures are worth thinking about, we use them without even thinking about it
OOP and FP are in theory orthogonal principles. Maybe I'm too much of a pragmatic and purists might scoff at me, but I use and appreciate both.
Did you use ChatGPT to write this?
What's really missing is:
- a basic computer science course, teaching how to be at home in a FLOSS desktop system
- an intermediate course teaching how to properly automate this environment, from scripting to classic development
- a basic course in networking and system management to reach the level of being able to be a dummy sysadmin at home
all of these must be preparatory to CS because without them it's like studying literature before knowing the basics of the language in which it's written. So far, it's assumed that students do it themselves, but the facts prove that this is not the case.
It is interesting that no software engineering or computer science course I’ve seen has ever spent any time on CI/CD.
Jenkins, Docker, Kubernetes, none of these sorts of things - and I don’t even mean these specific technologies, but moreover nothing even in their ballpark.
> It is interesting that no software engineering or computer science course I’ve seen has ever spent any time on CI/CD.
It's hard to fit everything student needs to know in the curriculum. Someone else posted here they had 10 pages of proofs per week, for one course. I would have been fired for assigning so much homework!
I was a CS professor at a local college. My solution was to ignore CS1 and CS2 curriculum (we were not ABET accredited, so that's okay) in the second course of Java programming. Instead, I taught students Maven/Gradle, Git and GitHub, workflows, CI/CD, regular expressions, basic networking, basic design patterns, Spring Boot, and in general everything I thought new programmers ought to know. I even found a book that covered much of this stuff, but in the end I wrote my own learning materials and didn't use a book.
The course was a victim of its success. The school mandated the course for non-Java programmers too, resulting in a lot of push-back from the non-Java students.
If anyone is interested, I have the syllabus online still (I've since retired) at <https://wpollock.com/>. Look for COP2800 and COP2805C. I can also send the Java teaching materials as a PDF to anyone interested (book length, but sadly not publishable quality).
>Someone else posted here they had 10 pages of proofs per week, for one course.
Huh. As a professor, I would not be able to grade this kind of volume in any serious capacity. Especially since proofs need to be scrutinized carefully for completeness and soundness. I wonder how their instructor manages.
I'm doing this in the software engineering¹ course I teach. However:
a) Practical CI/CD requires understanding (and some practical experience) of many other concepts like Linux shell scripting, version control, build automation, containers, server administration, etc. As few students (in our degree programme) have sufficient experience in these concepts, I spend about half of the semester teaching such devops basics instead of actual software engineering.
b) Effectively, teaching CI/CD means teaching how GitHub or GitLab do CI/CD. I feel a little bit uncomfortable teaching people how to use tech stacks owned by a single company.
¹) Actually it's more a course on basic software craftsmanship for media informatics students because no such course exists in our curriculum and I find it more important that students learn this than that they understand the V model.
Well this has nothing to do with computer science so there's that
I could not disagree with you more.
It would be easy for me to agree with you. I hold a graduate degree in computer science and I’m named for my contributions proofreading/correcting a graduate text about algorithms in computability theory.
I love abstraction and algorithms and pure theory, but this whole “computer science is a branch of mathematics, nothing more” idea has always struck me as ridiculous. Are you willing to throw out all of the study of operating systems, networking, embedded systems, security (hardware and software), a good chunk of AI, programming languages, UI/UX/human computer interaction, graphics, just to draw a line around algorithms and Turing Machines and say this is all there is to computer science?
Cryptography is all math, networking is largely math and algorithms (IMO yes this should really be replaced with information theory. Just understanding Shannons paper would have been more valuable than learning about how routers work), AI is mostly statistics (And AI as a whole Id argue is the essence of computer science), graphics is largely math and algorithms.
Yes I very much think a computer science degree should be as close to the foundation of theory as possible. And still learning Jenkins and kuberenetes or even a general course on how effectively push code is still far from the things you listed.
Theres so much computer science that isnt even covered that id include before including courses on CI/CD
Yeah cryptography mostly (but certainly not all) math but it accounts for a negligible (pun intended) portion of interesting security work.
AI is a lot of math, especially if you hang out with the quasiconvex optimization crowd, but a vast majority of work in that field can not properly constitute “theory”
I think it’s clear in practice that computer science has officially strayed beyond whatever narrow bounds people originally wished to confine it to.
Alot of its a push for practicality/catering to student's interests. IMO its a result of a really archaic education system. Universities were originally small and meant for theoretical study, not as a de facto path for everyone to enroll into in order to get a job.
If it were me Id get rid of statically defined 4 year programs, and/or definite required courses for degrees, or just degrees in general. Just offer courses and let people come learn what they want.
One of my favorite classes was a python class that focused on building some simple games with tkinter, making a chat client, hosting a server, because it was the first time I understood how actual software worked. Im really glad I took that class.
On the other hand Id love to have learned information theory, lamba calculus, all the early AI, cognitive science, theory of programming languages, philosophy behind all of it that got us here
Your point is well taken and to some extent I agree, but I think you have to recognize. It’s not just student interest, career preparation, and practicality.
The research done by professional academic computer scientists also reflects the broad scope I’m advocating for.
You've just named most of the interesting stuff.
Personally it felt quite natural once you start to work on real software projects.
I have never felt that way, and I’ve worked on a variety of projects at a variety of companies.
Everyone has a bespoke mishmash of nonsense pipelines, build tools, side cars, load balancers, Terragrunt, Terraform, Tofu, Serverless, Helm charts, etc.
There are enough interesting things here that you wouldn’t even need to make a tool heavy project style software engineering course - you could legitimately make a real life computer science course that studies the algorithms and patterns and things used.
Merging strategies, conflict resolution, bisect debugging, and version control in general are very computer sciencey.
Would make a great course.
That’s not even getting into the fact that you could basically teach multiple graduate level distributed computing courses with k8s as a case study
SQL
>CSCI 2100: Unlearning Object-Oriented Programming Discover how to create and use variables that aren't inside of an object hierarchy. Learn about "functions," which are like methods but more generally useful. Prerequisite: Any course that used the term "abstract base class."
This is just a common meme that often comes from ignorance, or a strawman of what OOP is.
>CSCI 4020: Writing Fast Code in Slow Languages Analyze performance at a high level, writing interpreted Python that matches or beats typical C++ code while being less fragile and more fun to work with.
I like this one, but see?
Python is heavily OOP, everything is an object in python for example.
I'm wondering if OP took a basic OOP course or would otherwise be interested in taking one? You can learn about a thing you are against, or even form your opinion after actually learning about it.
>
> Python is heavily OOP, everything is an object in python for example.
I strongly disagree. How is everything being called an object in any way "heavily OOP"? OOP is not just "I organize my stuff into objects".
You can write OOP code with python but most python code I've seen is not organized around OOP principles.
>I strongly disagree. How is everything being called an object in any way "heavily OOP
Do I need to spell it out? The O in OOP stands for object.Everything is an object therefore it is Object Oriented. It's not much more complex than that man.
And I don't mean that it supports users writing oop code, I mean that the lang, interpreter and library are themselves written with oop. Inheritance? Check. Classes? Check. Objects? Check. Even classes are an object of type MetaClass.
Also CSCI 3600: Making Optimal Compromises in Software Development
I propose CSCI 3500: Software Engineering is Not About Writing Code
CSCI 2170: User Experience of Command Line Tools.
This should exist and the class should study openssl.
I had forgotten about prog21, and I'm impressed how he wrapped up his blog:
> I don't think of myself as a programmer. I write code, and I often enjoy it when I do, but that term programmer is both limiting and distracting. I don't want to program for its own sake, not being interested in the overall experience of what I'm creating. If I start thinking too much about programming as a distinct entity then I lose sight of that.
Programming is a useful skill, even in the age of large language models, but it should always be used to achieve some greater goal than just writing programs.
ls should have just been for listing files, giving it 10 flags is one of the big mistakes of early UNIX design
Not really "computer science" topics, but useful nonetheless.
> It's about being able to implement your ideas.
No, it's about creating something which does something useful and is easy to maintain. Plumbers have great ideas and approaches but you just want plumbing which works and can be fixed .
It's time developers realised they are plumbers not **** artists.
[HN reduced my expletive to just four asterisks which seems a bit reductionist]
> It's time developers realised they are plumbers not artists.
If we're plumbers, why all the memory leaks?
Or... are we also bad plumbers?
Speak for yourself, please.
The TFA wants the following computer science courses:
Unlearning Object-Oriented Programming: a course on specific software engineering techniques
Classical Software Studies: a course on the history of software tools
Writing Fast Code in Slow Languages: a course on specific engineering techniques
User Experience of Command Line Tools: an engineering design course
Obsessions of the Programmer Mind: course about engineering conventions and tools.
One day, the name of science will not be so besmirched.
People don’t get this up in arms when there is a math course about using math tools or math history or how math is used to solve actual problems, but for some reason they do about computer science.
They just label such people as Applied Mathematicians, or worse: Physicists and Engineers; and then get back to sensible business such as algebraic geometry, complex analysis and group theory.
If universities offered a major in which students learned to operate telescopes of various kinds and sizes, and then called it astrophysics, people would be mad too.
Astronomy isn’t about telescopes yet astronomers spend lots of time studying and doing research regarding telescopes.