I've been in CS professionally for 12 years. This a perfect example of normal distribution at work. By increasing the population, you simply increase the number of people across the entire curve, bumping up the number of high, medium, low paid, as well as unemployed folks with the degrees. The real issue has always been greed - people disproportionately dove into CS because like all hype movements it promised significant income after just 2-4 years, and sometimes not even that, but a month long bootcamp. Once it was sold to the masses as a get rich quick scheme - the disappointment paired with a number of low-achieving grads tripling was unavoidable.
Gen Z has a very doomeristic view towards CS careers now due to social media influencers prematurely dismissing CS/software paths as a dead end or impossible to break into because of AI.
I know the job market can suck for graduates right now, but I do believe studying CS can still lead to decent paying careers. There's always going to be demand for people who understand code, who can break down complex problems and bring a problem solving mindset. LLMs don't solve everything.
The drop in CS students ironically may create a vacuum that allows us employed engineers to demand even higher compensation.
I feel like CS is just correcting back to what it was.
Even back when I was in college (graduated 2017), I noticed there was this clear bifurcation among the students. Alot of the students at that time did it because you could score a great job after college but the smaller cohort were the students that just loved the game. And even back then we had loads of students wash out or graduate then take other jobs after college from the former group.
It's no different today except that the group that did it for money are washing out before they even get to college because they fear that AI will take their jobs, meanwhile the latter group is still here and were able to do more and more with AI.
It's a truly wild time to be alive in this industry. Half of us are seeing the doom and gloom of AI and the other half are seeing the "next age" happen right before our eyes.
And I'll be honest I kinda feel sad for the folks that take the negative view of AI right now. Cause I'm having more fun than I've ever had before in this industry.
can you elaborate on your thesis as to why? it seems to me, with raw code being less of a bottleneck, things like understanding the spec, polishing, and doing the fuzzy work around the edges become all the more important. These were never strengths of outsourcing. In fact, I think that the fact that those parts are important is a big reason why the profession as a whole wasnt entirely just outsourced, despite the compelling economic reasons for it.
Isn't it the other way around, AI replacing outsourcing? AI can do the implementation work, but you still need the human who needs to specify what has to be done, give architecture guidance and check and accept the resulting work (or reject it, with notes on what to fix). AI coding is basically outsourcing to AI
This is the paradox. But because AI makes outsourcing jobs easier, those workers need to compete, and so they will be able to do those specification jobs and quality control jobs as well.
I was rich even before I came into this field, my family owns lots of agriculture land and I came to this field for passion of it and was never really motivated by money.
Thing is AI is taking outsourced jobs in india at much faster rate than elsewhere.
The latest layoff coming from Oracle mostly laid off workers in india.
It's not doomerism. I've seen this happen at companies.
I was talking to a guy who wanted uptime monitoring. So, he told the executive who called the uptimerobot but then other guy rolled out his uptime robot using AI in 60 minutes and deployed along with centralized logging and it costs the company only $5 VPS.
And honestly it works just as good, I've seen companies are refusing to pay for external tools and building leaner version using AI.
You can build a SaaS faster now but the need for SaaS is on decline.
I've moved to deploying on bare metal from OVH and Hertzner, why? Because devops is completely reduced to few minutes worth of work using agents.
You don't need AI for that. Just deploy Uptime Kuma or similar to a VPS and job done. I can do that in about 15 min vs. your 60 min.
Of course, this is not a production-grade deployment. To get there, I'd need to build images on pipelines, scan them, test them, publish artefacts, write up the IaC to manage the cloud resources, add monitoring around the solution, ...
Deploying a simple piece of software on a custom server was never difficult or slow to do.
Some things are better when they are outsourced because other companies specialized for that problem set. Just like you won't roll your own cryptography, you won't and can't do everything in-house.
Isn't it too early to declare the vibe coded uptime monitoring as "good" from a business standpoint? NIH syndrome has always been a thing and I don't see how the downsides of creating bespoke systems have changed in the LLM-era. You're now stuck maintaining this "genius" solution, LLMs or not, and onboarding devs/users gets harder the more churn you have.
Okay, so AI probably wasn't critical to crafting a 100 lines of what I'm assuming is python or some other scripting language that pings a url. Second, I don't think 100 lines is going to be as robust as some pre-packaged monitoring solution (open source or enterprise tier) which has a dashboard and auth story baked in along with proper documentation.
If that script works for your use case then great, but I don't see how LLMs were a game changer here.
Big companies would have rolled out their own anyway, but now 2-3 person companies are also not paying for tools like uptime robots. That's the development LLM brought
For 5 minutes. The need for cheap SaaS that one person can build and has no uptime requirements or security requirements or legal requirements or ongoing maintenance requirements is indeed declining.
Eh, this said I think SaaS offering had gotten overpriced. Coupled with things like they put up stickers that say "We R havin S3curity" rather than actually having secure systems, oh and constant price increases and methods of locking you into their services there was a need for some pushback in the market.
The same thing happened in dentistry, law, EE, some medical specialties, vets. Job path gets seen as a golden ticket to high pay for low to medium effort and people pile into the profession and saturate the market. Then some thing happens, a recession, technological innovation, or geopolitical/geo-economic shift, and the demand changes dramatically. These things work in cycles, happened before in the .com bust, and this will take 5-10 years to work its way out. The good news is that for the new grads who manage to hang on they will make a killing as the demand for mids and seniors will be insane due to the lack of qualified candidates available.
Yes? Insofar as a career path. You go to a good school, get a good degree and be almost guaranteed a good pay with known career progression. It's not like entrepreneurship, where you can't see the path ahead of you. Think video games, much easier to play when the goals are given to you than to make your own.
Our government failed the citizens and let outsourcing and wage suppression destroy the US tech industry...positioning the country far behind other countries in technology supremacy for at least a decade to come
You don’t have a god given right as an American to a mid-high six figure income working in tech and I’m not sure how you can see Anthropic playing whack-a-mole trying to keep the Chinese from distilling their models and say that the US isn’t still on the inside track with this stuff.
Corporations don’t have a God-given right to manipulate the market by outsourcing the work to the detriment of the natives, yet they make good use of it!
And yet, CS was the fastest growing employment category with some of the highest and among the fastest growing salaries in the nation for the past 2 decades despite the fact that outsourcing has been around since the 90s.
The fact that this changed in the last 2 years right when AI became feasible is just a coincidence. It’s actually finally outsourcing destroying the grad job market nearly 3 decades later.
Nothing changed in the past 2 years except we had another recession. Just like very other recession, people are looking for something to blame. Out sourcing has long been a popular thing to blame. This time blaming AI is the fad. There will be more recessions in the future and people will blame some other fad next time (or perhaps the same). In the end though it is the economy.
There was a golden age (2010-2020 or so?) when a CS degree of as basically an easy ticket to the upper middle class. Unlike similar well-paying jobs like law or medicine, a bachelor's degree was enough. Now it's not easy to get a job as a new grad, so there are fewer people getting into the discipline whose primary reason to study it was money.
CS started as a major for "computer nerds" who were very into computers. Computers were a hobby for virtually every one of my CS classmates in the 90's. Studying computer science was an extension of that interest, so majoring in CS made sense.
Fast forward a decade or two, and it is like you said, people who don't have a strong interest in computers starting taking CS as a major as a path to jobs and income.
Now, as a manager of engineering teams, I'm constantly surprised by Software Engineers that don't even own their own computers and/or have very little knowledge about how they work.
Everyone I knew in CS thought the same way - a basic CS degree was enough, some were even audacious enough to drop out without a degree "because it's tech" and they thought they'd make it without one. Well, spoiler alert, that's not how things have been playing out. Companies are looking for reasons to fire some and give way to others, and the first ones to go have been people with the least credentials and education.
Anecdotally, over 10 years ago, I recognized that with the pace of technology and advancement of machine learning - everybody would be doing that same Bachelor in CS so I went for a quick PhD, which I believed was the ticket to success since it was a way to become much more sought out and valuable than my peers with a Bachelors.
I still believe that golden ticket to the upper middle class is there, it's simply hiding behind more credentials now. The high-paying jobs are all going to folks with credentials or folks with S-tier projects from FAANG or MANGO or whatever else you want to call it
CS was viewed as a golden ticket at least as far back as 2000. Even then, there were people doing the degree who had no interest in the subject, but thought it would get them a good salary.
There have always been ups and downs in the economy. This isn't the first recession where programmers were hit harder than other jobs. There have been other recessions where programmers were one of the few places where everyone was hiring.
You can substitute every single possible profession for programmers above and still have a trust. Even things like groceries (everyone needs to eat) still has ups and downs (some recessions people go to restaurants less and so are good; others they switch to low margin staples and profits go down).
When I attended back in the late 90’s, there was a view that once y2k was over and that crisis was dealt with the industry would collapse and there wouldn’t be any jobs.
This just reminds me of that old meme where a product manager says something like "Why am I paying you $200k/yr when I can just copy code from StackOverflow?" and the engineer responded "You pay me that because I make sure the code you copied is the right code". I have a feeling we're in a similar situation here for AI. Sure, anyone can create AI slop code, but to fully take advantage of it, you need someone that understands the whole chain of design and development to make it fully work and integrate with existing systems.
I think we're in for an era where many folks will be filtered out and those who know and understand code, will be in high-demand.
I think AI emphazes brain dead computer science and script kiddy culture. It just lowers the bar enough to make bad ideas easy enough to implement quickly but good ideas still take longer to produce and argue for. Maybe its a skill issue on my part but I've watch my team rebuild a model I maintain, with AI for been estimating performance changes based on trace following. The Model isn't accurate and was build to bypass working on the real model. They spend someone's full time work for 4 months at this point but the thing they wanted modeled took 1 day by just adding it to the real model.
The managers and everyone are so excited by the fact the person did it with AI but I just get really confused because it seems like they just made some worse that has less value because it cannot actually correctly simulate the thing we want to test. Maybe i am being petty and salty but I think the that this is time wasted by any measure. And net-negative value but the team wants to emphasize we are using AI. There have been some productive uses but the productivity trap-doors are about the same as with normal development just people seem more willing to take the trap door ideas now.
I was just at the largest career expo for high schoolers in the greater NYC area yesterday. It’s anecdotal but the two most asked about majors at the fair were #1 mechanical engineering and #2 computer science. I gave away all the materials we had and I had left thinking “this will be plenty”.
So let’s just wait a bit before we say it hit a wall.
It is easy to forget sometimes in the excitement, but nobody has been using (2026) AI for 20 years. We're all still new. I am sure that in the next year, something will be found that is fairly exciting, and something we could all be doing right now, but it's simply that nobody has thought of it yet. Or something that is today common practice will become generally considered an anti-pattern and common practice will have some replacement for it that, again, nothing stops us from doing it today but nobody has thought of it yet, because we're all newbs.
(One candidate example for this is the discussion I've seen in the last few days about not trying to negate something, to say "Don't do X", but instead stay positive because eventually the negation gets lost in the context window and you're better off just not putting the idea in the LLM's mind at all, where "Don't do X" comes to be seen as an LLM antipattern.)
One of the consequences of none of us having used AI for long enough is that we don't know how to onboard developers in an age of AI. This will be, by necessity, transient. Eventually we're going to max out what a person can do and we'll need more people. The supply of existing engineers will be limited. We will be forced to discover how to onboard new engineers.
But at the moment we've got our hands full, and we don't know how to do it.
The irony is, the best time to join a field is often exactly when the enrollment dips and the worst can be precisely when it is the most popular. Start a programming college program today and the odds that in 4 years we'll have onboarding figured out and have developed some sort of need for fresh developers is pretty decent.
But I don't know what to do about the fact that the standard CS curriculum was already of debatable relevance to me in the late 90s and I don't know of what relevance it will be in four years except to guess that it very likely to be even less. I do know that we are again affected by the fact nobody has been doing this for 20 years, like I mentioned above. There is no body of "wisdom" for an AI-powered world to draw on to construct a new curriculum. Universities would be inclined to do the obvious thing and try to chase our current practices with AI but those aren't going to be stable enough to build a curriculum on any time soon, and a real fundamentals-based curriculum may involve less AI than people may think.
I know one advantage I have over my younger peers at this point is just a knowledge of what terms to say to the AI to get it to do what I want, words like "event sourced" or "message bus" or "stored procedures", where simply knowing that the concept exists is the bottleneck. I could see a programming curriculum based on touring through a whole whackload of concepts with their pros and cons, or at least, where that is a much larger portion of it.
Ask me in 5 years though and I'd almost certainly suggest a completely different curriculum than I would now, though.
My local post-secondary dont teach AI at all. Not even like a teaser course or anything.
Technically speaking, they are leftists who publicly oppose AI. They created the new Chief of AI Officer who has no support at all from the univeristy, had to go to politicians for support.
Whereas the college straight up opposes AI.
But what value is any of their degrees anymore? Suspicious at best.
> they are leftists who publicly oppose AI. They created the new Chief of AI Officer who has no support at all from the univeristy
The "leftist" administration created a position while at the same time speaking out against AI? Doesn't seem realistic.
> Whereas the college straight up opposes AI.
Opposes AI in what way? No courses on it? Does not allow students to utilize it? I have a hard time believing they do not offer a single course on any AI subject. Many colleges are offering it as a post-grad option, at least in Canada.
> But what value is any of their degrees anymore? Suspicious at best.
In general? I don't understand what you are getting at here.
Teaching AI is a rather large field are you talking about LLMs/transformers? Are you talking about working with LLMs, which is something that seems to change every 6 months?
I've been in CS professionally for 12 years. This a perfect example of normal distribution at work. By increasing the population, you simply increase the number of people across the entire curve, bumping up the number of high, medium, low paid, as well as unemployed folks with the degrees. The real issue has always been greed - people disproportionately dove into CS because like all hype movements it promised significant income after just 2-4 years, and sometimes not even that, but a month long bootcamp. Once it was sold to the masses as a get rich quick scheme - the disappointment paired with a number of low-achieving grads tripling was unavoidable.
Gen Z has a very doomeristic view towards CS careers now due to social media influencers prematurely dismissing CS/software paths as a dead end or impossible to break into because of AI.
I know the job market can suck for graduates right now, but I do believe studying CS can still lead to decent paying careers. There's always going to be demand for people who understand code, who can break down complex problems and bring a problem solving mindset. LLMs don't solve everything.
The drop in CS students ironically may create a vacuum that allows us employed engineers to demand even higher compensation.
I feel like CS is just correcting back to what it was.
Even back when I was in college (graduated 2017), I noticed there was this clear bifurcation among the students. Alot of the students at that time did it because you could score a great job after college but the smaller cohort were the students that just loved the game. And even back then we had loads of students wash out or graduate then take other jobs after college from the former group.
It's no different today except that the group that did it for money are washing out before they even get to college because they fear that AI will take their jobs, meanwhile the latter group is still here and were able to do more and more with AI.
It's a truly wild time to be alive in this industry. Half of us are seeing the doom and gloom of AI and the other half are seeing the "next age" happen right before our eyes.
And I'll be honest I kinda feel sad for the folks that take the negative view of AI right now. Cause I'm having more fun than I've ever had before in this industry.
> I do believe studying CS can still lead to decent paying careers.
Yes, for countries like India.
With AI, outsourcing becomes much more effective.
can you elaborate on your thesis as to why? it seems to me, with raw code being less of a bottleneck, things like understanding the spec, polishing, and doing the fuzzy work around the edges become all the more important. These were never strengths of outsourcing. In fact, I think that the fact that those parts are important is a big reason why the profession as a whole wasnt entirely just outsourced, despite the compelling economic reasons for it.
Isn't it the other way around, AI replacing outsourcing? AI can do the implementation work, but you still need the human who needs to specify what has to be done, give architecture guidance and check and accept the resulting work (or reject it, with notes on what to fix). AI coding is basically outsourcing to AI
This is the paradox. But because AI makes outsourcing jobs easier, those workers need to compete, and so they will be able to do those specification jobs and quality control jobs as well.
I was rich even before I came into this field, my family owns lots of agriculture land and I came to this field for passion of it and was never really motivated by money.
Thing is AI is taking outsourced jobs in india at much faster rate than elsewhere.
The latest layoff coming from Oracle mostly laid off workers in india.
[flagged]
It's not doomerism. I've seen this happen at companies.
I was talking to a guy who wanted uptime monitoring. So, he told the executive who called the uptimerobot but then other guy rolled out his uptime robot using AI in 60 minutes and deployed along with centralized logging and it costs the company only $5 VPS.
And honestly it works just as good, I've seen companies are refusing to pay for external tools and building leaner version using AI.
You can build a SaaS faster now but the need for SaaS is on decline.
I've moved to deploying on bare metal from OVH and Hertzner, why? Because devops is completely reduced to few minutes worth of work using agents.
You don't need AI for that. Just deploy Uptime Kuma or similar to a VPS and job done. I can do that in about 15 min vs. your 60 min.
Of course, this is not a production-grade deployment. To get there, I'd need to build images on pipelines, scan them, test them, publish artefacts, write up the IaC to manage the cloud resources, add monitoring around the solution, ...
Deploying a simple piece of software on a custom server was never difficult or slow to do.
You need to invest time in learning its configuration and features you probably won't even need.
Some things are better when they are outsourced because other companies specialized for that problem set. Just like you won't roll your own cryptography, you won't and can't do everything in-house.
$5 VPS + your AI subscription?
Isn't it too early to declare the vibe coded uptime monitoring as "good" from a business standpoint? NIH syndrome has always been a thing and I don't see how the downsides of creating bespoke systems have changed in the LLM-era. You're now stuck maintaining this "genius" solution, LLMs or not, and onboarding devs/users gets harder the more churn you have.
It's 100 lines of code, how hard is it to maintain? It's absolutely better than paying for enterprise license of Uptime robo
Okay, so AI probably wasn't critical to crafting a 100 lines of what I'm assuming is python or some other scripting language that pings a url. Second, I don't think 100 lines is going to be as robust as some pre-packaged monitoring solution (open source or enterprise tier) which has a dashboard and auth story baked in along with proper documentation.
If that script works for your use case then great, but I don't see how LLMs were a game changer here.
Big companies would have rolled out their own anyway, but now 2-3 person companies are also not paying for tools like uptime robots. That's the development LLM brought
> the need for SaaS is on decline
For 5 minutes. The need for cheap SaaS that one person can build and has no uptime requirements or security requirements or legal requirements or ongoing maintenance requirements is indeed declining.
Eh, this said I think SaaS offering had gotten overpriced. Coupled with things like they put up stickers that say "We R havin S3curity" rather than actually having secure systems, oh and constant price increases and methods of locking you into their services there was a need for some pushback in the market.
The same thing happened in dentistry, law, EE, some medical specialties, vets. Job path gets seen as a golden ticket to high pay for low to medium effort and people pile into the profession and saturate the market. Then some thing happens, a recession, technological innovation, or geopolitical/geo-economic shift, and the demand changes dramatically. These things work in cycles, happened before in the .com bust, and this will take 5-10 years to work its way out. The good news is that for the new grads who manage to hang on they will make a killing as the demand for mids and seniors will be insane due to the lack of qualified candidates available.
Uhh dentists doctors and EEs are low effort jobs?
Yes? Insofar as a career path. You go to a good school, get a good degree and be almost guaranteed a good pay with known career progression. It's not like entrepreneurship, where you can't see the path ahead of you. Think video games, much easier to play when the goals are given to you than to make your own.
seriously. I don't see how CS is low effort either, maybe they mean physically low effort.
Our government failed the citizens and let outsourcing and wage suppression destroy the US tech industry...positioning the country far behind other countries in technology supremacy for at least a decade to come
You don’t have a god given right as an American to a mid-high six figure income working in tech and I’m not sure how you can see Anthropic playing whack-a-mole trying to keep the Chinese from distilling their models and say that the US isn’t still on the inside track with this stuff.
Corporations don’t have a God-given right to manipulate the market by outsourcing the work to the detriment of the natives, yet they make good use of it!
It started long before tech. It started with manufacturing. Everything has been replaced with financial manipulation and rent-seeking.
And yet, CS was the fastest growing employment category with some of the highest and among the fastest growing salaries in the nation for the past 2 decades despite the fact that outsourcing has been around since the 90s.
The fact that this changed in the last 2 years right when AI became feasible is just a coincidence. It’s actually finally outsourcing destroying the grad job market nearly 3 decades later.
Nothing changed in the past 2 years except we had another recession. Just like very other recession, people are looking for something to blame. Out sourcing has long been a popular thing to blame. This time blaming AI is the fad. There will be more recessions in the future and people will blame some other fad next time (or perhaps the same). In the end though it is the economy.
There was a golden age (2010-2020 or so?) when a CS degree of as basically an easy ticket to the upper middle class. Unlike similar well-paying jobs like law or medicine, a bachelor's degree was enough. Now it's not easy to get a job as a new grad, so there are fewer people getting into the discipline whose primary reason to study it was money.
CS started as a major for "computer nerds" who were very into computers. Computers were a hobby for virtually every one of my CS classmates in the 90's. Studying computer science was an extension of that interest, so majoring in CS made sense.
Fast forward a decade or two, and it is like you said, people who don't have a strong interest in computers starting taking CS as a major as a path to jobs and income.
Now, as a manager of engineering teams, I'm constantly surprised by Software Engineers that don't even own their own computers and/or have very little knowledge about how they work.
So you hire people who “don’t even own their own computers?” You might be the problem then.
Everyone I knew in CS thought the same way - a basic CS degree was enough, some were even audacious enough to drop out without a degree "because it's tech" and they thought they'd make it without one. Well, spoiler alert, that's not how things have been playing out. Companies are looking for reasons to fire some and give way to others, and the first ones to go have been people with the least credentials and education.
Anecdotally, over 10 years ago, I recognized that with the pace of technology and advancement of machine learning - everybody would be doing that same Bachelor in CS so I went for a quick PhD, which I believed was the ticket to success since it was a way to become much more sought out and valuable than my peers with a Bachelors.
I still believe that golden ticket to the upper middle class is there, it's simply hiding behind more credentials now. The high-paying jobs are all going to folks with credentials or folks with S-tier projects from FAANG or MANGO or whatever else you want to call it
CS was viewed as a golden ticket at least as far back as 2000. Even then, there were people doing the degree who had no interest in the subject, but thought it would get them a good salary.
There have always been ups and downs in the economy. This isn't the first recession where programmers were hit harder than other jobs. There have been other recessions where programmers were one of the few places where everyone was hiring.
You can substitute every single possible profession for programmers above and still have a trust. Even things like groceries (everyone needs to eat) still has ups and downs (some recessions people go to restaurants less and so are good; others they switch to low margin staples and profits go down).
> There was a golden age (2010-2020 or so?)
Also during the Dot Com era. Pretty much every cycle lead to more people getting into the field.
CS hiring has always been very cyclical, and there have been more flush times prior to 2010 where CS more or less guaranteed a good paying job.
People realised that acamedia is not set up to train people for jobs, but set up to teach people.
These two roles are at odds with each other.
Care to elaborate? This sounds like a prelude to an argument to funnel more people into vocational schools/more funding for vocational schools.
(Not a criticism! I don't personally feel informed enough to have an opinion on this subject.)
We have an oversupply of CS majors. Lot's of people came to the craft chasing the money.
It's probably a good thing that the hype starts to die and we're seeing a market correction, hopefully back to a saner structure.
When I attended back in the late 90’s, there was a view that once y2k was over and that crisis was dealt with the industry would collapse and there wouldn’t be any jobs.
What’s happening now reminds me a lot of that.
https://archive.ph/IbTFW
This just reminds me of that old meme where a product manager says something like "Why am I paying you $200k/yr when I can just copy code from StackOverflow?" and the engineer responded "You pay me that because I make sure the code you copied is the right code". I have a feeling we're in a similar situation here for AI. Sure, anyone can create AI slop code, but to fully take advantage of it, you need someone that understands the whole chain of design and development to make it fully work and integrate with existing systems.
I think we're in for an era where many folks will be filtered out and those who know and understand code, will be in high-demand.
I think AI emphazes brain dead computer science and script kiddy culture. It just lowers the bar enough to make bad ideas easy enough to implement quickly but good ideas still take longer to produce and argue for. Maybe its a skill issue on my part but I've watch my team rebuild a model I maintain, with AI for been estimating performance changes based on trace following. The Model isn't accurate and was build to bypass working on the real model. They spend someone's full time work for 4 months at this point but the thing they wanted modeled took 1 day by just adding it to the real model.
The managers and everyone are so excited by the fact the person did it with AI but I just get really confused because it seems like they just made some worse that has less value because it cannot actually correctly simulate the thing we want to test. Maybe i am being petty and salty but I think the that this is time wasted by any measure. And net-negative value but the team wants to emphasize we are using AI. There have been some productive uses but the productivity trap-doors are about the same as with normal development just people seem more willing to take the trap door ideas now.
I was just at the largest career expo for high schoolers in the greater NYC area yesterday. It’s anecdotal but the two most asked about majors at the fair were #1 mechanical engineering and #2 computer science. I gave away all the materials we had and I had left thinking “this will be plenty”.
So let’s just wait a bit before we say it hit a wall.
It is easy to forget sometimes in the excitement, but nobody has been using (2026) AI for 20 years. We're all still new. I am sure that in the next year, something will be found that is fairly exciting, and something we could all be doing right now, but it's simply that nobody has thought of it yet. Or something that is today common practice will become generally considered an anti-pattern and common practice will have some replacement for it that, again, nothing stops us from doing it today but nobody has thought of it yet, because we're all newbs.
(One candidate example for this is the discussion I've seen in the last few days about not trying to negate something, to say "Don't do X", but instead stay positive because eventually the negation gets lost in the context window and you're better off just not putting the idea in the LLM's mind at all, where "Don't do X" comes to be seen as an LLM antipattern.)
One of the consequences of none of us having used AI for long enough is that we don't know how to onboard developers in an age of AI. This will be, by necessity, transient. Eventually we're going to max out what a person can do and we'll need more people. The supply of existing engineers will be limited. We will be forced to discover how to onboard new engineers.
But at the moment we've got our hands full, and we don't know how to do it.
The irony is, the best time to join a field is often exactly when the enrollment dips and the worst can be precisely when it is the most popular. Start a programming college program today and the odds that in 4 years we'll have onboarding figured out and have developed some sort of need for fresh developers is pretty decent.
But I don't know what to do about the fact that the standard CS curriculum was already of debatable relevance to me in the late 90s and I don't know of what relevance it will be in four years except to guess that it very likely to be even less. I do know that we are again affected by the fact nobody has been doing this for 20 years, like I mentioned above. There is no body of "wisdom" for an AI-powered world to draw on to construct a new curriculum. Universities would be inclined to do the obvious thing and try to chase our current practices with AI but those aren't going to be stable enough to build a curriculum on any time soon, and a real fundamentals-based curriculum may involve less AI than people may think.
I know one advantage I have over my younger peers at this point is just a knowledge of what terms to say to the AI to get it to do what I want, words like "event sourced" or "message bus" or "stored procedures", where simply knowing that the concept exists is the bottleneck. I could see a programming curriculum based on touring through a whole whackload of concepts with their pros and cons, or at least, where that is a much larger portion of it.
Ask me in 5 years though and I'd almost certainly suggest a completely different curriculum than I would now, though.
My local post-secondary dont teach AI at all. Not even like a teaser course or anything.
Technically speaking, they are leftists who publicly oppose AI. They created the new Chief of AI Officer who has no support at all from the univeristy, had to go to politicians for support.
Whereas the college straight up opposes AI.
But what value is any of their degrees anymore? Suspicious at best.
> they are leftists who publicly oppose AI. They created the new Chief of AI Officer who has no support at all from the univeristy
The "leftist" administration created a position while at the same time speaking out against AI? Doesn't seem realistic.
> Whereas the college straight up opposes AI.
Opposes AI in what way? No courses on it? Does not allow students to utilize it? I have a hard time believing they do not offer a single course on any AI subject. Many colleges are offering it as a post-grad option, at least in Canada.
> But what value is any of their degrees anymore? Suspicious at best.
In general? I don't understand what you are getting at here.
So they don't have a machine learning class?
Teaching AI is a rather large field are you talking about LLMs/transformers? Are you talking about working with LLMs, which is something that seems to change every 6 months?
[dead]
[flagged]
We can count on the racists to come out of the woodworks to comment on this.
[flagged]
Every racist claims to be justified in their prejudices against and generalizations of large groups of people.