Considering the author is explicitly going into AI research, has an AI-generated profile picture, and claims front-and-centre on their website they are excited about LLMs, I don’t think that analogy works. Or rather, it is like a knitter throwing away their needles to eagerly go work in the loom manufacturing industry.
I dont think many people would be excited at the thought of going from handcrafted artisan knitting to babying machines in the knitting factory. You need a certain type of autism to be into the latter.
So basically, he’s leaving software development because the job market is bad. Instead, he’s joining AI research which (currently) has a more healthy job market. That seems pretty reasonable to me, given that even widely used open source projects are only barely financially viable. Many open source projects end when the author finally gets a girlfriend, this one ends for a new job. Seems like a good outcome to me. Plus truly fascinating presentation.
I like the creativity behind this. And I feel sorry for them that the current wave of AI has lead to them abandoning their pet project. Maybe they will pick up the project again, once the dust has settled. At the end, at least for me, they are pet projects for exactly that reason: An escape route from the day to day business. One should still be proud of what they achieved in their spare time. I don’t care if my job requires me to use K8s, Claude or Docker. Or if that’s considered "industry standard".
(I'm not trying to throw shades at the author. I know they have no obligation to maintain an open source project. I'm simply having a hard time gasping what's happening.)
Seems like the author is abandoning software because in his opinion due to AI explosion employers don't care anymore about code quality and only quality.
I don't get it either, because that has always been the case, thus most of his post is borderline non sense.
Imo, what happened is that he took the opportunity of him entering academia to vent some stuff and quit maintaining his project.
I was thinking exactly the same : I also don't get it (even though I totally get that someone may lose motivation to work on a project, and certainly has no obligation to continue. But this justification sounds a bit weird to me).
Could this mean that he has been approached by some "men in black" asking to insert some backdoor in the code or to stop working on it, together with a gag order ? (actually I also wondered the same a long time ago with Truecrypt, even though to my knowledge no backdoor has ever been found in further audits...)
But in my opinion, it's a bit hypocritical to blame / be mad at LLMs etc. ruining the fun of coding because & then use AI generated profile pictures.
Why not draw your ghibli styled profile picture yourself? Why use an AI generated image? Doesn't using image generators ruin the fun of drawing? Vibe-drawing?
> He criticizes "Large tech companies and investors" for prioritizing "vibe coding," but not a specific company or individual.
You could rewrite this generated response to match artists point of view as well
> you're not helping anything or making some resonant statement with that thing above or your avatar
> Sorry for breathing and producing CO2.
That's not what the commenter argued, and that response is incredibly petty. It's a way to defuse the argument entirely (is that a straw-man? no idea).
The difference I perceive is the split between being the one designing the software, which is what he likes to do, and letting a LLM design the software, without the developer actually understanding what's going on, which is what he dislikes.
(This short comment exchange between us is also a meta commentary on that. Because our comments are much shorter than all the AI summaries, and at the same time, we add nuance and clarification to the ideas exposed, something the AI summaries don't do.)
It was clearly AI generated. So the author is clearly OK with using AI to generate slop in an area they don’t work in, while simultaneously decrying its use in an area they do work in. If they believe so strongly that AI use is destroying their industry, they should reflect on its effect on other industries too (it is well-documented how artists are being negatively impacted).
I agree with the commenters above that it makes the critique fall flat. The
author is saying “This thing is so frustrating and harmful it makes me want to stop working in a field because of it. Oh, by the way, I use this tool myself for other things, and will indeed pivot to contribute directly to them”.
That is to be expected by a project that contains encryption code, I'd say. Maybe their userbase isn’t big enough to report all false-positives and gain the reputation needed.
When I checked a few years back, even a ”hello world” Go application compiled for Windows was flagged as malware by a malware scanner that I investigated.
I’m not at a computer where I could try that hypothesis right now, but back then my conclusion after testing various executables was that any unsigned Go binary would be flagged as a malware.
Just as a rule of thumb, doesn't the fact that only a few unknown vendors flag it—and all of the major vendors do not—indicate something? It would suggest a false positive, wouldn't it?
LLMs are glorified "LMGTFY" tools. AI assistance doesn't make people experts at anything. Some genz vibe coder isn't getting your job guys calm the heck down.
I think people who are afraid that AI coding is going to replace them should try using it a bit seriously. They should be quickly reassured.
What worries me more (on coding related impact of AI - because of course all the impact on fake news and democracies are much more worrying IMO) is having to deal with code written by others using AI (yes, people can write shity code on their own, but with manageable pace)
I'm not worried that LLMs, as they currently and foreseeably (i.e. 18 months) are, will be a good substitute for high quality developers like me.
But oh boy have I seen a lot of mediocre coders get away with mediocrity for a long time — there's a big risk that employers won't care about the quality, just the price, for long enough that the developments in AI are no longer foreseeable.
I’m concerned that AI slop will affect open source projects by tilting the already unfavorable maintainer-contributor balance even more towards low-quality contributions.
Daniel Stenberg (from the curl project) has blogged a bunch about AI slop seeping into vulnerability reports[1], and if the same happens to code contributions, documentation and so forth that can help turning a fun hobby project into something you dread maintaing.
Because a lot of devs were getting a free ride off the back of ten years of zirp money, and firing people is a sure fire way to pump your share options.
The fear may just be a result of thinking about who is making the decisions. I know I'm good, my peers know I'm good. But how far up in management chain does that knowledge go?
I am thinking of that quotation that said [paraphrasing] "90% of my skills went to $0, but the other 10% are now worth 1000x"
This LLM-fuelled rant/departure is a thought-provoking expression of frustration from someone who focused on the 90%, not the 10% -- namely someone willing to handcraft software like an artisan.
I think we're in the mass-production era for code and nobody wants a hand-crafted anything anymore. Automated and mass-produced please. "Quantity has a quality all its own"
what exactly is the 90% that he focused on, and the 10% he didn't?
Is the 90% just hand crafting? I don't believe there can be no place for hand crafting things at all, because there is not any other business or human endeavor in history which has been 100% mass produced without any place for artisans in the market.
Even Automobile production has its place in the market for craftsmanship.
Yes, the AI hype is real, and yes there's a desire to cut costs by using AI within companies. However, I think the maintainer (Evan Su) has a bit of a narrow view on this matter. Evan is still a student in university.
This doesn't mean his perspective or opinion should be disregarded, it's more just I think he's declaring quite a career defining absolute for himself before really having a solid foot in the industry. Frankly, this rant seems kind of fueled by intense doom-scrolling on linkedin rather than by first hand experience.
To be fair, the job market is terrible. Not nearly as attributable to AI as people think, I suspect - I'd pin it on interest rates. Still, I'd consider other things if I had anywhere to go and he's not made a bad choice.
https://github.com/Picocrypt/Picocrypt/issues/134#issuecomme...
This to me is the crux of the whole thing.
Almost like a knitter throwing away their needles because they saw a loom.
Considering the author is explicitly going into AI research, has an AI-generated profile picture, and claims front-and-centre on their website they are excited about LLMs, I don’t think that analogy works. Or rather, it is like a knitter throwing away their needles to eagerly go work in the loom manufacturing industry.
I dont think many people would be excited at the thought of going from handcrafted artisan knitting to babying machines in the knitting factory. You need a certain type of autism to be into the latter.
So basically, he’s leaving software development because the job market is bad. Instead, he’s joining AI research which (currently) has a more healthy job market. That seems pretty reasonable to me, given that even widely used open source projects are only barely financially viable. Many open source projects end when the author finally gets a girlfriend, this one ends for a new job. Seems like a good outcome to me. Plus truly fascinating presentation.
I like the creativity behind this. And I feel sorry for them that the current wave of AI has lead to them abandoning their pet project. Maybe they will pick up the project again, once the dust has settled. At the end, at least for me, they are pet projects for exactly that reason: An escape route from the day to day business. One should still be proud of what they achieved in their spare time. I don’t care if my job requires me to use K8s, Claude or Docker. Or if that’s considered "industry standard".
My projects, my rules.
I don't get it.
(I'm not trying to throw shades at the author. I know they have no obligation to maintain an open source project. I'm simply having a hard time gasping what's happening.)
Seems like the author is abandoning software because in his opinion due to AI explosion employers don't care anymore about code quality and only quality.
I don't get it either, because that has always been the case, thus most of his post is borderline non sense.
Imo, what happened is that he took the opportunity of him entering academia to vent some stuff and quit maintaining his project.
I was thinking exactly the same : I also don't get it (even though I totally get that someone may lose motivation to work on a project, and certainly has no obligation to continue. But this justification sounds a bit weird to me).
Could this mean that he has been approached by some "men in black" asking to insert some backdoor in the code or to stop working on it, together with a gag order ? (actually I also wondered the same a long time ago with Truecrypt, even though to my knowledge no backdoor has ever been found in further audits...)
Nice post really . I like the meta conversation .
But in my opinion, it's a bit hypocritical to blame / be mad at LLMs etc. ruining the fun of coding because & then use AI generated profile pictures.
Why not draw your ghibli styled profile picture yourself? Why use an AI generated image? Doesn't using image generators ruin the fun of drawing? Vibe-drawing?
> He criticizes "Large tech companies and investors" for prioritizing "vibe coding," but not a specific company or individual.
You could rewrite this generated response to match artists point of view as well
lol
I second this, plus their responses are petty.
As an example:
> you're not helping anything or making some resonant statement with that thing above or your avatar
> Sorry for breathing and producing CO2.
That's not what the commenter argued, and that response is incredibly petty. It's a way to defuse the argument entirely (is that a straw-man? no idea).
Wow.
That was strangely...something. Simultaneously not what I expected and yet just nailing the vibe of vibe slop frustration.
It's a nice message and I sympathize with the frustration, but the critique falls flat with the author's decision to pivot into AI research.
The difference I perceive is the split between being the one designing the software, which is what he likes to do, and letting a LLM design the software, without the developer actually understanding what's going on, which is what he dislikes.
(This short comment exchange between us is also a meta commentary on that. Because our comments are much shorter than all the AI summaries, and at the same time, we add nuance and clarification to the ideas exposed, something the AI summaries don't do.)
> Advancements in intelligent AI and LLMs get me excited.
Large and centered on the authors website: https://evansu.com/
Also the AI ghibli pfp...
What's wrong with it?
It was clearly AI generated. So the author is clearly OK with using AI to generate slop in an area they don’t work in, while simultaneously decrying its use in an area they do work in. If they believe so strongly that AI use is destroying their industry, they should reflect on its effect on other industries too (it is well-documented how artists are being negatively impacted).
I agree with the commenters above that it makes the critique fall flat. The author is saying “This thing is so frustrating and harmful it makes me want to stop working in a field because of it. Oh, by the way, I use this tool myself for other things, and will indeed pivot to contribute directly to them”.
yep, it's a very old internet response, something you rarely see now. creative and interesting, in a meta sense.
Seems like a nice project. I'm still a little bit concerned with the VirusTotal result: https://www.virustotal.com/gui/file/81bbdffb92181a11692ec665...
I'm concerned about your comment.
I envision a future when people can't deduct anything by themselves and will rely on automated, flawed systems.
The problem is not knowing why and how the systems are flawed, and therefore being unable to have nuanced and truly accurate decisions.
That is to be expected by a project that contains encryption code, I'd say. Maybe their userbase isn’t big enough to report all false-positives and gain the reputation needed.
When I checked a few years back, even a ”hello world” Go application compiled for Windows was flagged as malware by a malware scanner that I investigated.
I’m not at a computer where I could try that hypothesis right now, but back then my conclusion after testing various executables was that any unsigned Go binary would be flagged as a malware.
Just as a rule of thumb, doesn't the fact that only a few unknown vendors flag it—and all of the major vendors do not—indicate something? It would suggest a false positive, wouldn't it?
I don't think this moment will age well. Is this an attempt to create a personal brand story?
LLMs are glorified "LMGTFY" tools. AI assistance doesn't make people experts at anything. Some genz vibe coder isn't getting your job guys calm the heck down.
I think people who are afraid that AI coding is going to replace them should try using it a bit seriously. They should be quickly reassured.
What worries me more (on coding related impact of AI - because of course all the impact on fake news and democracies are much more worrying IMO) is having to deal with code written by others using AI (yes, people can write shity code on their own, but with manageable pace)
Tell that to C-level executives. They don't understand this, and until then, we, developers, can only be afraid of losing our jobs to a mediocre AI.
I'm not worried that LLMs, as they currently and foreseeably (i.e. 18 months) are, will be a good substitute for high quality developers like me.
But oh boy have I seen a lot of mediocre coders get away with mediocrity for a long time — there's a big risk that employers won't care about the quality, just the price, for long enough that the developments in AI are no longer foreseeable.
Your last sentence should have been: CEOs calm the heck down.
I’m concerned that AI slop will affect open source projects by tilting the already unfavorable maintainer-contributor balance even more towards low-quality contributions.
Daniel Stenberg (from the curl project) has blogged a bunch about AI slop seeping into vulnerability reports[1], and if the same happens to code contributions, documentation and so forth that can help turning a fun hobby project into something you dread maintaing.
[1] https://daniel.haxx.se/blog/2025/07/14/death-by-a-thousand-s...
What does that stand for?
https://googlethatforyou.com/what-is-lmgtfy-meaning.html
LMGTFY is „let me google that for you“
bing it
doesnt change anyones mind when it comes to layoffs
> Some genz vibe coder isn't getting your job guys calm the heck down.
Then why isn't the software job market recovering
Because a lot of devs were getting a free ride off the back of ten years of zirp money, and firing people is a sure fire way to pump your share options.
The fear is like telling on yourself.
The fear may just be a result of thinking about who is making the decisions. I know I'm good, my peers know I'm good. But how far up in management chain does that knowledge go?
The new era Socrates dialogue. With a machine
I am thinking of that quotation that said [paraphrasing] "90% of my skills went to $0, but the other 10% are now worth 1000x"
This LLM-fuelled rant/departure is a thought-provoking expression of frustration from someone who focused on the 90%, not the 10% -- namely someone willing to handcraft software like an artisan.
I think we're in the mass-production era for code and nobody wants a hand-crafted anything anymore. Automated and mass-produced please. "Quantity has a quality all its own"
I think if you believe that 90% of skills went to 0% but the other 10% are worth 1000x, that makes sense.
But even if that's true, the 1000x is going to go to far fewer humans. Maybe you're in the lucky % saved, but a lot of people won't be.
It's interesting to consider. I don't have any takes one way or the other, I'm just observing. I have no idea how all of this works out.
what exactly is the 90% that he focused on, and the 10% he didn't?
Is the 90% just hand crafting? I don't believe there can be no place for hand crafting things at all, because there is not any other business or human endeavor in history which has been 100% mass produced without any place for artisans in the market.
Even Automobile production has its place in the market for craftsmanship.
(income × 0 × 0.9) + (income × 1000 × 0.1) = income × 100
Really looking forwards to the 10000% pay raise
>I think we're in the mass-production era for code and nobody wants a hand-crafted anything anymore.
Not all people like IKEA furniture.
>Quantity has a quality all its own
You understand the meaning of this Stalin attributed quote?
I wouldn’t want that kind of quality in planes, cars, surgical robots, power plants etc.
I think it’s a good time to introduce fines for severe damages by software bugs.
> What the fuck bro
QFT
I think this is just a bit doomerish honestly.
Yes, the AI hype is real, and yes there's a desire to cut costs by using AI within companies. However, I think the maintainer (Evan Su) has a bit of a narrow view on this matter. Evan is still a student in university.
This doesn't mean his perspective or opinion should be disregarded, it's more just I think he's declaring quite a career defining absolute for himself before really having a solid foot in the industry. Frankly, this rant seems kind of fueled by intense doom-scrolling on linkedin rather than by first hand experience.
To be fair, the job market is terrible. Not nearly as attributable to AI as people think, I suspect - I'd pin it on interest rates. Still, I'd consider other things if I had anywhere to go and he's not made a bad choice.