Still not clear to me what is meant by "ai" now? My sense is that it is a marketing term for LLM. Is that accurate? Do people now consider any ML project to be ai?
That's what I was alluding to, I don't think it defines ai, do you? These pieces seem like classical ML pieces to me plus LLM. Is that ai? Like from a technical standpoint, is it clearly defined?
I thought this was going to be satire. Software engineer job titles are already pretty bogus (Senior Principal Distinguished Engineer, anyone?), and the AI trend has only created more jobs with nebulous descriptions around "doing AI".
Seems about right. My official title at work is "AI Engineer". What does that mean exactly?
- I'm not a researcher and not fine tuning or deploying models on GPUs
- I have a math/traditional ML background, but my explanation of how transformers, tokenizers, etc work would be hand-wavy at best.
- I'm a "regular engineer" in the sense I'm following many of the standard SWE/SDLC practices in my org.
- I'm exclusively focused on building AI features for our product, I wear a PM hat too.
- I'm pretty tuned in to the latest model releases and capabilities of frontier models, and consider being able to articulate that information part of my job.
- I also use AI heavily to produce code, which is helpfully a pretty good way to get a sense for model capabilities.
Do I deserve a special job title...maybe? I think there's definitely an argument that "AI Engineering" really isn't a special thing, and considering how much of my day to day is pure integration work with the actual product, I can see that. OTOH, part of my job and my value at work is very product based. I pay a lot of attention to what other people in the industry are doing, new model releases, and how others are building things, since it's such a new area and there's no "standard playbook" yet for many things.
I actually quite enjoy it since there's a ton of opportunity to be creative. When AI first started becoming big I thought about doing the other direction - leveraging my math/ML background to get deeper into GPUs and MLOps/research-lite kind of work. Instead I went in a more producty direction, which I don't regret yet.
We all know the AI part is largely meaningless because of the hype and nonsense, but what defines you as an engineer? When you consider that classical engineers are responsible for the correctness of their work, combining it with AI seems like a joke
> "When you consider that classical engineers are responsible for the correctness of their work"
Woah hang on, I think this betrays a severe misunderstanding of what engineers do.
FWIW I was trained as a classical engineer (mechanical), but pretty much just write code these days. But I did have a past life as a not-SWE.
Most classical engineering fields deal with probabilistic system components all of the time. In fact I'd go as far as to say that inability to deal with probabilistic components is disqualifying from many engineering endeavors.
Process engineers for example have to account for human error rates. On a given production line with humans in a loop, the operators will sometimes screw up. Designing systems to detect these errors (which are highly probabilistic!), mitigate them, and reduce the occurrence rates of such errors is a huge part of the job.
Likewise even for regular mechanical engineers, there are probabilistic variances in manufacturing tolerances. Your specs are always given with confidence intervals (this metal sheet is 1mm thick +- 0.05mm) because of this. All of the designs you work on specifically account for this (hence safety margins!). The ways in which these probabilities combine and interact is a serious field of study.
Software engineering is unlike traditional engineering disciplines in that for most of its lifetime it's had the luxury of purely deterministic expectations. This is not true in nearly every other type of engineering.
If anything the advent of ML has introduced this element to software, and the ability to actually work with probabilistic outcomes is what separates those who are serious about this stuff vs. demoware hot air blowers.
You're right in a descriptive manner, but I also think the parent comment's point is about correctness and not determinism.
In other engineering fields correctness-related-guarantees can often be phrased in probabilistic ways, e.g. "This bridge will withstand a 10-year flood event but not a 100-year flood event", but underneath those guarantees are hard deterministic load estimates with appropriate error margins.
And I think that's where the core disagreement between you and the parent comment lies. I think they're trying to say AI generated code-pushers are often getting fuzzy on speccing out the behavior guarantees of their own software. In some ways the software industry has _always_ been bad at this, despite working with deterministic math, surprise software bugs are plentiful, but vibe-coding takes this to another level.
(This is my best-case charitable understanding of what they're saying, but also happens to be where I stand)
Nicely said, I'm going to borrow some language here. I've talked a little to my coworkers about how it's possible the future of SWE looks more like "build a complex system with AI and test it to death to make sure it fits inside the performance envelope you require".
Hard to tell what you're even trying to say here. I am obviously responsible for the correctness of my work. "AI Engineer" does not generally mean "AI-Assisted Engineer", thought that was clear from my post.
Who cares? The word "engineer" is meaningless now and anyone can be a self-proclaimed engineer whenever they feel like it. Will anyone double check or even reject you for an engineering job when you are not? Absolutely not! Take a bootcamp, submit plenty of PRs correcting typos, and pass the interview with the help of AI and you basically made it, dreams come true!!
Agree 100%, even blue collar workers guard their profession. Hell, I was talking to a friend and they rejected her for a retail job because she had never worked in retail before. Engineering on the other hand has zero gatekeeping - it's a sign spinner job right now. Just do a few humiliation rituals like daily standup and you're the perfect candidate!
> what do you think of recent MIT news that 95% gen ai projects don't do anything valuable at all ?
Worth noting that a project that ends up “doing nothing” isn’t the same as a project that had/created no value.
Even some projects that in hindsight were deterministic lemons.
Assuming compute resources continue scaling up, and architectures keep improving, AI change now has an everything, everywhere, all the time, scope. Failing fast is necessarily going to have a substantial dimension.
They could just call it "Field Service Tech" like the rest of the universe. I understand using title inflation/deflation to keep pushing the engineer title (and pay expectation) into the dirt, but still, this is dumb.
Still not clear to me what is meant by "ai" now? My sense is that it is a marketing term for LLM. Is that accurate? Do people now consider any ML project to be ai?
You should read the post. You might find the “domain” discussion interesting.
That's what I was alluding to, I don't think it defines ai, do you? These pieces seem like classical ML pieces to me plus LLM. Is that ai? Like from a technical standpoint, is it clearly defined?
I thought this was going to be satire. Software engineer job titles are already pretty bogus (Senior Principal Distinguished Engineer, anyone?), and the AI trend has only created more jobs with nebulous descriptions around "doing AI".
I wanna just be a webmaster again.
Yeah, same. I was actually disappointed when I saw that they were taking the titles seriously
Should’ve included “Member of Technical Staff”
Seems about right. My official title at work is "AI Engineer". What does that mean exactly?
- I'm not a researcher and not fine tuning or deploying models on GPUs
- I have a math/traditional ML background, but my explanation of how transformers, tokenizers, etc work would be hand-wavy at best.
- I'm a "regular engineer" in the sense I'm following many of the standard SWE/SDLC practices in my org.
- I'm exclusively focused on building AI features for our product, I wear a PM hat too.
- I'm pretty tuned in to the latest model releases and capabilities of frontier models, and consider being able to articulate that information part of my job.
- I also use AI heavily to produce code, which is helpfully a pretty good way to get a sense for model capabilities.
Do I deserve a special job title...maybe? I think there's definitely an argument that "AI Engineering" really isn't a special thing, and considering how much of my day to day is pure integration work with the actual product, I can see that. OTOH, part of my job and my value at work is very product based. I pay a lot of attention to what other people in the industry are doing, new model releases, and how others are building things, since it's such a new area and there's no "standard playbook" yet for many things.
I actually quite enjoy it since there's a ton of opportunity to be creative. When AI first started becoming big I thought about doing the other direction - leveraging my math/ML background to get deeper into GPUs and MLOps/research-lite kind of work. Instead I went in a more producty direction, which I don't regret yet.
We all know the AI part is largely meaningless because of the hype and nonsense, but what defines you as an engineer? When you consider that classical engineers are responsible for the correctness of their work, combining it with AI seems like a joke
> "When you consider that classical engineers are responsible for the correctness of their work"
Woah hang on, I think this betrays a severe misunderstanding of what engineers do.
FWIW I was trained as a classical engineer (mechanical), but pretty much just write code these days. But I did have a past life as a not-SWE.
Most classical engineering fields deal with probabilistic system components all of the time. In fact I'd go as far as to say that inability to deal with probabilistic components is disqualifying from many engineering endeavors.
Process engineers for example have to account for human error rates. On a given production line with humans in a loop, the operators will sometimes screw up. Designing systems to detect these errors (which are highly probabilistic!), mitigate them, and reduce the occurrence rates of such errors is a huge part of the job.
Likewise even for regular mechanical engineers, there are probabilistic variances in manufacturing tolerances. Your specs are always given with confidence intervals (this metal sheet is 1mm thick +- 0.05mm) because of this. All of the designs you work on specifically account for this (hence safety margins!). The ways in which these probabilities combine and interact is a serious field of study.
Software engineering is unlike traditional engineering disciplines in that for most of its lifetime it's had the luxury of purely deterministic expectations. This is not true in nearly every other type of engineering.
If anything the advent of ML has introduced this element to software, and the ability to actually work with probabilistic outcomes is what separates those who are serious about this stuff vs. demoware hot air blowers.
You're right in a descriptive manner, but I also think the parent comment's point is about correctness and not determinism.
In other engineering fields correctness-related-guarantees can often be phrased in probabilistic ways, e.g. "This bridge will withstand a 10-year flood event but not a 100-year flood event", but underneath those guarantees are hard deterministic load estimates with appropriate error margins.
And I think that's where the core disagreement between you and the parent comment lies. I think they're trying to say AI generated code-pushers are often getting fuzzy on speccing out the behavior guarantees of their own software. In some ways the software industry has _always_ been bad at this, despite working with deterministic math, surprise software bugs are plentiful, but vibe-coding takes this to another level.
(This is my best-case charitable understanding of what they're saying, but also happens to be where I stand)
Nicely said, I'm going to borrow some language here. I've talked a little to my coworkers about how it's possible the future of SWE looks more like "build a complex system with AI and test it to death to make sure it fits inside the performance envelope you require".
I will be thinking about this comment for a bit. Thanks for this perspective!
This comment is excellent.
Hard to tell what you're even trying to say here. I am obviously responsible for the correctness of my work. "AI Engineer" does not generally mean "AI-Assisted Engineer", thought that was clear from my post.
Who cares? The word "engineer" is meaningless now and anyone can be a self-proclaimed engineer whenever they feel like it. Will anyone double check or even reject you for an engineering job when you are not? Absolutely not! Take a bootcamp, submit plenty of PRs correcting typos, and pass the interview with the help of AI and you basically made it, dreams come true!!
Yeah the term "engineer" has been diluted into oblivion, and we only have ourselves to blame for not protecting it.
In German you're not even an engineer if you don't sometimes wear a hard hat or hold a screwdriver.
Agree 100%, even blue collar workers guard their profession. Hell, I was talking to a friend and they rejected her for a retail job because she had never worked in retail before. Engineering on the other hand has zero gatekeeping - it's a sign spinner job right now. Just do a few humiliation rituals like daily standup and you're the perfect candidate!
> I pay a lot of attention to what other people in the industry are doing, new model releases, and how others are building things,
what do you think of recent MIT news that 95% gen ai projects don't do anything valuable at all ?
> what do you think of recent MIT news that 95% gen ai projects don't do anything valuable at all ?
Worth noting that a project that ends up “doing nothing” isn’t the same as a project that had/created no value.
Even some projects that in hindsight were deterministic lemons.
Assuming compute resources continue scaling up, and architectures keep improving, AI change now has an everything, everywhere, all the time, scope. Failing fast is necessarily going to have a substantial dimension.
Sounds kind of aggressive but probably the number is up there.
"Forward Deployed Engineer" is a bodyshop with LLM.
Pretty sure this title came from Palentir who got it from the military.
They could just call it "Field Service Tech" like the rest of the universe. I understand using title inflation/deflation to keep pushing the engineer title (and pay expectation) into the dirt, but still, this is dumb.
I also dislike the term. It feels concocted to evoke “tacticool” vibes.
Unless you’re pushing new firmware onto a drone in Ukraine, FDE is stolen valor.