Do I belong in tech anymore?

(ky.fyi)

94 points | by patrikcsak 10 hours ago ago

68 comments

  • piloto_ciego 3 minutes ago

    Hey OP, I quit my job and said "screw it" at the start of the Year for very similar reasons.

    I had a "good" job, it was extremely stable and in the public sector, the work hypothetically mattered... I was miserable because it didn't matter. If I would have died in my study, the system would have happily churned on accomplishing nothing without me. There were so many many obstacles to accomplishing anything too, like I'm all about "perfect shouldn't be the enemy of good" - but hypothetically we should do something. I went on vacation in November and when I got back the latest ServiceNow update nuked a bunch of the changes I had worked for months trying to get done.

    I quit at the start of the year and honestly, it's been great? Not fast, not suddenly lucrative, but I've been taking it slow. I'm literally building little vibe-engineered tools for local companies. I can now do what would have taken me a team to do by myself, it is paying (albeit slowly), it's fun, and I have time to do the things I care about in this life.

    Don't work for the man. Your job cannot love you back, in fact, it actively hates you.

  • dgb23 an hour ago

    > The point of a code review is not simply for good code to make it into a codebase, but to build institutional knowledge as people debate and iterate and compromise, slow as it may be.

    I feel like this is a very profound insight.

    Of course processes like this can become about the immediate utility. Reviewing is then checking work so, it can be merged and used.

    But the process is more about us than the code. And we lose the deeper part when we only care about the superficial one.

    • nunez 10 minutes ago

      it's an extension of the principle that the purpose of writing code is to write it for your successor.

  • erentz an hour ago

    The way AI is being used feels like it is proving that, in many orgs, what has always mattered has been the appearance of work, not results of work. Will we wake up in a few years and find out we’ve fired all the doers and are now overloaded with the fakers?

    • dewey an hour ago

      I find that to be a very defeatist take. It always mattered how much value you provide to the business. Writing pretty code or arguing about some implementation detail never really mattered. If you are good at coming up with solutions to problems AI is just one additional tool in your toolbox and personally it allows me to do much more than before.

      There were fakers before, and there will be fakers after.

      • Paul-Craft 35 minutes ago

        Are you willing to wake up at 3 AM when that "valuable" AI-written code pages on-call?

        I agree there is some value in AI tools, but implementation details do matter. People shouldn't be pushing unread code to prod. That's how you end up with security holes and other bugs. That's how you end up dropping millions of orders on Amazon.com.

        • dd8601fn 20 minutes ago

          I think the last ten+ years has taught us that massive security breaches are more of an insurance claim problem and some $4/mo credit monitoring payouts.

          And major corporations certainly don’t seem to care that much about leaving massive amounts of money on the table from jr level tech issues. I see it all the time. I mentioned a few from Walmart, Meta, and Amazon recently.

          Everyone talks like these things matter, but the results say everyone is just playing pretend.

          • Paul-Craft 17 minutes ago

            Excuse me? Amazon lost more money in one day than most companies have in revenue, from dropped orders. I would say that matters. Believe it or not, the systems we work on do things that matter in the real world.

            • dd8601fn 10 minutes ago

              I would too. I’m saying businesses don’t seem to. At least not like we assume.

        • dewey 11 minutes ago

          People pushed unread and buggy code to production long before AI.

    • cineticdaffodil 38 minutes ago

      Actually i think we will see a faker take over and then a doer conquest. All those going now take the recipe with them and are capable of cooking it elsewhere. Elsewhere being a place without ai management.

  • nunez 4 minutes ago

    Good article.

    I want to zoom in on the rise of AI notetakers. AI that generates transcripts alongside recorded video that you can watch later? Amazing. I can catch up later and find people asyhc if I need more info; the videos are discoverable/shareable and anyone who needs to be in the know can be. AI notetakers that give you a summary and nothing else? Useless. These generat concepts of overviews and tend to miss small, but, key details.

    I'd rather (and often do) take notes manually than turn on the notetaker.

  • AbbeFaria an hour ago

    I work at MSFT and I feel burnt out too and am in a similar situation where I feel like resigning would be better for my mental health but AI isn’t a big contributing factor. I do have some arguments against speculative uses of AI though.

    Experimenting with speculative uses is fine, technological breakthroughs require lot of iterations and some would naturally never make it but with the enormous amounts of capex that companies are investing, these have to impact the top line and eventually the bottom line as well. I just don’t see that happening now, I could be wrong.

    1. To me speculative uses of AI like meeting notes summarisers seem to add little value if at all. First off, most meetings are performative work especially at big companies. Add to this, when someone just casually pastes the meeting notes from an AI summary and asks the meeting organiser to “pls check for correctness”, my blood just boils. Are we spending billions of dollars of capex for this ?

    2. Every team builds their own “agent” for diagnosing incidents which is announced to huge fanfare but people rarely end up using it irl.

    3. Devs and PMs chasing “volume” of work. You prompt GPT for an issue and it is bound to give you pages of text that you can use to show how much of output you can churn. I have seen excessively verbose design docs that only the writer (and prompter) could understand and all this was accepted because “Hey, I used AI for this and it must be good”.

    There are legit uses of AI and I do have a 20$ Claude subscription which I like and use but at big companies they are shoving AI into every nook and cranny hoping it shows up in the top line and bottom line and so far it doesn’t add up.

    Lot of these uses are driven by fear, by repeated exhortations from upper management about shoving AI into every nook and cranny when they are just as much clueless as us. People’s mortgages, their children’s education and their retirement, in short their whole livelihoods are at stake even more so when companies will happily lay off workers without a second thought. So people have to use AI even when it adds questionable value, if at all.

    I am not resistant to change and am not an AI Luddite. I am happy to use AI to become a better developer but most current use cases seem to add questionable value.

    • jimmydoe 35 minutes ago

      CEO see performative work happening as the cut is still not deep enough.

      • Paul-Craft 15 minutes ago

        Can you add in the missing words that make this comment make sense, please?

  • zby 23 minutes ago

    This report lists failures of some AI systems. They look consequential - but the company does not seem to care. This is very strange - how can it be? I really like AI products they help me all the time - but I know I need to take into account their failure modes and be careful. But lots of organisations don't seem to do that calculation. Will competition root them out? I don't know - I am so enthusiastic about AI - but ever after the LangChain situation I can see that what is adopted is always something that has a lot of flows. The more careful developers that notice the flaws and try to find true workarounds fail because it takes time to do the design well. It is not new thing - there were Betamax mourners for decades - but it seems that the hype machine is now more and more powerful.

  • arcfour 26 minutes ago

    While I certainly relate to some of your points, and I'm not an AI maximalist by any means, a few thoughts:

    > You join a meeting with a coworker. Your coworker has enabled an AI tool to automatically take notes and summarize the meeting. They do not ask for consent to turn it on. The tool mischaracterizes what you discuss.

    Asking for consent to what is more or less meeting transcription (already enabled, presumably) seems a little odd. If you don't like it, why not just talk to the coworker and ask them not to use it? Offer to take notes yourself, perhaps.

    > A team lead adds an AI chatbot to a Slack channel. Anyone can tag the bot to answer questions about the company’s products. Coworkers tag the chatbot many times a day. You never see someone check that the bot’s responses are correct.

    Why would that happen in the Slack channel? Presumably you'd be googling it or reading documentation to do this, not posting in the channel.

    > An engineer adds 12,000 lines of code affecting your app’s authentication. They ask that it be reviewed and merged same-day. Another engineer enlists a “swarm” of AI agents to review the code. The code merges with no one having read the full set of changes.

    This is an insanely reckless thing to do with or without AI. If this actually happened at your company...I think there were deeper issues than overuse of AI.

    > One of your pull requests has been open for a few days. You ask other engineers to leave a code review. Minutes later, an engineer pastes a review that was generated by an AI tool. There are no additional thoughts of their own.

    Again, I think you should communicate with your coworkers on this. Possibly even bring it up in 1 on 1s with your manager. Not "I want to discourage use of AI" but "copying and pasting AI responses shows a lack of respect for others' time" and "lack of due diligence," show a horror story of an AI deleting someone's PROD database, etc. it's a useful but imperfect tool, not a replacement for thought.

  • coinfused 10 hours ago

    I think a lot of people relate with this but kind of sit with this silently for reasons the author mentioned:

    “ Would initiating these discussions result in interpersonal stress? Should I just let things slide? Would I become known as a “difficult” coworker for pushing back on AI use? Does any of it really matter? Does anyone really care? “

  • ej88 37 minutes ago

    "The psychic toll of AI" -- It's sad, but each of these scenarios (barring the AI notetaker, which I haven't found to be an issue personally but ymmv) are indicative more of the culture of the company than the tool itself. From my experience it seems like the most frontier companies have the best AI-use culture.

    I work at a very 'AI-pilled' company, but:

    - Everyone reads and reviews every PR and leaves human comments

    - Documentation is written well and tended to by humans

    - There's no 'AI mandate'

    - Whether features are possible are first explored by an agent but manually traced by a human through the codebase

    You can treat AI like a very powerful tool to augment you and run your agent swarms at the same time.

    • maplethorpe 10 minutes ago

      Are there any companies that aren't AI-pilled at this point?

  • LVB an hour ago

    Can definitely relate. It is no more complicated than I really enjoyed designing and writing code by hand, and get very little joy out of agentic processes. I use the tools and see the velocity increase, but it has just become… bland work. I completely get others’ excitement around the tools and the newfound “super powers”, but it hasn’t much resonated with me.

    That’s ok! I was fascinated by coding when many others weren’t and found a great career as a result. A different cohort will love Development 2.0.

  • somesortofthing an hour ago

    Obviously the author's experience is a nightmare but what was this place like pre-AI? I have a hard time believing people who are this willing to hand over all of their thinking to LLMs were doing anything productive beforehand.

    • dgb23 44 minutes ago

      I think you must be right to _some_ degree. The article illustrates that this org doesn’t know why they are doing certain things.

      But there‘s something psychologically powerful happening with the interaction of AI. I think we overestimate our ability to be rational and underestimate how essily influenced we are.

  • rkagerer 31 minutes ago

    I feel like all this hype around generated code overlooks a distinct opportunity for enterprises which focus on excellent, clean, maintainable, curated code - baked by humans, for other humans.

    • rkagerer 18 minutes ago

      We also haven't really seen how generated jank will stand up over time (like, decades) in terms of maintainability. My prediction is you'll encounter a lot more disposable software. That's fine for making general code more of a commodity (cheap and accessible), but where you get commodities you eventually find demand for more premium flavors of product. Those tend to derive from taste and opinion (attributes which, for example, were major success factors of the iPhone at its peak design).

      The act of software development formalizes paradigms, surfaces unknowns and forces their resolution. Traditionally the work product gets better over time as you iterate. My own coarse rule of thumb is on average it takes until version 3 or so - i.e. 3 rewrites - until you to land at the kind of high caliber product that stems from really understanding the problem space and having worked in it extensively enough to have a good mental model and have uncovered the edge cases and hammered out an optimal solution.

      While AI is famous for fast iteration, I expect in cases where the designers wielding the tool lack a deep understanding of what's going on, potentially exacerbated by never actually having to work with the codebase, it may actually turn out to impede their ability to reach that plateau. Not saying this will be true for all use cases, just that the tool makes it seductively easy to fall into that trap.

    • Paul-Craft 28 minutes ago

      What would that look like? In my experience, real production codebases tend to have lots of bugs. Most of them never get prioritized, because features matter more than fixing obscure bugs.

  • Paul-Craft 2 hours ago

    I'm asking myself the same question for a different reason: nobody will even interview me. I've been out of work for a while. Savings are running out. I apparently don't even know how to look for a job anymore.

    • nso an hour ago

      Yeah. Got word I was being laid off in November. Officially because of restructuring, but after having had some conversations it's clear I've been replaced by a junior with a Claude subscription.

      20 years coding experience. Gone through the sweaty junior years, senior, founding engineer, CTO (and back to software Engineering again because it's my preference) -- and now I can't even get an interview with a human.

      Due to unfortunate life events my savings are now all but gone and I don't even know how if I will be able to keep a roof over our heads. It's messed up.

      If anyone is hiring send me a message. I'm a .eu citizen but work have residency in and work out of Mexico.

    • baxtr an hour ago

      The best way to find out: just start. You’ll improve along the way. Questions like this (and anxiety) are best fixed by action.

      • Paul-Craft an hour ago

        I mean, I am. How else would I know nobody wants to interview me? :)

        • baxtr an hour ago

          Fair enough :) wasn’t clear to me from your first comment. It’s definitely pretty tough out there right now.

      • lazyasciiart an hour ago

        When someone says “no one will interview me” this is a pretty unhelpful response.

        • baxtr an hour ago

          My response is probably controversial. But I genuinely think it’s generally helpful advice. Ofc I don’t have any other information than the comment about this person.

    • oldmanhorton an hour ago

      I have no advice to offer, I only wish you good luck. I am still lucky enough to be employed, but when this whole parade ends, I have no idea what comes next - my only skill is programming and related knowledge work. I think the only path forward is to try to jump ship to another white or blue collar industry…

      • Paul-Craft an hour ago

        I thought along those lines as well. The only thing I could come up with that would be semi-viable was medical school, and I"m not sure I'd survive residency. I definitely would never be able to pay back the debt, if I had to take any.

    • idiotsecant an hour ago

      The era of anyone interested in programming for fun being able to make upper 10% incomes is drawing to a close. You'll unfortunately have to join the rest of us who work for money and program for fun. I suggest engineering (the real kind, not software 'engineering')

      • Paul-Craft 20 minutes ago

        Unfortunately, I have a visual-spatial processing disability. You don't want me near anything mechanical, and I can't do visualization-based tasks because I literally can't visualize. That eliminates most engineering jobs.

        There's also the matter of going back to school, and the associated debt I'd have to take. I'd never be able to pay the loans off if I did that.

    • alephnerd an hour ago

      Where do you live, what are your skills, and what is your citizenship status?

      If you are gunning for a remote job, that's not happening anymore expect for the top 5% of candidates.

      If you are gunning for a job outside of a Tier 1 tech hub like the Bay, NYC, London, TLV, Beijing, Shanghai, Hangzhou, Singapore, BLR, HYD, etc you will have a hard time.

      If you are not up-to-date with modern stacks and the capacities as well as limitations of AI/ML enhanced workflows, you will have a hard time.

      Edit: can't reply

      > Paul-Craft

      Based on your profile below, I am surprised you aren't finding anything in the Bay. It's a hot market right now. Maybe get your resume reviewed?

      > Most of the job openings for humans are remote and not in big tech

      Absolutely agree about the "not in big tech" part, but remote being the majority of tech hiring is absolutely false in 2026.

      > My "default" resume is by ChatGPT; it's essentially my human-written resume, jazzed up a bit for ATS-friendliness

      Go back to using a human written resume. An LLM generated resume is obvious and a negative signal (you could be a bot)

      Also, make sure your resume is 1 page.

      • Paul-Craft an hour ago

        Huh, weird that you can't reply.

        I'm tailoring my resume to individual postings a good portion of the time. My "default" resume is by ChatGPT; it's essentially my human-written resume, jazzed up a bit for ATS-friendliness. There are no hallucinations in it, and I feel it accurately represents my experience.

        • defrost 41 minutes ago

          > Huh, weird that you can't reply.

          It happens to many, it's happened to me three times so far - the mods rate limit (only X comments per Y time period) people who have been flagged, judged, and found to be a bit prone to get in rapid back n forth exchanges that have crossed guidelines.

          It can generally be reversed on request via hn email, sometimes it's a blessing, sometimes it's not even something that impacts a user very often unless they find themselves in an interesting exchange.

      • Paul-Craft an hour ago

        Bay Area, 9 YoE primarily backend, US citizen. I'm familiar with AI coding tools. I've done real work on real systems.

      • dnnddidiej an hour ago

        At this conversation depth thete is no reply button here but you can open the comment by clicking the time "8 hours ago" then reply.

      • sublinear an hour ago

        Most of the job openings for humans are remote and not in big tech, but the pay in absolute terms is significantly lower (same wage percentile for the area you live though).

        It's important to understand the world beyond your bubble. If those jobs seem unrealistic as an option, you may need to consider if your cost of living is unrealistic.

        • Paul-Craft an hour ago

          I'm fine with "not big tech," along with a "not big tech" salary. In fact, I prefer "not big tech." My cost of living is not absurd for the Bay Area. I'd even be willing to take a little less than what I made before. After all, less than before is still better than 0. I'm using AI to tailor my resume to every posting, and still not getting calls.

          • adw 43 minutes ago

            You’ve got nine years of experience, so work your network and get referrals. It’s very hard to get mid-career jobs through the front door; most people want someone they trust to vouch for you.

            • Paul-Craft 19 minutes ago

              I've tried that. They don't have anything for me.

          • sublinear an hour ago

            > not absurd for the Bay Area

            Yeah I was implying you might need to move to optimize for cost of living, but I don't know your situation and am not really asking. It's actually surprising sometimes to hear how long this took to affect some tech workers. You're lucky it's now that housing prices have stabilized (everyone else has stopped moving), and not a few years ago.

            Remote work doesn't necessarily mean you aren't still tethered to some radius. Otherwise I'd be living in Monaco or something haha.

  • brewcejener 10 hours ago

    Thank you for writing this. I didn't realize it, but I feel a lot more of this than I thought.

  • baCist an hour ago

    I see this as a temporary phase driven by AI hype.

    In the long run, strong senior specialists — in design, development, and other IT fields — will likely be more valuable than ever. Meanwhile, those who rely entirely on AI without developing fundamentals may never reach that level.

    AI isn’t really capable of creating truly complex solutions or top-tier UI/UX — it mostly recombines existing ideas.

    So it’s probably better to focus on your craft and avoid burnout — that’s what will matter.

  • cbreynoldson an hour ago

    No comment on the ethics; however, I think when people's instincts to survive kick in, many of these larger goals get sidelined. There's a growing belief that it's now or never as far as accumulating wealth, securing a house, etc. go because people think once AGI comes their chances of having the lives they want will diminish. The bay area has only gotten more expensive to live in, and that's where all of the AI folks are, so no surprise.

    I think in general, if it were cheaper to live, we would see a shift in priorities, what people focus on, etc. More art, less grift.

    Genuinely good people get caught up in rat races trying to reach their ceiling while they can. If they didn't feel that pressure, maybe they'd be doing something else.

    • serial_dev an hour ago

      I genuinely enjoy software development, but if I could provide for my family, I’d also enjoy selling croissants at a local bakery or filling up shelves at the supermarket.

    • MikeNotThePope an hour ago

      I don't think the now or never thinking is healthy, but I certainly understand the motivation. I myself have never really fit into a career path climbing the corporate ladder, and entrepreneurship is a skill that takes time to develop. When you're oscillating between stability and bleeding money, it's natural to want to go all in on an opportunity when it presents itself.

    • sublinear an hour ago

      You can just... not live in California. Most other places are doing just fine and experiencing the usual moderate economic instability that happens every decade or two along with the rest of the world.

      If we do consider the ethics, there's a lot of contradictions built into why someone would want to live there so badly to do the kind of work the blog post is concerned with.

      Their efforts are better rewarded moving their passion into an open source project while keeping a job in tech that they don't care so much about and are qualified for. This is a normal part of growing up. Some people switch careers while others stay in it while decoupling their passions from their paycheck.

      • Paul-Craft an hour ago

        I actually considered that, myself. The thing is, California is where the jobs are for me. If I move out of California, I may never be able to come back. That could cost me a lot.

  • imiric an hour ago

    This resonates a lot with me.

    Long breaks help. Take your mind off of things that bothered you. Do things you enjoy. Which may include tech work, but on your own terms.

    I wouldn't be surprised if you decide to not go back. The status quo of most organizations is grim. But there are still people who care about the same things as you. You can seek them out and work together, much like you did 15 years ago. This is more difficult now among the noise, but you can tune that out. The industry will never recover altogether, but this current period is a blip of high insanity, which will subside in a few years.

    Good luck!

  • coffeebeqn an hour ago

    The worst part so far has been some people have Claude write tickets and they don’t check what the very detailed piece of crap ticket says. Just tell me the few pieces of true knowledge you know rather than a full page of AI slop that has multiple errors in it that causes me to waste hours trying to figure out what’s true

    • an hour ago
      [deleted]
    • dfee an hour ago

      i never got along with tickets, anyway.

  • spaqin 35 minutes ago

    Another problem the author may be facing that if they decide to get back to the tech market and get a new job, it may be difficult with tech still going forward - not in a meaningful way, as computers still compute as before, but enough that lack of experience with a new tool or framework will make them unattractive compared to other candidates.

    Otherwise, if they decide to go into another field that they will be starting from scratch in will pay only a small fraction and whatever lifestyle they were used to will have to change.

  • an hour ago
    [deleted]
  • casey2 an hour ago

    This happened once with open sores now this behavior has turned up to 11. People taking dependencies they don't even know what, full of incorrect code, vulns intentionally or not, delegate everything take no responsibility.

  • an hour ago
    [deleted]
  • bad_username 31 minutes ago

    > Generative AI tools, .. supercharge the spread of disinformation and fascism, ... and concentrate wealth in fewer hands

    People caught up in this line of beliefs generally tend to be more neurotic and unhappy about most things.

  • ThrowawayR2 8 hours ago

    [dead]

  • alex_sf 2 hours ago

    [flagged]