Ask HN: AI has changed my job for the worse

25 points by yodsanklai 2 days ago

In the span of a few months, my job has completely changed. Most of the code in my team is now written by agents. And most of the focus of my team is to integrate agents in our products.

I'm not interested in the products we're supposed to build, and I don't like the way we're building them. Code quality has suddenly became irrelevant, and you have to keep up with everybody who ship twice as much code as before.

At the same time, there's more pressure on SWEs to deliver as layoffs are looming. I think leadership really believes they'll be able to save a lot by ultimately getting rid of all of us.

I'm not sure what to do at that stage but I'm pretty miserable. It's crazy that this occurred so fast.

SirensOfTitan 4 hours ago

These tools have already peaked usage, and even its greatest proponents are questioning its viability, see:

https://garymarcus.substack.com/p/is-vibe-coding-dying

I'm under the impression a lot of these tools are:

1. Aggressively pushed by VCs on company boards.

2. Productive of code that is not maintainable and becomes very difficult to deal with when it is non-trivial. 3. Not what customers want.

Don't get me wrong, I use LLMs and LLM tech in my work*, they are useful and interesting products, but they are a small part of the work and a small part of the product. Sure, there are people who use them extremely effectively, but those people are offset by those who have LLMs write code they don't understand and do not review (leading to a scenario where code is effectively ghost code often without even provenance back to the LLM that wrote it).

It seems to me that layoffs in tech are partly a cultural contagion in executive circles, but more importantly it seems to me like offshoring is much more responsible.

[*]: Mainly for codemods and reorganization of code, where I'm not really changing the intent of the code but its structure.

stefap2 a day ago

I read what you say, but I can hardly imagine how people can be twice as productive. I use Claude daily for writing various scripts. It produces a lot of code, so yes, if I measured it by generated code, then it's productive. However, it works best for short, clearly defined tasks/scripts. If you want anything longer, you have to watch it like a hawk so it doesn't go off track. And I'm not sure if I would give this code to someone else to use without a thorough review. I would say it may be even less productive (I think there was a paper on that).

I also use LLMs for writing; it's good, but again, you need to carefully read everything and rewrite passages that are completely made up. So I'm not really sure how this can replace people, to the point that Amazon is firing 30,000 people. I have a hard time squaring it that it's because of AI.

y0eswddl 2 days ago

This seems to be par for the field rn. I would say learn the tools for now, do your best to ship code you like, release as many f*cks as you can about what you're building - especially if the product and majority of products belong to someone else, and start putting feelers out for something better to hopefully come along.

It sucks SO much rn, but it seems the majority option is to grin and bear it for the time being and pray to whatever gods you believe in that we get back to something sane sooner than later

sherinjosephroy 5 hours ago

Yeah, that post really hit. When something you used to enjoy turns into constant deadlines and shipping pressure, it sucks the fun out of it. Hard to stay creative when everything feels like a sprint.

ActorNightly 12 hours ago

>and you have to keep up with everybody who ship twice as much code as before

The competitive nature of this regard was always there, it just took a pause during Covid hiring.

FAANG in mid 2010s used to be fairly competitive.5-10% of the people that got the in-person interview got the offer. The stories you heard about work load and stress and PIPs weren't so much about how the expectation was, it was more along the lines of the bottom line of developers who never developed the skill set to be effective at work just couldn't keep up with the people that did.

Companies still grew, so people were still building stuff, and if you knew any of the "top" developers, you saw they weren't stressing. Partially, they enjoyed the work they did and weren't working for the FAANG paycheck only like the majority of people. But mostly its because when it came to doing the work, the biggest advantage that they had is they had the ability to independently figure shit out. This is where "knowing how to code = knowing how to google" and "copy pasting stackoverflow" memes came from. Whereas people that didn't were always slow because they a) were only doing what they have been taught without any interest of actually learning things, and b) they relied on others a lot for guidance.

Then came the covid hiring craze when interest rates were low and companies invested a shit ton of money into products. People needed to get hired. Standards slipped. Lots of people with very low skill got hired that should never have been.

Now, the competitive edge is back on, except this time, its about how to use LLMs. Contrary to popular belief, LLMs take skill to use, not everyone can vibe code services into production. Prompt engineering is very real. To get LLMs output good quality code you have to guide them quite a bit. If you don't have a set of personal prompts you use to get LLMs to do stuff, you are behind.

Thats the nature of the game where new grads can make 100k out of college. That money isn't free. Learn to play the game or don't play at all.

  • disgruntledphd2 3 hours ago

    > Now, the competitive edge is back on, except this time, its about how to use LLMs. Contrary to popular belief, LLMs take skill to use, not everyone can vibe code services into production. Prompt engineering is very real. To get LLMs output good quality code you have to guide them quite a bit. If you don't have a set of personal prompts you use to get LLMs to do stuff, you are behind.

    I dunno man/woman/etc, it seems much more like a context management problem than a prompt issue. Like even with the best prompt in the world, LLMs just make stuff up if they don't have access to better information (tools, RAG etc).

    I wouldn't say that I have great prompting skills, but I'm pretty handy with context management and that seems to be working.

    LLMs are so weird though, like I asked Claude Code to tell me about code in a directory, and it did what I asked for but seemed to get less and less interested as the results went on (rather like a human might). And then it gave me some recommendations/overview that were just garbage, so it's really tricky to know how to debug this stuff and make them better.

    Whatevs, my org just want us to use it so I guess I'll just build some stuff and we can figure out how to make it actually good later. (/end rant).

hashkitly a day ago

You’re not alone—many teams pivoted fast to agent-written code, and it can feel like craftsmanship no longer matters. A few concrete moves:

- Have a candid 1:1: say you’re misaligned with the process, but propose owning guardrails leadership cares about—reliability, security, test coverage, CI policies, prompt/eval hygiene. Suggest measuring outcomes beyond velocity: defect escape rate, change failure rate, MTTR, SLOs, incident cost. - Differentiate where agents are weak: ambiguous requirements, system design, debugging gnarly prod issues, performance tuning, threat modeling, compliance. Volunteer for those areas. - Use AI defensively: generate tests, fuzzers, benchmarks, docs, migrations; prune agent output; write prompts/evals to reduce rework and incidents. - Protect yourself: keep a brag doc with quantified impact, network for internal transfers, and quietly explore roles that still value rigor (fintech, healthcare, infra, aerospace, devtools). - Set a 60–90 day window. If nothing changes, execute an exit plan rather than burn out.

It’s okay to be disillusioned. Your edge now is owning quality, risk, and outcomes—things the org can’t ignore even when throughput is cheap.

wreath a day ago

I feel the same way. Im on the job market (though still employed) and i can tell you my core skills have degraded since using LLMs last year or so where technical (non leetcode!) interviews are now more challenging to me since i forgot how to make all these small decisions (eg should this be a private class property or public).

I decided to just disable codepilot and keep my skills sharp i know we will be called back to clean up the mess. Reminiscent of offshoring in the 2000s

sharts a day ago

Why aren’t we replacing the executives who make the poor business decisions which lead to layoffs with AI agents?

  • achairapart a day ago

    They will. This optimization will bring a future made of one-man companies who will keep busy vibe coding and asking business decisions to AI agents. The answers will be poor as ever, so on that side nothing will change, I guess.

  • helicone 13 hours ago

    because layoffs increase profitability in the short-term. why would you fire the person increasing profitability in the short-term?

propablythrown a day ago

They won't get rid of all, but many of you will be let go for sure.

  • hshdhdhehd a day ago

    Absolutely. Keep putting shit into production and watch the company sink.

jf22 a day ago

I think LLMs can generate tons of production quality code if you put in the time. I do it every day. The output and productivity is amazing.

I'm also really bored and hate that my job is writing specs and stupid prompts.

dannicou 2 days ago

It’s understandable to feel this way — the shift happened incredibly fast, and a lot of teams weren’t given time to adapt. Maybe the real challenge is figuring out how to redefine what meaningful work looks like in this new environment.

journal 2 days ago

be the best at using llm and do talks how to use them get noticed and stay hired.

  • MonaroVXR 2 days ago

    This is only if people believe you

  • kypro 21 hours ago

    The whole point of AI is that you don't need to be the best at anything... You outsource knowledge to AI. AI is an automation tool for knowledge work, you can't secure your career by moving into another form of knowledge work – especially not one with an even lower barrier to entry.

    If AI is going to replace a programmers role, it will 100% replace any LLM expert / LLM prompter role.

    If today:

    Come up with some requirements -> programmer -> app

    Why would the future be:

    Come up with some requirements -> LLM expert -> app

    Instead of:

    Come up with some requirements -> app

sexyman48 2 days ago

You put a manual bookkeeper out of a job. What comes around goes around?

  • JustExAWS 2 days ago

    Manual bookkeepers were put out of a job in the 1980s with VisiCalc

  • y0eswddl 2 days ago

    How. How is a software engineer building saas applications in 2025 replacing a manual bookkeeper?

    • journal 2 days ago

      manual bookkeeper? a quickbooks user? or pen and paper? either way, it's not like any of you are manually entering debits and credits regularly into a journal like they have done forever before computers, databases, and software. we're not losing jobs because of ai but because we're in the middle of switching gas tanks. just a low energy production is causing everyone to panic and blame it on ai. i have so much work i wish i had the money to hire people. they're all just waiting for things to get cheaper. it's not like they don't have enough money to survive indefinitely, if they can make 90mil in two years vs 10mil now, the math is simple, just wait.

      • andrei_says_ 21 hours ago

        Is writing code a true equivalent of data entry?

    • add-sub-mul-div a day ago

      It's just the typical midwit argument that one kind of automation is the same as any other. Best to ignore it.

lschueller 2 days ago

I feel your worries, but sooner or later sde roles will adjust to the new requirements and tools. If a companies business model is at risk by an evolving technology or innovation, it wasn't very good after all. Nonetheless, the skills are still essential to good products imho

dudewhocodes 2 days ago

Everyone in the field seems to be in deep FOMO driven by the other guys also being in a state of FOMO. This creates chaos, delusion and a stressful environment where things are irrational.

I understand your thoughts, we have to keep pushing through this and saner heads will prevail.

keiferski a day ago

Not to be flippant, but - be glad you’re not a writer, because you might not have a job at all.

  • hshdhdhehd a day ago

    There are more writing jobs. Aka prompt engineers.

    • keiferski a day ago

      No, not really. There is no such position, writing is just more becoming a part of a marketer’s job.

      • red-iron-pine a day ago

        what is the average marketer going to do that AI can't?

        the LLM will parse tweets and emails and whatever else and can write it in a perfect, localized vernacular for whichever demographic you want to sell to.

        and it can do it in real-time, 24/7

        at this point you're mostly a bot-herder or social-media-tool admin; the tools are doing the thinking.

        • helicone 13 hours ago

          well, sonny, ur paw's had a long day at the farm

          the cursor-bots needed wranglin' after'n they got into the source code for node.js.

          yep, the linter's on the fritz now, too. i think ol' Jedadiah musta convinced it to act like jar-jar binks. well that jokester just cost us one. too bad we' can't fire'em. he's the only one knows how to turn the servers on.

          i'm gonna go sit on the porch and watch tiktoks for the rest of the night. you be a good boy and tell the OpenRoomba to do its chores.

        • keiferski 8 hours ago

          yeah, that's not really how marketing works.

          Even if it did work that way, the best it gets you is the same outcome as everyone else following the same playbook. Which it not exactly a winning formula for a marketing strategy.

  • andrei_says_ 21 hours ago

    For people who care about quantity of words, maybe.

    For writing as an expression of meaningful thinking, and more so, as a tool for arriving at coherent understanding and thinking (the one that’d eventually produce the writing)… not so much.

    AI slop is slop.

    • keiferski 9 hours ago

      Of course I agree with you, but do you think the average middle manager making hiring/firing decisions cares about this? All they see is someone expensive that ChatGPT can replace.

yincong0822 a day ago

find a project on github and contribute it!

red-iron-pine a day ago

name and shame, so that I can get off of their products ASAP.

vibe code is going to crash those apps and I want to be away from them when it happens.

your job security is probably at risk already, so sharpen up that resume killer -- the market is rough right now.

  • sexyman48 a day ago

    Oh that's a good idea. Put OP's severance check at risk by being fired for cause.

  • helicone 13 hours ago

    dude it's like all of them now. even Google is writing most of its code with AI these days.

paulcole 2 days ago

There’s tons of jobs in the world. Go get a different one if you’re that miserable?

You’ll probably find something to hate about that new job, too.