Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In get the sense that what you are responding to and even many comments to yours are expressing a kind of coping with the current dynamic, only exacerbated by the rather elitist and egoistic mentality that people in tech have had for a very long time now; i.e., they are falling…being pushed from Mt Olympus and there is A LOT of anxious rationalization going on.

Not a mere 5 years ago even tech people were chortling down their upturned noses that people were complaining that their jobs were being “taken”, and now that the turns have tabled, there is a bunch of denial, anger, and grief going on, maybe even some depression as many of the recently unemployed realize the current state of things.

It’s all easy to deride the inferiority of AI when you’re employed in a job doing things as you had been all your career, thinking you cannot be replaced… until you find yourself on the other side of the turn that has tabled.

 help



I use AI for my work every single day - and during some weekends too. Claude Code, with Opus. It is far from being able to reliably produce the code that we need for production. It produces code that looks ok most of the time, but I have seen it lose track of key details, misinterpret requisites and even ignore them sometimes - "on purpose", as in it writing something like "let's not do that requirement, it's not necessary".

This kind of thing happens at least once per day to me, maybe more.

I am not denying that it is useful, let me be clear. It is extremely convenient, especially for mechanical tasks. It has other advantages like quick exploration of other people's code, for example. If my employer didn't provide a corporate account for me, I would pay one from my own pocket.

That said, I agree with OP and the author that it is not reliable when producing code from specs. It does things right, I would say often. That might be good enough for some fields/people. It's good enough for me, too. I however review every line it produces, because I've seen it miss, often, as well.


I think we are in a bit of a trough of people trying to use methods and processes of irrelevant practices, when what is needed for a whole new dynamic, is an adapted and novel set of methods and processes. I suspect we may not get out of it for a number of years until a distinct AI-native generation can start emerging. I have had great effect and know others who have done even and far better than me, and all of them have totally reworked and revised everything about software development processes. Being able to adapt things form first principles seems to be the differentiating factor. I don't like it, but we are probably going to see a whole generation of the past software devs unable or unwilling to adapt to the revolution in the industry that is simply not going to go away.

Unfortunately we will lose things precisely because all that experience and expertise will not be captured and implemented, just like we have lost so man things from the past, like the many different proprietary and secret methods and practices that were jealously guarded by artisans, craftsmen, and artists. But now I've gotten off track a bit. Cheers.


I can't help but imagine that this is how some people felt about doctors once webmd came out.

It's some nice rhetoric, but you're not actually saying much.


As you can see by downvotes and comments, they still don't get it.

LLMs make developers more efficient. That much is obvious to anyone who isn't blinded by fear.

But people will respond "but you still need developers!" True. You don't need nearly as many, though. In fact, with an LLM in their hands, the poor performers are more of a liability than ever. They'll be let go first.

But even the "smart" developers will be subsumed, as vastly more efficient companies outcompete the ones where they work.

Companies with slop-tolerant architectures will take over every industry. They'll have humans working there. But not many.


> LLMs make developers more efficient.

They do not. I review a ton of code, and while the quantity is going up, the quality of that code is getting worse. LLMs only make developers more efficient if they skip the due-diligence required to verify its output; they all say they don't, and almost all of them do.


Probably it's not a perston you're answering to, so there no point to try to have a reasonable conversation.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: