Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>At one point, we wanted a command that would print a hundred random lines from a dictionary file. I thought about the problem for a few minutes, and, when thinking failed

Wow, this guys working on rocket science, everyone. Watch out! We might get replaced!



"Don't be snarky."

"Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith."

https://news.ycombinator.com/newsguidelines.html

Edit: you've unfortunately been breaking the site guidelines a ton lately (examples:

https://news.ycombinator.com/item?id=38215766

https://news.ycombinator.com/item?id=38215679

https://news.ycombinator.com/item?id=38082131

https://news.ycombinator.com/item?id=37800148)

We have to ban accounts that post like that, so if you'd please stop posting like that, we'd appreciate it. I don't want to ban an 8-year-old account, but we don't have much choice if the account is frequently posting like that.


The author makes this conceit in the next sentence

> everybody looks them up anyway. It’s not real programming


> Wow, this guys working on rocket science, everyone. Watch out! We might get replaced!

The issue here is that it doesn't matter what you or I think. What matters is what boardmembers, CEOs and CTOs of companies which start cargo cults. You know.. the mainstream tech companies which everyone else follows. If they start shrinking development budgets (I mean shrinking the workforce) under the guise of 'with GPT, you're a 5x-10x coder now, so you'll only handle integrations and people problems', you or I will still be affected, although not replaced.

So don't think about this in black-or-white terms. The issue is way more nuanced than the capabilities of GPT or replacing a fulltime dev job.


> If they start shrinking development budgets

the outcomes aren't controlled by a reality warping field from these CEO's and CTO's. If dev budgets shrink without actually being justifiable, the software suffers. Those who are made redundant still have their skills, and can offer competing products.


Do you honestly believe that the skills of software devs will never be eclipsed by AI tools?


It'll be interesting to see what happens. But I won't take the opinion of somebody who struggles with a task that I'd expect first year undergraduates to accomplish as the key source of insight here.


What about opinion of the CEOs of OpenAI, Anthropic, pivotal deep learning researchers, etc?


People with a vested interest in being proven correct?


Who else would be more qualified to weigh in though?


Only by a true AGI, but better non-general AI tools will change how the software is developed. Instead I foresee a tragicomedy of LLMs stuck in endless repetitive e-mail threads as more and more APIs are replaced by bad chatbots to keep customers away.


I honestly wouldn't expect it to be the first application, when the technology eventually does get developed, and we have the power budget necessary to actually operate one of these units.

Until then, it's not a question of "eclipsed," it's a labor market, so it's a question of "efficiency" and the large language model people have a very long way to go to produce something that's generally as capable as a programmer.

So.. I guess if your craft is "scripting" you might be able to get by with GPT-4, but to imagine that your child is not going to ever need to "program" is eager nonsense.


You don’t think the technology will improve at all in the next ~16 years? It will just be stuck at it being able to write simple scripts even in 2040?

I fully expect human programming to be completely obsolete by then.


Just like human-driven cars.

I mean, it could happen. I don't know it can't, and I won't be totally shocked if it does.

I'm not totally sold either, though.


> will improve at all in the next ~16 years?

Improve "at all?" Well, of course it will, but you're just moving the goalposts here. I'm telling you I don't think it's going to be anywhere near the level of AGI that can solve most programming tasks with less effort an energy cost than a human in that time.

> It will just be stuck at it being able to write simple scripts even in 2040?

Pretty much. You can improve the technology, but you have some massive gaps you have to cover, and electrical use is going to be one of them. Short a massive improvement in transistor technology or a move to an entirely new computing platform, I don't see it happening in that time.

> I fully expect human programming to be completely obsolete by then.

You seem that you mostly _want_ that to be true, so much so, that you've failed to complete the analysis. What's worse is, I'm just hedging your bet. If I'm wrong, no big deal, if you're wrong, you're in for a world of pain and problems. I get that this is hacker news and untamed expectations are en vogue here, but I'm content to be the hipster on this issue.


Transformer large language modes are about 7 years old. So we are 1/3 of the way from 2017 to 2040 and we’ve gone from hardly being able to string sentences together to being able to write entire scripts coherently - GPT4’s output is often better than mine to be honest, and I’ve been programming almost my whole life.

GPT4’s capabilities are quite close to a human’s even now, especially when asking it about areas that I haven’t specialized in. And now it has vision capability, it can see what it is doing.

With twice the time remaining that has elapsed, clearly there’s plentiful time for its capabilities to increase and for it to get faster and cheaper. And it will not be a linear improvement but an exponential one.

I don’t want programmers to become obsolete. I just consider the likelihood that we have anything to offer over one of these agents in the medium term to be very unlikely. Why would you want to spend $100,000s on a human if you can get something in less time from an AI for $1000s? Human programmers will be attacked on three fronts: price, quality and time. The quality aspect is the only one that is arguable: price and time are already lost.


Are you saying that you honestly do?


Yes


A bit maybe




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: