Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
OpenAI has hired an army of contractors to make basic coding obsolete (semafor.com)
40 points by dkobia on Jan 27, 2023 | hide | past | favorite | 47 comments


> OpenAI’s existing Codex product, launched in Aug. 2021, is designed to translate natural language into code.

For the sake of argument, let's say they succeed at this. They won't, but let's say they do. What does that mean?

It means that instead of specifying the function of software precisely via programming languages developed over many decades of study, thought, and research, you get to specify the function of the software in English, the shittiest programming language I could possibly imagine. If you'd rather build software in English than in a programming language, then you don't really want to build software at all.

Good luck with that.


> If you'd rather build software in English than in a programming language, then you don't really want to build software at all.

To be clear, 99.99% of the market for the results of software does not, in fact, "want to build software at all".

Normals don't want to build an app, they want "an app for that" and can't find one. Those are very different wants.

Sounds as though OpenAI is going after the more common want. If they succeed, they won't need luck.

// Effectively, an OpenAI "ProdGPT" could play PdM (product manager), with immediate build/release iterations, which help the person wanting an app iteratively clarify their thoughts (especially the "I'll know it when I see it" types. Now the user could just converse and converge from skeleton concept to "that's it!" in one ProdGPT convo, because the iteration loop is fast. The conversation to figure out what someone wants is the hard part, success would shorten that loop.


These Normals usually want a button "make it good", however they can't come up with the definition of what "good" means and for whom it is. I have no doubt that coders will be replaced with AI soon. Sooner then anyone here anticipate. However I would charge like $100 per minute for a job of translating wishes of these oblivious idiots into requests to AI. But this would not going to happen. There will be gazillions of jobless coders who will charge $7.5 per hour just to have a job, and they will continue to produce shit-code with AI, but now cheaper then ever.


Heh, this sounds like the description of a genie.

"You will get exactly what you wish for" --warning label on the lamp.


> You will get exactly what you wish for

AI ToS: offload liability and responsibility on humans proferring their free will to silicon genies.


> Normals don't want to build an app

How did it become acceptable to use such vile "us and them" language in polite conversation?


We’ve been using it for decades: “Users”.


> Normals don't want to build an app, they want "an app for that" and can't find one. Those are very different wants.

And then you guys cry foul when these "Normals" cut your job. You'll think it's because of their newfound reliance on AI, but in reality, it's just that you're just a horrible person to work with compared to an AI model.


It is the same argument as the program-by-diagram stuff that has been around for 30 years.

Getting API arguments to line up is accidental complexity that might be automated.

Understanding business processes and how to construct systems to connect people and electronic systems in a way that makes sense is still a human endeavor.


Program-by-diagram works for some game engines, for example Unreal Engine storyboard. I'm fairly sure that current AI will run into an issue that plagues business requirements: meetings. It'll be an outlet that actually generates some MVP after the conversation.


>you get to specify the function of the software in English, the shittiest programming language I could possibly imagine.

Would your view change if the language wasn't English? German or Mandarin or something?


  > If you'd rather build software in English than in a programming language, then you don't really want to build software at all.
Every software build I've been a part of is conceived first and fleshed out in English. The programming language then used is an intermediary annoyance to most parties involved.

Ultimately, the end result is all that most people care about.


I worked with a lot of people. I am one of the most structured thinkers I have worked with. Still I sometimes only realize how incomplete my understanding was when I am actually Code the solution. If my thoughts would have been as structured before they would probably look a lot like the code that it resulted in. I am not paid to do what the PM tells me to. I am paid to think his thoughts to the end.


> Every software build I've been a part of is conceived first and fleshed out in English.

I'd bet a lot of money you're wrong about this. It's conceived mostly in some non-verbal thought-like pseudo-language inside someone's brain, and then it's poorly translated into English. It's fleshed out in code. That's when the real details were figured out. Maybe not while actually typing in code -- if very experienced programmers were involved, they fleshed it out using the mental models they learned from their years of programming, in their head, and then they poorly translated that to English, and then they sat down and typed it correctly again without English being involved.


Yeah I think my post was lacking nuance, because of course a lot of things come out during the build. What I mean is at some point all software starts out as an idea, generally being discussed with others. Typically you will get to scope docs before building anything as well.

Then again, not everyone is going to follow the same process.


I can convey in speech in 8 hours what would take you a week to type.

Writing code is low level plebian work. I can convey thousands of lines of functions in a handful of sentences of plain English describing a task or complex high level function.


> I can convey in speech in 8 hours what would take you a week to type.

No, you can't. Your speech will not cover every edge case; it will not cover exactly how data flows through the system; it will not specify the appropriate data structures and algorithms to use; it will be full of holes and inconsistencies. How do I know this? Because if you could say it in English to that level, then you could simply type it in a programming language even more easily.

Maybe it would take me a week to weed through your imprecise exposition to try to understand what you're actually trying to do, but it certainly does not take a week to type code; typing takes maybe 5% to 10% of my time. About 70% of my time is spent fleshing out the actual precise problem to solve, with the remaining 20% to 25% spent coming up with a good solution to that problem. Your 8-hour speech would not even precisely specify the problem you're trying to solve.

> Writing code is low level plebian work. I can convey thousands of lines of functions in a handful of sentences of plain English describing a task or complex high level function.

If a handful of plain-English sentences result in thousands of lines of code, either those plain-English sentences were unhelpful nonsense like "make me Facebook for dogs", or your code is massively over-complicated.

Honestly what I'm hearing from you is "I don't know how to write code". That's OK. What's not OK is turning your nose up at those people who actually do. You sound like one of those fresh MBA "idea guys" who has a "billion dollar idea" and just need someone to type the code for you for a generous 20%.

Just imagine someone saying "I could easily write a novel if I wanted to, but I'm not going to type it, because typing is for plebs. I'm an ideas man. I'll say what happens in the story, and I'll get some peon to actually type it into a novel." The person who says that is not "too good to be an author", they are a wannabe-author.


I was with you until you said “make me Facebook for dogs”. I think a sufficiently pattern recognizing AI could probably make a somewhat useful/accurate application from that prompt, because most social media apps follow a set of patterns. Users (dogs) have certain attributes that go on their profile, they can share things with their friends, notifications should work. Where it gets tricky is the details, but maybe AI could get you a rough version at least by knowing social media app conventions. This is also how a motivated programmer would go about it, instead of asking the “idea person” a million questions they would just assume things.

I say this as someone who has looked at AI generated code and was not impressed though, so this is probably still a long way off.


The real issue with prompting a la "Facebook for dogs" is how ridiculously underspecified such a prompt is. You see the problem with this very clearly in text to image models: you might have a specific idea of your dog in your head, but no matter how often you prompt "cute white Maltese dog with a tuft of fur across its eyes", you will never get something that would pass as your dog. (A more obvious challenge would be to try to generate an image of yourself if you are not a celebrity). The amount of words needed to replicate every detail exactly would amount to at least a short essay. You give an AI the prompt "Facebook for dogs" and it could take that in SO many different directions.

Another problem that I'm personally more concerned about is that this way of coding will allow for so many security flaws to be built into the code. Even if it builds something that matches your expectations for the prompt, what's stopping it from generating something vulnerable to attacks that leak all your users' data? Humans already struggle enough with this, and we're feeding all our flawed codebases into these models.


I'm not sure why you think all that can't be done by AI. I'm not a coder or atleast I wouldn't call myself one. But yes I've done the whole write a funct in a few hundred lines then spend a thousand making sure it's used correctly.

The thing is I don't see why ai couldn't do all that too. If it has decisions to be made that affect the outcome it can prompt me for input on them. Writing is slow. Not sure why I would bother with it tho if I can bang through the whole process by interacting at a concept/theory level instead.

I look at AI like a new compiler. I mean I could write machine level assembly and if I was a wizard who knew it in and out I'm sure it would be super efficient. But we basically don't do that for 99% of cases, we write in some other language and we use a compiler. AI is like a compiler for my plain ass basic bish English to code. Might be a bit rough round the edges now but I'm sure it'll be suitable for the vast majority of coding in a very short amount of time.


You can describe what the input and output of a function should be, but unless you can describe the mechanism by which one turns into the other, you aren't communicating what code does. This is the disconnect between programmers and non-programmers; people operate at a level of abstraction that is fundamentally incompatible with computer software. Programming skill is about learning how to think about an abstract concept and realize it as a discrete concept, and try as we might we've been unable to come up with a way to get computers to intuit what we really mean when we speak in nonspecific terms.

"Writing down code" is, perhaps, plebian work. Understanding a problem as expressed by humans and being able to connect that to the code to write, though, is where the real work is. Based on my experiences with ChatGPT and Copilot, AI isn't there yet. It's good at the plebian work, which is great, but it's hot garbage at the actually difficult thing.


> I can convey thousands of lines of functions in a handful of sentences of plain English describing a task or complex high level function.

Amateur, I can do much better. "Build a search engine that's better than Google at everything". Voila, the AI will start spinning up and come back with a very large, very complex system, and all it took was ten words.


Can't wait for all the consulting jobs on AI generated software that "we can't get the AI to add any more new features to correctly" and nobody knows anything about the current code.


That's why OpenAI will also sell support contracts to debug and fix what their tools produce. Some occasional broken code isn't necessarily a problem if it can lead to more sales.

Vertical integration.


we already have this sort of company

it's called infosys/accenture/...


Yes, but those companies still have to hire people to do the breaking and fixing.

OpenAI could become a fully automated business ouroboros.


Personally, I think I'd rather be unemployed at that point.


Good argument. Though it is possible that rewriting the code anew with the AI could be faster and cheaper then to maintain the code...


Imagine the nightmares possible of 30 years of layer on top of layer on top of layer of AI written legacy code in an international business O_O

We're not talking about millions of lines of code anymore, that'll be hundreds of billions of lines of code.

Then you'll get private cloud AIs that solve enterprise-scale AI spaghetti code for an enterprise budget...

Some consultant will to come in to tame the AI because employees just kept stacking more and more and more and more code on top of itself to add new features. Features interact with one another and behaviour of older code changes. You end up with some kind incomprehensible singularity spaghetti.

Will be fun to be an AI-driven product maintainer in the future!


That sounds incredible. We should freeze AI progress right when that becomes possible. We would never advance beyond that point. I would be paid handsomely to fix the spaghetti code.


Then the hourly rates will really hit new highs. I'd like some of that pile of cash.


Do you remember Microsoft FrontPage or Dreamweaver? I was thinking about going into computer science around 1997-8. When I saw all these WYSIWYG website creators, I thought forget it, programming is a dead-end career.


I suspect people who dunk on this haven't actually tried using Copilot / ChatGPT for coding in their workflow in any meaningful way.

For me, I love engineering, and I'm very experienced at it. But I use Copilot/ChatGPT constantly now, every single day, for writing code that I know how to do, but is very "grunt" level code that I'd rather not have to spend time writing.

It's incredibly solid at that! I would not go back to not having this tech. I would happily pay $200+/month for access without blinking. It saves me much more than that in an hourly rate. Not because the code it's writing is hard code that actually would take me hours to write in theory. But because, as a human, I'm going to take a lot longer to write that simple code because I really don't want to have to do it. It's below my pay-grade. I'd rather spend my time on the more interesting / difficult code.

I fully believe this effort will work.


What code for example?


Prompt "write a powershell script to copy a file from one folder to another. Rename the file to 'newfile' and append the date it was copied to the end of the file name" Just an example of 'grunt-level' code that chatgpt can easily handle


AI> Tell me about your problem

Person> I need to find files on my computer I don't need delete them all.

AI> sudo rm -rf

Person -- Crap what is going on with my computer. Lets unplug it. Plugged back in. Now it won't turn on... Gotta call IT support now.


Please pardon my cynicism for a moment. No doubt AI will be able to replace a majority of software engineers at some unspecified point in the future. However, I suspect an embarrassingly large number of companies will jump the gun and attempt to pivot to AI generated code prematurely. The results will crash and burn, possibly literally, and some market segments will be hit with lawsuits so hard that they'll effectively be barred from using AI to write code for several decades. Meanwhile, the less critical industries will crank out buggy AI crap that works 90% of the time because it will still be more profitable than hiring engineers. And everything will suck until the AI overlords are finally advanced enough to completely take over.


Good. I want computer communism. This is how I get it.


This can't come fast enough. What to type is the hard part, not actually typing it.

Most of programming is just typing stuff you already know. If there is a machine that will do that for me, my god.


Job security for actual coders. Using english to communicate precise program instructions will make slow, buggy, code. This will be fixed with actual code


Not sure how I feel about an American company that has just been flooded with capital is deciding to hire programmers outside of America “to cut costs”. If their reasoning was solely based on talent I think that would be perfectly acceptable.


The contractors can use the money better than we US residents can.


Tell that to the hundreds of thousands US IT professionals that lost their jobs recently.


They can go on welfare. Poorer countries don’t have adequate welfare systems.


In the future will your job "interview" be used to train AI?


Nothing new here if you already read that one dude on Twitter who described the job when he was interviewing for it.


As long as there is somewhere free to extract boilerplate code from, I don’t really care where that is.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: