Hacker Newsnew | past | comments | ask | show | jobs | submit | newsoftheday's commentslogin

> In contrast, LLMs in their current state have (for me) dramatically reduced the distance between an idea and a working implementation

It may have reduced the time to an implementation, based on my experiences I sincerely doubt the veracity of applying the adjective "working".


As a software developer over 30 years, AI is not a tool, it is not deterministic, it is an aide.

Don't have it do things for you. Have it do things with you.

It can't do things with me like a human, it's not human, it's not intelligent, it's not thinking, it's not aware. It's an aide I use, not a tool I rely on.

When I see the word "score", it reminds me of the CCP social scoring system.

Weird... when I see something done by US-Based capitalist and attributed to communists half a world away, it makes me think of the Powell Memo.

That is weird, the US didn't ask the CCP to invent social scoring.

I mean -also- weird to claim that the CCP invented scoring folks, but even if they did, it'd be hella weird to think that somehow they helped a US local power company implement it...

Look, I get that "CCP Bad". It's just always wild to see folks try and make that case when something has literally nothing to do with it, especially while there are plenty of pretty horrific and material mechanisms in play without pretending that the big-O Other is to blame.


> I think this is missing the reason why these APIs are designed like this: because they're convenient and intuitive

Agreed. In my view, the method the author figured out is far from intuitive for the general population, including me.


I guess the point is: How often do we really need actual angles in the code? Probably only at the very ends: input from users and output to users. Everywhere else, we should just be treating them as sin/cos pairs or dot/cross pairs. So when the user inputs an angle, immediately convert it to what the computer actually needs, store it that way throughout the computation, and then only if/when the user needs to see an actual angle would you need to convert it back.

This is how most physics/graphics engines work.

I avoid Kotlin as a principal, any language that can't get the type and variable name in the correct order; I avoid them completely.

> not machine code

Actually, computers, being machines, do equate machine code and source of truth.


> Yes, and the implementation... no one actually cares about that.

There has been a profession in place for many decades that specifically addresses that...Software Engineering.


Same for me, has been my whole life. I complain about it all the time. It's well documented that people can read black on light far better and with less eye strain than light on black; yet there seems to be a whole generation of developers determined to force us all to try and read it. Even the media sites like Netflix, Prime, etc. force it. At least Tubi's is somewhat more readable.

Sometimes a site will include a button or other UI element to choose a light theme but I find it odd that so many sites which are presumed to be designed by technically competent people, completely ignore accessibility concerns.


The most common mistake I see (on this website at least) is the assumption that one's programming competence is equal to their competence in other things.

> Eventually, we'll end up in a world where humans don't need to touch code, but we are not there yet.

Will we though? Wouldn't AI need to reach a stage where it is a tool, like a compiler, which is 100% deterministic?


Two things to mention here:

1. You are right that we can redefine what is code. If code is the central artefact that humans are dealing with to tell machines and other humans how the system works, then CodeSpeak specs will become code, and CodeSpeak will be a compiler. This is why I often refer to CodeSpeak as a next-level programming language.

2. I don't think being deterministic per se is what matters. Being predictable certainly does. Human engineers are not deterministic yet people pay them a lot of money and use their work all the time.


>Human engineers are not deterministic yet people pay them

Human carpenters are not deterministic yet they won't use a machine saw that goes off line even 1% of the time. The whole history of tools, including software, is one of trying to make the thing do more precisely what is intended, whether the intent is right or not.

Can you imagine some machine tool maker making something faulty and then saying, "Well hey, humans aren't deterministic."


They do it all the time with their EULAs.

We will and soon because it does not have to be deterministic like a compiler. It only has to pass all tests.

Who is writing the tests?

There are different kinds of tests:

* regression tests – can be generated

* conformance tests – often can be generated

* acceptance tests – are another form of specification and should come from humans.

Human intent can be expressed as

* documents (specs, etc)

* review comments, etc

* tests with clear yes/no feedback (data for automated tests, or just manual testing)

And this is basically all that matters, see more here: https://www.linkedin.com/posts/abreslav_so-what-would-you-sa...


In the future users will write the tests

Compiler is not 100% deterministic. Its output can change when you upgrade its version, its output can change when you change optimization options. Using profile-guided optimization can also change between runs.

If you change inputs then obviously you will get a different output. Crucially using the same inputs, however, produces the same output. So compilers are actually deterministic.

This is irrelevant over the long run because the environment changes even if nothing else does. A compiler from the 1980's still produces identical output given the original source code if you can run it. Some form of virtualization might be in order, but the environment is still changing while the deterministic subset shrinks.

Having faith that determinism will last forever is foolish. You have to upgrade at some point, and you will run into problems. New bugs, incompatibilities, workflow changes, whatever the case will make the determinism property moot.


Many compilers aren't deterministic. That's why the effort to make Linux distros have reproducible builds took so long and so much effort.

The reason is, it's often more work to be deterministic than not deterministic, so compilers don't do it. For example, they may compile functions in parallel and append them to the output in the order they complete.


> yet I don't have to know or care about implementation details

Where do I even begin...yes, you should care about implementation details unless you're only going to write stuff you run locally for your own amusement.


until you learn to trust the system and free mental capacity for more useful thinking. at some point compilers became better at assembly instructions than humans. seems inevitable this will happen here. caring about the details and knowing the details are two different things.

LLMs lie constantly. There should be no trust in that system. And no I don't think they will "get better".

How do they "lie constantly"? We are specifically talking about code here, not LLMs writing legal documents.

I've had the LLM "lie" to me about the code it wrote many times. But "lie" and "hallucinate" are incorrect anthropomorphisms commonly used to describe LLM output. The more appropriate term would be garbage.

Just a basic sanity check: did the LLM have the tools to check its output for lies, hallucinations and garbage? Could it compile, run tests, linters etc and still managed to produce something that doesn't work?

I've frankly given up on LLMs for most programming tasks. It takes just as much time (if not more) to coddle it to produce anything useful, with the addition of frustration, and I could have just written far better code myself in the time it takes to get the LLM to produce anything useful. I already have 40 years experience programming, so I don't really need a tin-can to do it for me. YMMV.

Compilers are deterministic tools. AI is not deterministic. It will tell you this if you ask it. AI then, is not a tool. It is an aide. It is not a tool like a compiler, IDE, editor, etc.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: