Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So, this is obvious bullshit.

LLMs don't do anything without an initial prompt, and anyone who has actually used them knows this.

A human asked an LLM to set up a blog site. A human asked an LLM to look at github and submit PRs. A human asked an LLM to make a whiny blogpost.

Our natural tendency to anthropomorphize should not obscure this.



Yeah I agree




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: