Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Same here. Its biased sampling, also my prompt had generalized from GPT4 to Google’s own model - Bard. And was directly sampling, without having to go through the state when the model produces a repeating token. At least back then.

Should be a good food for the lawsuits. Some lawsuits were based on a hallucinated acknowledgement of the model that it used some particular materials, and this was clearly nonsense. Here, this is a bit more solid ground, provided that copyrighted material can be sampled and an owner would be interested in a class action.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: