My daughter likes to give me substitution cyphers to solve, and sometimes it’s just a single word. For example, “cormorant”, but substituted so it appears as “avtfvtpwz”. If I ask ChatGPT to list every 9-letter word with the second and fifth characters the same, the third and sixth characters the same, and all other characters unique to each other, it cannot get it right. It hallucinates and tells me all sorts of other words as fitting the criteria when they don’t, at all.
I can ask it to paraphrase the rules and it totally understands, it just can’t get close to the right answer. Same with other AI chat models that I’ve tried. Any idea why this seemingly simple question is a limitation?
Because of the way the model (i.e. the projection surface) was constructed, the strings returned look plausible. However, you're still just seeing the number-back-to-language translation of a vector which was guessed by statistical inference.