i dont need to. llm are probabilistic systems, they are not design to reason, and its actually the opossite nobody can explain some of the emergent behaviour they exhibit. will you let one of those to control the air traffic based on "black magic"? sometimes i have the feeling that we have forgot what scientific method is...
I asked GPT to compute some hard multiplications and the reasoning trace seems valid and gets the answer right.
https://chatgpt.com/share/6999b72a-3a18-800b-856a-0d5da45b94...