Over just a few months, ChatGPT went from accurately answering a simple math problem 98% of the time to just 2%, study finds

https://lemmy.world/post/1866990

Over just a few months, ChatGPT went from accurately answering a simple math problem 98% of the time to just 2%, study finds - Lemmy.world

Over just a few months, ChatGPT went from accurately answering a simple math problem 98% of the time to just 2%, study finds::ChatGPT went from answering a simple math correctly 98% of the time to just 2%, over the course of a few months.

Why is “98%” supposed to sound good? We made a computer that can’t do math good

It’s a language model, text prediction. It doesn’t do any counting or reasoning about the preceding text, just completes it with what seems like the most logical conclusion.

So if enough of the internet had said 1+1=12 it would repeat in kind.

Someone asked it to list the even prime numbers… it then went on a long rant about how to calculate even primes, listing hundreds of them…

ChatGPT knows nothing about what it’s saying, only how to put likely sounding words together. I’d use it for a cover letter, or something like that… but for maths… no.