#ChatGPT's logic can be strange. If I ask if a rectangle is an equiangular polygon, it says yes.
But if I first as the definition of an equiangular polygon, and then ask about a rectangle, it says no, despite stating it has four 90 degree angles. It is very confused.
I clicked "dislike" and reported this strange illogic.

If I quoted verbatim its previous answer, it would agree with the correct answer.

Follow

@tomruen it's really bad with anything math related. it misstates theorems and definitions constantly. I played with it a while back and found many examples. and it can't do even most basic math computations. E.G.:

@herid I read one article that said the primary advancement of chatGPT was its ability to write and read natural language. But apparently since it exclusively relies on fuzzy logic, it can come to weird conclusions.
I imagine future Chat systems will be dual-minded, where a logical system can analyze its own output before sharing, and catch its own poor logic.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.