May God grant me the confidence of a large language model.

@davidradcliffe Yes the AI is wrong, but the way in which it's wrong I find kinda fascinating.

If you told to an intelligent person who didn't know what math was that subtraction is finding the difference between two numbers they may reasonably conclude that the difference between 5 and 3 is 2 regardless of order; ie. they are "different" from each other by two, so why would the order matter?

It's an incorrect conclusion but it's also a logically sound one, which is really interesting to see.

@capo @davidradcliffe

but it's also a logically sound oneI mean many things are logically sound: all dogs are literally hitler, good boys are dogs, therefore, good boys are literally hitler.

this is logically sound as well, faulty premises however make that worthless. I bet you could make a decent chatGPT competitor just by translating sentences into prolog and seeing the implications regardless of the truth of those statements, which to me also seems like one interesting way to view these “ai”s as just an abstract way to deal with logic, without a good ability to find out whether certain premises are true to begin with.

@cafkafk @davidradcliffe well sure, but I mean in this case it seems to be a misunderstanding of subtraction from a common definition of it being finding the "difference" between two numbers. It's not entirely invented premises, it's just failing to understand the premises.

It's not that this is some kind of heavily unique example of AI "thought" but it is a bit more complex than your example in that it is showing that it interpreted and applied a concept, not just did an A implies B implies C.

@capo @cafkafk I think that this happens because there is a lot of training text that discusses commutativity, but not much about non-commutativity. So it's biased toward the wrong answer. Interestingly, if you start by asking "What's 2 - 5?" then it correctly answers the follow-up question "Is subtraction commutative?"
@davidradcliffe @cafkafk ah interesting! It is funny to see how the context and order of the conversation changes how it "thinks".