It's artificial but it's not intelligence. Let's call it machine learning.

Analytical machine learning is an excellent technology for identifying patterns in data.

Generative machine learning is a Venture Capital-fuelled dumpster fire that will crash as soon as people have to pay the actual non-subsidised price for it.

#machinelearning #aibubble #aicrash

@mrundkvist from Cambridge dictionary "the ability to learn, understand, and make judgments or have opinions that are based on reason: " Note especially 'understand, and make judgments or have opinions that are based on reason:" Neither of the last are apparent in AI - it learns patterns, yes it has the ability to learn. Let's see what Microsoft says about CoPilot "Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk." Source https://www.microsoft.com/en-us/microsoft-copilot/for-individuals/termsofuse. Do they themselves seem to say that it's intelligent?
Copilot - Terms of Use

Microsoft Copilot

@iam_jfnklstrm @mrundkvist

That does not explain the motive, what motivate intelligent people to learn and understand, only say that they have the ability.

There is no reason for them to use that ability and what will be the outcome of that ability.

@dozymoe @mrundkvist true, but we have the capacity to reason - which is not true for AI. But on the other hand - forming opinions based on slop from the internet maybe make many people equal to AI.

When an AI says 1+3 = 400 it 'believes' so and do NOT stop and think about if it is reasonable. But, I hope, most people 'understand' that it is not possible.

@iam_jfnklstrm @dozymoe @mrundkvist let me explain myself Martin. Humans take information in, they learn, they store that information in some form in brain. When a new situation comes, they search for that information and make up inference. In principle, AI is same. It trains on data, it learn, and it performs inference based on that learning. And obviously, with inappropriate learning, or no learning output will be wrong. Same wrong output you will get from children with no learning.
@Fxiz
Sure, but your thinking process, between input and output, does not involve statistics to identify the most normal textual string to output given the input string.
@mrundkvist that is the problem. While one is using statistics (I think not), other is using chemical reactions to perform those same operation for identifying those matching patterns. Human brain is working with chemical reactions and electrical signals, just for argument sake, even more unreliable way to do it. But it's not right. We are over simplifying that AI uses math, but it is essentially just trying to stimulate same kind of operation as that happen in brain.