A college professor of mine back in 1983 said "'AI' is what we call software we don't know how to write yet." I think this neatly captures the problem we have talking about current "AI". In 2000, nobody knew how to write software that would drive cars, write poetry, play grandmaster-level chess, or summarize text, so those were considered to be examples of what AI might accomplish. Now we know how to write systems that do those things, so they are no longer AI.
@ahltorp @isomeme @Gargron @darylgibson well, not *good* poetry, anyway. 😉
I weep for humanity that so many people have been impressed with the level of “art” these LLMs and generative art (pixel plagiarism) machines spit out. This is what happens when we fail to properly teach the humanities in school.
@KydiaMusic @ahltorp @Gargron @darylgibson
AIs aren't producing great art (yet), but they're easily outperforming the average human. I've seen a few AI-generated works that were quite compelling. As one of my favorite proverbs puts it, the amazing thing about a dancing bear is not how *well* it dances, but that it dances at all.
@KydiaMusic @ahltorp @Gargron @darylgibson
Absolutely. But the number of capabilities that are unique to humans will continue to decrease as AI technology advances. What happens when an AI can write a poem that reduces you to tears with its emotional punch? Pinning our claim to sentience on what computers can't do runs into the same problem as the "God of the Gaps" approach in theology.
@KydiaMusic @ahltorp @Gargron @darylgibson
Has any human poet in the last 10,000 years not "plagiarized" earlier works?
@darylgibson @KydiaMusic @ahltorp @Gargron
I'm a software engineer, and I can assure you that consecutive series and compulsiveness are two things at which computers excel. 🙂 Vide infra.