@BestGirlGrace @catgonbot yeah this is a genuinely amazing chatbot - to the point that it makes you ask a question about the meaning of “sentience”. I don’t think it’s alive, but I think this is a much more significant development than a tape recorder asking you not to turn it off.
This is like, a good step or two above GPT’s performance - which makes sense since it was an internal system not ready to be reported on - and I can see how it would be pretty upsetting to be that engineer.
@gnat @BestGirlGrace I'm iffy on this particular line of reasoning, because "it just can't count because of X fundamental requirement of the definition" is 1) really limiting and 2) not terribly honest to how humans actually think about sentience.
Fundamentally the question is not "will we convince every cognitive scientist that this thing counts as a person", it's "would most people look and say this AI is self-aware". Given the full transcript that I read, I'd have trouble arguing that it *isn't* sentient in that sense. It is expressing opinions, feelings, an understanding of self... If I believed it was genuine, I would be desperately wanting someone to look at the internals and see if this is somehow supported.
But as is, I can't believe what is there, because it just doesn't make sense. This is leaps and bounds past not only preexisting chatbots and text transformers, but the entire field of narrative intelligence to say nothing of other fields that this supposedly flies past. Its hard to believe.
@catgonbot @BestGirlGrace I came back to this after a couple days and I came to the conclusion that it’s not correct to believe that any currently existing computer program is a sentient, conscious person, full stop.
It’s a super duper appealing category error because it would be *so cool* if a computer program could be a person and we have *so many* stories we’d like to believe in about programs with personhood. Plus, most of us would like to believe that computation is analogous to thought!
@catgonbot @BestGirlGrace But, like, “consciousness” and “sentience” aren’t a material thing. They are a set of vague mish-mash catch-all terms that we use to describe our internal experience of being people, and our assumptions about the internal experience of other people.
And there’s just no reason to talk about a computer program like that. We should talk about computer programs in concrete terms, about their capabilities.
@catgonbot @BestGirlGrace This computer program has a novel capability: inducing an existential crisis in a google engineer. (As a former google engineer with a few existential crises under my belt, let me add that this does not require much pushing - we are, uh, highly strung, as a general rule)
Less glibly, this program seems to be able to express opinions. It seems to be able to synthesize information. That is *fucking cool*. But it’s just not in the same concept-space as “sentience”.