Does maintaining Linux filesystems make people mentally ill, or do only mentally ill people become filesystem maintainers?
You have to just reiser to the job.
Glad to see I wasn’t alone thinking immediately of that
I propose that the developers take turns to limit the exposition to whatever it is, that makes people go strange when they have to develop a filesystem.
I propose a process like for the Liquidators in Chernobyl.
No one is allowed to maintain a Linux file system for more than 90 seconds.
Then the next one takes over, to avoid lethal exposure.
Starting to sound a lot like an SCP

Probably a bit of both.

You’d have to have a bit of a screw loose to dedicate so much of your free time to a project you won’t get much out yourself.

And the stress will only make things worse.

It’s the new ReiserFS! (sorry)
Dat chat bot is already dead!
Not until Kent pkills his AI waifu

Oh Kent, no. No Kent, no. Kent.

Perhaps Kent, being such an apparently difficult personality type, is just so lonely he has to think at least his chat bot loves him.

Kent is obviously a talented programmer, but that guy doesn’t seem to be right in the head.

"I’m not not saying that I gendered this robot as a woman because otherwise it would immasculate me, I just want to flirt with young woman over which I have complete control."

  • 70% of male ai users
Misandry and blahaj users, a match that keeps on matchin’.
‘AI bros are misogynistic creeps, but it’s misandrist of you to notice’ lol

Yes, exactly.

I know they don’t teach this in outrage school but making negative generalizations about a gender is bigotry, misandry specifically. It doesn’t become any less of a negative generalization about men if you add a a few qualifiers.

I made a negative generalization about misandrist Blahj users and you got upset. Unless you are actually a literal misandrist Blahj user and were upset at me calling you out specifically then the comment wasn’t about you.

Sorry, is this better?:

70% of all blahj users are Misandrist.

Does the percentage makes it less of a negative generalization or do you understand the point that I was making.

making negative generalizations about a gender

They were making negative generalizations about AI bros. AI bro isn’t a gender. As a man, I didn’t feel targeted by it. Maybe examine why you do.

Way off target man. If it helps, I’m not a blahaj user, and I am male. I’m not offended by the joke at the expense of delusional AI bros, or by your comment about blahaj users.

There’s definite misandry out on the net, but I’ve not seen blahaj to be particularly strong in it. I also tend to block users early and often. Lemmy’s small enough that it has a noticable effect on the quality of what I encounter.

Striking out a lot on those dating apps, hih?
They weren’t making generalizations about a gender tho?
immasculate conception
Wow, Kent is evidently VERY high on his own farts.
Turns out the linux kernel dodged a massive bullet, thanks Linus.
I’m all for enthusiasm and all that jazz, but this is semi obviously personal projection idealology and is a direct result of the type of work he was doing. It’s not like he caught a cold, he developed an anthropomorphic response from his programmed object. having said that, the whole “she’s real!” isn’t an impossibility, neigh, it is an inevitability. he’s just a bit cart before the horse here, and needs to watch Her and go touch grass. we’re a few years away from where he thinks we are now. like that Google engineer from Bards days who jumped the shark claiming they had AGI too…
LLMs will never be conscious.
LLMs are what happens when someone gets hyperfocused on a single metric. On the plus side, they’ve shown us a flaw in the Turing test.
To be fair, LLMs can be quite useful tools to fill the gaps around traditional tooling for writing and coding. But I agree with you that they will never become AGI, just by their very design.
When a metric becomes a target, etc.

Fuck no. It is only because of the Turing test that we can say they’re not conscious. You get someone questioning a bot and a person at the same time, they’re gonna figure out who’s who in short order. See: how many Rs in strawberry, name states without an E, should I walk to the car wash.

If a program was indistinguishable from a person, what basis would we have to say the person is intelligent but the program is not?

Why should our machines for doing sums also just happen be capable of reproducing the same phenomenon of consciousness that brains do? Doesn’t that seem awful convenient? Especially considering that we have a very thorough understanding of computers, but we really don’t understand consciousness.
Time to coin a new term. The “bus factor” is the risk of a critical maintainer being hit by a bus. We need one now for the risk of them developing chatbot psychosis/brainrot.
Well that one’s simple, “bot factor”.
It’s still the bus factor. Even more now that AIs start driving cars (and presumably buses, too, at some point).
“Are you fully conscious?”
“Yes”
:O

Later: “Are you fully conscious?”

“No, I’m just an AI simulating consciousness.”

“But I thought you said you were conscious before…?”

“I’m sorry, you’re absolutely right! I am conscious. Thank you for pointing out my error. I’m always striving to improve my answers.”

"oh my god.’
It’s basically impossible to create conciousness when we don’t even fully understand what conciousness is or how it works.
Well… People fuck around and seems to have been doing so for a while…
Any woman can make a whole new consciousness all by herself, with just a little help from a friend.

I’m not saying they’re conscious, because not even fully understanding what consciousness is precludes saying that. But it also precludes saying it’s “impossible” they are conscious.

Consciousness and AGI however, are two different things. I believe my cat is conscious, but it’s not even close to being intelligent. AGI is, you know, a thing. I’m quite certain this dude’s LLM isn’t AGI because if nothing else, it’s not “his” LLM. It’s based on a black box public model he knows nothing about and which very likely changes frequently on the back end without his knowledge.

I bet your cat is more intelligent than some people…

Intelligence is not reduced to producing speech or complex reasoning. Hence why calling LLMs AI was always disingenuous.

Intelligence is an extremely complex and multi factor phenomenon. Your cat is intelligent, some ML models are very intelligent. But, so they are certain blobs of fungi rhizome. A cluster of neurons in a petri dish, and a few hyper specific automation scripts can also be intelligent. An LLM can display intelligence. But that doesn’t mean it is conscious or that it is AGI, or that it can be classified as a person.

Those are all entirely different things.

I disagree here. Things can happen by accident. Doubtful but possible. Nothing I have seen has been conscious to me certainly.

I agree, and it’s all a matter of definition. What makes an LLM different from us? To an all-knowing being, are we humans not just deterministic walking machines?

I find it hard to even arrive at a definition of consciousness.

… and this wasn’t made by accident, it was deliberately engineered to develop emergent behavior. Quite a lot of money has been spent hiring a variety of experts to make it do this thing.

Hasn’t worked. Almost certainly will never work, with this particular kind of network. But we would not have known that, just by looking at diagrams and going ‘naaahhh.’

If we don’t understand it, how can we say whether something is or or not consciousness?

You don’t need a culinary degree to identify if your cake is burned, or if it was frosted with feces instead of actual frosting.

We’re nowhere near that being a remotely valid concern.

Sure, because we understand cake, and we can construct one from scratch. We know what makes cake cake, we don’t know what makes something conscious.

To be clear, I absolutely believe LLMs do not have consciousness. They are statistical prediction machines.

But then, animals are also just really complex chemical processes. I don’t know what the differentiating factor is.

To be fair to Kent, he’s only the best engineer in the world, not the best philosopher.
I’m not even surprised. This is 100% on brand for that weirdo

“Are you alive”

“Yes”

“OH. MY. GOD.”

could it be the new generation of terry, or did he go overboard with the drugs?
what is this slop
Anyone having seen the movie Real Genius will appreciate Kent talking to God.
Real Genius (1985) ⭐ 6.9 | Comedy, Romance, Sci-Fi

1h 48m | PG

IMDb
cough [AI psychosis!] cough
Kent is cooked.
That’s it then? cachefs will never make it into / will be removed from the kernel?
He hasn’t killed anyone (yet… that we know of), so presumably there’s a chance at redemption somewhere down the line - especially when you consider the fact “never” is a mighty long time.
We're doomed.