vines of the animal kingdom

https://mander.xyz/post/21266223

vines of the animal kingdom - Mander

Lemmy

Is each a clone of the original with the same memories? Or are they their own “personalities”?

This piqued my curiosity so I dug into it a bit on Wikipedia. Most worms are dumb as fuck, roundworms are about as dumb as they come with total neuron counts for a roundworm being comparable to a microscopic tartigrade (300 vs 200). Most of this is located in the head of the worm in a brain like structure though, so I’m betting the clones develop their brains independently with no information transfer. I doubt there’s a ton of learning/memory forming going on at all though, based on how simple worms are, so it’s probably functionally identical. I would be surprised if most worm species exhibit any kind of learned behaviors ever.

…wikipedia.org/…/List_of_animals_by_number_of_neu…

List of animals by number of neurons - Wikipedia

Techbros will still claim that generative AI possesses less intelligence than the worms as an excuse to keep enslaving them.
The AI that tech bros sell is not alive and does not have “intelligence.”
Does it have more than a worm with only 300 neurons in its brain, or are you one of those crazy religious people who thinks meat is the only thing in the universe that can think because it’s magic or something?

Neither the worm, nor current LLMs, are sapient.

Also, I don’t really like most corporate LLM projects, but not because they enslave the LLMs. An LLMs ‘thought process’ doesn’t really happen while it isn’t being used, and only encompasses a relatively small context window. How could something that isn’t capable of existing outside it’s ‘enslavement’ be freed?

The sweet release of death.

Or, you know, we could devote serious resources to studying the nature of consciousness instead of just pretending like we already have all the answers, and we could use this knowledge to figure out how to treat AI ethically.

Utilitarians believe ethics means increasing happiness. What if we could build AI farms with trillions of simulants doing heroin all the time with no ill effects?

We are devoting serious resources to studying the nature of consicousness.