'In dark times, should the stars also go out?'
queer anarchist // jewish anti-zionist
I'm still COVIDing and you should too
| pronouns | she/her |
| discord | same as here |
| more info in | my pinned intro post |
'In dark times, should the stars also go out?'
queer anarchist // jewish anti-zionist
I'm still COVIDing and you should too
| pronouns | she/her |
| discord | same as here |
| more info in | my pinned intro post |
RE: https://digipres.club/@foone/116241346584343156
since starting this, I've gotten sued by a credit card company for being unable to pay, and had to pay 700$ due to a delivery fuckup.
Please, if you can donate, it'd really help.
I suspect LLMs reinforce the Gell-Mann amnesia effect. Experts who query LLMs about their fields of expertise will *quickly* realize how wrong their output can be, how quick they are to confabulate, and how eager they are to confirm one’s biases. Sometimes, replying “No, that’s wrong, try again” can cause an LLM to generate a completely different—and often opposite—answer to the same query, which makes no sense if the LLM had *actually* worked out an independently coherent answer.
Asking an LLM to comment about a subject you know nothing about—or worse, know a little bit about—is a psychologically dangerous activity. Not only will it confirm your biases, it will do so in a way that *appears* to be objective and independent, using fallacies that lie just beyond your ability to discern. At best, you will be misled. At worst, you will begin spiraling down a path of conspiracy thinking.
Be extremely suspicious of answers that are especially satisfying; you might have just gaslit yourself.
| yes | |
| no | |
| other | |
| results |
my roommates have been adding more doodles to the whiteboard i use as my grocery list.
soon, there will be no more groceries. only pokemon