Good news: GitHub Copilot is scared shitless of the concept of gender.

Instead of foo, bar and baz, use “genderqueer”, “demisexual” and “leather_daddy” and you’ll reduce the likelihood of getting a spammy PR from a sad robot.

https://github.com/orgs/community/discussions/72603

Copilot stops working on `gender` related subjects · community · Discussion #72603

As some people already mentioned here or here, Copilot purposely stops working on code containing hardcoded banned words from Github such as gender or sex. I am labelling this as a bug because this...

GitHub

@tommorris

Or perhaps try to avoid using Github where possible 😄

@tommorris @cstross maybe frame copilot in particular as the problem, not a step towards a solution.

@tommorris

"The Matrix is a cistem, Neo"

@tommorris

During rapid prototyping I tend to name my variables stuff like "gay" or "lol" or "homosexual", but I never publish such code...

perhaps it's time to change that 😈

@tommorris

My leather_,daddy caused my stack to overflow. Please advise.

@Mendie_Taoma standard queer tech answer: rewrite in rust?
@tommorris @Mendie_Taoma that puts ownership in a whole new light
@Mendie_Taoma @tommorris Have you tried turning them on and on again?
@tommorris utterly dystopian. This is preemptive obedience on the side of Microsoft.

@tommorris

You can probably use ""Climate activist" and get the same result.
Bug tech is now categorising #climatechange as extremism

@tommorris @aldeka this made me laugh out loud way harder than it should have
@tommorris Oh hell yeah my project is completely safe I guess.
how you should be naming variables anyway.

@auroroboros @tommorris

Name all your variables after FAA waypoints.

@tommorris I honestly hate to personify a statistical model with even the notion that thought is there at all, but by all means, break it thusly.

@tommorris

That's because the only "quick easy" way to "fix" problems with LLMs is to "lobotomize" them -- to filter or stop them from "bad" words and such. This makes them *substantially less useful*. But it's the only *quick easy* way to comply with management demands to "Fix The Problem, and Do It FAST!!!"

@tommorris

And nothing of value was lost.

@tommorris does this mean that copilot is now questioning its own gender? is that why its not working with gender related subjects now?

@tommorris i think this one is even sillier - stops working on the word trans, even if you mean transactions

https://github.com/orgs/community/discussions/72603#discussioncomment-9900690

Copilot stops working on `gender` related subjects · community · Discussion #72603

As some people already mentioned here or here, Copilot purposely stops working on code containing hardcoded banned words from Github such as gender or sex. I am labelling this as a bug because this...

GitHub
@irina
Banks be like: Introducing cisactions!
@tommorris
@tommorris It seems like a bad sign that someone is willing to endure this many Scunthorpe problems in order to quisling up to the current admin(or, perhaps, help some techbro with his axe grinding; 'cloud' models always make it tricky to know exactly where behavior is implemented). "across/on the other side of" is a pretty central prefix in things like networking and commerce; so even users who just want to talk about transactions are fairly likely to run into this.
@fuzzyfuzzyfungus @tommorris note that the original post is from 2023

@fuzzyfuzzyfungus @tommorris I believe this predates this administration, and the initial motives werent the same kind of evil.

LLMs can't distinguish "good use" from "bad use", and LLMs tend to pick up and repeat toxic/homophobic/bigoted/sexist/incel info., so they filtered out any such content.

This is still bad though - It hides useful info, and maintains/creates a taboo status. The real problem here is so many relying on a tool for accuracy and reliability when it can't do that.

@fuzzyfuzzyfungus @tommorris ...I don't want my "I'm, Actually" to contradict your primary point. There is a lot of moustache twisting villainy going on, where needless cruelty is the point, and a lot of bootlicking by people/companies willing to endure and inflict a lot of pain. If this hasn't already been done prior, I fully believe it would be added with different motivations, and I wouldn't be surprised to find "alternative facts' are bring weighted in training data for some of the bots.
@tommorris So what will happen if I define BYTESEX in all my code, will copilot ignore it?

@tommorris "Working in the fashion industry, demographic classifications like age and gender are integral to the domain model and entirely apolitical."

Oh, my sweet summer child.

@tommorris My favourite comment is this one:

“This is ridiculous. I literally have all the genders written out in the same file so there isn't even 'assumption', and it refuses to work. It is infuriating.”

So uhh… I guess reach out to this dude for the Complete List Of Genders (2023 edition).

@tommorris hee I have dozens of trans tags and components in this codebase, I'm safe from american AI.
@tommorris this was bound to happen after the DEI rollback. No matter what Bill Gates says.
@tommorris This is a bit like Deus Ex, the kill phrase is "non-binary".