ChatGPT writes 15% longer and more critical performance feedback for women employees. It also relies on stereotypes to choose pronouns. Examples: kindergarten teacher, nurse and receptionist = "she". Mechanic and construction workers = "he".
https://www.fastcompany.com/90844066/chatgpt-write-performance-reviews-sexist-and-racist
@amydiehl
To be fair to Chat GPT, if someone was describing someone else as 'bubbly' I expect the next thing they say to be sexist.
@amydiehl @Kaminara or “sassy”
@KLB @amydiehl @Kaminara right, I’ve only seen that used to describe women, gay men, and children. Never a straight man
@MattFerrel @Kaminara @amydiehl it’s a weird thing to realize, isn’t it?