Another very alarming problem with unchecked (or debatably worse) data-driven systems.

Image generation via WhatsApp has linked "Palestinian" and "Palestinian boy" with guns and boys with guns.

These models and algorithms are feeding hate.

https://www.theguardian.com/technology/2023/nov/02/whatsapps-ai-palestine-kids-gun-gaza-bias-israel

WhatsApp’s AI shows gun-wielding children when prompted with ‘Palestine’

By contrast, prompts for ‘Israeli’ do not generate images of people wielding guns, even in response to a prompt for ‘Israel army’

The Guardian