There's no systemic racism but weirdly every time we train an AI on public data sets it becomes very racist.

@aidenbenton
and even applies to AI algorithms trained to diagnose chest xrays:

https://www.nature.com/articles/s41591-021-01595-0

tldr: the AI has a higher rate of underdiagnoses/missed diagnoses in POC/marginalised populations

Underdiagnosis bias of artificial intelligence algorithms applied to chest radiographs in under-served patient populations - Nature Medicine

Artificial intelligence algorithms trained using chest X-rays consistently underdiagnose pulmonary abnormalities or diseases in historically under-served patient populations, raising ethical concerns about the clinical use of such algorithms.

Nature