Read the full story here: https://civio.es/sanidad/2025/07/03/mole-or-cancer-the-algorithm-that-gets-one-in-three-melanomas-wrong-and-erases-patients-with-dark-skin/

Would you agree that #ArtificialIntelligence should not discriminate? The case of #QuantusSkin shows that it can be deadly when systems fail to work for all people. To prevent this from happening, we need better education, more research, and clear rules for the use of algorithms.

Do you suspect an algorithm has treated you unfairly? Or are you uncertain? You can use our reporting form to share your experience: https://algorithmwatch.org/en/report-algorithmic-discrimination/

Mole or cancer? The algorithm that gets one in three melanomas wrong and erases patients with dark skin

The Basque Country is implementing Quantus Skin in its health clinics after an investment of 1.6 million euros. Specialists criticise the artificial intelligence developed by the Asisa subsidiary due to its “poor” and “dangerous” results. The algorithm has been trained only with data from white patients.

Civio

Spanish NGO @civio recent investigation reveals the case of #QuantusSkin, an AI system for early detection of skin cancer. The Basque Health Service, Osakidetza, plans to roll it out in 2025, promising faster diagnoses.

But the reality is alarming: Quantus Skin misses nearly 1 in 3 melanoma cases, and was trained almost entirely on images of white patients. For people with darker skin, that means even less accuracy, delayed treatment, and a higher risk of deadly outcomes.