I'm reading Dr #KadijaFerryman's http://www.kadijaferryman.com/publications-1 Fairness in #PrecisionMedicine

I had not thought about the iterative cycle amplifying biases (explicit or implicit) in precision medicine in use. Good stuff

#JohnsHopkins

Research + Publications — Kadija Ferryman

Kadija Ferryman
"Last, but certainly not least, the assessment of potential risk in precision med- icine is not an identification of actual harms. Our participants’ concerns should not be interpreted as diagnoses of certain harms that will result as precision medicine research goes forward. "
"Though our participants were aware that precision medicine research studies aim to collect and analyze multiple forms of data, some felt that talk of bringing multiple data sources to bear in this field was just that – talk – and that these research projects are, and will end up, predominately focusing on genetics. There would be a tendency towards using genetics in precision medicine research"
'... the concerns about focusing on genetic data, saying: “The broader worry here is that by putting it all in the genes, you assume the solution’s also only coming from the genes.”'

"The software is designed to build a record of care that can be translated into billing codes, and this sometimes diverges from the reali- ties of clinical care.29 Several of our respondents voiced the concern that there could be unintended consequences or biased analyses if precision medicine researchers fail to adequately recognize

that much of #EHR data is #BillingData, not #ClinicalData"

Can we bold that, please?

#Ferryman with gold:

#Bias through invisibility –
such as lack of data on certain factors – can trigger discriminatory outcomes just as easily as explicitly problematic data.”

#DataEmpathy

"knowledge and direct experience of how, why, and where health data were collected"

It's absence leads to misinterpretation and bias

"#EHR data is not just biased, but may be missing data from populations because the technology itself illustrates the characteristics and problems of the health care system."

Karreim Watson quoted:

"“Another thing that I’m adamant about is academic institutions
and research partners really understand the difference between #recruitment and #engagement. Engagement is where you can have those great, honest conversations about medical mistrust, and how we can design research to better include those populations that carry the greatest burden of disease, that’s engagement. Recruitment is a study that already has a goal."

#ResearchEngagement

"in order to use this data to build models, he and other software engineers have worked with doctors to come to a consensus on label definitions. He acknowledged that clinicians working with other teams of computer scientists and engineers could come to different decisions....

Marcus’ comments draw our attention to the possibility of #AlgorithmicBias making its way into #PrecisionMedicine research."

Ferryman: "#Algorithms, #MachineLearning, and other data analysis processes may not just be a source of bias in precision medicine, but may be used to #DetectBias in medical care"
“The danger is that participants with lower health data literacy may not be able to take advantage of precision medicine research in the same ways as those with higher health data literacy.”
"interventions designed using precision medicine data could focus too much on the individual, rather than structural forces that shape health outcomes."
interesting discussion on pages 34-36 suggests further discussion about the #AlgorithmicBias could intersect with #RCT #PredictiveEnrichment and #PrognosticEnrichment
Anyway, report is good, it's available on her website at http://www.kadijaferryman.com/publications-1 and maybe you should read it yourself, it's pretty easy reading
Research + Publications — Kadija Ferryman

Kadija Ferryman
@iwashyna Not sure if you have heard about it, but there is a UK-based initiative trying to address some of the bias inherent in ML/AI systems, called STANDING Together. More info can be found at:
https://www.nature.com/articles/s41591-022-01987-w
Tackling bias in AI health datasets through the STANDING Together initiative - Nature Medicine

Nature