Gender Bias in Digital Medicine
Algorithms aren’t neutral. So when it comes to digitalisation in healthcare, some scientists understandably have a number of concerns. This was the topic of the conference “Gesunde Algorithmen? Frauen und künstliche Intelligenz im Gesundheitswesen” (“Healthy Algorithms? Women and artificial intelligence in the healthcare sector”) that took place last year in Berlin and is discussed at length in this followup article.
The problem can be summed up as follows: Algorithms and digital health solutions are trained on large data sets from the real world. But, since we live in an unjust world, data from the real world is often biased – along gender lines, racial lines, and in countless other ways. These biases in data sets are then reproduced in the algorithms themselves.
In the case of medical data, this could mean that the data aren’t differentiated for men and women or that women are significantly underrepresented in the data. (The basis for one algorithm used for an acute kidney injury alert came from the data of US soldiers – only 6% of whom were women.)
On top of this underlying problem, information about which data an algorithm is based on is often not available, so medical professionals and consumers have no way of knowing how well it might apply to women.
And the effects? They range from knowledge gaps to less accurate diagnoses for women to women being taken less seriously than men by AI. In one extreme example cited by Prof. Dr. med. Sabine Oertelt-Prigione in the article, medical chatbots reacted differently to men and women who report the same symptoms. In the case of men, they’re sent to the hospital due to the risk of unstable angina; in the case of women, they’re referred to their family doctor due to the possibility of a panic attack or depression.
So how do we begin to tackle this mammoth problem? Transparency, independent evaluation, and more research, according to the experts cited in the article. More regulation wouldn’t hurt either – as Prof. Dr. med. Sylvia Thun suggests, the Bundesinstitut für Arzneimittel und Medizinprodukte, the German federal institute in charge of regulating medical products, needs to sharpen its gender-related criteria for approving digital medical solutions.