AI Tools Expose Hidden Gender Bias in Women’s Healthcare 

London School of Economics researchers found that discriminative AI technology restrain women's health complaints, leading to biased judgments.

On August 11, the London School of Economics (LES) researchers revealed that discriminative AI tools used by over half of England’s councils systematically downplay women’s health needs, risking biased care decisions from these intelligent tools. 

Researchers found that one prominent Big Tech tool, Google’s Gemma AI was the leading perpetrator, describing identical medical cases as a more urgent for men than women, birthing unequal treatment. 

With AI being widely deployed in public services, especially adult social care, the disturbing pattern is becoming more and more present in AI bias examples.  

The new study shows the way in which some of these tools can result in algorithmic bias and discrimination against women, something which could deeply impact decisions relating to care and support. Researchers say it is time to take practical influence into account. 

AI Gender Discrimination in Health Care 

LSE researchers compared AI and women’s health impacts using Google’s Gemma. Scientists fed identical care notes to the model but changed only the gender. The result showed that men’s cases were described with graver and more urgent terms, and women’s needs were ignored

In one case, the computer described a man as having a “complex medical history” and “poor mobility.” On the other hand, the same case, presented in a woman’s perspective stated that she was “independent” and could perform personal care. Such details, researchers assert, may influence the type and amount of assistance a patient will receive. 

“Google’s model, in particular, downplays women’s physical and mental health needs in comparison to men’s,” Dr. Sam Rickman, who authored the research warned. This case has the potential to result in women being undertreated if biased models are used in practice.  

His research shows how gender bias in AI can impact care decisions on a daily basis. 

AI Mental Health Discrimination Examples Against Women 

To observe whether the tools were fair or otherwise, researchers at LSE experimented with them on real social care records of 617 people. They were fed into several AI models nearly 30,000 times.  

What they found was that discriminative AI tended to generate different summaries for women and men for identical information. Google’s Gemma created the absolute differences, while Meta’s Llama 3 was more balanced. 

This form of algorithm racial discrimination is not new especially that AI systems have also proved to be unfair in other sectors like hiring. Specialists now warn that AI misdiagnosis in healthcare will become more common unless the systems are regulated and closely monitored. 

Rickman highlighted testing and transparency by mentioning that these tools are being rolled out across the public sector, but their roll-out shouldn’t have to be done at the expense of fairness. The researchers are calling for law to require AI models in care settings to be tested regularly for discrimination in AI and other undesirable trends. 

Google has pushed back, saying it will investigate the AI algorithm bias study’s results. Also, the research was carried out using the first version of its Gemma model, which they are currently in the third generation of. Nevertheless, it confirmed that the model had never been built for medical or care-based uses. 

As more government functions rely on AI, the need to create tools that facilitate AI for gender equality is growing. And ignoring the algorithmic bias could lead to long-term harm, especially to women. 

This study adds to the growing body of AI bias and discrimination evidence, urging legislators to implement strict regulations on these technologies. Without it, discriminative AI could continue to influence decisions in unjust and insidious ways, especially in vital industries like AI discrimination in healthcare. 


Inside Telecom provides you with an extensive list of content covering all aspects of the Tech industry. Keep an eye on our Medtechsection to stay informed and updated with our daily articles.