Register now for better personalized quote!

HOT NEWS

AI tools risk gender bias in women's health care

Aug, 11, 2025 Hi-network.com

AI tools used by over half of England's local councils may be downplaying women's physical and mental health issues. Research from LSE found Google's AI model, Gemma, used harsher terms like 'disabled' and 'complex' more often for men than women with similar care needs.

The LSE study analysed thousands of AI-generated summaries from adult social care case notes. Researchers swapped only the patient's gender to reveal disparities.

One example showed an 84-year-old man described as having 'complex medical history' and 'poor mobility', while the same notes for a woman suggested she was 'independent' despite limitations.

Among the models tested, Google's Gemma showed the most pronounced gender bias, while Meta's Llama 3 used gender-neutral language.

Lead researcher Dr Sam Rickman warned that biassed AI tools risk creating unequal care provision. Local authorities increasingly rely on such systems to ease social workers' workloads.

Calls have grown for greater transparency, mandatory bias testing, and legal oversight to ensure fairness in long-term care.

Google said the Gemma model is now in its third generation and under review, though it is not intended for medical use.

tag-icon Hot Tags : Artificial Intelligence Gender rights online Human rights quota

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.
Our company's operations and information are independent of the manufacturers' positions, nor a part of any listed trademarks company.