How women’s health apps are misusing sensitive medical and fertility data

How women’s health apps are misusing sensitive medical and fertility data
© shutterstock/Dean Drobot

New research has found that health apps designed for women are exposing users to unnecessary privacy and safety risks.

Teams from University College London and King’s College London presented the most extensive evaluation of the privacy practices of female health apps at the ACM Conference on Human Factors in Computing Systems (CHI) 2024 on 14 May.

The health apps in question contain sensitive information about millions of women’s medical and fertility data such as menstrual cycle information, and this information could put them at risk.

Analysing the most popular female health apps in the UK and USA

The team analysed the privacy policies and data safety labels of 20 of the most popular female health apps available in the UK and USA Google Play stores, which are used by millions of people. The analysis revealed that in many instances, user data could be subjected to access from law enforcement or security authorities.

Through the research, only one app reviewed explicitly addressed the sensitivity of menstrual data concerning law enforcement in their privacy policies and made efforts to safeguard users against legal threats.

However, many of the pregnancy tracking health apps required users to indicate whether they had previously miscarried or had an abortion, and some apps lacked data deletion functions, or made it difficult to remove data once entered.

The experts warn that these poor data management practices could lead to serious physical safety risks for users in countries where abortion is a criminal offence.

Dr Ruba Abu-Salma, lead investigator of the study from King’s College London, said: “Female health apps collect sensitive data about users’ menstrual cycle, sex lives, and pregnancy status, as well as personally identifiable information such as names and email addresses.

“Requiring users to disclose sensitive or potentially criminalising information as a pre-condition to deleting data is an extremely poor privacy practice with dire safety implications. It removes any form of meaningful consent offered to users.

“The consequences of leaking sensitive data like this could result in workplace monitoring and discrimination, health insurance discrimination, intimate partner violence, and criminal blackmail; all of which are risks that intersect with gendered forms of oppression, particularly in countries like the USA where abortion is illegal in 14 states.”

Privacy policy wording is flawed and ambiguous

Shocking differences between privacy policy wording and in-app features were discovered, along with flawed user consent mechanisms, and the gathering of sensitive data with rife third-party sharing.

Key findings included:

35% of the apps claimed not to share personal data with third parties in their data safety sections, but this was contradicted in their privacy policies by describing some level of third-party sharing.

50% provided explicit assurance that users’ health data would not be shared with advertisers but were ambiguous about whether this also included data collected through using the app.

45% of privacy policies outlined a lack of responsibility for the practices of any third parties, despite also claiming to vet them.

Many of the health apps studied linked users’ sexual and reproductive data to their Google searchers or website visits, which researchers warn could pose a risk of de-anonymisation for the user and could lead to assumptions about their fertility status.

Lisa Malki, first author of the paper and former research assistant at King’s College London, who is now a PhD student at UCL Computer Science, said: “There is a tendency by app developers to treat period and fertility data as ‘another piece of data’ as opposed to uniquely sensitive data which has the potential to stigmatise or criminalise users. Increasingly risky political climates warrant a greater degree of stewardship over the safety of users and innovation around how we might overcome the dominant model of ‘notice and consent’ which currently places a disproportionate privacy burden on users.

“It is vital that developers start to acknowledge unique privacy and safety risks to users and adopt practices which promote a humanistic and safety-conscious approach to developing health technologies.”

Subscribe to our newsletter

LEAVE A REPLY

Please enter your comment!
Please enter your name here