Reproductive Health App Fined for Violating Data Privacy
Author:
Quick ReadIntro: Premom, a free fertility tracking app with a freemium model, has been instructed to pay a settlement fee of $200K by the U.S. Federal Trade Commission (FTC). This fee relates to breaches in user privacy by sharing identifiable health information to third parties, including information about individuals fertility and pregnancies. Further, it was found that the app is lacking in appropriate security measures to protect the data it collects and also that it has breached its own privacy commitments to users.
Why it’s Notable:
Though on the general public’s radar increasingly over the past few years, data security has reached new levels of scrutiny given TikTok’s recent high profile battles with the U.S. over their handling of user data. Despite the app's popularity, this scrutiny has resulted in serious repercussions with Montana being the first state to fully ban it on personal devices, and other countries making moves to do the same.
Further, and potentially much more concerning, the global climate around bodily autonomy is extremely strained. The right to choose over one's reproductive decisions has become a battleground in some countries, such as in the U.S., where recently Roe vs. Wade was overturned and access to safe abortion is no longer a guarantee in many states. The unlawful collection and analysis of highly sensitive data around an individual's fertility and reproductive status is alarming in a world where individuals are facing criminal prosecution for reproductive decisions.
Industry Implications:
To compound all of this, recent discussion around the impact of AI technologies and the data that is being used to train these models is causing concern globally. There is a sense that we’ve already let such models get ahead of what we can control, and we are behind in regulating their capabilities or legislating their use. Data such as that contained in health apps, especially any relating to contentious and sensitive issues such as fertility and pregnancy, must only be utilised in training such models only when user consent is expressly given for this purpose in order to ensure that health autonomy is upheld.
It is imperative that companies collecting health data display integrity and respect in their handling of it. This data should be treated with the same sanctity that traditional medical information is subject to, just as a physician would never disclose a patient’s health information, neither should any company handling it. Thankfully, increased measures appear to be on the horizon in terms of protecting digitally captured health information. The FTC is leading the charge, having recently proposed a rule that will hold health-related apps and trackers accountable if users are not notified that their data has been disclosed without their permission.