Issue 3: Data privacy, transparency and trust in FemTech
A long read on the unique data privacy challenges for FemTech, fear of ‘menstrual surveillance’, Clue’s new crowdfunding drive, and what might be next for regulatory compliance in the sector.
Welcome to FutureFemHealth, the newsletter that brings you a weekly dose of news and inspiration about innovation in women’s health and FemTech.
I’m Anna and whether you’re a founder, an investor, someone who works in women’s health / FemTech or are simply a fan, I’m glad you’re here to join me.
This week we’re talking about data privacy in FemTech.
Data privacy is all about trust.
And trust in FemTech apps is fragile after several investigations over the years showing that many are sharing sensitive data with third parties, not clearly communicating data privacy with users and leaving themselves open to security hacks.
Coupled with this you also have growing consumer awareness of the importance of data privacy - especially since the Roe v Wade ruling - which has led to a real fear of using these apps and calls for women to delete them.
As a result, data privacy is the biggest challenge to trust and reputation facing FemTech. And now there are signs that it’s becoming a higher priority for regulators too.
In early April 2023, John Edwards, the UK’s Information Commissioner, said that the UK’s Information Commissioner’s Office (ICO) would be “going after providers of women’s health apps and auditing them, and getting them to change any practices that are non-compliant.”
In this issue we’ll get into all of that and how we got to where we are now.
At the end I’ve dropped some of the resources that I’ve found useful to learn about this topic too.
Thanks for reading!
Anna
Why is data privacy so uniquely important in FemTech?
FemTech - or health products and services that particularly serve women - includes period trackers, fertility trackers, pregnancy apps, fitness apps and much more.
It’s estimated that a third of women in the US now use at least one of these apps.
In order for an app to be effective and useful to the user, it relies on collecting a lot of data about that person.
And that data is extremely personal and intimate.
The data ranges from the typical name, date of birth, address type data, through to details about someone’s period cycle, how often they’ve had sex, how they feel, and their pregnancy status.
To give a particular example, quite feasibly your app will know you’re pregnant before you even do and certainly before you’re able to tell a partner, parent or workplace. Your app will know about the heavy bleeding you’ve experienced that you may not want to discuss with anyone else. Your app will know that you had sex with someone new last week. Your app will know that you forgot to take your pill.
So even for someone familiar with entering information in an app, it takes a level of courage to give your most personal and intimate details over to an app.
You need to trust that it is being held and stored safely and not shared with others unless there is a valid reason to do so (and you have agreed to it being shared).
So, why is trust so low in FemTech apps?
Roe v Wade 2022
The issue of data privacy in FemTech apps came to a head last year after protection for abortion was overturned in the US.
There were calls for women to delete their period tracking apps because of the fear that information logged in these apps could be used to prove women had committed the ‘crime’ of abortion. For example, if data showed dates that a woman was pregnant followed by a date that she wasn’t.
This data could be viewed if a phone was seized by authorities and the app was accessed, if the app companies themselves were subpoenaed for the data, or even potentially if data was sold.
While some of these risks are somewhat out of the control of FemTech apps themselves, they do highlight the need for FemTech apps to ensure that only essential data is collected and steps are taken to anonymise data where possible.
Poor data privacy practices
But long before Roe vs Wade was overturned there were existing concerns about data privacy and FemTech.
And this time related to poor data privacy practices.
The main concerns have been around collecting more data than is necessary, sharing with third parties (and even potentially selling data to third parties), poor data security measures and a lack of transparency about data privacy (i.e not being open about how data is collected, used and shared).
To be clear, in some cases it may be fine and necessary to share data with third parties, but it needs to be for good reason (and anonymised where possible), and users need to know and be given the option to opt-out easily and control their data.
This 2022 investigation by Orcha health is a good example where often the lack of control and transparency with users is the heart of the issue - rather than the just the data sharing itself.
But that wasn’t the first time this issue had been raised.
Back in 2019, Privacy International conducted an analysis of 36 menstruation apps and they found that 61% were automatically transferring data to Facebook when a user opened the app. They also found that some of those apps were routinely sending Facebook incredibly detailed and sometimes sensitive personal data. This was happening even if people were logged out of Facebook or didn’t have an account.
It’s also not just Facebook and Google that data is being shared with. In one shocking example a pregnancy tracking app called Ovia shared data from its users with their employers.
Poor data practices and weak security also leave FemTech apps open to hackers too. Health data is highly valuable. While not specifically FemTech, in Finland, thousands of mental health records were hacked and stolen, published online and offered for a ransom payment. Victims were understandably distressed at such sensitive and personal data being shared in public.
A failure to address issues
What’s most shocking is that poor practices are continuing - even when they have been highlighted time and again.
A study by JMIR from 2022 found that “many of the most popular women’s [mobile health] apps on the market have poor data privacy, sharing, and security standards. Although regulations exist, such as the European Union General Data Protection Regulation (GDPR), current practices do not follow them.”
Even as recently as last month, a study in Australia found that an unnecessarily high amount of data was being collected and this could leave things open to being sold to third parties.
The same apps are also mentioned in multiple studies. Flo Health is one of the most popular period tracker apps. However, there have been repeated concerns about its data privacy practices. In January 2021 it settled with the Federal Trade Commission for sharing user data with marketing and analytics services at Facebook and Google. Yet this was at least two years after they had been highlighted as one of the apps sharing data with Facebook previously.
So, why have mistakes in data privacy happened - and why aren’t they being addressed?
While the majority of FemTech apps have been developed from a place of wanting to help and serve, it’s clear that it could actually be quite easy to unintentionally fall short with data privacy.
That could be because:
Early stage founders rush product development to get an MVP out and rush through the steps of thinking through what data is being collected. They don’t fully take into account the user and the level of control and privacy they need, nor how to be transparent with them about their rights too.
There is a lack of understanding of data privacy and data sharing agreements - for example if Facebook may take data if you use its Facebook developer tools. Or if you work with a third party marketing agency that they may be getting access to all of your data.
Regulations aren’t fit for purpose right now - the regulations that do exist aren’t that straightforward or complete enough for what’s needed. In the UK, data protection compliance falls under the General Data Protection Regulation (GDPR) which requires robust data security measures and getting user consent for data collection and processing. In the US things are less clear. The big gap seems to be that Health Insurance Portability and Accountability Act (HIPAA) compliance is the one that protects your medical information when it’s at a doctors or medical provider or similar. But data in an app isn’t treated as an official place of healthcare - it’s treated almost like social media so it isn’t as protected.
Opportunistic hackers are preying on early stage companies where security might not be as strong as it could be.
There is often a lack of budget to involve a specialist to develop a bespoke privacy policy - instead perhaps a privacy policy has been downloaded online, or copied from a competitor. So your privacy policy doesn’t truly fit your product and/or it doesn’t have the clarity it needs to have for your users.
Intentional data sharing is done for profiteering - there are FemTech apps that share data as part of their monetisation. Clearly this one isn’t a mistake on the behalf of the apps, but often this is not being made clear to a user.
There has undoubtedly been a lack of consumer awareness, understanding and interest in data privacy too. A Deloitte study reported that less than 10% of people read the terms and conditions and user agreements for the websites and mobile applications that they use.
Even the Privacy International investigation noted that often users were agreeing to everything in the privacy policies of the apps - but would probably be shocked if they really understood how much intimate personal data was being processed and transferred to third parties.
What should FemTech be doing?
So it’s not just a case of making sure your data privacy policy is right, but you’ve also got to make sure that your users know exactly what they’re signing up to.
Cybersecurity consultant Natasha Singh has suggested that best practice falls into three pillars:
transparency - be clear with users about how data is collected, processed, used and shared.
control and privacy - ensure users can choose what data they share, they can easily access and edit it, or request that data is deleted.
security by design - prioritise security as you build your app, for example, ensuring data can be anonymised or really considering what data fields need to be in and which don’t.
And together these three pillars build trust.
As a real-life example, I really like Clue’s simple and user-centric approach to explaining its terms and conditions. It also gets ahead of the inevitable questions about selling data with this blog post about how it actually makes money.
Clue has recently taken things a step further on the pillar of ‘transparency’ and announced it is to crowdfund, inviting its own users to invest in the company. Part of the reason for the decision, co-founder Carrie shared, is to foster a sense of transparency and trust with its users.
“[Because] there was a mounting fear around reproductive surveillance of women through their health data, we talked a lot about our position as a company, and the role we had to play,” Carrie told Forbes.
It’s a smart move - by creating closer relationships with its users it can help them feel like Clue is open.
How can FemTech get ahead on this issue?
From a reputational perspective, here are a few ways to get ahead of this issue:
Go through the some of the previous analysis of FemTech apps (i.e Privacy International and others). How would your app have held up if included in that analysis? Are there things you would change? It’s a great opportunity to test-run your privacy policies.
Check every data point you collect - really question why you collect it and how you use it. Make sure there is a purpose for every data point you are collecting and that you can answer the question.
Team up with a strategic copywriter for your privacy policy copy (not just a lawyer) - especially if you work in period tracking it’s really important your privacy policy is clear. Yes, it can feel a big job but it is critical. Test out your privacy policy on users too - do they understand it, does it make sense?
Consider how you can educate your users about reading your privacy policy. There’s a responsibility on FemTech to explain and educate its users on stuff like this. People are interested and are eager to know and understand more now. Explaining some of the terminology (’de-identified’ vs ‘anonymised’ for example), some of your data processing processes and even just why you collect the data that you do and how you use it are great first steps.
Build your community - find ways to build a community around your product more generally. How can you build greater relationships with your users and develop trust through transparency?
What’s next for data privacy in FemTech?
Since Roe v Wade, consumers are getting more educated about data and are already demanding more from the apps they use.
The growth of FemTech undoubtedly will lead to more and more attention on this sector and therefore we may see closer scrutiny from regulators, as indicated by the ICO statement mentioned at the top of this article.
By getting ahead on this issue, founders have an opportunity to give themselves a competitive advantage by offering their users the control, transparency and security that they want and need.
We can build a better picture of women’s health
The flip side of large-scale data collection is the greater good that can be done with the data from FemTech apps too.
Because women’s health has been underserved and under-researched for so long, there is an incredible opportunity with all of this new data to pool it together to build a better picture of women’s health.
That is already happening - for example Apple’s women’s health study and Natural Cycles’ work to look at the impact of Covid 19 on periods.
But it’ll only be as good as the data that is collected.
A final word then from Clue, which I think summarises the potential of using the data from FemTech:
“Women’s health has historically been massively under-funded and under-researched, resulting in inadequate understanding and care of female bodies. There are so many very basic questions that we don’t know the answers to today, and that can only be answered with the kind of longitudinal cycle dataset that we are now starting to build, for the first time in history.”
There’s incredible opportunity to do good with the data from FemTech. Let’s just hope we don’t trip ourselves up by breaking the trust that we need to get there.
__
Recommended resources:
Here are some of the podcasts and articles I’d recommend if you’d like to find out more about this topic:
FemTech Focus [PODCAST] - Data Privacy of FemTechApps in a post Roe v Wade America (15 July 2022)
An excellent conversation between host Brittany Barrettoo and Privacy International’s Laura Lazaro Cabrera where they discuss the significance of the Roe v Wade ruling and also cover a brief history of data privacy in FemTech, including Privacy International’s own investigation from 2019.
Legally FemTech [PODCAST] - Femtech and data privacy in the US: Impact of the Leaker Supreme Court Abortion decision (11 May 2022)
A solo episode from Legally FemTech’s Bethany Corbin, this one takes you through how data can be shared with authorities and Bethany also provides some super useful practical advice for founders looking to improve their data privacy standards.
Wired [ARTICLE] - The most popular period-tracking apps, ranked by data privacy (20 July 2022)
A useful overview of some of the best and worst practices in data privacy for period-tracking apps.
ORCHA Health [REPORT] - 84% of period tracker apps share data with third parties (21 July 2022)
An investigation from 2022 about how period tracker apps share data, and why user control and transparency are so important.
Privacy International [REPORT] - Privacy International’s 2020 investigation into data privacy in FemTech apps (4 Dec 2020)
A long read on what Privacy International found when it asked five period tracker apps to release the data it held on its users.
Privacy International [REPORT] - Privacy International’s 2019 investigation into data privacy in FemTech apps
The original investigation into the data that period trackers share with Facebook.