Femtech apps aim to empower users by helping them gain insights into their reproductive and sexual health. While the data you get from your period tracker, menopause transition kit or other femtech tools are valuable to you, using them can turn intimate — and potentially incriminating — data into a ready-made dossier for law enforcement, a bounty hunter, a jilted ex or an insurance company to use against you.
In a post-Roe world where reproductive freedoms have been curtailed and pregnancy-related prosecutions are on the rise, tracking your intimate data can be risky — even in states where abortion is legal.
Travel Bans and Pregnancy Criminalization Are Happening
Pregnancy-related criminal prosecutions are not new, but they have increased since the 2022 Dobbs v. Jackson case overturned Roe v. Wade, according to non-profit Pregnancy Justice US in a press release for its report Pregnancy as a Crime.
The report identified at least 210 cases in which pregnant people faced criminal charges associated with pregnancy, abortion, pregnancy loss or birth in the first year after the Dobbs decision. (More than 2,000 individuals in the U.S. faced pregnancy-related prosecution and punishment while Roe was in force from 1973 to 2022 – an average of 40 per year). “Pregnancy as a Crime shows that in the post-Dobbs environment, pregnant people are under increased surveillance and are getting arrested, prosecuted, and incarcerated for any actions that have a perceived risk of harm to the pregnancy.”
In Texas, AG Ken Paxton and a jilted boyfriend are each suing for access to out-of-state medical records of women suspected of traveling for abortions, as reported by the Associated Press and CNN, respectively. Idaho prosecutors charged a woman and her son with second-degree kidnapping for allegedly crossing state lines with his girlfriend for an abortion. It also enacted a law in May preventing minors from traveling for abortions, but it has been blocked pending a constitutional challenge, according to the Associated Press.
Efforts to restrict access to the abortion pill have ramped up. Police are using drug-sniffing dogs to detect abortion pills in Jackson, Mississippi, according to The Intercept. Idaho, Missouri and Kansas are jointly suing the Food and Drug Administration to prevent the remote provision of abortion drugs and to outlaw it for teens, arguing it harms their states by reducing teen pregnancy rates, according to the Los Angeles Times.
The joint suit could soon be moot, if President-elect Trump heeds Project 2025’s recommendations (pages 457-459 and 496) to curtail what the document calls “mail-in abortions” while working to reverse Food and Drug Administration approval of the abortion pill and restrict access to morning-after and week-after pills.
Trump distanced himself from the Project 2025 report during the campaign, so there is no guarantee that he will implement its policy proposals that center on its “pro-family promises” (page 5). However, Trump’s rumored cabinet nominees include people who either contributed to Project 2025, such as Tom Homan of the Heritage Foundation for border czar.
If he does, this will also increase the likelihood and extent of reproductive surveillance. For example, Project 2025 recommends Health and Human Services compel states to submit detailed abortion data to the Center for Disease Control by withholding Health and Human Services funds (pages 455-456). It proposes renaming HHS the Department of Life and having it explicitly reject the notion that abortion is healthcare (page 489).
Femtech Users Value Privacy, But They Also Value Their Health Data
Femtech users take their privacy seriously. But they also value the ability to track and gain insights from their detailed reproductive and sexual health data — even in extreme conditions. To illustrate, a recent study by femtech unicorn Flo Health was able to analyze data extracted from over 87,315 active Flo Health users in Ukraine who consistently tracked their symptoms before and immediately after Russia’s February 2022 full-scale invasion. (Flo used the data for a study on pain perception under acute stress).
A 2022 Wired analysis found the mass post-Roe deletion of femtech apps was swiftly followed by a wave of new downloads. Users weren’t abandoning their femtech apps. They were switching to new apps that promised greater privacy and security. Yet the analysis showed that many apps fell short of their privacy promises. That same year, Mozilla slapped a “Privacy not Included” label on 18 of 25 period and pregnancy tracking apps it had reviewed. Only a handful were “privacy-first.”
Femtech apps are marketed as health and wellness apps, but the data they collect is not subject to the privacy protections that apply to physicians or researchers.
“Health and commercial data often fall under different regulatory requirements. When you sell commercial apps for health, the user may not understand that their health data has less safeguards,” privacy engineer and Binary Tattoo founder Cat Coode explained to me. “And app and product developers aren’t forced by regulatory requirements to put those safeguards in.”
This gives femtech providers more flexibility. They can choose to optimize for monetization or for privacy and medical reliability or something in between. Femtech users are left to decipher which ones to trust. Self-regulation is not always effective for protecting privacy, as I’ve written previously.
Zach Edwards, data supply auditor and founder of boutique analytics firm Victory Medium, cautioned in an email to me that, “If data is stored, it’s at risk of being accessed via government subpoenas. Organizations who provide period tracking apps, yet also store location data or data about their users on centralized servers, are putting all those users at risk.”
Even aggregated and de-identified user data used for research or prediction improvement can provide a “snapshot” for law enforcement to zero in on what to request in a subpoena, warned Edwards. It’s important to minimize what’s collected and how long it’s retained. Without detailed assurances regarding how providers manage re-identification risk, users should assume they can be identified in the data.
It’s Difficult For Consumers To Know Which App Providers’ Privacy Claims Can Be Trusted
Consumer app users are not in a position to verify whether providers’ privacy claims are accurate, even if they take the time to read the fine print in privacy notices. That means they may only find out once investigative journalists, researchers or regulators uncover data practices that depart from what consumers are told, or draw attention to information in the fine print consumers may have missed.
In 2021, the Federal Trade Commission entered a settlement with Flo Health for sharing users’ sensitive health data with advertisers via software development kits (which Flo denies). Flo Health has since become the first femtech app to be dual-certified under ISO/IEC 27001 and 27701 for information security and privacy information management systems. It also created the award-winning Anonymous Mode that lets people use the app without associating their data with device identifiers, names or contact details. Edwards has praised this move after reviewing Flo’s technical white paper. He says all femtech apps that centrally store user data should do the same.
In 2023, the FTC alleged Easy Healthcare Corporation, the makers of ovulation tracker app Premom, had violated the Health Breach Notification Rule by sharing users’ health data with advertisers and analytics providers such as AppsFlyer and Google via software development kits. In a press release, the company denied selling or sharing users’ health data to third parties, and noted the settlement was not an admission of any wrongdoing.
Jeff Jockish, CXO and privacy researcher at ObscureIQ, has intimate knowledge of the data broker world. His firm specializes in performing “digital footprint wipes” for clients seeking to reclaim their online privacy. He told me femtech users shouldn’t trust providers’ claims that they won’t sell their health data. Even if they don’t technically “sell” it, sharing it widely for ad-targeting or analytics exposes users to serious risks. App providers that haven’t attracted regulator scrutiny have plausible deniability because these apps are not subject to health privacy laws that clearly define “health data.” He said users should assume that any app they are logged into is tracking them.
Edwards agrees that femtech apps that create unique user IDs or rely on AdIDs put their users at risk because data from various sources are used to create user segments and profiles that can be shared or sold. “But some specific organizations created those lists, and so if a state law somehow ever is passed that empowers investigations into people who are pregnant, it would be dangerous for any advertising organizations to continue to build those audience segments, for fear of the government attempting to subpoena or access that data,” Edwards explained.
Even Non-Health Data Can Be Used to Infer Reproductive Health Data
A 2020 Norwegian Consumer Council report Out of Control analyzed the data flows of various consumer apps, including two femtech ones. It found that app data was shared with numerous third parties, including advertising and analytics companies. It noted that in addition to usage data, one of the femtech apps frequently shared location data with third parties, including location data brokers, even though location was not necessary for core app functionality. These included detailed GPS location data, WiFi access point data, cell tower data and Bluetooth properties. “Together, these data points can be used to pinpoint location, even inside a building, down to a specific floor” its technical report notes.
Location tracking near sensitive points of interest can be particularly revealing, as a recent 404 Media exposé demonstrated. That investigation showed how a user’s trajectory could be used to infer they had sought abortion care and pinpoint where they live.
In 2023, Washington enacted the My Data My Health Act, becoming the first state law to protect health data outside of HIPAA. It uses a broad definition of consumer health data that includes inferences based on precise geolocation. This state-level step is the exception — not the rule — in the U.S.
Using Femtech Makes Reproductive ‘Dataveillance’ Trivial
Policing the most intimate aspects of people’s lives takes data and surveillance, hence the post-Roe calls to “delete your period apps!” Data surveillance and privacy expert Roger Clarke, coined the term “dataveillance” to refer to “The systematic creation and/or use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons.”
Many femtech apps contain third-party trackers that share user data across complex data ecosystems with third parties for targeted advertising, analytics and algorithmic improvement. The more data is shared, the less control you have. Even if you delete the app, the data may be retained. For example, Premom’s privacy notice explains it retains the data of inactive users for up to five years, while Flo retains it for up to three, in case a user seeks to reactivate during another stage of life, according to its privacy policy. In any case, once it’s shared, it’s harder to delete.
‘Prompt Surveillance’: AI Has Entered the Chat
Integrating AI chatbots into femtech apps can introduce novel risks. Jackie Singh, former director at STOP, the Surveillance Technology Oversight Project, warns in a recent recent Substack post that conversational AI introduces the new threat of ‘prompt surveillance’. Singh explained her concerns to me via text: “As a cybersecurity expert and mother of three daughters, I want women to understand the complex reality behind AI: it’s not just a tool with benefits, but also one with inherent risks. While companies present only positive aspects, they’re actually accumulating vast datasets of human thoughts and behaviours without our consent or knowledge. This raises critical questions about ownership, control, and potential consequences for our privacy and autonomy.”
These conversations can be shared with third parties. For example, Premom’s privacy notice explains that information users input into its Ask AI feature may be shared with vendor OpenAI.
There’s Light At The End of The Tunnel — But Proceed With Caution
Femtech emerged to fill a serious void in reproductive and sexual healthcare. There is a historical sex and gender bias in medicine because women, transgender and intersex people are often excluded from the research and development pipeline. CBS News reports that abortion restrictions are impacting women’s healthcare more broadly. While restrictive states become OBGYN deserts, OBGYN enrollments are down. The need for reliable health resources, data and insights will only increase, and this is a gap femtech apps can fill. But if femtech providers can’t protect users’ privacy, they may put their bodily autonomy in jeopardy.
People shouldn’t have to choose between their healthcare and their privacy. Thankfully, there are privacy-first and privacy-enhanced period apps available. There are also measures people can take to better protect their privacy. Since not all femtech apps are created equally, users should proceed with caution.