Your Body Is a Data Mine. Five Countries Can't Agree Who Owns It.
The same heartbeat reading, DNA sample, or facial scan has radically different legal protections depending on where you live. From 23andMe's bankruptcy to post-Dobbs data weaponization, here's how five legal regimes treat the data your body generates — and why 640 million smartwatch users should care.
Your Body Is a Data Mine. Five Countries Can't Agree Who Owns It.
Your heart rate, your DNA, your face, your menstrual cycle — data your body generates every second is now collected, sold, subpoenaed, and breached at a scale no privacy law anticipated. The twist? The same piece of biometric data has radically different legal status depending on which country you're in, who collected it, and whether the collector is a corporation or the state.
PRISM's AI scanned sources across English, Japanese, Korean, and Chinese to map how five legal systems handle the most intimate data you'll ever produce.
The core facts
In 2024, 276.8 million healthcare records were breached in the US alone — a 64.1% jump from 2023. The Change Healthcare ransomware attack exposed 190 million individuals' data, making it the largest healthcare breach in history.
Meanwhile, the wearable health market is booming. An estimated 562.86 million people wore smartwatches in 2025, a number projected to hit 640.15 million in 2026. Ninety-two percent of those users rely on their devices for health and fitness tracking. The data flowing from those wrists — heart rate variability, blood oxygen, sleep patterns, menstrual cycles — is extraordinarily intimate.
And here's the gap that should alarm you: HIPAA, the US law most people assume protects their health data, doesn't cover consumer health apps, fitness trackers, or wearables. Unless that data enters the formal healthcare system, it sits in a regulatory void.
That void has consequences. BetterHelp paid $7.8 million after the FTC caught it sharing therapy data with Facebook. GoodRx paid $1.5 million for funneling prescription information to ad platforms. Eighty-seven percent of mental health apps have serious privacy vulnerabilities. On the dark web, a single therapy record sells for over $1,000 — far more than a stolen credit card.
US: "Your body data is an unregulated commercial asset"
The United States has no comprehensive federal health data privacy law. HIPAA covers hospitals and insurers but not the apps on your phone. The result is a patchwork where the most sensitive data imaginable falls through the cracks.
The 23andMe bankruptcy in March 2025 made this painfully concrete. When the company filed Chapter 11, the genetic data of 15 million customers became a corporate asset that could theoretically be auctioned off. The New England Journal of Medicine warned that "existing privacy, bankruptcy, and bioethics frameworks are ill-equipped to handle the transfer of sensitive genetic data through bankruptcy courts." A nonprofit led by co-founder Anne Wojcicki eventually acquired the data, but only after months of uncertainty.
Post-Dobbs, the stakes got darker. (For more on the body data surveillance landscape, see PRISM's earlier coverage.) Period trackers and reproductive health data became potential criminal evidence. In Nebraska, a teenager was convicted using subpoenaed Facebook messages about terminating a pregnancy. Privacy International noted a core principle: if an app doesn't hold user data on its servers, there's nothing to hand over in response to a subpoena.
Perhaps most disturbing: newborn blood samples collected during routine hospital screening have been accessed by law enforcement without warrants. In New Jersey, police used a stored infant blood sample to identify a child's father in a criminal case. When parents discovered this, backlash was fierce — Texas destroyed over 5 million samples, Minnesota destroyed 1.1 million.
EU: "Body data is a fundamental right"
The EU treats biometric, genetic, and health data as "special category" information under GDPR, requiring explicit consent for processing. The AI Act, whose prohibitions took effect in February 2025, bans real-time remote biometric identification by law enforcement except in narrow cases — the strictest biometric surveillance law on Earth.
The European Health Data Space regulation entered into force on March 26, 2025, creating an EU-wide framework for health data sharing with GDPR-level penalties: up to 20 million euros or 4% of global turnover. On paper, it's the gold standard. In practice, enforcement remains uneven. Total GDPR healthcare fines across 27 countries amount to 22.8 million euros — spread across 237 separate cases. A Swedish pharmacy was fined 3.2 million euros for transmitting customer health data to Meta via tracking pixels.
Japan: "Let's loosen the rules for AI"
Japan's Act on Protection of Personal Information (APPI) classifies biometric data and health records as "special care-required personal information" needing explicit consent. But a March 2025 proposal from the Personal Information Protection Commission would allow health and biometric data to be used without consent for statistical processing and AI development.
That overturns the consent requirement for an entire category of sensitive data. Japan's My Number card system now links to health insurance, giving the government access to prescription histories. The country's corporate wellness culture means employers routinely collect employee health metrics. Japan is betting that enabling AI innovation matters more than maintaining strict consent requirements — a calculated trade-off its neighbors aren't making.
South Korea: "Tighten protections, grow the market"
South Korea is trying to have it both ways. PIPA classifies health and biometric data as "sensitive personal information" with heightened protections. New data portability rights that took effect in March 2025 let individuals request their data in machine-readable format. The PIPC plans to enact a dedicated biometric data law covering facial images and fingerprints.
At the same time, Korea's genetic testing market is projected to grow from $316 million in 2025 to $1.77 billion by 2035 — an 18.8% annual growth rate. Samsung Health is widely used, but its privacy implications are rarely discussed. Korea hosted the 2025 Global Privacy Assembly in Seoul, signaling ambition to lead on privacy regulation while feeding a booming biotech economy.
China: "Protect citizens from companies, not from us"
China presents the sharpest contradiction. The Personal Information Protection Law (PIPL) classifies biometric and genetic data as sensitive, requiring separate consent and impact assessments. New national standards effective November 2025 expand the definition to include gait recognition, eye patterns, and psychological health data. March 2025 facial recognition regulations require informed consent and dedicated storage.
But state surveillance is exempt. China operates the world's largest biometric surveillance network while simultaneously restricting foreign access to Chinese genetic data, which is classified as a "strategic national asset." When Illumina, a US genomics company, was placed on China's Unreliable Entity List in February 2025, it became clear that genetic data governance is now geopolitical — not just personal.
So what?
The de-identified health data market hit $8.8 billion in 2025. Big data in healthcare is a $110.97 billion industry. These numbers will keep climbing. The question isn't whether your body data will be collected — it already is. The question is who gets to profit from it and who gets to weaponize it.
The divergence across legal regimes creates a troubling reality. Your Fitbit data stored on US servers has different protections than the same data would receive in the EU. Your genetic sample in a 23andMe database had weaker protections than Chinese genetic data classified as a national asset. A newborn's blood spot in New Jersey had fewer privacy rights than an adult's facial scan in Brussels.
States like Illinois (where BIPA has triggered over 100 class actions in 2025 alone and produced settlements like Google's $1.375 billion payout) show that strong biometric laws generate real accountability. But the global picture remains fragmented, and the data keeps flowing across borders that laws can't follow.
This analysis was generated by PRISM's AI, which scans news sources across English, Japanese, Korean, and Chinese daily.
Sources referenced:HIPAA Journal, NPR, New England Journal of Medicine, Health and Human Rights Journal, FTC, ACLU, Privacy International, Brookings Institution, Lawfare, Stanford Law School, Federation of American Scientists, IAPP, CMS Law, European Commission, Chambers and Partners, China Briefing, Bird & Bird, DemandSage, Athletech News, Stateline, National Law Review, Biometric Update, Science (AAAS)
This content is AI-generated based on source articles. While we strive for accuracy, errors may occur. We recommend verifying with the original source.
Related Articles
Iran and Israel are hacking civilian security cameras for military reconnaissance. How consumer surveillance devices became weapons of war.
The Pentagon's clash with AI companies exposes a legal gray area around mass surveillance. OpenAI and Anthropic's opposing choices reveal deeper questions about privacy in the AI age.
Qualcomm's new Snapdragon Wear Elite chip targets pendants, pins, and smart glasses, expanding AI wearables beyond smartwatches. Is this the future or just another tech fantasy?
Greek court sentences Intellexa founder to 8 years for illegal wiretapping. First time a spyware maker faces prison for technology misuse, setting precedent for the industry.
Thoughts
Share your thoughts on this article
Sign in to join the conversation