Skip to content


By Lea Chapoton, Volume 108 Staff Member

In the wake of 2022’s Dobbs v. Jackson Women’s Health Organization[1] decision and the ensuing barrage of state laws limiting abortion access, online discussions surged with strategies for maintaining reproductive freedom in potentially hostile circumstances. One popular piece of advice urged deletion of period tracking apps because companies could sell the personal data collected to third-parties, or worse, provide the data to law enforcement as proof of a pregnancy.[2] Just one year later, a mother and daughter pled guilty to violating Nebraska’s anti-abortion laws after Facebook Messenger chats discussing terminating the daughter’s unplanned pregnancy were revealed to law enforcement.[3] We all cast wide digital shadows, giving companies and other users glimpses into our lives. Abortion is only one example of information that we casually communicate through online channels; from dating apps to LinkedIn profiles, the Internet is filled with nooks and crannies we are encouraged to fill with personal details and data. Yet, this personal information holds the potential for profound violations of our intimate privacy, and it is often held by companies financially incentivized to profit from it rather than protect it.[4] And, critically, the current legal solutions to such privacy violations are falling behind.

Danielle Keats Citron’s 2022 book, The Fight for Privacy: Protecting Dignity, Identity, and Love in the Digital Age, compellingly weaves personal stories of individuals harmed by online intimate privacy violations with discussion of the legal systems that failed to protect them from harm or adequately address their grievances. Citron, a law professor and digital privacy scholar, introduces the concept of “intimate privacy,” which concerns the “extent to which others have access to, and information about, our bodies; minds . . .; health; sex, sexual orientation, and gender; and close-relationships.”[5] Consumers often fail to notice how frequently we let companies peer into our intimate lives, in part due to intentional website or app design choices making it easier to accept privacy policies and harder to assess data practices.[6] This becomes especially concerning in the realm of dating apps,[7] mental health apps,[8] and wearable technology like smart watches.[9] Worse, some technologies are designed precisely with surveillance in mind, such as “cyberstalking apps” that give installers of the app unprecedented access to another’s location, photos, texts, calls, and more.[10]

Citron identifies three main perpetrators of privacy violations: (1) “Spying, Inc.,” or companies that profit off collecting and selling consumer data;[11] (2) “privacy invaders,” or individuals who leak photos and information about others online without their consent;[12] and (3) government actors who deploy extensive online surveillance in the name of law enforcement and national security goals.[13] Perhaps unsurprisingly, women, people of color, and the LGBT community are disproportionately harmed by these practices, as they often have the most to lose when privacy is breached.[14] Beneath all these issues, argues Citron, is an inadequate, outdated, and underpowered system of laws and jurisprudence.


The current legal options for victims of intimate privacy violations online are sparse. Both civil and criminal law options often are limited in scope and remedies, making justice inaccessible to victims or attainable only at great emotional and financial costs.[15] The anonymity of many online communities further complicates litigation by making it difficult to identify individual perpetrators.[16] Perhaps most significantly, the online platforms hosting harmful content are rendered legally untouchable due to Section 230 of the Communications Decency Act.[17] Section 230, adopted by Congress in 1996, protects content hosts from civil liability for user-generated content.[18] While it was originally intended to ensure online “Good Samaritan” platforms would not be punished for moderating harmful content, the law has since been applied as a near-total liability shield from most civil actions, aside from certain intellectual property claims.[19] This lack of legal recourse in the event of a criminal or tortious privacy violation is a pertinent concern for anyone who has an online presence—which is to say, almost everyone.

Reform is urgently needed to address these legal weaknesses; however, the path forward will be challenging because it requires a change in the current approach to data privacy problems. One major obstacle to policymaking on this issue, as Citron describes it, is that policymakers often tackle online privacy issues in a “piecemeal” manner rather than attacking problems on a broad scale.[20] Recent state laws reflect this trend. For example, California passed the “Delete Act” in October 2023, requiring online data brokers allow Californians to request deletion of their personal data from the broker’s records with a single button click.[21] While a positive step for empowerment of online users, this law is mainly reactive in scope and offers little preventative measures to stop data from being collected in the first place. Similarly, the Minnesota legislature passed the Student Data Privacy Act in 2022, forbidding schools and technology providers from selling or sharing data gathered from students’ school-issued devices.[22] Unlike the “Delete Act,” this law represents a preventative action. Yet, this mechanism is still limited because it only protects specific citizens in specific spaces. Both laws also only address privacy issues at a state-level, meaning that the amount of protection individuals have varies widely depending on location. It is time for a national, holistic solution to intimate data privacy violations.


Citron offers numerous suggestions for where to start. First, legislatures and courts must address the barriers plaintiffs face in bringing online privacy violation claims.[23] This includes making procedural changes to ensure satisfactory solutions, such as providing for injunctive relief and awards of reasonable attorneys’ fees.[24] Crucially, Citron also argues for changes to Section 230.[25] She suggests new statutory language requiring that online platforms, to maintain their liability shield, must take “reasonable steps to address unlawful uses of its service that clearly create serious harm to others.”[26] The “reasonable steps” standard, consistent with many tort law concepts, allows courts to consider the policies and practices of individual platforms in relation to industry norms and best practices.[27] Citron further suggested that Congress could authorize a federal agency—such as the FTC—to issue guidance on what “reasonable steps” look like to help platforms and courts.[28] In the big picture, Citron hopes that these reforms can eventually build up to treating intimate privacy as a civil right,[29] and recognizing platforms and government actors as “data guardians” charged with protecting that right.[30]

The problems presented by intimate data privacy are vast and will require significant, coordinated policymaking to meaningfully address. Optimistically, there are signs that changes to U.S. law are on the way.[31] The proposed American Data Privacy and Protection Act, for example, synthesizes many of the reforms suggested by Citron and saw strong, bipartisan support in the House Energy and Commerce Committee in 2022.[32] However, until this legislation passes, The Fight for Privacy will continue to be an essential read for policymakers and ordinary Internet users alike. Citron’s message is clear: the stakes of intimate privacy violations online are too high to continue accepting discrete and piecemeal reform. Deleting period tracking apps, speaking in code on Facebook Messenger, and other self-protection mechanisms can only help so much until deeper change occurs. Until the goal of national and comprehensive reform is reached, the fight for privacy continues.


[1] 142 S. Ct. 2228 (2022).

[2] Tatum Hunter & Heather Kelly, With Roe Overturned, Period-Tracking Apps Raise New Worries, Wash. Post (June 24, 2022), []; see also Rina Torchinsky, How Period Tracking Apps and Data Privacy Fit into a Post-Roe v. Wade Climate, NPR (June 24, 2022), [] (describing the risks presented by period tracking apps after Dobbs and alternatives for cycle tracking).

[3] Margery A. Beck, Nebraska Mother Sentenced to 2 Years in Prison for Giving Abortion Pills to Pregnant Daughter, Associated Press (Sept. 22, 2023), [].

[4] Sara Jacobs, Americans’ Most Private Data is Under Threat. Here’s How to Protect It., MSNBC (June 24, 2023), [] (“The information you leave behind in period and fertility tracking apps, ride-sharing apps, search engines, browsing history, location data and more can be sold and shared without your consent with advertisers, data brokers or even law enforcement.”); Danielle Keats Citron, The Fight for Privacy: Protecting Dignity, Identity, and Love in the Digital Age 13 (2022) (ebook) (“The data-brokerage industry generates 200 billion dollars annually.”).

[5] Citron, supra note 4, at viii.

[6] Id. at 4–5.

[7] See, e.g., id. at 10–11 (detailing a Catholic priest’s resignation after an online data broker report revealed he was using Grindr, the gay dating app).

[8] Id. at 9 (“According to a study conducted in 2019, 29 of the 36 most popular free apps for depression and quitting smoking allowed advertisers or marketing services to access some subscriber information.”).

[9] Id. (“When then-FTC chief technologist Latanya Sweeney analyzed twelve health apps and two wearable devices in 2014, she found that subscriber information was sent to no fewer than 76 different companies.”).

[10] Id. at 33–35. Over 518,223 cyberstalker technology infections were detected globally by the security firm Kaspersky in 2019. Id. at 35.

[11] See generally id. at 1–23 (describing the extensive ways online entities collect and sell user data, as well as the dangers of current practices).

[12] See generally id. at 23–48 (discussing online privacy invasions perpetrated by individuals including nonconsensual pornography and deepfakes).

[13] See generally id. at 49–63 (describing governments’ online surveillance methods and the implications of these broad powers for civilians).

[14] See, e.g., id. at viii (citing that women and gay or bisexual men are more likely to be victims of nonconsensual pornography than straight men).

[15] Id. at 93–96.

[16] Id. at 90.

[17] Id. at 83–86.

[18] Id. at 86 (“[N]o provider or user of an interactive computer service shall be treated as a publisher or speaker of any information provided by another information content provider.”) (internal quotation marks omitted).

[19] Id.

[20] Id. at 82.

[21] Johana Bhuiyan, Californians Can Scrub Personal Info Sold to Advertisers with First-in-US Law, Guardian (Oct. 10, 2023), [].

[22] Minnesota Has New Privacy Protections for Students Using School-Issued Tech Devices, Council of State Gov’ts (Sept. 15, 2022), [].

[23] Citron, supra note 4, at 133–35. These barriers include allowing plaintiffs to sue under pseudonyms and incentivizing attorneys to take on claims pro bono. Id.

[24] Id. at 135–36.

[25] Id. at 148.

[26] Id. at 149.

[27] Id. at 151–53.

[28] Id. at 155.

[29] Id. at 106–10.

[30] Id. at 147–48.

[31] See, e.g., Muge Fazlioglu, US Federal Privacy Legislation Tracker: Introduced in the 118th Congress (2023-2024), Int’l Ass’n of Privacy Pros. (2023), [] (listing over forty privacy-related bills proposed in the current legislative session); Jonathan M. Gaffney, Chris D. Linebaugh & Eric N. Holmes, Overview of the American Data Privacy and Protection Act, H.R. 8152, Cong. Rsch. Serv. (Aug. 31, 2022), [] (summarizing the American Data Privacy and Protection Act, a proposed bipartisan bill featuring a “comprehensive federal consumer privacy framework” including a private right of action, state law preemption, and broad data security protections for consumers).

[32] Gaffney et al., supra note 31.