Loading...
Blog

STAY ON TARGET: MODERN ADVERTISING, THE TORT OF PUBLIC DISCLOSURE, AND THE CASE FOR EXPANDED PRIVACY PROTECTIONS

By: Jake Knanishu

Abstract: Advances in advertising delivery and targeting technology pose a new, unique, and rapidly developing threat to individual privacy. Now, you can safely assume any advertisement you see on television or on a webpage has in some way been targeted at you—and so can anyone else who happens to see the ad. The finer points of this interaction among an advertiser, you, and a secondary audience prevent this invasion of privacy from being actionable under traditional common-law theories. The novelty, scope, and probabilistic nature of the harm suggest that regulation should be handled by an administrative agency, but which one? And how? Although not quite an “unfair or deceptive act or practice” in the familiar sense, this seems to be a modern twist on exactly the kind of thing the FTC was commissioned to police. Regardless of whose responsibility it becomes, something must be done before this practice gets too far out of hand.

  1. Ads Everywhere

            Today, ads are increasingly ubiquitous, and you can safely assume any advertisement you see on television or on a webpage has in some way been targeted at you. Facebook will now play advertisements in the middle of videos, but only videos the company’s algorithms have deduced that you are likely to watch.[i] And if all goes according to Elon Musk’s plan, companies offering “free to use” services will soon be able to beam ads right into your brain.[ii]

But at least when ads are beamed into your brain, they’ll be as private as they are intrusive. For now, the ads you see are also often visible to others around you. What can a friend or coworker infer from the ads that play on your Spotify on the way to lunch? What can children infer about their parents from the ads that play on TV?

Targeted advertising is not just another government invasion of privacy for the purpose of something like national security. At least when “Big Brother is watching,”[iii] we only have to live with the indignity of surveillance. Now, Alexa is listening, and with every purchase you make,[iv] every place you take your phone,[v] every second of screentime you give to every webpage you visit,[vi] you tell a little secret about yourself to all the data miners, data brokers, and search-engine optimizers of the world. And after they study all those little secrets, they make some very educated guesses about your big secrets. Then they play those secrets back to you, on every screen everywhere in your life, and they don’t care who else sees. In fact, the more people who see, the better. That’s the first rule of advertising. 

  1. The Tort of Public Disclosure

            It’s also illegal—almost. This practice so offends traditional notions of privacy that one might expect common-law protections to prohibit it. The tort of “public disclosure of a private fact” comes close, but the reason it likely doesn’t cover targeted advertising is helpful in understanding how difficult this phenomenon is to regulate.

As defined in the Restatement (Second), this tort exists to prevent someone “giv[ing] publicity to a matter concerning the private life of another . . . if the matter publicized is of a kind that (a) would be highly offensive to a reasonable person, and (b) is not of legitimate concern to the public.”[vii] A targeted ad is publicity—how much publicity depends on the context in which the ad plays, but it is publicity nonetheless. The information a targeted ad publicizes is typically private, whether it’s spending patterns, internet-browsing history, GPS data, or any number of other data points. And if that information reflects, say, a sudden interest in bankruptcy lawyers or erectile-dysfunction medication, its publicity would clearly be “highly offensive to a reasonable person”[viii] and “not of legitimate concern to the public.”[ix]

The argument isn’t a slam-dunk, but targeted advertising appears to be close enough to the scope of behavior this particular tort attempts to prevent. Presumably, at least some specific cases are egregious enough to merit judicial remedy. Consider, for example, a scenario in which a young woman discovers she’s pregnant. She “tells” no one yet—especially not her parents—but she researches pregnancy on the internet and perhaps purchases some prenatal vitamins or pregnancy-related books at a nearby store. Two weeks later, her parents discover in the mail some inserts for maternity clothes addressed to their daughter.[x] Is it a tort? How private must the information be, and how public must an ad or ad campaign make that information?

  1. Standing

Unfortunately, courts are not keen to reach those questions. Public disclosure of a private fact is one of what are known as the four “privacy torts,” the others being intrusion upon seclusion, appropriation of name or likeness, and false light.[xi] And although no one has yet litigated targeted advertising as public disclosure exactly, plaintiffs suing for these other torts in this context have consistently had their claims rejected outright.

The issue has been a failure of standing, which requires a plaintiff to demonstrate a “concrete” and “particularized” harm that is “fairly traceable” to the defendant’s conduct and that is redressable by a judicial decision.[xii] “In Shibley v. Time, Inc., an Ohio court dismissed a plaintiff’s suit against magazine publishers that sold subscription requests to direct mail advertisers,”[xiii] and “in Dwyer v. American Express Co., an Illinois appellate court rejected a plaintiff’s privacy suit that objected to American Express’s sale of consumer profiles that were derived from their spending habits.”[xiv] Although not in so many words, these courts expressed skepticism that the Shibley[xv] and Dwyer[xvi] plaintiffs presented genuine injuries in fact. 

Furthermore, when courts have indeed recognized standing under these other tort theories, it has been in cases such as Remsburg v. Docusearch, Inc.,[xvii] in which an individual purchased his intended victim’s personal information from a data broker in order to more easily stalk and murder her. But even that court was clear in its opinion that “a private citizen has no general duty to protect others from the criminal attacks of third parties.”[xviii]

  1. Where’s the Harm?

Targeted advertising, as public disclosure, presents a closer case. The secondary audiences to the ads targeted at you are likely people you know personally, making the reputational harm more concrete. Because ads are for specific products and services, the harm in this case is also more particular—these are not just general profiles for sale, but specific data points being communicated. And it is trivial to trace an ad to the source publicizing it.

However, such a case would likely still fail for lack of standing. Although these four torts are known as “the privacy torts,” three of them, including public disclosure, do not really protect privacy per se. They provide remedies for harms against an individual’s reputation. With that in mind, consider the nature of the injury targeted ads cause. However aggressive or obvious an ad campaign is, it still only publicizes clues about the intended audience’s private life. The harm to one’s reputation occurs not when the ad plays, but when the secondary audience infers something from it. In light of the decisions discussed above, the harm here is likely either too abstract or too probabilistic to get most courts’ attention. These privacy interests are just not of the kind that can be meaningfully enforced through private action.

  • Administrative Necessity

            But clearly something must be done. By only a narrow failure of standing, millions of torts are happening every day without remedy. We obviously can’t start punishing people for inferring things from ads. Likewise, targeted advertising is probably here to stay. As a practice, it predates Big Data,[xix] and it has only become a significant problem because of recent technological changes.

            China is attempting a quasi-legislative solution to curb the rampant privacy problems posed by ongoing explosion of targeted ads and their siblings, personalized experiences and recommended content. Coming into effect October 1 of this year, the country’s new Personal Information Security Specification requires, among other things, that companies offer users the option to turn off any “personalized” experiences that utilize certain categories of personal data.[xx]

The rest of the world will be watching closely to see how that plays out, but it does seem like a logical response. Another possibility might be, rather than requiring companies to give users an on-off switch for personalized experiences, instead to require companies to balance the sensitivity of information used to target ads and recommend content against the probability these features will appear on platforms that expose the sensitive information to secondary audiences. 

Regardless of the contours and principles that define the regulation, it must occur at the agency-level. The problem is simultaneously too young, too vast, too nebulous, and too dynamic for a direct, comprehensive legislative solution, and standing issues eliminate the possibility of private enforcement. The burden must be on companies themselves to use our information responsibly and respectfully, and the government must be responsible for monitoring companies’ behavior. However, most agencies, such as the Federal Communications Commission, are too industry-specific to police effectively this phenomenon as a whole.[xxi]

  • An Unfair Practice?

Barring the establishment of a new agency dedicated specifically to information privacy, the Federal Trade Commission (FTC) must be the source of our salvation. The FTC is already “the leading advocate for consumer privacy,”[xxii] but because its authority is limited to combatting “unfair or deceptive acts or practices,”[xxiii] the agency hasn’t been very aggressive in this arena. It sees itself as limited to intervening only when it catches companies lying or being opaque about their privacy practices.[xxiv] However, “unfair or deceptive acts or practices” are a much broader category than just a lack of transparency.[xxv]

Perhaps if the Commission announced tomorrow that it believes it has the authority to regulate targeted advertising and personalized experiences, courts might agree. Taking a step back from the exact language, “unfair or deceptive acts or practices,” one of the FTC’s core responsibilities is to protect consumers from sellers’ informational advantages,[xxvi] and this same kind of asymmetry is at work in the practice of targeted advertising. You don’t know when and how a company is going to show you a particular ad, but the company does. However, if the FTC declines to claim this authority, or if courts deny it, Congress must give it to someone, even if it means commissioning a new agency altogether. With every day these unfair practices go ignored, they become more sophisticated, more entrenched, and—of course—more deeply invasive.


[i] Complement Your Video Strategy, Facebook for Business, https://www.facebook.com/business/ads/video-ad-format (last visited Oct. 11, 2020).

[ii] Neuralink, https://neuralink.com/applications/ (last visited Oct. 11, 2020).

[iii] George Orwell, 1984 1 (1949).

[iv] Max Freedman, How Businesses Are Collecting Data (And What They’re Doing with It), Business News Daily(June 17, 2020), https://www.businessnewsdaily.com/10625-businesses-collecting-data.html

[v] Privacy and Terms, Google, https://policies.google.com/technologies/location-data?hl=en-US (last visited Oct. 11, 2020).

[vi] David Nield, Here’s All the Data Collected from You as You Browse the Web, Gizmodo (Dec. 6, 2017, 11:30 AM), https://gizmodo.com/heres-all-the-data-collected-from-you-as-you-browse-the-1820779304.

[vii] Restatement (Second) of Torts § 652D (Am. L. Inst. 1977). 

[viii] Id.

[ix] Id.

[x] See Kashmir Hill, How Target Figured Out A Teen Girl Was Pregnant Before Her Father Did, Forbes (Feb. 16, 2012 11:02 AM), https://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/#19dbac456668.

[xi] Restatement (Second) of Torts §§ 652B, 652C, 652E (Am. L. Inst. 1977).

[xii] Theodore Rostow, What Happens When an Acquaintance Buys Your Data?: A New Privacy Harm in the Age of Data Brokers, 34 Yale J. on Reg. 667, 679-80 (2017).

[xiii] Id. at 680.

[xiv] Id.

[xv] Shibley v. Time, Inc., 341 N.E.2d 337, 339-40 (Ohio Ct. App. 1975).

[xvi] Dwyer v. Am. Exp. Co., 652 N.E.2d 1351, 1357 (Ill. App. Ct. 1995).

[xvii] Remsburg v. Docusearch, Inc., 816 A.2d 1001 (N.H. 2003).

[xviii] Id. at 1006.

[xix] Lawrence R. Lepisto & Terry C. Wilson, Marketing Strategy in the 1980’s: Major Changes Ahead for Marketers136 (2015), https://link.springer.com/chapter/10.1007/978-3-319-10966-4_31 (“As markets get smaller and more defined, marketers must continue to develop more precise segmentation approaches. The use of demographics, psychographics, and family decision working roles must be used with greater accuracy, insight, and skill.”).

[xx] China Publishes New Specification on Personal Data Security, CMS (Mar. 11, 2020), https://www.cms-lawnow.com/ealerts/2020/03/china-publishes-new-specification-on-personal-data-security?cc_lang=en.

[xxi] See, e.g., Communications Act of 1934, 47 U.S.C.A. § 151 (West 2012) (limiting the FCC’s regulatory authority to companies engaged in communications and only for several specific purposes, none of which involve consumer privacy).

[xxii] Rostow, supra note xii, at 681.

[xxiii] Federal Trade Commission Act, 15 U.S.C.A. § 45 (West 2012).

[xxiv] Rebecca Lipman, Online Privacy and the Invisible Market for Our Data, 120 Penn St. L. Rev. 777, 789 (2016) (“In 2014, [the FTC] released a long report on data brokers . . . . The name of the report is telling: ‘Data Brokers: A Call for Transparency and Accountability.’ The FTC can only call for transparency and accountability, they cannot mandate it without supporting legislation. The press release for the report highlights this fact by providing a long list of policies the FTC ‘encourages’ Congress to consider enacting.”).

[xxv] See, e.g., Fed. Trade Comm’n v. Partners In Health Care Ass’n, Inc., 189 F. Supp. 3d 1356, 1364 (S.D. Fla. 2016) (explaining that “To establish liability under [S]ection 5 of the FTCA, the FTC must establish that (1) there was a representation; (2) the representation was likely to mislead customers acting reasonably under the circumstances, and (3) the representation was material” and “Proof of intent to deceive is not required to establish liability.”).

[xxvi] About the FTC, https://www.ftc.gov/about-ftc (last visited Oct. 30, 2020).

Leave a Reply