The Privacy Concerns of Palantir, Police Departments & the Judicial Process

DALE RAPPANEAU, Class of 2022

Perilous to us all are the devices of an art
deeper than we possess ourselves.[1]

From his office suite eight floors above the rest of his security team, Peter Cavicchia, a former U.S. Secret Service agent, worked as JPMorgan’s official spy.[2] His duties entailed monitoring the financial institution for insider threats—meaning employees going rogue or somehow sabotaging the institution. To do this, Cavicchia and his team of 120 security engineers accessed and absorbed every piece of available data on their employees. From an employee’s “printer and download activity” to their company-issued smartphone “GPS locations,” “phone conversations” to “emails and browser histories,” Cavicchia got it all.[3] Then he input that data into a computer algorithm product known as Metropolis, which analyzed the information and informed Cavicchia about which employees warranted “further scrutiny” or even “physical surveillance after hours by bank security personnel.”[4] Neither JPMorgan nor Metropolis set any limits on what or who Cavicchia could access, so for four years Cavicchia “had unprecedented access to everything, all at once, all the time, on one analytic platform.”[5] Not surprisingly, all that power went to his head, and in an ironic yet predictable twist of fate, Cavicchia himself went rogue, utilizing Metropolis to spy on the company’s senior executives who had originally authorized him to spy on others.[6]


The Metropolis software is an algorithmic product produced by a company known as Palantir, but it is by far their most benign. Founded in 2004, Palantir was seeded by the C.I.A.’s venture-capital arm which invested in Palantir to produce a software program capable of analyzing large data sets “in order to identify connections, patterns and trends that might elude human analysts.”[7] This initial investment produced Gotham, an analytic platform designed for national security and defense sectors. Palantir claims the product “integrates and transforms data, regardless of type or volume, into a single, coherent data asset,” and as “data flows into the platform, it is enriched and mapped into meaningfully defined objects—people, places, things, and events—and the relationships that connect them.”[8] The product is so skilled at analyzing data and mapping individuals that rumors claim Palantir’s software assisted the U.S. military in tracking down Osama bin Laden.[9] However, rumor or not, Gotham’s ability to analyze data sets and track individuals is incredible,[10] making it an enticing tool for American police departments.

The problem is, how does Palantir’s software actually work? And if a police department utilizes it to supplement how they catch suspected criminals, as both the LAPD[11] and NYPD[12] have done, how does the court system analyze the algorithm’s outputs to ensure its results are ethical, accurate, and constitutional?

Unfortunately, these questions remain unresolved. According to Paromita Shah, associate director of the National Lawyers Guild’s National Immigration Project, there is an institutional effort to keep them unresolved because “Palantir lives on that secrecy.”[13] According to Shah, “prosecutors and immigration agents have been careful not to cite the software in evidentiary documents,”[14] making it difficult for civil liberties lawyers to bring a case challenging the constitutionality of Palantir’s use. From the perspective of a police department, this need for secrecy makes sense; if the public at large knew that police departments have a comprehensive analytic platform that “frequently ensnar[es] in the digital dragnet people who aren’t suspected of committing any crime,” they would—or at least should—be livid.[15] Especially when that system “can now identify more than half the population of U.S. adults.”[16]

However, one court has provided a hint at how the legal system might view Gotham’s results as admissible evidence. In In re Polyurethane Foam Antitrust Litigation, indirect and direct purchasers brought a class action against flexible polyurethane foam manufacturers, alleging a long-term price fixing conspiracy.[17] The litigation involved an expert report from Dr. Robert Gordon, whose analysis used Gotham to “produce a visual representation of the events and entities” in the case.[18] Specifically, Dr. Gordon used Gotham to show a web of communication between the defendant manufacturers in “three-month periods in which price increase letters were issued.”[19] The defendants challenged Dr. Gordon’s reports, stating that he knew nothing about the foam industry yet claimed to be an expert because he utilized Gotham to produce a mapped result. The court punted on the issue, stating it “need not examine these reports in detail” because other available evidence allowed it to reach a decision. But that hesitation to accept Gotham’s outputs—that need to rely on other evidence in order to make a decision—demonstrates that courts are skeptical of the algorithm. And they should be, because despite the prevalence of Palantir’s products in the financial, military, and law enforcement sectors, little is known about how the products work.

Even Palantir itself projects a distorted view on how its products analyze data and account for privacy concerns. In August of 2020, Palantir filed an S-1 registration form in preparation of an October IPO. In that S-1 form, the company claimed their products have “Privacy-Enhancing Technologies.”[20] Under this section, the company said its products “provide highly granular access restrictions” to allow for “precision data management.”[21] The products “track the provenance and version history of all data in the system so that users can assess the reliability of the data and review and correct inaccuracies.”[22] This way, Palantir’s products provide “users with well-curated, up-to-date data” that “reduces the risks of erroneous conclusions.”[23] This sounds like a faint echo of the principles published by the EU’s High-Level Expert Group on artificial intelligence—accuracy, reliability, risk management[24]—but the “user” in this case is not the individual or citizen to whom a Palantir product such as Gotham is applied. Rather, the “user” is the customer or client who purchases the ability to use surveillance software on others. It is to those users that Palantir seeks to guarantee a trustworthy, secure, and private product.[25]

At this time, Palantir claims it does not store or sell the data integrated by its products, and that it has a robust security system to ensure its data banks remain unbreached.[26] But as Cavicchia proved during his time at JPMorgan, Palantir’s lack of storing or selling data means little in the face of a licensed customer who goes rogue with an analytic platform. In 2013, Crime Intel Officer David Gamero of the LAPD said the department uses Palantir’s Gotham “on a daily basis. It’s one of the first databases [he] log[s] into.”[27] If Cavicchia managed within four short years to gain unbridled access to every data point on every JPMorgan employee and senior executive, what can the LAPD achieve over seven years of daily use? And what happens if an officer, in a similar fashion as Cavicchia, decides to utilize Gotham to monitor an ex-lover, a spouse, a child, or a coworker? As In re Polyurethane Foam signals, courts hesitate in utilizing the algorithm’s outputs in their legal decisions, but as Shah hinted at, customers of Palantir are wise enough not to mention that they relied on Palantir’s products to reach a conclusion. Secrecy allows the customer to continue their unbridled use of Gotham, regardless of whether that use falls within the company’s or law enforcement’s intended purpose for the product. Thus, they circumvent the issue punted on by the court in In re Polyurethane Foam, and we are left hoping authorities utilize the algorithm for altruistic purposes.

Unfortunately, many in the industry are skeptical of Palantir becoming a safe and transparent tool, including Palantir’s co-founder, Alex Karp. “Every technology is dangerous,” he told the New York Times, “including ours.”[28] Yet, despite this, Palantir continues functioning on two basic assumptions: (1) that its algorithm has the right to access any detail of an individual’s life and (2) that its clients will limit that access to legitimate, ethical purposes. But this expectation of access and client-focused approach to oversight, as demonstrated by JPMorgan’s rogue incident, doesn’t work. Entrusting clients to construct proper barriers between Palantir’s products and inappropriate data provides too much power to the clients themselves. And if those clients are police departments who know not to mention in court cases that they relied on Palantir, then this hands-off approach fails before it even begins because there is no accountability. There is no safeguard of privacy against those who may use this all-access, all-consuming algorithm to create a roadmap of an individual’s life. Instead, we are left with a culture of secrecy and surveillance, and an algorithmic device so much deeper than we, the surveilled, possess ourselves.

[1] J.R.R. Tolkien, The Two Towers, Ch. 11, The Palantír (1954).

[2]  Peter Waldman, et. al, Peter Thiel’s Data-Mining Company is Using War on Terror Tools to Track American Citizens. The Scary Thing? Palantir is Desperate For New Customers, Bloomberg Businessweek (Apr. 19, 2018),

[3] Id.

[4] Id.

[5] Id.

[6] Id. “It all ended when the bank’s senior executives learned that they, too, were being watched . . .”

[7] Michael Steinberger, Does Palantir See Too Much?, N.Y. Times, (Oct. 21, 2020)

[8] Palantir, Palantir Gotham,

[9] Id.

[10] Palantir, Fighting Child Pornography: Palantir at the Netherlands National Police Services Agency, (Mar. 2014), Pornography.pdf (describing how the Netherlands National Police Services Agency, utilizing Palantir’s software, “arrested a child pornographer after tracing a photo to its origins across two continents.”)

[11] Palantir at the Los Angeles Police Department, YouTube, (Jan. 25, 2013), (in which Charlie Beck, LAPD Chief of Police, describes a crime analyst putting a suspect’s first name and physical description into the Palantir system to “work up a suspect that became a really viable piece of information . . . [Palantir] can take the smallest clue and turn it into something big.”)

[12] See Brennan Cent. for Just. at NYU Sch. of Law v. New York City Police Dept., 2017 NY Slip Op. 32716(U) (Sup. Ct.), at 2. (in which a purchase order revealed that the NYPD was using Palantir Gotham, and the department argued that disclosing records of how they used Palantir Gotham “would reveal non-routine techniques and procedures.”)

[13] Waldman, et. al, supra note 2,

[14] Id.

[15] Id. (“People and objects pop up on the Palantir screen inside boxes connected to other boxes by radiating lines labeled with the relationship: ‘Colleague of,’ ‘Lives with,’ ‘Operator of [cell number],’ ‘Owner of [vehicle],’ ‘Sibling of,’ even ‘Lover of.’ If the authorities have a picture, the rest is easy.”)

[16] Id.

[17] In re Polyurethane Foam Antitrust Litig., 314 F.R.D. 226 (N.D. Ohio 2014).

[18] Id., at *5.

[19] Id.

[20] U.S. Sec. & Exch. Comm’n, S-1 Registration Statement for Palantir Technologies, Inc., 168-69,

[21] Id.

[22] Id., at 169.

[23] Id.

[24] High-Level Expert Group on A.I., Ethics Guidelines for Trustworthy AI,

[25] Id. at 5 (“Creating a Trustworthy Operational Foundation for Data. Data is only as valuable as it is trustworthy. Our software provides data transparency and accountability through integration, versioning, orchestration, provenance, and security. These capabilities provide the conditions necessary for our customers to build a data foundation that they can trust.”)

[26]  Steinberger, supra note 7.

[27] Palantir at the Los Angeles Police Department, YouTube, (Jan. 25, 2013),

[28]   Steinberger, supra note 7.


Leave a Reply

Your email address will not be published. Required fields are marked *