Palantir says it performs comprehensive human rights analysis on its work. The company has publicly embraced the UN Guiding Principles on Business and Human Rights, the Universal Declaration of Human Rights, and the OECD Guidelines for Multinational Enterprises. Yet a detailed examination by the Electronic Frontier Foundation reveals a stark gap between those stated commitments and what Palantir’s tools actually enable when deployed by Immigration and Customs Enforcement.
The tension matters because Palantir’s ELITE system—the tool ICE uses for immigration enforcement—appears to power something far broader than the company claims. According to sworn testimony in Oregon and a leaked ELITE user guide, ICE agents use the system to determine where to conduct deportation sweeps, pulling from multiple data sources including Medicaid information from the Department of Health and Human Services to identify locations for raids aimed at mass detentions. Palantir describes ELITE’s role narrowly as helping ICE surface likely addresses of specific people with final removal orders or serious criminal charges. That description does not align with how the tool operates in practice.
- The Medical Data Connection: ICE uses Medicaid information through Palantir’s ELITE system to identify locations for mass detention raids.
- The Targeting Pattern: Nearly one in five ICE arrests involve Latine individuals with neither criminal history nor removal orders.
- The Accountability Gap: Palantir’s human rights framework lacks specific mechanisms to address documented misuse of its surveillance tools.
The EFF sent Palantir a detailed letter asking how the company’s human rights framework extends to its ICE work. The questions were specific: What human rights due diligence did Palantir conduct when first contracting with ICE? Did it perform the “proactive risk scoping” it advertises? What has it done in response to reports of misuse? Palantir’s response, while respectful, largely sidestepped accountability. Instead of engaging with the specifics, the company pointed to internal frameworks and legal compliance.
Is Legal Compliance the Same as Human Rights Protection?
But legality and process are not the same as human rights protection. Palantir’s response leans heavily on the Privacy Act of 1974 and inter-agency data sharing agreements, arguing that any data sharing is governed by legal requirements. Yet the company does not identify a specific lawful basis for using Medicaid data in dragnet sweeps, nor does it explain how its software enables that access. Even if a legal theory exists, converting sensitive medical information into fuel for mass detention operations strains reconciliation with commitments to privacy by design, equity, and the rights of affected communities.
• Many people detained by ICE had no criminal record and no final removal order
• An overwhelming percentage of those detained were from Central and South America
• Nearly 20% of ICE arrests were street arrests of Latine individuals without criminal history or removal orders
The real-world impact is documented. Reporting shows that many people detained by ICE had no criminal record and no final removal order. An overwhelming percentage of those detained were from Central and South America. Nearly one in five ICE arrests were street arrests of a Latine person with neither a criminal history nor a removal order. These facts raise obvious questions about discriminatory impact and racial profiling. Palantir’s response does not meaningfully engage those questions, despite the company’s stated commitments to non-discrimination and due process.
How Does Facial Recognition Fit Into Palantir’s Civil Liberties Framework?
There is also the facial recognition issue. The EFF’s letter asked Palantir to explain how it honors its civil liberties commitments in light of reports linking Palantir-owned systems to facial recognition tools used to identify and target people observing or recording law enforcement. One incident involved an ICE officer scanning protesters’ faces and threatening to add their biometrics to a “nice little database.” Palantir denied involvement in any such database. But that narrow denial sidesteps the broader question: if ICE claims it has this capability, what has Palantir done to ensure its tools are not used to chill protected speech or facilitate targeting of people engaged in First Amendment activity?
Palantir also argues that audit logs and internal frameworks demonstrate its commitment to human rights. But audit logs alone do not protect human rights. History shows that authoritarian regimes kept extensive logs of their abuses. Those structures are useful only if they trigger reassessment and lead to changes in design, access, or contract enforcement when credible reports of abuse emerge. The company did not specify what reports of misuse it has received, what changes it made, or on what timeline. Instead, it offered generic assurances without engaging specifics.
What Does Continuous Human Rights Due Diligence Actually Require?
Under the UN Guiding Principles, human rights due diligence is not a one-time approval at contract signing—it is continuous. Complaints, media reports, leaks, litigation, and sworn testimony should trigger review. If Palantir has a process for that work, it had every opportunity to describe it. Confidentiality may sometimes limit disclosure, but it cannot substitute for accountability. This challenge extends beyond Palantir to broader questions about data privacy regulation in the surveillance technology sector.
• Voluntary corporate human rights policies often function as weak accountability mechanisms
• Companies can publish principles while changing little in actual operations
• Policy frameworks without enforcement mechanisms fail to address documented abuses
The gap between Palantir’s stated values and its ICE contract reflects a broader pattern: voluntary corporate human rights policies often function as weak accountability mechanisms. Companies can tout principles, publish policies, and answer criticism with polished statements while changing very little on the ground. When the record includes violent raids, dragnet detentions, use of sensitive medical data, discriminatory targeting, retaliation against observers, and deaths tied to immigration enforcement operations, pointing to a values page is insufficient.
The EFF has called on Palantir to reconsider its contract with ICE and with all agencies whose work predictably violates human rights. For now, the contradiction is clear: a company cannot claim to champion human rights while powering systems that enable mass detention based on medical data and discriminatory targeting. The question is whether Palantir will align its actions with its promises, or whether those promises remain merely words on a policy page. This case highlights the urgent need for stronger data protection frameworks that go beyond voluntary corporate commitments.
