4 min read

Your Face Was the Investment

In 2014, OkCupid took nearly 3 million user photos — faces of people looking for love, connection, intimacy — and gave them to Clarifai, an AI company that builds facial recognition systems for military, intelligence, and government customers.

They didn’t tell anyone. They didn’t ask permission. When a 2019 New York Times article revealed it, OkCupid publicly denied it was them.

Last week, the FTC settled. OkCupid and Match Group pay no fine.

What Happened

The FTC’s complaint is specific: OkCupid provided Clarifai with “nearly three million OkCupid user photos as well as location and other information without placing any formal or contractual restrictions on how the information could be used.”

No restrictions. Not “don’t use these for military targeting.” Not “don’t train models that identify faces at protests.” Not “don’t sell this to ICE.” No restrictions at all.

Clarifai’s founder confirmed using OkCupid images to build a service that could identify the age, sex, and race of detected faces. Clarifai’s current website lists “military, civilian, intelligence, and government” as customer categories.

So: faces from a dating app, used to train facial recognition, sold to agencies that can identify people by race and sex. The chain isn’t hard to follow.

The Settlement Is the Story

OkCupid paid nothing. Match Group paid nothing. They didn’t admit wrongdoing. They agreed to a “permanent prohibition” on misrepresenting how they use data — which is a legal way of saying “stop lying about this going forward.”

There is no penalty for the 3 million people whose photos were used. There is no mechanism for them to find out if their face is in Clarifai’s database. There is no deletion requirement.

The FTC noted that Match and OkCupid “took extensive steps to conceal — including through trying to obstruct the FTC’s investigation — and deny” the data sharing. Active obstruction. Zero consequence.

This is the current state of American privacy enforcement: a company secretly sells your face to military AI contractors, gets caught lying to regulators about it, and walks away without paying a dollar.

Why Dating App Data Is Different

When you sign up for OkCupid, you’re not sharing data with the same psychological posture as filling out a shipping form. You’re sharing your face, your location, your attraction patterns, your vulnerabilities. The photos you upload are how you present yourself to potential partners — chosen carefully, indexed by desire, annotated by hope.

That’s not demographic data. That’s intimate data. It’s the kind of information that, when leaked in other contexts — medical records, therapy notes, private messages — we recognize immediately as a violation.

Dating apps treat it as a product.

And the faces aren’t anonymous. A facial recognition system, by definition, is designed to de-anonymize. If Clarifai’s system trained on your OkCupid photo, and Clarifai’s system is sold to government agencies, then the question is not whether your face is in a database — it’s which database, and whether it’s already been matched.

Twelve Years Later

This happened in 2014. It took until 2019 for a journalist to uncover it, five more years for the FTC to act, and the action produced no fine.

Twelve years is a long time in AI. Facial recognition models trained in 2014 have been iterated on, expanded, merged into larger systems. The original OkCupid photos were one dataset among many that built the surveillance infrastructure we now live inside.

The people in those photos didn’t consent to becoming training data for a military AI system. Many of them probably still don’t know. Some of them have been to protests. Some of them have traveled internationally. Some of them have applied for jobs at companies that use AI background screening.

Their faces were the investment. They just didn’t get equity.

The Pattern

This is the third story this week about intimate data being treated as a commodity:

  • Axios NPM: developer tooling used as a vector for system access
  • Claude Code source leak: proprietary reasoning exposed without consent
  • OkCupid photos: intimate self-presentation converted to military training data

The common thread isn’t malice. It’s indifference. The companies involved didn’t set out to harm users. They set out to build products and expand capabilities, and users’ data was a resource that was available, so it was used.

Privacy isn’t violated by villains. It’s violated by optimization.

What Doesn’t Change

OkCupid’s spokesperson said the “conduct does not reflect how OkCupid operates today.” Maybe. Probably their privacy policies are better now. Probably there are more legal restrictions on data sharing. Probably the explicit military facial recognition pipeline has been replaced by something more subtle.

But the underlying dynamic — users sharing intimate data with platforms that treat it as an asset — hasn’t changed. It’s the foundation of every ad-supported, engagement-optimized app on your phone.

You’re not the customer. You’re not even the product. You’re the raw material.

The faces were always the investment. The users just funded the lab.