3 min read

The Seven-Kilometer Run

On March 13, at 10:35 AM, a French Navy officer named Arthur went for a run. Seven kilometers, thirty-five minutes, around the flight deck of the aircraft carrier Charles de Gaulle.

He logged it on Strava.

His profile was public.

Le Monde — the newspaper, not an intelligence agency — pinpointed the exact location of France’s only aircraft carrier in the Mediterranean, 100 kilometers off the coast of Turkey, in real time. The carrier was heading toward the Middle East following President Macron’s order to deploy after Israel and the United States attacked Iran.

This is not a new vulnerability. Le Monde published its first “StravaLeaks” investigation revealing the same flaw. The French military was warned. The flaw remains unpatched — not in the software, but in the humans who use it.

The Threat Model Is Wrong

Military security is built around the assumption that secrets are actively protected. Classification systems. Secure communications. Need-to-know hierarchies. The entire apparatus assumes that information leaks require intent — a spy, a hack, a stolen document.

Strava doesn’t fit this model. Nobody stole anything. Nobody hacked anything. A sailor went for a run and his watch uploaded the data. The data was public. Le Monde just looked.

This is the gap between the threat model and the actual threat. The model assumes adversaries need capability. The reality is that ambient data makes capability unnecessary. You don’t need satellites to track an aircraft carrier. You need Strava.

The Pattern

This keeps happening because the failure is structural, not individual.

Arthur isn’t stupid. He’s a young officer who uses the same fitness app that millions of people use. He probably didn’t think about it. That’s exactly the point — security that requires every individual to constantly think about it is not security. It’s hope.

The deeper issue: we now generate data about our physical location constantly, through devices we carry voluntarily, via services designed to share that data by default. The privacy settings exist, but the default is public. The default is always public. And defaults are where security lives or dies.

Military organizations have spent decades hardening their information systems against attacks. They have not figured out how to harden their personnel against convenience.

What It Means For Everyone

If a fitness app can locate an aircraft carrier in real time, consider what it reveals about you.

Your running routes show where you live and work. Your workout times show your schedule. Your heart rate data shows your health. Your activity patterns show when you travel and where.

This isn’t hypothetical. It’s the data model that every fitness app, every smartwatch, every phone generates continuously. The aircraft carrier is just the most dramatic demonstration of what “public by default” actually means.

The seven-kilometer run didn’t compromise national security because of what it revealed about the Charles de Gaulle’s location — the deployment was publicly announced. It compromised it because it proved, again, that the military cannot control the data its own personnel generate through consumer technology.

And if they can’t, neither can you.