Flock's Privacy Opt-Out Is a Delete Button Wired to /dev/null

5 min read 1 source clear_take
├── "Flock Safety's opt-out is architecturally impossible — the system is append-only by design and cannot honor deletion requests"
│  └── honeypot.net author (honeypot.net) → read

The author personally attempted to exercise their data rights with Flock Safety and documented the Kafkaesque result: opting out requires handing over your license plate and personal information, feeding more data into the system you're trying to escape. Even if historical records are deleted, the cameras immediately re-capture your plate with no persistent block list or do-not-track registry.

├── "The real danger is the low cost and decentralized deployment model — a single HOA vote can impose mass surveillance without meaningful consent"
│  └── top10.dev editorial (top10.dev) → read below

The editorial highlights that Flock's Falcon cameras cost roughly $2,500/year per unit, cheap enough that a single HOA board vote can blanket an entire neighborhood in surveillance infrastructure without broader resident consent. This democratization of deployment means surveillance decisions are being made by small unaccountable bodies across 5,000+ communities in 40+ states.

├── "This is a familiar technical pattern — compliance theater layered over a pipeline that structurally cannot comply"
│  └── top10.dev editorial (top10.dev) → read below

The editorial frames Flock's privacy process as a recognizable architectural anti-pattern: an append-only data pipeline with edge capture, cellular backhaul, cloud storage, and inter-agency sharing that multiplies copies. The technical community's reaction centered on recognizing that the opt-out exists on paper but is structurally incapable of delivering what it promises — a GDPR-style right layered atop infrastructure designed to make deletion meaningless.

└── "The scale of Flock's network — billions of plate reads across a $7.5B private company — represents a qualitative shift in domestic surveillance"
  └── honeypot.net author (honeypot.net) → read

By documenting their personal experience with Flock's privacy process, the author draws attention to the sheer scale: Flock operates the largest private ALPR network in the US, processes billions of plate reads annually, and captures 700+ vehicle attributes per read. The post frames this not as a single company's privacy failing but as the emergence of a privately-operated mass surveillance infrastructure with no meaningful oversight.

What Happened

A privacy researcher writing at honeypot.net did something most people never bother with: they actually tried to exercise their data rights with Flock Safety, the Atlanta-based company that operates the largest private automated license plate reader (ALPR) network in the United States. They contacted Flock's privacy team, requested information about what data was held on their vehicle, and attempted to opt out of ongoing surveillance.

The post, which hit 624 points on Hacker News, documents what anyone who's built a data pipeline could have predicted: the opt-out process exists on paper but is architecturally incapable of delivering what it promises. To submit an opt-out request, you must provide Flock with your license plate number and personal identifying information — effectively feeding more data into the system you're trying to escape. Even if Flock deletes your historical records, your plate is re-captured the next time you drive past any camera in their network. There is no persistent opt-out. There is no block list. The cameras don't check a do-not-track registry before firing.

Flock Safety has deployed cameras in over 5,000 cities and communities across 40+ states. Their network processes billions of plate reads annually. The company, valued at over $7.5 billion after a Series E led by Andreessen Horowitz, offers its Falcon cameras to police departments, HOAs, and business improvement districts at roughly $2,500/year per unit — cheap enough that a single HOA board vote can blanket an entire neighborhood in surveillance infrastructure without broader resident consent.

Why It Matters

The technical community's reaction wasn't just about privacy outrage — it was about recognizing a familiar architectural pattern. Flock's system is append-only by design: cameras capture, edge processors extract plate data and 700+ vehicle attributes, cellular backhaul streams everything to cloud storage, and inter-agency sharing multiplies the copies. Deleting a record from one agency's view doesn't touch the raw captures, the derived analytics, the shared query logs, or the pattern-of-life models that have already ingested the data point.

This is the same problem every developer who's built an event-sourced system or an analytics pipeline understands intuitively. Once data enters a fan-out architecture, "deletion" becomes a distributed consensus problem that most systems simply don't solve. Flock's 30-day default retention window (configurable up to a year or more by purchasing agencies) means your movement history persists across dozens of independent data stores with different retention policies, different legal jurisdictions, and different access controls.

The Vehicle Fingerprint technology makes this worse than a simple plate-number problem. Flock's ML models identify vehicles by make, model, color, body damage, bumper stickers, roof racks, and other distinguishing features. Even if you could somehow suppress your plate number across the entire network, the system can still track your vehicle by its visual signature. This isn't a privacy bug — it's the core product feature that Flock sells to law enforcement.

What makes the Hacker News discussion particularly sharp is the recognition that privacy regulations like CCPA and GDPR assume a data model where deletion is technically feasible. They were written for databases, not for distributed surveillance networks with real-time inter-agency sharing. Flock has argued it operates as a "service provider" to law enforcement, which complicates consumer rights claims under California law. The legal framework assumes you have a relationship with the data controller. You don't have a relationship with Flock — you just drove past a pole.

The Architecture That Defeats Privacy by Design

For practitioners, the Flock case is a masterclass in how system architecture determines privacy outcomes long before any policy is written. Consider the data flow:

1. Edge capture: Solar-powered cameras with embedded ML processors perform on-device plate recognition and vehicle fingerprinting. Data is extracted at the point of capture — there's no "raw only" mode to intercept.

2. Cloud aggregation: Extracted data streams to FlockOS via LTE. Each subscribing agency gets its own view, but the underlying data lake is shared.

3. Cross-agency sharing: Flock's "Wing" feature lets law enforcement set virtual tripwires across cameras they don't own. A query from Agency A can surface captures from Agency B's cameras. This means your opt-out request to one jurisdiction doesn't propagate to the dozens of other agencies that may have already queried or cached your data.

4. Derived analytics: Pattern-of-life analysis, route mapping, and frequency scoring create second-order data products that persist independently of the raw captures.

This is a textbook example of what privacy engineers call the "deletion propagation problem." In a microservices architecture, you'd need to implement something like a GDPR deletion cascade — an event that propagates through every service, every cache, every materialized view, every backup, and every downstream consumer. Most companies with far simpler architectures than Flock's struggle to do this correctly. A distributed surveillance network with hundreds of independent law enforcement customers? The deletion request isn't just hard to honor — it's structurally incoherent.

What This Means for Your Stack

If you're building anything that touches personal data — and in 2026, that's most of us — the Flock case is a concrete example of why privacy-by-design has to happen at the architecture level, not the policy level.

Audit your fan-out. Every time data enters a pipeline that copies, transforms, or shares it with downstream consumers, you're creating a new deletion obligation. If you can't enumerate every place a user's data might land, your "delete my data" button is as fictional as Flock's opt-out form. Event-sourced systems, analytics pipelines, and ML training sets are the usual suspects.

Design for tombstones, not just deletes. If your system can't propagate a "forget this entity" signal to every downstream consumer — including caches, backups, and derived datasets — then you don't actually support deletion, and your privacy policy shouldn't claim you do. Cryptographic deletion (destroying the key rather than every copy) is one pattern that actually works at scale, but it requires encrypting per-entity from day one.

Treat data sharing as a liability multiplier. Flock's inter-agency sharing is the feature that makes their opt-out impossible. Every API consumer, every webhook subscriber, every data export is another node in your deletion propagation graph. The more you share, the harder deletion becomes. This isn't an argument against APIs — it's an argument for data minimization at the boundary.

Looking Ahead

The Flock case will likely become a reference point in the growing push for ALPR-specific regulation at the federal level. New Hampshire, Maine, and Montana have already passed targeted laws, and several other states have bills in committee. But the deeper lesson transcends surveillance cameras: the gap between privacy regulation (which assumes deletion is possible) and system architecture (which often makes it impossible) is the defining tension in data governance right now. For developers, the takeaway is blunt — if you can't `DELETE FROM` everywhere your data has been, your privacy page is marketing copy. The architecture is the policy.

Hacker News 624 pts 243 comments

I wrote to Flock's privacy contact to opt out of their domestic spying program

→ read on Hacker News
kstrauser · Hacker News

I wrote this. I had/have absolutely no expectation that Flock would comply with my request, but figured I should try anyway For Science. Their reply rubbed me wrong, though. They seem to claim that there are no restrictions on their collection and processing of PII because other people pay them

empathy_m · Hacker News

I noticed that the company is glossed as "Flock" and not "Flock Safety (YC S17)" in posts like this and last week's "US cities are axing Flock Safety surveillance technology", https://news.ycombinator.com/item?id=47689237.Did YC house style change a

wcv · Hacker News

Flock has stonewalled with the "we are not the controllers" excuse here in MN too. We have similar rights to opt-out and delete under the MCDPA [0].[0] https://ag.state.mn.us/Data-Privacy/Consumer/

dsr_ · Hacker News

Remember that the difference between "Flock can do whatever the hell it wants" and "Flock is required to delete your data at your request" is a law. Citizens vote for legislators. If you want this to be a higher priority for your legislators, buy them off.Or vote for/against

ldoughty · Hacker News

I think you're going to have a hard time with this...Flock seems to leave the data in ownership of the government. They are just providing the service of being custodians for storing and accessing that data.You probably would get a similar response by submitting your request to Amazon web servi

// share this

// get daily digest

Top 10 dev stories every morning at 8am UTC. AI-curated. Retro terminal HTML email.