The author actually attempted to exercise their privacy rights with Flock Safety and documented the result: polite corporate boilerplate followed by the realization that 'opting out' doesn't mean what a reasonable person would think it means. Their firsthand experience demonstrates that the opt-out process is designed to satisfy regulatory checkboxes rather than genuinely empower individuals to stop their data from being collected.
The editorial argues that Flock's key innovation was making cameras cheap enough for HOAs to purchase individually, then networking them into a shared law enforcement database — building a surveillance network 'one neighborhood at a time, without the kind of public debate that would accompany a city council voting to install cameras on every block.' This bottom-up model means decisions about mass surveillance are made by HOA boards rather than elected officials subject to public accountability.
By submitting the story with 553 upvotes and framing it as a 'domestic spying program,' the submitter highlights that Flock captures plate data from every vehicle that passes its cameras regardless of suspicion, across all 50 states. The implicit argument is that scale changes the nature of the technology: what might be acceptable as a single neighborhood camera becomes mass surveillance when networked into a national database queryable by law enforcement.
The editorial draws a sharp contrast between traditional ALPR systems — operated by police departments and subject to municipal oversight, public records requests, and departmental policy — and Flock's private model where a $7.5 billion company controls the infrastructure. By routing surveillance through private HOA purchases rather than government procurement, the system sidesteps the accountability mechanisms that theoretically constrain law enforcement data collection.
A blog post on honeypot.net, titled "I wrote to Flock's privacy contact to opt out of their domestic spying program," went viral on Hacker News this week with over 553 upvotes. The author did something most people never bother with: they actually tried to exercise their privacy rights with Flock Safety, a company that operates one of the largest automated license plate reader (ALPR) networks in the United States.
Flock Safety, founded in 2017 and valued at over $7.5 billion after its 2024 Series F, deploys solar-powered ALPR cameras to neighborhoods, HOAs, business districts, and — increasingly — directly to law enforcement agencies. The company now operates cameras in more than 5,000 cities and towns across all 50 states, capturing tens of millions of plate reads per day from every vehicle that passes, regardless of whether the driver is suspected of anything. The data is retained for a default of 30 days, though law enforcement partners can extend retention periods significantly.
The author's experience trying to opt out followed a depressingly predictable pattern: polite initial contact, corporate boilerplate in response, and the eventual realization that "opting out" doesn't mean what a reasonable person would think it means.
The core issue isn't that Flock Safety exists — ALPR technology has been around for decades. The issue is the scale and the access model. Traditional ALPR systems were operated by police departments, subject (at least in theory) to municipal oversight, public records requests, and departmental policy. Flock's innovation was making the cameras cheap enough for HOAs to buy, then networking every camera into a shared database that law enforcement can query.
This is a surveillance network that was built bottom-up, one neighborhood at a time, without the kind of public debate that would accompany a city council voting to install cameras on every block. Your neighbor's HOA board — the same people who send passive-aggressive letters about lawn height — decided to plug your street into a nationwide tracking system. You weren't consulted.
The privacy contact exercise reveals the structural problem with consent-based frameworks applied to passive surveillance. When a camera photographs your car on a public road, you haven't "consented" to anything, but the data exists regardless. Flock's privacy policy technically offers opt-out mechanisms, but the practical effect is negligible. The camera still captures your plate. The image still enters the system. A flag in a database row is the only difference between "opted out" and "opted in," and that flag is only as durable as Flock's internal enforcement.
Privacy advocates, including the Electronic Frontier Foundation and the ACLU, have repeatedly flagged ALPR networks as one of the most potent mass surveillance tools in the domestic landscape. A 2024 investigation by the Associated Press found that some Flock customers had access to data from cameras they didn't own or operate, effectively creating a panopticon effect where any participating agency could track a vehicle's movements across jurisdictions. The practical result: if you drive through three towns that have Flock cameras, law enforcement in any of those towns — and potentially federal agencies via data-sharing agreements — can reconstruct your route, timing, and patterns without a warrant.
The HN discussion, which drove the story to the front page, focused heavily on the gap between Flock's marketing ("public safety," "solving crime") and the operational reality (mass collection on everyone, warrantless access, minimal oversight). Several commenters with law enforcement experience noted that ALPR data is routinely used for investigations that have nothing to do with the violent crimes featured in Flock's case studies — things like tracking people's movements during custody disputes, identifying attendees at political rallies, or simply building dossiers on individuals who haven't been charged with anything.
If you're a developer, there are several layers of relevance here.
If you're building anything that touches vehicle data, location data, or IoT sensor networks, Flock is a masterclass in how to architect a system where privacy controls are an afterthought bolted onto a collection-first design. The lesson: if your data pipeline captures everything and filters later, your "privacy controls" are policy decorations, not technical guarantees. Real privacy requires not collecting the data, or collecting it ephemerally with cryptographic enforcement of retention limits — not a boolean column in Postgres.
If you're integrating with third-party data providers, ask the hard question: where did this data come from, and did the subjects have a meaningful way to prevent its collection? The regulatory landscape is shifting. States like California (via CCPA/CPRA), Montana, and Vermont have begun restricting ALPR data retention and sharing. Illinois's BIPA has been used to challenge biometric surveillance systems in court. If your product depends on data from surveillance networks with questionable consent models, you're building on a foundation that legislation could yank out from under you.
If you're an engineering leader at a company that sells to government or law enforcement, the Flock story is a preview of the reputational and regulatory risks ahead. The companies building the infrastructure of mass surveillance are going to face the same public reckoning that social media platforms faced over data harvesting — the only question is when. Engineers who helped build these systems without pushing back on collection scope or retention policies will look back the way early Facebook engineers look back on "move fast and break things" applied to election integrity.
The trajectory here is clear: ALPR networks will get denser, the data will get richer (Flock's newer cameras capture vehicle make, model, color, and distinguishing features — not just plates), and the legal frameworks will lag behind deployment by years. The opt-out exercise documented in this blog post isn't just one person's frustrating afternoon with a privacy@ inbox. It's an existence proof that the consent model for passive surveillance infrastructure is broken by design. For developers, the takeaway is to build systems that earn trust through architecture — data minimization, cryptographic deletion, zero-knowledge proofs where possible — rather than through privacy policies that promise what the technology doesn't enforce.
I noticed that the company is glossed as "Flock" and not "Flock Safety (YC S17)" in posts like this and last week's "US cities are axing Flock Safety surveillance technology", https://news.ycombinator.com/item?id=47689237.Did YC house style change a
Flock has stonewalled with the "we are not the controllers" excuse here in MN too. We have similar rights to opt-out and delete under the MCDPA [0].[0] https://ag.state.mn.us/Data-Privacy/Consumer/
Remember that the difference between "Flock can do whatever the hell it wants" and "Flock is required to delete your data at your request" is a law. Citizens vote for legislators. If you want this to be a higher priority for your legislators, buy them off.Or vote for/against
I think you're going to have a hard time with this...Flock seems to leave the data in ownership of the government. They are just providing the service of being custodians for storing and accessing that data.You probably would get a similar response by submitting your request to Amazon web servi
Top 10 dev stories every morning at 8am UTC. AI-curated. Retro terminal HTML email.
I wrote this. I had/have absolutely no expectation that Flock would comply with my request, but figured I should try anyway For Science. Their reply rubbed me wrong, though. They seem to claim that there are no restrictions on their collection and processing of PII because other people pay them