You Can't Opt Out of Flock's 40,000-Camera Surveillance Network

5 min read 1 source clear_take
├── "The data processor vs. data controller distinction creates a deliberate accountability vacuum in ALPR surveillance"
│  └── speckx (honeypot.net) → read

Documented firsthand how Flock Safety's opt-out process leads nowhere by design. Flock claims to be a data processor and redirects privacy requests to their customers (HOAs, police departments), while those customers say Flock handles the data — creating a circular runaround where nobody takes responsibility.

├── "Existing privacy frameworks like CCPA/CPRA are fundamentally inadequate for mass public-space surveillance by private companies"
│  └── top10.dev editorial (top10.dev) → read below

The editorial argues that Flock Safety's legal position — that license plates scanned on public roads may not constitute personal information under CCPA/CPRA — exposes a structural gap in privacy law. Current frameworks were not designed to handle private companies operating 40,000+ cameras across 5,000 cities with no meaningful individual recourse.

└── "The scale and normalization of private ALPR networks represents a mass domestic surveillance apparatus hiding in plain sight"
  ├── @Hacker News community (Hacker News, 581 pts)

The post's 581 points and 232 comments indicate strong community resonance with the idea that Flock Safety's network — serving 2,000+ law enforcement agencies and thousands of private entities — constitutes a surveillance infrastructure that most people don't realize exists or understand the scope of. The engagement suggests this concern has been building for some time.

  └── speckx (honeypot.net) → read

Framed the opt-out attempt as exercising 'what most people assume is a basic right' against a company that captures plate numbers, timestamps, locations, direction of travel, and vehicle descriptions for every passing car. The mundane, bureaucratic nature of the failed opt-out was presented as more alarming than any dramatic leak would be.

What happened

A user writing on honeypot.net documented their attempt to exercise what most people assume is a basic right: opting out of a private company's surveillance program. They wrote to Flock Safety's privacy contact requesting removal from the company's automated license plate reader (ALPR) network — a system that captures, processes, and stores the plate number, timestamp, location, direction of travel, and vehicle description of every car that passes one of their cameras.

Flock Safety, founded in 2017 in Atlanta and now valued at $3.5 billion after a Series E led by Andreessen Horowitz, operates over 40,000 cameras across more than 5,000 US cities. Their customers include over 2,000 law enforcement agencies, plus thousands of HOAs, businesses, and private communities. The post rocketed to 581 points on Hacker News, suggesting this particular nerve has been exposed for a long time and is only getting more raw.

The core revelation wasn't dramatic. There was no leaked memo, no whistleblower. It was something worse: the mundane, bureaucratic reality that the opt-out process is designed to lead nowhere.

The data processor shell game

Flock Safety's standard response to privacy requests relies on a legal distinction that most non-lawyers wouldn't catch. They position themselves as a data processor, not a data controller. In privacy law, this distinction matters enormously: the controller decides what data is collected and why; the processor merely handles it on the controller's behalf. Flock's argument is that the HOA, city, or police department that purchased the camera is the controller, and individuals should direct their requests there.

This creates a perfect accountability vacuum: Flock tells you to contact their customer, the customer says Flock handles the data, and nobody takes responsibility. Under California's CCPA/CPRA framework, Flock has further argued that license plates scanned on public roads may not constitute "personal information" in certain interpretations, or that public safety exemptions apply.

The practical result: there is no mechanism to prevent your plate from being scanned, no reliable way to get your data deleted before the retention window expires, and no single entity that will acknowledge responsibility for the collection. The 30-day default retention period sounds reasonable until you realize that (a) law enforcement customers can set their own, longer retention periods, and (b) the metadata about your movements has already been cross-referenced, alerted on, and potentially shared across the Flock network the moment you drove past.

Why developers should care about the infrastructure angle

The technical architecture of Flock's system is what makes it qualitatively different from traditional surveillance. Each Flock Falcon camera is solar-powered and LTE-connected. No wiring. No permits. No infrastructure approval. A neighborhood HOA board can vote to install cameras at every entrance and have them operational in days, feeding data into a searchable cloud platform accessible to law enforcement partners across the network.

This is mass surveillance deployed with the operational friction of installing a Ring doorbell, but with the data aggregation capabilities of a federal agency. Flock's "Vehicle Fingerprint" technology goes beyond plate recognition — it captures make, model, color, and distinguishing features like bumper stickers and roof racks, meaning vehicles can be tracked even with unreadable or missing plates.

For developers, the Flock model illustrates a pattern worth studying (and worrying about). The law enforcement API enables integration with CAD and RMS systems. Data cross-references with NCIC stolen vehicle databases. The "FlockOS" platform supports geofences, hot lists, and automated alerts. Their "Total Analytics" platform enables real-time data sharing across the entire network. This is not a camera company — it's a data platform with cameras as the collection layer.

The Hacker News discussion surfaced a recurring developer concern: when you make surveillance infrastructure this easy to deploy and this easy to network, you create capabilities that would constitutionally require a warrant if a government agency built them directly. The private-to-public data pipeline is the loophole. An HOA's camera data, voluntarily shared with local police via Flock's platform, becomes accessible to the broader law enforcement network without any individual ever being suspected of a crime.

The legal patchwork is not keeping up

There is no federal law governing ALPR use in the United States. The regulatory landscape is a patchwork that ranges from Virginia's strict 7-day retention limit to states with no restrictions at all. New Hampshire banned private ALPR use entirely in 2007. Montana requires warrants for ALPR data access. Most states have done nothing.

The Supreme Court's 2018 decision in *Carpenter v. United States* — which ruled that accessing historical cell phone location data requires a warrant — is the most relevant precedent. Multiple legal scholars and advocacy organizations (EFF, ACLU) have argued that the same logic applies to ALPR data that tracks vehicle movements over time. A single plate scan is an observation; a network of 40,000 cameras logging your movements across months is a search.

But Carpenter addressed government access to data held by a third party (cell carriers). Flock's model adds another layer: the data is collected by a private company, owned by a private customer, and shared with government voluntarily. Each step in the chain has a plausible legal defense. The aggregate effect is a warrantless tracking system covering significant portions of the US road network.

What this means for your stack

If you're building anything that touches location data, vehicle data, or surveillance integrations, Flock's architecture is both a technical reference and a cautionary tale.

First, the data processor vs. controller distinction is not just a legal nicety — it's an architectural decision with real consequences for who bears privacy liability. If your system processes data on behalf of customers and those customers share it with law enforcement, your privacy policy's fine print doesn't insulate you from the public backlash when someone tries to opt out and discovers they can't. Design your data governance so that deletion requests actually result in deletion, with a clear chain of responsibility that doesn't require a law degree to navigate.

Second, frictionless deployment is a feature that can become a liability. Flock's solar-plus-LTE architecture is genuinely clever engineering. It's also why cameras proliferate faster than communities can debate whether they want them. If you're building infrastructure that collects data about non-consenting third parties, the ease of deployment is inversely proportional to the democratic oversight it receives.

Third, network effects in surveillance data are qualitatively different from network effects in social products. When one more city joins Flock's network, every existing customer's tracking capability expands. The value to law enforcement increases superlinearly. This is the same dynamic that makes the system powerful and the same dynamic that makes meaningful privacy protection nearly impossible once the network reaches critical mass.

Looking ahead

The 581-point Hacker News thread suggests a growing cohort of technically literate people who understand exactly what's happening and are running out of patience with the legal runaround. Legislative responses will likely accelerate — the Virginia model of strict retention limits is the minimum viable reform, and federal ALPR legislation has been proposed (if not yet passed) in multiple Congressional sessions. For Flock Safety, now a $3.5 billion company backed by a16z, the question isn't whether regulation is coming but whether the installed base of 40,000+ cameras will create enough institutional inertia to water it down. For the rest of us, the lesson is older than software: the absence of a "no" is not a "yes," and the absence of a law is not the absence of a problem.

Hacker News 624 pts 243 comments

I wrote to Flock's privacy contact to opt out of their domestic spying program

→ read on Hacker News
kstrauser · Hacker News

I wrote this. I had/have absolutely no expectation that Flock would comply with my request, but figured I should try anyway For Science. Their reply rubbed me wrong, though. They seem to claim that there are no restrictions on their collection and processing of PII because other people pay them

empathy_m · Hacker News

I noticed that the company is glossed as "Flock" and not "Flock Safety (YC S17)" in posts like this and last week's "US cities are axing Flock Safety surveillance technology", https://news.ycombinator.com/item?id=47689237.Did YC house style change a

wcv · Hacker News

Flock has stonewalled with the "we are not the controllers" excuse here in MN too. We have similar rights to opt-out and delete under the MCDPA [0].[0] https://ag.state.mn.us/Data-Privacy/Consumer/

dsr_ · Hacker News

Remember that the difference between "Flock can do whatever the hell it wants" and "Flock is required to delete your data at your request" is a law. Citizens vote for legislators. If you want this to be a higher priority for your legislators, buy them off.Or vote for/against

ldoughty · Hacker News

I think you're going to have a hard time with this...Flock seems to leave the data in ownership of the government. They are just providing the service of being custodians for storing and accessing that data.You probably would get a similar response by submitting your request to Amazon web servi

// share this

// get daily digest

Top 10 dev stories every morning at 8am UTC. AI-curated. Retro terminal HTML email.