Fingerprint.com documents that Firefox's IndexedDB implementation exposes a stable, unique identifier that persists across private browsing sessions and Tor Browser circuits. They demonstrate that any website can silently link what should be completely isolated browsing identities, breaking the fundamental assumption that no browser-level artifact survives across identity boundaries.
The editorial notes that Fingerprint.com sells commercial browser identification services while also responsibly disclosing browser privacy bugs. This mirrors their 2022 Safari IndexedDB disclosure (CVE-2022-22594), establishing a pattern where a company that profits from fingerprinting also acts as a privacy watchdog — a tension the community should be aware of when evaluating their findings.
By submitting the story which garnered 616 points and 167 comments, danpinto highlighted the accessibility of the exploit. The editorial emphasizes this framing: unlike theoretical side-channels or timing attacks requiring deep expertise, this is a simple read of an IndexedDB value that should be ephemeral but isn't, making it trivially exploitable by any website.
Researchers at Fingerprint.com — the same browser fingerprinting company that uncovered Safari's IndexedDB cross-origin leak in 2022 — have published a new vulnerability in Firefox's IndexedDB implementation. The bug exposes a stable, unique identifier that persists across private browsing sessions and Tor Browser circuits, allowing any website to silently link what should be completely isolated browsing identities.
The discovery hit Hacker News with a score north of 600, and for good reason: this isn't a theoretical side-channel or a timing attack that requires a PhD to exploit. It's a straightforward identifier leak. A website opens an IndexedDB database, reads a value that should be ephemeral but isn't, and now it knows that the person browsing via Tor circuit A is the same person who visited via Tor circuit B — or via a regular private browsing window, or across browser restarts.
Fingerprint.com, which sells commercial browser identification services, has a track record of responsibly disclosing these findings. Their Safari IndexedDB vulnerability (CVE-2022-22594) forced Apple into a rapid patch cycle in early 2022. This Firefox disclosure follows a similar pattern: find the leak, document it thoroughly, publish after coordination with the vendor.
To understand why this is severe, you need to understand what Tor Browser promises. Tor doesn't just route traffic through onion relays — it partitions browser state so that each circuit (roughly, each new identity) starts clean. Cookies, cache, DOM storage — all of it is supposed to be wiped or isolated when you click "New Identity" or open a new circuit. The entire threat model of Tor Browser assumes that no browser-level artifact survives across identity boundaries. This vulnerability breaks that assumption at the storage layer.
IndexedDB is the browser's built-in NoSQL database, and it's everywhere. Service workers use it. PWAs depend on it. Analytics libraries stash data in it. The API is powerful enough that developers routinely store megabytes of structured data client-side. But that power comes with complexity, and complexity is where privacy bugs hide.
The core issue appears to be in how Firefox's storage engine manages IndexedDB's internal metadata. Every IndexedDB database has internal bookkeeping — version numbers, schema state, storage quotas — that the browser maintains outside the database contents themselves. If any of this metadata generates or exposes a stable identifier that isn't properly scoped to the browsing context, it becomes a supercookie: invisible to the user, immune to "Clear Site Data," and readable by any origin that knows where to look.
This is particularly damaging because IndexedDB is a standard Web API. Unlike canvas fingerprinting or WebGL renderer strings — which require active probing and produce probabilistic identifiers — an IndexedDB metadata leak produces a deterministic, stable identifier. There's no ambiguity: if two sessions share the same identifier, they're the same browser. Period.
This isn't the first time a browser storage API has betrayed user privacy, and it won't be the last. The history is instructive:
Safari's IndexedDB leak (CVE-2022-22594, January 2022): Fingerprint.com discovered that Safari's WebKit engine leaked IndexedDB database names across origins. If Google created a database called `myAccount-abc123`, any other website could enumerate that database name and extract the Google user ID. Apple patched it within weeks.
HSTS supercookies: Browsers that remembered HSTS (HTTP Strict Transport Security) headers per-site inadvertently created a bitmap that could encode a unique identifier. Visit 32 domains with specific HSTS configurations, and you've written a 32-bit supercookie. Most browsers now partition HSTS state.
Favicon cache tracking: Researchers demonstrated that favicon caches could be abused as a persistent tracking vector because browsers cached favicons separately from other site data and didn't clear them with cookies.
The pattern is consistent: every persistent storage mechanism in the browser is a potential supercookie until proven otherwise, and the proof usually comes after someone exploits it. Browser vendors play whack-a-mole with storage-layer leaks because the web platform keeps adding new storage surfaces (Cache API, Storage Buckets, OPFS) faster than privacy reviews can audit them.
For regular Firefox users in private browsing mode, this vulnerability is concerning but bounded. Private browsing is generally understood as "don't leave traces on my own machine" — it was never designed to defeat a determined remote adversary.
For Tor Browser users, the calculus is entirely different. Tor users include journalists communicating with sources, activists in authoritarian regimes, whistleblowers, and security researchers. These users depend on circuit isolation as a safety mechanism, not a convenience feature. A website that can link two Tor sessions can potentially:
- Correlate a user's anonymous browsing with their identified browsing - Track a user across "New Identity" resets, defeating Tor's primary UX for privacy renewal - Build behavioral profiles across sessions that were supposed to be unlinkable - In adversarial environments, provide evidence that two supposedly different visitors are the same person
The Tor Project has historically taken these vulnerabilities seriously. Tor Browser is a hardened Firefox fork that disables or modifies dozens of browser features specifically to prevent fingerprinting and cross-session linkage. That this one slipped through suggests the IndexedDB internals weren't fully covered by Tor's fingerprinting resistance patches — a gap that likely extends to other Gecko storage subsystems.
If you're a web developer, this vulnerability has two practical implications.
First, audit your own IndexedDB usage for unintended state leakage. If your application creates IndexedDB databases with predictable, user-specific names — say, `user-${userId}-cache` — you may be leaking authenticated user identifiers to other origins through similar cross-context mechanisms. The fix is straightforward: use opaque, random database names and don't encode user identity into storage keys.
Second, if you're building privacy-sensitive applications, stop trusting browser storage APIs as isolation boundaries. The browser's same-origin policy and context isolation are necessary but not sufficient for real privacy guarantees. Design your system so that even if browser-level isolation fails, your application-layer isolation holds. This means: don't rely solely on the browser to separate user contexts; implement server-side session isolation; and assume that any client-side identifier can leak.
For teams building browser extensions, privacy tools, or applications that interact with Tor Browser, this is a reminder to test against the actual Tor Browser threat model, not just Firefox's. The Tor Project maintains a design document specifying exactly which APIs should be restricted or isolated. If your extension touches IndexedDB, verify that it respects Tor Browser's first-party isolation and doesn't inadvertently create cross-circuit linkage.
Mozilla will patch this — they always do, and their security response team is competent. The deeper question is architectural: Firefox's storage engine (and Chromium's, for that matter) wasn't designed with privacy partitioning as a first-class concern. It was bolted on later through features like Total Cookie Protection and State Partitioning. Every bolt-on creates seams, and every seam is a potential leak. Until browsers rebuild their storage layers with partition-first architectics — where isolation is the default and sharing requires explicit opt-in — we'll keep finding supercookies in the gaps between intent and implementation. Fingerprint.com, a company whose business model depends on finding these gaps, will keep looking. The question is whether browser vendors can close them faster than adversaries can exploit them.
Being fingerprinted across Tor is different from being deanonymized—it basically just "psuedonomizes" you. You now have an identifier. It is a significant threat, but it is not hard to "psuedonomize" someone based on stylometry and some of the people with the highest threat model
> the identifier can also persist [...] as long as the Firefox process remains runningMake sure to exit Tor Browser at the end of a session. Make sure not to mix two uses in one session.
I learned enough about security years ago that there's basically zero chance you're secure and almost 100% chance someone is watch everything you do online.Whether they care is entirely separate.
I question why websites can even access all this info without asking or notifying the user.Why don't browsers make it like phones where the server (app) has to be granted permission to access stuff?
Top 10 dev stories every morning at 8am UTC. AI-curated. Retro terminal HTML email.
Very cool research and wonderfully written.I was expecting an ad for their product somewhere towards the end, but it wasn't there!I do wonder though: why would this company report this vulnerability to Mozilla if their product is fingeprinting?Isn't it better for the business (albeit uneth