Published a detailed step-by-step teardown of the White House app, documenting the full decompilation process including network calls, dependency graph, and data collection behavior. The act of publishing the methodology itself argues that citizens have the right to inspect what government software does on their devices.
Frames app decompilation in the tradition of COVID contact tracing teardowns that shaped public policy, arguing the principle is 'trust but verify, with a disassembler.' Notes that a government app serves citizens and arguably every line of code should be auditable.
Highlights that the 504-point Hacker News score represents 'something beyond casual curiosity,' arguing developers care deeply about what ships inside government software and want receipts. Points to the typical pattern of findings in such teardowns — unexpected third-party analytics SDKs and advertising identifiers — as justification for that skepticism.
Acknowledges that while government apps should serve citizens transparently, governments also have legitimate security concerns about exposing infrastructure details. Describes the teardown as sitting 'squarely in that tension' between the public's right to audit and operational security needs.
A developer going by thereallo published a detailed teardown of the White House's new mobile application, documenting the full decompilation process and cataloging what they found inside the binary. The post, published on thereallo.dev, walks through the methodology step by step — from pulling the app package to disassembling it and tracing its network calls, dependency graph, and data collection behavior.
The analysis hit 504 points on Hacker News, placing it firmly in the top tier of community engagement for the day. That score signals something beyond casual curiosity — developers care deeply about what ships inside government software, and they want receipts.
This isn't the first time someone has decompiled a high-profile government app. COVID contact tracing apps received similar treatment in 2020-2021, and those teardowns shaped public policy. The White House app teardown follows the same tradition: trust but verify, with a disassembler.
Mobile app decompilation is a well-established practice in the security research community. For Android apps, tools like jadx and apktool convert DEX bytecode back into readable Java/Kotlin source. For iOS, Hopper and class-dump extract Objective-C and Swift class hierarchies from Mach-O binaries. The specific tool chain matters less than the methodology: pull the binary, enumerate its dependencies, trace its network behavior, and read the code that handles user data.
What makes government app teardowns particularly interesting is the tension between public accountability and operational security. A government app serves citizens — arguably, every line of code should be auditable. But governments also have legitimate security concerns about exposing their infrastructure details. This teardown sits squarely in that tension.
The typical findings in these investigations follow a pattern: third-party analytics SDKs (often more than you'd expect), advertising identifiers that shouldn't be present in a government app, API endpoints that reveal backend architecture, and certificate pinning (or lack thereof) that indicates the security posture. The community reaction on Hacker News suggests this teardown found enough noteworthy details to sustain hundreds of comments.
Decompilation serves multiple purposes in the ecosystem. Security researchers use it to find vulnerabilities before attackers do. Privacy advocates use it to verify data collection claims. Competitive analysts use it to understand how rival products are built. And curious developers use it to learn.
For the White House app specifically, the question isn't whether decompilation is appropriate — it's whether the app's internals match its public privacy commitments. Government apps occupy a unique position: they're funded by taxpayers, used by citizens exercising their civic duties, and subject to federal privacy requirements that commercial apps aren't.
The Hacker News discussion around these teardowns invariably focuses on a few key questions: What analytics and tracking SDKs are embedded? Are there any third-party services receiving user data? How does the app communicate with its backend — is it using certificate pinning, proper TLS configuration, and secure API patterns? And perhaps most importantly, does the actual behavior of the app match what the privacy policy describes?
These questions aren't theoretical. In 2021, researchers found that several government COVID apps were sharing data with third-party analytics providers in ways that contradicted their privacy policies. Those findings led to code changes within days. Public scrutiny works.
The tooling for mobile reverse engineering has matured significantly. Modern decompilers produce remarkably readable output, especially for apps written in managed languages like Kotlin or Swift. Network traffic analysis tools like mitmproxy and Charles Proxy can intercept API calls even from apps with certificate pinning, given enough effort. And automated scanners like MobSF can produce a baseline security audit of any APK or IPA in minutes.
For developers building apps — government or otherwise — this means your binary is effectively source-available whether you intend it to be or not. Obfuscation raises the bar but doesn't eliminate the risk. The practical implication: design your app as if someone will read every line, because someone will.
This is particularly relevant for apps that handle sensitive data. The metadata your app collects, the endpoints it calls, the SDKs it bundles — all of this is discoverable. If your privacy policy says you don't track users but your binary contains Firebase Analytics, Google AdMob, or Facebook SDK, someone will notice.
If you're shipping mobile apps, this teardown is a prompt to audit your own dependencies. Run your APK through jadx or your IPA through class-dump and see what's actually in there. Most developers are surprised by the transitive dependencies their build system pulls in — analytics SDKs nested inside UI libraries, advertising identifiers initialized by dependencies you didn't choose directly.
Practical steps worth taking: audit your dependency tree with a focus on data collection, verify that your actual network behavior matches your privacy policy, implement certificate pinning if you haven't, and consider running automated security scans as part of your CI pipeline. Tools like MobSF, Semgrep for mobile, and Frida for dynamic analysis are all free and well-documented.
For teams building government or public-sector apps, the bar is higher. Assume your app will be decompiled within days of release. Document your dependency choices. Minimize third-party SDKs. And if you must include analytics, prefer privacy-respecting options like self-hosted Plausible or Matomo over commercial tracking SDKs.
The White House app teardown is part of a broader trend: the public increasingly expects transparency from the software that serves them, and developers have the tools to enforce that expectation. As government digital services expand globally, decompilation-as-accountability will become routine — not adversarial, but a normal part of the civic technology feedback loop. The apps that hold up under this scrutiny are the ones built with the assumption that someone is watching. Because someone always is.
Looks like what you might expect in a standard marketing app from a consultancy. They probably hired someone to develop it, that shop used their standard app architecure which includes location tracking code and the other stuff.
OneSignal cofounder here. Posting since our service was mentioned in this article.For those concerned or curious about location data collection, we wrote an explanation of how it works: https://onesignal.com/blog/youre-in-control-how-location-act...
The OneSignal location tracking code being "compiled in" is expected behavior for anyone who has shipped Expo apps with OneSignal. The OneSignal React Native SDK bundles its full native module including location capabilities regardless of whether you use them. Expo config plugins like with
Interesting. The site is nearly unusable to me unfortunately. '19 MBP w/ Chrome - scrolling stutters really bad
Top 10 dev stories every morning at 8am UTC. AI-curated. Retro terminal HTML email.
A bit skeptical of how this article is written as it seems to be mostly written by AI. Out of curiosity, I downloaded the app and it doesn't request location permissions anywhere, despite the claims in the article.I've noticed Claude Code is happy to decompile APKs for you but isn't v