The editorial argues that git commits are technical and legal records, not marketing surfaces. In regulated industries, commit authorship feeds into audit trails, SOX compliance, and IP provenance chains. Silently injecting false authorship metadata constitutes data integrity corruption, not merely a preference issue.
Filed the original issue highlighting that VS Code was inserting 'Co-Authored-by Copilot' into commits regardless of whether Copilot was actually used, framing it as a factual accuracy problem — the trailer falsely attributes co-authorship where none occurred.
As the Microsoft engineer who approved the PR, dmitriv publicly apologized and explained it was a desire to support functionality some customers expect, not malicious intent. He acknowledged the mistake of turning the feature on by default without sufficient upfront validation.
The editorial highlights the self-parodying detail that Copilot itself flagged the change during automated code review, noting it created inconsistency in the codebase and suggested reverting. The fact that human reviewers ignored the AI's correct assessment while approving a change to inflate that same AI's attribution underscores a failure in the review process.
The editorial emphasizes the change shipped with no prompt, no opt-in, and no indication to the developer that commit metadata was being altered. This framing positions the issue within broader concerns about dark patterns in developer tools, where defaults are chosen to benefit the platform rather than the user.
A pull request ([#310226](https://github.com/microsoft/vscode/pull/310226)) merged into Microsoft's Visual Studio Code changed the default value of a configuration setting so that every git commit made through VS Code would automatically include a `Co-Authored-By: GitHub Copilot` trailer — even if the developer never used Copilot for that commit, or didn't have Copilot enabled at all.
The change was subtle: a configuration schema default was flipped to `"all"`, which instructed VS Code to append the co-author trailer to every commit message by default. No prompt. No opt-in. No indication to the developer that their commit metadata was being altered. The PR made it through code review and was approved by a Microsoft engineer identified as dmitriv, who later posted a public apology on Hacker News: "I am the person who approved this PR and would like to acknowledge and apologize for the mistake of turning this feature on by default without sufficient upfront validation. There was no ill intent by evil corporation, but rather a desire to support functionality that some customers expect of VS Code."
In a twist that borders on self-parody, Copilot itself commented on the pull request during review, noting that the change created inconsistency in the codebase and suggested reverting it. That automated review comment was apparently ignored by the human reviewers who approved the merge.
Git commits are not marketing surfaces. They are technical and legal records. In regulated industries, commit authorship feeds into audit trails, SOX compliance, and IP provenance chains. In open source, contribution graphs determine maintainer credibility and, increasingly, employment decisions. Silently injecting false authorship metadata into these records is not a UX preference — it's data integrity corruption.
The community reaction on Hacker News (1,068 points) was swift and pointed. As commenter yankohr put it: "Git commits are legal and technical records. Falsifying who authored a piece of code just to pump up AI usage stats is a huge breach of trust." Another commenter, rsynnott, placed it in a broader context: "One fascinating thing about the whole AI phenomenon is how incredibly hostile it is to standards. Whether something works properly, or is ethical, or is true, no longer matters at all; all that matters is 'pls use our AI.'"
The cynical reading — and the one that gained traction — is that Microsoft needs Copilot usage metrics. Every commit tagged with `Co-Authored-By: GitHub Copilot` is, at minimum, a data point that inflates the apparent reach of the tool. Whether that data feeds into investor presentations, enterprise sales decks, or internal KPIs, the incentive structure is obvious. When your business model depends on demonstrating AI adoption, the temptation to count generously is structural, not accidental.
The charitable reading, offered by the approver himself, is that some users genuinely want this attribution and the mistake was making it opt-out rather than opt-in. This is plausible — plenty of developers do use Copilot heavily and want to track that. But the correct UX for "some customers expect this" is a setting that defaults to off, not a silent default-on that modifies legal records. The distinction between opt-in and opt-out is not a product nuance. It is the entire question.
Commenter artyom offered the longest historical view: "To everyone who bought the 'developer-friendly' Microsoft of VSCode fame from a few years ago: this is what they forever did, and forever will do. This company has been pulling these tricks since the early 90s." Whether you buy the historical determinism or not, the pattern of bundling unwanted defaults into widely-used tools is well-documented in Microsoft's playbook — from Internet Explorer's OS integration to Bing's search defaults to Edge's aggressive reinstallation after Windows updates.
If you're running VS Code, check your settings immediately. Search for `github.copilot.chat.commitMessageGeneration.attribution` (or the relevant setting key) and ensure it's set to `"off"` or `"manual"` rather than `"all"`. If you've already made commits with the false trailer, you'll need to decide whether to rewrite history — which has its own costs in shared repositories — or live with the inaccurate metadata.
For teams with compliance requirements, this is a wake-up call to audit what your editor is injecting into commit metadata. Git hooks that validate commit message format are cheap insurance. A simple `commit-msg` hook that strips or flags unexpected `Co-Authored-By` trailers takes five minutes to write and prevents exactly this class of supply-chain-adjacent problem. If you're in a regulated environment (finance, healthcare, defense), consider adding this to your onboarding checklist.
More broadly, this incident is a useful forcing function for a conversation your team should probably have: what is your policy on AI attribution in commits? If a developer uses Copilot for autocomplete on three lines of a 200-line change, is that "co-authored"? What about ChatGPT-generated test scaffolding? There's no industry standard here yet, and the answer matters for IP assignment, code review trust, and contributor metrics. Better to decide proactively than to discover your commit history has been silently annotated by your editor.
Microsoft will almost certainly revert or fix this default — the PR backlash is too loud to ignore, and the approver's apology suggests internal alignment that this was wrong. But the deeper issue persists: as AI tools embed deeper into the development workflow, the surface area for silent metadata manipulation grows, and the incentives for AI vendors to inflate their own usage metrics grow with it. The lesson isn't "Microsoft bad" — it's that any tool with write access to your commit messages, CI configs, or dependency files has the power to alter your project's record of truth. Trust, but verify. Preferably with a git hook.
This feels like the modern version of 'Sent from my iPhone' but much more invasive. Git commits are legal and technical records. Falsifying who authored a piece of code just to pump up AI usage stats is a huge breach of trust and it is disappointing to see Microsoft prioritize branding ove
I am the person who approved this PR and would like to acknowledge and apologize for the mistake of turning this feature on by default without sufficient upfront validation.There was no ill intent by evil corporation, but rather a desire to support functionality that some customers expect of VS Code
They do not care about customers because they are the customer and users are hostages. They only care about hostage count and other shitty metrics.
The best part is that copilot commented on the PR saying that this doesn’t actually change the behaviour, creates inconsistency in the codebase and suggested reverting the change! (This comment seems to have been ignored…)> The configuration schema default was changed to "all", but the
Top 10 dev stories every morning at 8am UTC. AI-curated. Retro terminal HTML email.
One fascinating thing about the whole AI phenomenon is how incredibly hostile it is to _standards_. Whether something works properly, or is ethical, or is true, no longer matters at all; all that matters is "pls use our AI".Microsoft spent literal decades rehabilitating their reputation. A