Microsoft Reverses Copilot Commit Attribution in VS Code
Microsoft has reversed a controversial change to Visual Studio Code that automatically appended co-authorship attribution to Git commits when developers used AI-assisted features. The modification, introduced in early March, triggered widespread developer backlash after it persisted in commit messages even when users manually edited them or had AI features disabled.

Understanding the Automatic Attribution Issue

According to The Register, VS Code’s Git extension was modified to append a Co-authored-by: Copilot trailer to commits generated with AI assistance. The problem stemmed from the aggressive default behavior: the attribution line remained in place regardless of whether developers explicitly removed it or had disabled AI features entirely.

This behavior raised immediate concerns within the developer community. GitHub community discussions documented cases where the metadata persisted even after commit message edits, suggesting the change bypassed standard developer consent workflows. For teams managing reproducible histories, code review auditing, or legal provenance requirements, automatic metadata injection into Git history represents a significant operational risk.

Microsoft’s Response and Timeline

Dmitriy Vasyura, the VS Code reviewer who approved the change, acknowledged the backlash in the GitHub discussion. According to The Register, Vasyura wrote: “There was no ill intent by an evil corporation, but rather a desire to support functionality that some customers expect of VS Code with regard to AI-generated code.”

The reversal is scheduled for VS Code 1.119, which will restore the co-authorship trailer to opt-in rather than opt-out. This means developers will need to actively enable the feature if they want Copilot attribution automatically added to their commits.

Industry Context and Broader Implications

VS Code is not alone in this approach. Anthropic’s Claude Code also adds co-author lines by default, and the community has requested similar opt-in controls there as well. The pattern reveals a tension between AI tool developers seeking to track AI-assisted work and developers’ expectations around commit history integrity.

For practitioners, the incident underscores how automatic modifications to Git metadata erode trust in development tools. Industry patterns consistently show that when tools alter commit history or provenance without explicit, visible consent, developer confidence deteriorates. Projects relying on auditable workflows are particularly sensitive to post-commit metadata changes.

Watch for VS Code 1.119 release notes to confirm the exact opt-in setting and any UI changes that surface AI authorship choices to users. Additionally, monitor whether other integrated development environments or AI coding tools update their default behavior or implement clearer consent flows in response to similar community pressure.

The fix addresses a workflow trust issue but also highlights the ongoing conversation about how AI-assisted development should be transparently tracked in version control systems without compromising developer autonomy over commit metadata.

Follow Hashlytics on Bluesky, LinkedIn , Telegram and X to Get Instant Updates