Back
GitHub BlogPolicyGitHub Blog2026-04-13

GitHub Will Use Your Copilot Data to Train AI Models by Default Starting April 24

GitHub announced it will use interaction data from Copilot Free, Pro, and Pro+ users — including code snippets, inputs, outputs, and editor context — to train AI models by default starting April 24, 2026. Users must actively opt out. Enterprise and Business subscribers are exempt.

Original source

Starting April 24, 2026, GitHub will begin using Copilot interaction data from its Free, Pro, and Pro+ subscription tiers to train and improve its AI models — unless users explicitly opt out. The announcement, buried in a privacy policy update, has landed poorly with developers who expected opt-in as the default.

The data scope is broad: accepted or modified code suggestions, inputs and prompts sent to Copilot, surrounding code context at the cursor, file names and repository structure, navigation patterns, feature interactions including chat history, and explicit feedback like thumbs-up/down ratings. GitHub says it applies "de-identification techniques" and filters to remove sensitive data, but has provided no specifics on what that means in practice.

The most pointed criticism isn't about personal privacy — it's about organizational IP. Individual users on free or personal plans typically do not have authority to sublicense their employer's proprietary code to a third party. The opt-out is enforced at the user level, not the organization level, meaning a single team member who doesn't adjust their settings could expose their company's codebase through routine Copilot interactions. GitHub's community post announcing the change received over 117 thumbs-down votes and hundreds of critical comments within days.

The timing is notable: GitHub's parent company Microsoft is under increasing competitive pressure from Anthropic's Claude Code (now supporting direct organizational billing) and Google's Gemini CLI. Feeding proprietary interaction data back into model training is a meaningful moat — but only if developers trust the process. Right now, they demonstrably don't.

The policy applies only to individuals. Copilot Business and Copilot Enterprise plans, which are purchased and managed at the organization level, are not affected. For teams and companies still on personal Copilot plans, April 24 is a hard deadline to either audit opt-out status or migrate to a business plan.

Panel Takes

The Builder

The Builder

Developer Perspective

Every developer in a company on a personal Copilot plan needs to check this before April 24. The opt-out being user-level instead of org-level is the kind of decision that will generate real legal exposure for companies who miss it. Set a calendar reminder today.

The Skeptic

The Skeptic

Reality Check

GitHub's framing of this as a benign improvement initiative is doing a lot of heavy lifting. 'De-identification' is not the same as 'safe' — re-identification of code context is trivially possible if the training data retains structural patterns. The 117 downvotes tell you how much trust GitHub has left in the developer community.

The Futurist

The Futurist

Big Picture

This is the flywheel moment for GitHub's AI ambitions — using 1.5 billion developer interactions to train proprietary models creates a compounding advantage. But if it triggers mass migration to Claude Code or Gemini CLI, Microsoft will have sacrificed long-term trust for a short-term training data grab.