Github notifies Protect Your Code Before April 24
If you check your email, you likely have a recent notification from GitHub sitting in your inbox. On the surface, it looks like a standard Terms of Service update taking effect on April 24, 2026. But if you read the fine print, there is a massive shift happening behind the scenes regarding your data.
GitHub is changing its default privacy settings, meaning the code you write, the prompts you use, and the context you share with GitHub Copilot will automatically be used to train Microsoft and GitHub's future AI models.
As a developer, I'm all for leveraging AI to ship faster, but not at the expense of leaking proprietary architecture or private project data into a global training pool. Here is a breakdown of what this actually means for you and, more importantly, how to opt out.
What is Actually Happening?
Historically, if you wanted your code used for AI training, you had to opt in. This new update flips the script. Starting April 24th, the default state for many users becomes opt-out.
If you do nothing, GitHub's systems will ingest your telemetry, code snippets, and chat histories to refine their Copilot models.
Who is Affected? (The Enterprise Exemption)
There is a catch here that has frustrated a lot of the developer community. If your company pays for a Copilot Business or Copilot Enterprise seat, you are generally protected. Microsoft knows enterprise clients won't tolerate IP leaks, so those tiers are exempt from this specific telemetry harvesting.
However, if you are using:
- GitHub Free
- GitHub Pro
- Copilot Individual
...you are the ones being opted in by default. If you are building tools, hacking on personal projects, or doing freelance work on an Individual plan, your code is on the menu.
How to Opt-Out (Step-by-Step)
Don't wait until the 24th to fix this. It takes less than a minute to secure your repositories.
- Log into your GitHub account and click your profile picture in the top right corner.
- Select Settings from the dropdown menu.
- In the left-hand sidebar, scroll down to the Code, planning, and automation section and click on Copilot.
- Look for the section labeled Policies or Data Collection (GitHub sometimes shifts the exact UI, but it will be on this page).
- Uncheck the box that says "Allow GitHub to use my code snippets for product improvements" (or similar wording regarding AI training).
- Click Save.

The Open Source Dilemma
There is a real irony here. Open-source developers are the backbone of the GitHub ecosystem. We want AI models to get better at understanding complex, open architectures. But doing so through a silent, default opt-in feels like a breach of trust.
AI tools are incredible assistants, but you should always have the final say over where your code ends up.
Take two minutes today, check your settings, and make sure you are actively choosing how your data is used. Let me know your thoughts.
And don't forget to subscribe to newsletter for future blogs.
