The pattern is older than most enterprise AI policies. An engineer hits a wall on a tricky bug. The chatbot is right there. They paste in a function, ask for help, get a slick answer, and move on with their day. The chatbot vendor now has a copy of proprietary code on its training-eligible side of the wall. Multiply by every engineer in the company, every time they get stuck.
The most-cited example is still the one where engineers at a major chip manufacturer were reported to have pasted internal source code and meeting transcripts into a hosted assistant in 2023. The company’s response was to ban the tool internally. The information they shared still exists on someone else’s servers, somewhere, in some form.
What’s changed since then is the volume of organizations that have quietly accepted this as normal. The 2023 stories had a tone of scandal. The 2025 ones have a tone of resignation. The 2026 ones, if we’re being honest, mostly aren’t making it into the news at all.
The pitch from the assistant vendors has shifted accordingly. “We don’t train on your data.” “Your conversations are private to you.” Read the terms and you’ll find a half-dozen exceptions and a future-tense clause that lets the vendor change the policy with notice. The data is still on their infrastructure. The keys are still theirs. The leverage is still theirs.
The thing nobody seems to want to say out loud: if your engineers find the assistant useful enough to paste source code into it, the assistant is useful. The instinct is right. The destination is wrong.
Same chatbot. Different perimeter.
The local AI inside Eclipse is the version of this story where the assistant is in the same room as the source code, not on someone else’s servers. Engineers paste in a function. The model answers from your corpus. Nothing crosses the boundary. The instinct that drove them to the chatbot in the first place is now satisfied without a single token leaving your environment.
The Hacker News covered the original Samsung incident and the policy reckoning that followed. https://thehackernews.com/
