Mitigating the risks of AI-written code
If you're going to use AI at work, you'd better be sure the code it writes doesn't put your business at risk
If you’ve been reading The AI-Augmented Engineer for a while, you know I’m crazy excited about AI-augmented software development. I’m also clear-eyed about the risks. Big businesses care a lot about speed and leverage, and they care even more about avoiding legal problems.
Today’s issue looks at how professional teams in the US that are using AI at work keep AI-generated code from slipping open-source license obligations or patent trouble into their codebases without anyone realizing it.
Here is what we’ll cover today:
Why this risk matters for engineers and EMs
How startups and mature companies approach it differently
The technical safeguards inside popular tools like Copilot
The role of license and code scanners
Legal, policy, and review workflows that catch problems before they ship
Patent considerations and other IP concerns
Today’s topic was requested by a paid subscriber! If you want access to this edition, the full members-only archive, and priority on future article requests, I’d love to have you join as a paid subscriber.
Some subscribers expense this newsletter to their learning and development budget. If you have such a budget, here’s an email you could send to your manager.
Why AI-Generated code poses IP risks
Large Language Models (LLMs) trained on public code can inadvertently produce snippets that are verbatim or close copies of existing open-source code. If a generated snippet comes from code under a restrictive license (for example, GPL), using it might obligate developers to open source their entire project or provide attribution under that license’s terms.
Researchers found that even top-tier code models produce around 1% of outputs that are “strikingly similar” to existing open-source code, often without any license notice. In one notable case, an AI tool reproduced a contributor’s GPL-licensed code almost verbatim with no attribution or license. Yikes!