
AI-generated code introduces quite a lot of threat into the event course of. A current Sonatype report discovered that AI hallucinated 27% of improve suggestions for open supply initiatives, whereas analysis from Veracode discovered that AI launched safety vulnerabilities in 45% of 80 coding duties throughout 100+ completely different LLMs. Now, new analysis from Black Duck is shedding mild on one other urgent problem associated to AI-generated code: IP and licensing dangers.
Within the firm’s 2026 Open Supply Safety and Danger Evaluation (OSSRA) report, it analyzed 947 industrial codebases and located that two-thirds of them had license conflicts—the very best proportion within the historical past of the report. This represents a 12% improve from final 12 months, which additionally breaks a report for the most important leap within the report’s historical past.
One of many codebases that Black Duck audited contained 2,675 distinct licensing conflicts, indicating the complexity of managing IP has grown exponentially.
“This rise is partly pushed by ‘license laundering,’ the place AI assistants generate code snippets derived from copyleft sources (like GPL) with out retaining the unique license info,” the corporate defined in a weblog submit. For instance, the report reveals that 17% of open supply parts are coming into codebases exterior of conventional bundle managers, by means of copy and pasted snippets, direct vendor inclusions, or AI era. This presents a problem, as code that enters this manner could also be invisible to conventional manifest-based scanning instruments.
This 12 months’s OSSRA report additionally discovered that the imply variety of vulnerabilities in code has almost doubled since final 12 months. Eighty-seven % of the codebases had at the least one vulnerability, 78% had high-risk vulnerabilities, and 44% had critical-risk vulnerabilities.
The corporate defined that it found a “zombie part” downside when digging into the analysis. Ninety-three % of codebases contained parts that hadn’t seen lively improvement in two years, 92% contained parts that had been at the least 4 years old-fashioned, and solely 7% of parts in use had been upgraded to the most recent model.
“These deserted parts are a ticking time bomb. When a vulnerability is found in a mission that hasn’t been touched in years, there’s usually no maintainer left to repair it. Organizations are left with tough selections: fork the mission, refactor the applying, or settle for the chance,” the researchers wrote.
Black Duck concluded {that a} key takeaway from this 12 months’s report is that there’s a rising hole between AI adoption and governance.
“As regulatory strain mounts from frameworks such because the EU AI Act and Cyber Resilience Act, the ‘ship and overlook’ mannequin of software program supply is now not viable. Organizations should transfer towards a mannequin of steady provide chain transparency, the place each part, whether or not human-written, AI-generated, or open supply, is accounted for,” Black Duck stated.