Last week, the Biden administration released its National Cybersecurity Strategy, and buried in the policy language is a tectonic shift that should have every software vendor, open-source maintainer, and enterprise development team sitting up straight. The document explicitly calls for shifting cybersecurity liability from end users to the companies that build and maintain software.
If you’re a developer reading this and thinking “that’s a policy thing, not my problem” — I’d urge you to reconsider. This is the clearest signal yet that the era of shipping software with known vulnerabilities and hiding behind EULAs is ending.
The Core Shift: You Built It, You Own It#
The strategy is organized around five pillars, but Pillar 3 — “Shape Market Forces to Drive Security and Resilience” — is the one that will keep software executives up at night. The key language: the administration wants to “shift liability onto those entities that fail to take reasonable precautions to secure their software.”
This isn’t entirely new thinking. The EU’s Cyber Resilience Act has been moving in a similar direction. But the US putting this in a top-level strategic document signals that legislation will follow. The question isn’t whether software liability laws are coming, but when and how broadly they’ll be applied.
For those of us in the trenches, this means several things. First, “we’ll fix it in the next sprint” is going to carry legal weight it never had before. Second, software bills of materials (SBOMs) are moving from “nice to have” to “legally required.” Third, the security practices you implement today are building (or failing to build) your compliance posture for regulations that are almost certainly coming within the next few years.
What This Means for Open Source#
The strategy acknowledges that open-source software requires special consideration, and this is where things get genuinely complicated. The document suggests that liability should fall on the commercial entities that build products using open-source components, not on the volunteer maintainers of those components.
That sounds reasonable in theory. In practice, the boundary between “commercial use” and “community contribution” is blurry at best. A company that maintains an open-source project as part of its business model — think Elastic, HashiCorp, or Red Hat — occupies a grey zone that policy makers will need to define more precisely.
I’m cautiously optimistic about this approach. The Log4j incident in late 2021 showed us what happens when critical infrastructure depends on under-resourced open-source projects. Placing liability on the commercial consumers of open source creates a financial incentive to actually fund and support the projects they depend on. Whether that incentive translates into meaningful investment remains to be seen.
For open-source maintainers: this is a good time to make sure your project has clear licensing, contribution guidelines, and — critically — documented security practices. Even if you’re not directly liable, being part of a supply chain that’s under regulatory scrutiny means more eyeballs on your processes.
The SBOM Imperative#
If you’re not already generating Software Bills of Materials for your projects, start now. The strategy builds on Executive Order 14028 from 2021, which already mandated SBOMs for software sold to the federal government. The new strategy expands this thinking to the broader market.
Tools like Syft, Trivy, and CycloneDX make SBOM generation straightforward. If you’re running a CI/CD pipeline (and in 2023, you should be), adding SBOM generation is a half-day task at most. Here’s a basic example with Syft in a GitHub Actions workflow:
- name: Generate SBOM
uses: anchore/sbom-action@v0
with:
image: myapp:${{ github.sha }}
format: cyclonedx-json
output-file: sbom.jsonThe harder part isn’t generating the SBOM — it’s acting on it. You need processes for tracking vulnerabilities in your dependency tree, policies for how quickly you patch, and documentation showing you’ve taken “reasonable precautions.” That last phrase is going to be litigated extensively.
Secure by Design, Not Secure by Afterthought#
The strategy repeatedly emphasizes “secure by design” and “secure by default.” These aren’t new concepts — we’ve been talking about shifting security left for a decade. But having them enshrined in national strategy adds institutional weight.
Practically, this means:
- Default configurations should be secure. If your application ships with debug mode enabled, default passwords, or permissive CORS policies, that’s a liability.
- Memory-safe languages get a boost. The strategy explicitly mentions reducing memory safety vulnerabilities. If you’re starting a new systems project and choosing between C++ and Rust, the regulatory environment just added another point in Rust’s column.
- Vulnerability disclosure programs are becoming mandatory, not optional. If you don’t have a way for researchers to report security issues, you’re already behind.
My Take#
I’ve lived through enough security incidents to know that voluntary compliance doesn’t work at scale. Companies that prioritize security do so because of culture and leadership, not because of guidelines. For everyone else, regulation is the only lever that moves the needle.
Is there a risk of overreach? Absolutely. Poorly drafted legislation could punish small vendors disproportionately, create compliance theater that doesn’t improve actual security, or chill open-source contribution. The details matter enormously, and I hope the eventual legislation benefits from genuine technical input rather than just lobbyist influence.
But the direction is right. As someone who has spent decades watching organizations treat security as someone else’s problem, seeing it elevated to a national strategic priority — with real liability implications — feels overdue. We build the software that runs the world’s infrastructure. It’s not unreasonable to ask that we take responsibility for securing it.
Start with the basics: generate SBOMs, automate dependency scanning, document your security practices, and make sure your team understands that “secure by default” isn’t a slogan anymore. It’s becoming the law.
This post is part of my Security in Practice series, covering the evolving intersection of security, policy, and software development.
