Sign Here on the Dotted Line

A look at the draft DHS/CISA Secure Software Development Attestation Form

If you’re in the Federal IT/Software space, or even in industry, by now you’ve likely heard the Federal government is making a push to require secure software development. It’s a theme that has been present in publications from CISA, in their Secure-by-Design/Default guidance, which we covered here, as well as present in the Cybersecurity Executive Order (EO) and National Cyber Strategy, which we also covered.

Previously the Office of Management and Budget (OMB) issues a memo titled “Enhancing the Security of the Software Supply Chain through Secure Software Development Practices (M-22-18), which I write about in this article.

Their guiding star for secure software development has been NIST’s Secure Software Development Framework, which I covered in the same article as 22-18 and was an activity NIST conducted based on the Cyber EO. SSDF is a collection of secure software development practices and activities, drawing from sources such as OWASP’s Software Application Maturity Model (SAMM) and Synopys’ Building Security In Maturity Model (BSIMM), among others.

As part of 22-18, OMB authorized Federal agencies to begin to collect attestations from third-party software suppliers selling software to the Federal government, attesting that they are using secure development practices to create their software and products.

The draft of the form has been released, and we will be taking a look at it in this article.

Secure Software Self-Attestation Common Form

The form opens up with a bit of context and history, tying to sources such as the Federal Information Security Modernization Act of 2014, along with efforts by OMB and, the Cyber EO NIST, all of which tie to requiring Federal agencies to provide security protections of Federal data.

This means that software producers/suppliers must meet and attest to meeting these defined secure software development requirements before Federal agency can procure their products/software.

It applies to all software that:

  • Is developed after September 14 2022

  • Existing software that undergoes major version changes after September 14 2022 (as defined in the form)

  • Software which is delivered with continuous changes to the code - e.g. SaaS or software using CI/CD (which is increasingly most modern software)

It ironically excludes:

  • Software developed by Federal agencies

  • Software that is freely obtained that is used directly by the agency (e.g. FOSS/OSS)

This part of course is a bit perplexing because it gives an aura of rules for thee but not for me mentality, and also precludes Federally developed software from specifically aligning with SSDF and associated secure software development practices, despite no shortage of Federal agency systems falling victim to malicious actors.

That said, one refreshing aspect is the acknowledgement that software producers utilizing FOSS/OSS components need to attest to have taken specific steps to minimize the risk of relying on those FOSS components in their products.

This shifts from the current paradigm where software suppliers just point to OSS maintainers and say it is their software, they are responsible for it. As I have covered in a previous article titled “Supplier Misnomer”, OSS maintainers are not your suppliers, they generally owe you nothing, and most OSS is provided “as-is”, you take the risk and responsibility of integrating it into your products and services. There are plenty of risks associated with OSS that go beyond CVE’s as well, which I cover in the article discussing the “Top 10 OSS Risks”.

The guidance also leaves the door open for agencies to provide “agency-specific instructions” beyond the forms contents. On one hand this is good because agencies can tailor instructions to software suppliers based on their unique agency needs, but it also opens the door to create a complex patchwork quilt of requirements that differ from agency to agency.

Submitting and Signing the Form

The form is able to be submitted directly by downloading and filling it out and returned to a specific email address or completed directly online.

In a nod to the seriousness of the request, the CEO of the software producer firm must sign the form, or someone they designate in their stead. I like this because many of us have long stated that the business owns the risk. Cybersecurity is merely here to help make the business make risk-informed decisions, such as advising them on whether or not they should sign a form to the Federal government attesting to using secure software development practices. This eliminates the business externalizing the burden onto CISO’s/Cyber and instead makes them the signatory (unless of course they designate the CISO/Cyber to sign the form on their behalf, which likely may occur in many cases).

Gaps in Compliance/Plans of Actions and Milestones (POAM)’s

As anyone working in cyber knows, it is rare to entirely meet all compliance and security control requirements, often creating a situation where the firm must accept the risk of gaps and document them to remediate them at a future date, if ever.

In the Federal space, which uses the NIST Risk Management Framework (RMF) this is known as a Plan of Actions and Milestones (POAM), which documents control deficiencies and projected future remediation dates.

The DHS/CISA form leaves the door open to suppliers not being able to meet specific practices and documenting them in a POAM on when they plan to meet the deficient practices, along with potential extension and waiver requests.

This puts the agency in a position to accept the risk of using a suppliers software despite having deficiencies in their secure software development practices.

Software Bill of Materials (SBOM) and 3rd Party Assessment

In addition to the self-attestation artifact, the form, much like 22-18, leaves the door open for agencies to request other artifacts such as a SBOM. For those unfamiliar, an SBOM is a nested ingredient list for software, showing the various libraries and components and so on within the software.

There are two leading industry formats in OWASP’s CycloneDX and The Linux Foundations Software Package Data Exchange (SPDX). I’ve written extensively about SBOM’s and their various formats in articles such as this one, and you can also find more from agency websites such as NTIA and CISA when it comes to SBOM’s. For those interested in not just vulnerabilities but actual exploitability, you can find my article about the SBOM companion document, Vulnerability Exploitability eXchange (VEX) here. As the form points out, any SBOM’s supplied must meet the minimum elements for SBOM’s, as defined by NTIA.

The self-attestation form also points out, much like 22-18 did, that agencies may request a third-party assessor organization (3PAO), such as a FedRAMP 3PAO to assess the suppliers software/services, and also doesn’t require suppliers to submit self-attestations for software that’s already been assessed by approved 3PAO’s. This alleviates duplicative and cumbersome requirements for some entities such as Cloud Service Providers (CSP)’s, who already have went through FedRAMP for their service offerings and software for example.

Minimum Attestation Requirements/References

The form goes on to define what the minimum attestation requirements are, and how they are related to both specific Cyber EO sub-sections and SSDF practices and tasks. With regard to the Cyber EO, they almost all relate directly to Section 4, which focuses on Software Supply Chain Security, but they span various areas of the SSDF, so let’s take a look at those.

First up is:

Looking at the items laid out and their mapping, they tie to the entire gamut of SSDF groups (e.g. Preparing the Organization (PO), Protecting the Software (PS), Producing Well-Secured Software (PW) and Responding to Vulnerabilities (RV)).

SSDF provides several examples for each of these practices and tasks along with tying to specific identifiers in references sources such as the EO, BSIMM and SAMM. There is of course some level of subjectivity, as with any control frameworks. For example, what are consistent and reasonable steps to minimize the risk of vulnerable products, and who/what determines undue risk?

Nonetheless, these requirements are a great start to get software suppliers thinking critically about their secure software development practices and map to sources such as CNCF’s Catalog of Software Supply Chain Attacks, which I covered in this article.

Second Up:

Again, this requirement includes a fair level of subjectivity, such as “good faith efforts” and “reasonable steps”. However, it does emphasize the need for organizations to maintain trusted source code environments and supply chains, utilizing tooling to vet third-party components and manage the vulnerabilities related to third-party components they include in their products/services, which subsequently passed on to the Federal government as a consumer in this case.

This ties back to the point of software suppliers being responsible for the OSS components they include in their products, and the associated vulnerabilities of those components, rather than passing that cost on to the consumer, which is the U.S. Federal government.

3rd on the list:

Third up on the list of attestation requirements is the aspect of provenance data for internal and third-party code incorporated into products and software.

For those not familiar with the concept of Provenance, it is defined below from the Supply Chain Levels for Software Artifacts (SLSA), an emerging framework for software supply chain security.

It is also defined by NIST and used in various special publications, such as 800-161r1 and 800-37r2

This one, while it may seem simple, isn’t quite so simple in many cases, with organizations particularly struggling to provide provenance information for transitive dependencies, or in other words, dependencies of their direct dependencies.

This requirement also maps to various SSDF groups and those subsequently map to industry references, such as OWASP’s Software Component Verification Standard (SCVS).

While SSDF doesn’t appear to go as granular as SCVS, you can get a feel for the level of rigor and depth to provenance and pedigree activities by looking at the Pedigree and Provenance Requirements control objective in SCVS.

4th:

4th up on the list focuses on utilizing automated tooling and processes to check for security vulnerabilities. Further detailed below:

Some key aspects of this requirement are automated tooling and processes to identify vulnerabilities in software. This would include 3rd party components, and the potential need to replace or eliminate their use, if for example they have vulnerabilities that aren’t be remediated by the maintainers.

It also includes the requirement to have a Vulnerability Disclosure Program (VDP), which I touch on in my article about VDP’s and Product Security Incident Response Teams (PSIRT)’s, leveraging FIRST’s PSIRT Maturity Document.

This is a testament to the fact that all software, applications and services will have vulnerabilities, despite best efforts to mitigate the inclusion of vulnerable components and code. This is why software suppliers need mature vulnerability disclosure programs and PSIRT’s to respond to incidents or vulnerabilities when they occur and to be forthcoming with downstream consumers (such as the Federal government) regarding vulnerabilities in their products.

Software ages like milk and new vulnerabilities are emerging constantly.

Critiques

As with any emerging requirement, there is no shortage of critiques and concerns associated with it.

Some of the most notable critiques come from the former DoD Chief Software Officer (CSO), Jason Weiss, who made a post on LinkedIn pointing out that 9 controls from SSDF are entirely absent, including key activities, such as:

  • Tracking security requirements for customer software

  • Routine training on what constitutes secure software development

  • Using forms of risk modeling

  • Using standardized security features over proprietary options

To easier understand some of the items Jason mention as missing, I captured them in the below table:

The comments on Jason’s post run the range from those happy with the subset of SSDF controls chosen, to others asking for transparency in how the control/practice tailoring occurred and its associated rationale.

For example, software supply chain leader Chainguard’s Co-Founder Dan Lorenc stated he always assumed this attestation would be a subset of SSDF, while others such as OWASP’s CycloneDX/Dependency Track creator Steve Springett stated that some of the omissions are “scary”, along with the fact that being surprised the attestation forms are a human-readable rather than machine readable format, which does seem antithetical to the push to as-Code artifacts and machine readability (e.g. SBOM’s and VEX)

I personally did find some omissions puzzling, such as the removal of practices related to risk modeling (e.g. Threat Modeling), especially given CISA’s recent publication of Secure-by-Design/Default software, which emphasizes the role of Threat Modeling in designing and creating secure software and services.

Many have also pointed out that the self-attestation method creates an all-too-familiar problem we have seen play out in other compliance efforts, such as the DoD’s Defense Industrial Base (DIB) 800-171 security requirements. Many “inaccurate” self-attestations along with the pilfering of DoD controlled unclassified information (CUI) has now led to the creation of the CMMC framework model, which uses a 3PAO.

That said, self-attestations and 3PAO’s each have their own considerations, benefits and drawbacks, such as scalability, accuracy, burden and so on. These have been covered extensively by experts such as James Dempsey at places such as Lawfare Blog, which I recommend checking out if you want a masterclass on the topic.

Moving Forward

That said, for those with feedback, whether positive or negative, this is a Request for Comment phase of the documentation, with the community able to provide that feedback to CISA, with this link here.

While perhaps imperfect, this form is another step in the direction of the Federal government looking to bolster software supply chain security, particularly by using their massive purchasing power to drive systemic changes in secure software development across the ecosystem.