Ensuring Software Assurance Process Maturity
Software assurance is the level of confidence that software is free from vulnerabilities, whether intentionally designed into the software or accidentally inserted at any time during its lifecycle, and that it functions in the intended manner.1 Once an organization becomes aware of the need to meet software assurance goals, the next step is to assess its current development and procurement activities and practices. Such an analysis requires at least two things. The first is a repeatable and objective assessment process. The second is a clear benchmark or target that represents a suitable level of risk management given the nature of the organization and the software’s mission. Performing this assessment periodically provides an ongoing understanding of the maturity of respective software assurance capabilities.
Choosing a methodology for appraising an organization’s ability to meet software assurance goals may seem overwhelmEdmunding because there are several maturity models available, each with their own focus and level of granularity. For an organization that may be new to the area of software assurance, it can be a challenge to simply find good sources of guidance, much less understand which parts of each model are best suited for its environment and supply chain. Although finding the right maturity model may seem challenging, organizations should not wait for an authority to mandate a software assurance initiative. Such mandates are typically intended to be “one-size-fits-all” and offer limited flexibility. Organizations are best served by tailoring a software assurance strategy to their own supply chains.
Selecting the best maturity model, or model components, for a particular organization to begin addressing assurance goals may also present a time-consuming learning curve. In order to facilitate an understanding of how multiple maturity models address similar assurance goals, the authors created a model-agnostic framework as part of participation in the SwA Forum Processes and Practices (P&P) Working Group (WG), which is co-sponsored by organizations with DHS, DoD, and the National Institute for Standards and Technology. This analysis involved mapping maturity models, and their respective practices, within the framework. The agreement among the models provides a valuable reference. This framework evolved into the SwA Checklist, which serves as a model-agnostic harmonized view of software assurance guidance.
The SwA Checklist can help organizations begin a dialogue amongst the entities in the supply chain that influence and/or support the software throughout the lifecycle. Using the checklist to characterize each of the organizations in a given supply chain provides extraordinary insight into the credibility or trust deserved by a given piece of software. By leveraging this insight, organizations can verify implicit assumptions that certain practices are taking place and align their activities with assurance goals to mitigate risks within their supply chains. Organizations can also use the checklist to organize evidence for assurance claims while assessing all of its practices as it performs the activities necessary to complete its baseline. Finally, organizations can use the baseline to engage their senior leadership regarding the areas in which resources are needed to meet assurance goals based upon guidance from the mapped models.
The SwA Checklist provides a consolidated view of current software assurance best practices in the context of an organized SwA initiative. The checklist is currently implemented as a “hot linked” Microsoft Excel spreadsheet that provides a cross-reference of goals and practices with side-by-side mappings to several publicly available maturity models. Organizations can use the mappings to identify where the maturity models agree and diverge, and use this consolidated format to select model components best suited to their environments.
Once an organization establishes its assurance goals, selects a maturity model (or model components), and captures its baseline, it can then establish an improvement plan for achieving software assurance goals as it develops and/or acquires secure software. Working with its direct customers (downstream in the supply chain) and suppliers (upstream in the supply chain) to improve software assurance will have a large multiplier effect as the approach spreads to other organizations.
The intended users of the SwA Checklist are organizations that currently are or soon will be acquiring or developing software. Organizations may have many options when developing or acquiring software from various sources. Although vendors and developers may offer software that meets specified functional requirements and provides myriad features, these offers are inconsequential if the data and functions are not protected. Developers and acquirers must give significant consideration to the ability of the software to reliably function and protect data and processes over the life of the product. Organizations can use the SwA Checklist to guide their own development or to evaluate vendor capabilities. Organizations can use the baselines they establish to facilitate an understanding of similar assurance goals and practices among several freely available maturity models, which can help guide the selection of the most appropriate model components.
Design of the SwA Checklist
The SwA Checklist is available at no cost at . The SwA Checklist is currently being vetted and we request your feedback based upon practical use within the field. A feedback form is available at the same URL above. The authors designed the checklist to be understandable by users with various levels of SwA experience (readers are invited to download a copy now and review it while reading this section).
The SwA Checklist contains multiple tabs/worksheets including the following: Intro, SwA Checklist, Sources, BSIMM, CMMI-ACQ, OSAMM, PRM, and RMM. The “Intro” tab serves as the introductory section that also provides pointers to each of the included models. The “SwA Checklist” tab provides the information that enables users to perform their analysis. Content from the included models is organized into five domains: Governance, Knowledge, Verification, Deployment, and Supplier Management. This categorization helps to harmonize terminology and makes it easy for the user to locate specific guidance. Within each domain are three categories containing a short, high-level goal and a set of three corresponding practices. There is a “Status” cell under each practice. Users can click on the cell to open a pull-down menu with pre-defined responses to input their organization’s implementation status for each practice. The range of possible status levels in the pull-down menus includes the following:
• Not Applicable
• Not Started
• Partially Implemented Internally
• Partially Implemented by Supplier(s)
• Partially Implemented Internally and by Supplier(s)
• Fully Implemented Internally
• Fully Implemented by Supplier(s)
• Fully Implemented Internally and by Supplier(s)
... to read more articles, visit http://sqa.fyicenter.com/art/