BSIMM6: Software Security Maturity Model
BSIMM6: Software Security Maturity Model
Authors
Gary McGraw, Ph.D., Sammy Migues, and Jacob West
PART ONE
The Building Security In Maturity Model (BSIMM, pronounced “bee simm”) is a study of existing software security
initiatives. By quantifying the practices of many different organizations, we can describe the common ground
shared by many as well as the variation that makes each unique. Our aim is to help the wider software security
community plan, carry out, and measure initiatives of their own. The BSIMM is not a “how to” guide, nor is it a one-
size-fits-all prescription. Instead, the BSIMM is a reflection of the current state of Software Security.
We begin with a brief description of the function and importance of a software security initiative. We then explain
our model and the method we use for quantifying the state of an initiative. Since the BSIMM study began in
2008, we have studied 104 initiatives, which comprise 235 distinct measurements—some firms use the BSIMM to
measure each of their business units and some have been measured more than once. To ensure the continued
relevance of the data we report, we excluded from BSIMM6 measurements older than 42 months. The current data
set comprises 202 distinct measurements collected from 78 firms. Thanks to repeat measurements, not only do we
report on current practices, but also on the ways in which
some initiatives have evolved over a period of years.
Introduction
History
In the late 1990s, software security began to flourish as a discipline separate from computer and network security.
Researchers began to put more emphasis on studying the ways a programmer can contribute to or unintentionally
undermine the security of a computer system: What kinds of bugs and flaws lead to security problems? How can we
identify problems systematically?
Since then, practitioners have come to know that process alone is insufficient. Software security encompasses
business, social, and organizational aspects as well. We use the term software security initiative (SSI) to refer to all of
the activities undertaken for the purpose of building secure software.
BSIMM6
The purpose of the BSIMM is to quantify the activities carried out by real software security initiatives. Because these
initiatives use different methodologies and different terminology, the BSIMM requires a framework that allows
us to describe all of the initiatives in a uniform
way. Our software security framework (SSF) and
activity descriptions provide a common vocabulary
for explaining the salient elements of a software
security initiative, thereby allowing us to compare
We believe all
initiatives that use different terms, operate at organizations can benefit
different scales, exist in different vertical markets,
or create different work products. from using the same
We classify our work as a maturity model because measuring stick.
improving software security almost always
means changing the way an organization works—
something that doesn’t happen overnight. We
understand that not all organizations need to achieve the same security goals, but we believe all organizations can
benefit from using the same measuring stick.
BSIMM6 is the sixth major version of the BSIMM model. It includes updated activity descriptions, data from 78 firms
in multiple vertical markets, and a longitudinal study.
Audience
The BSIMM is meant for use by anyone responsible for creating and executing an SSI. We have observed that
successful software security initiatives are typically run by a senior executive who reports to the highest levels in an
organization. These executives lead an internal group that we call the software security group (SSG), charged with
directly executing or facilitating the activities described in the BSIMM. The BSIMM is written with the SSG and SSG
leadership in mind.
We expect readers to be familiar with the software security literature. You can become familiar with many concepts
by reading Software Security: Building Security In. The BSIMM does not attempt to explain software security basics,
describe its history, or provide references to the ever-expanding literature. Succeeding with the BSIMM without
becoming familiar with the literature is unlikely.
Activity: Actions carried out or facilitated by the SSG as part of a practice. Activities are divided
into three levels in the BSIMM.
Domain: The domains are: governance, intelligence, secure software development lifecycle (SSDL)
touchpoints, and deployment. See the SSF section on page 10.
Practice: One of the 12 categories of BSIMM activities. Each domain in the Software Security
Framework has three practices. Activities in each practice are divided into three levels. See the SSF
section on page 10.
Satellite: A group of interested and engaged developers, architects, software managers, testers,
and similar roles who have a natural affinity for software security and are organized and leveraged
by a Software Security Initiative.
Secure Software Development Lifecycle (SSDL): Any SDLC with integrated software security
checkpoints and activities.
Security Development Lifecycle (SDL): A term used by Microsoft to describe their Secure
Software Development Lifecycle.
Software Security Framework (SSF): The basic structure underlying the BSIMM, comprising 12
practices divided into four domains. See the SSF section on page 10.
Software Security Group (SSG): The internal group charged with carrying out and facilitating
software security. According to our observations, the first step of a Software Security Initiative is
forming an SSG.
Governance: Practices that help organize, manage, and measure a software security initiative.
Staff development is also a central governance practice.
Intelligence: Practices that result in collections of corporate knowledge used in carrying out
software security activities throughout the organization. Collections include both proactive
security guidance and organizational threat modeling.
SSDL Touchpoints: Practices associated with analysis and assurance of particular software
</> development artifacts and processes. All software security methodologies include these practices.
Deployment: Practices that interface with traditional network security and software
maintenance organizations. Software configuration, maintenance and other environment issues
have direct impact on software security.
Governance
STRATEGY & METRICS (SM)
LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Publish process (roles, responsibilities, plan), evolve as necessary. SM1.1 53%
Create evangelism role and perform internal marketing. SM1.2 51%
Educate executives. SM1.3 46%
Identify gate locations, gather necessary artifacts. SM1.4 85%
LEVEL 2
Publish data about software security internally. SM2.1 46%
Enforce gates with measurements and track exceptions. SM2.2 37%
Create or grow a satellite. SM2.3 38%
Identify metrics and use them to drive budgets. SM2.5 22%
Require security sign-off. SM2.6 37%
LEVEL 3
Use an internal tracking application with portfolio view. SM3.1 19%
Run an external marketing program. SM3.2 9%
LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Unify regulatory pressures. CP1.1 58%
Identify PII obligations. CP1.2 78%
Create policy. CP1.3 53%
Table continued on next page >
LEVEL 3
Create regulator eye-candy. CP3.1 23%
Impose policy on vendors. CP3.2 14%
Drive feedback from SSDL data back to policy. CP3.3 8%
TRAINING (T)
LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Provide awareness training. T1.1 76%
Deliver role-specific advanced curriculum (tools, technology stacks, bug parade). T1.5 33%
Create and use material specific to company history. T1.6 22%
Deliver on-demand individual training. T1.7 46%
LEVEL 2
Enhance satellite through training and events. T2.5 13%
Include security resources in onboarding. T2.6 19%
Identify satellite through training. T2.7 8%
LEVEL 3
Reward progression through curriculum (certification or HR). T3.1 4%
Provide training for vendors or outsourced workers. T3.2 4%
Host external software security events. T3.3 4%
Require an annual refresher. T3.4 10%
Establish SSG office hours. T3.5 5%
LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Build and maintain a top N possible attacks list. AM1.1 22%
Create a data classification scheme and inventory. AM1.2 65%
Identify potential attackers. AM1.3 40%
Collect and publish attack stories. AM1.4 10%
Gather and use attack intelligence. AM1.5 59%
Build an internal forum to discuss attacks. AM1.6 14%
LEVEL 2
Build attack patterns and abuse cases tied to potential attackers. AM2.1 8%
Create technology-specific attack patterns. AM2.2 10%
LEVEL 3
Have a science team that develops new attack methods. AM3.1 5%
Create and use automation to do what attackers will do. AM3.2 3%
LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Build and publish security features. SFD1.1 78%
Engage SSG with architecture. SFD1.2 76%
LEVEL 2
Build secure-by-design middleware frameworks and common libraries. SFD2.1 31%
Create SSG capability to solve difficult design problems. SFD2.2 50%
LEVEL 3
Form a review board or central committee to approve and maintain secure
design patterns. SFD3.1 10%
LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Create security standards. SR1.1 73%
Create a security portal. SR1.2 64%
Translate compliance constraints to requirements. SR1.3 67%
LEVEL 2
Create a standards review board. SR2.2 35%
Create standards for technology stacks. SR2.3 27%
Identify open source. SR2.4 24%
Create SLA boilerplate. SR2.5 26%
Use secure coding standards. SR2.6 29%
LEVEL 3
Control open source risk. SR3.1 8%
Communicate standards to vendors. SR3.2 14%
LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Perform security feature review. AA1.1 86%
Perform design review for high-risk applications. AA1.2 37%
Have SSG lead design review efforts. AA1.3 28%
Use a risk questionnaire to rank applications. AA1.4 59%
LEVEL 2
Define and use AA process. AA2.1 15%
Standardize architectural descriptions (including data flow). AA2.2 12%
Make SSG available as AA resource or mentor. AA2.3 17%
LEVEL 3
Have software architects lead design review efforts. AA3.1 8%
Drive analysis results into standard architecture patterns. AA3.2 1%
LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Use a top N bugs list (real data preferred). CR1.1 23%
Have SSG perform ad hoc review. CR1.2 68%
Use automated tools along with manual review. CR1.4 71%
Make code review mandatory for all projects. CR1.5 31%
Use centralized reporting to close the knowledge loop and drive training. CR1.6 35%
LEVEL 2
Enforce coding standards. CR2.2 9%
Assign tool mentors. CR2.5 26%
Use automated tools with tailored rules. CR2.6 21%
LEVEL 3
Build a factory. CR3.2 4%
Build a capability for eradicating specific bugs from the entire codebase. CR3.3 6%
Automate malicious code detection. CR3.4 4%
LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Ensure QA supports edge/boundary value condition testing. ST1.1 78%
Drive tests with security requirements and security features. ST1.3 85%
LEVEL 2
Integrate black box security tools into the QA process. ST2.1 31%
Share security results with QA. ST2.4 10%
Include security tests in QA automation. ST2.5 13%
Perform fuzz testing customized to application APIs. ST2.6 14%
LEVEL 3
Drive tests with risk analysis results. ST3.3 5%
Leverage coverage analysis. ST3.4 5%
Begin to build and apply adversarial security tests (abuse cases). ST3.5 6%
LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Use external penetration testers to find problems. PT1.1 88%
Feed results to the defect management and mitigation system. PT1.2 60%
Use penetration testing tools internally. PT1.3 60%
LEVEL 2
Provide penetration testers with all available information. PT2.2 26%
Schedule periodic penetration tests for application coverage. PT2.3 22%
LEVEL 3
Use external penetration testers to perform deep-dive analysis. PT3.1 13%
Have the SSG customize penetration testing tools and scripts. PT3.2 10%
LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Use application input monitoring. SE1.1 47%
Ensure host and network security basics are in place. SE1.2 88%
LEVEL 2
Publish installation guides. SE2.2 40%
Use code signing. SE2.4 32%
LEVEL 3
Use code protection. SE3.2 13%
Use application behavior monitoring and diagnostics. SE3.3 6%
LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Create or interface with incident response. CMVM1.1 91%
Identify software defects found in operations monitoring and feed them back CMVM1.2 94%
to development.
LEVEL 2
Have emergency codebase response. CMVM2.1 82%
Track software bugs found in operations through the fix process. CMVM2.2 78%
Develop an operations inventory of applications. CMVM2.3 40%
LEVEL 3
Fix all occurrences of software bugs found in operations. CMVM3.1 5%
Enhance the SSDL to prevent software bugs found in operations. CMVM3.2 8%
Simulate software crisis. CMVM3.3 8%
Operate a bug bounty program. CMVM3.4 4%
ACTIVITY DESCRIPTION
[SM1.4] Identify gate locations and gather necessary artifacts
[CP1.2] Identify PII obligations
[T1.1] Provide awareness training
[AM1.2] Create a data classification scheme and inventory
[SFD1.1] Build and publish security features
[SR1.1] Create security standards
[AA1.1] Perform security feature review
[CR1.4] Use automated tools along with manual review
[ST1.3] Drive tests with security requirements and security features
[PT1.1] Use external penetration testers to find problems
[SE1.2] Ensure host and network security basics are in place
[CMVM1.2] Identify software bugs found in operations monitoring and feed them back to development
Spider charts are created by noting the highest-level activity for each practice per BSIMM firm (a “high-water mark”)
and averaging these scores over a group of firms to produce 12 numbers (one for each practice). The resulting spider
chart plots these values on 12 activity spokes corresponding to the 12 practices. Note that level 3 (the outside edge)
is considered more mature than level 0 (center point). Other, more sophisticated analyses are possible, of course. We
continue to experiment with weightings by level, normalization by number of activities and other schemes.
2.0
1.0
0.5
Architecture Analysis
Earth (78)
The graph below shows the distribution of scores among the population of 78 participating firms. To make this
graph, we divided the scores into six bins. As you can see, the scores represent a slightly skewed bell curve. We
also plotted the average age of the firms in each bin as the orange line on the graph. In general, firms where more
BSIMM activities have been observed have older software security initiatives.
SCORE DISTRIBUTION
30
25
20
FIRMS
15
10 8.9
6.0
5.3
5 3.9
3.1
0.9
Earth (78) - With average SSG age (in years) per score bucket
We are pleased that the BSIMM study continues to grow year after year (the data set we report on here is more
than 26 times the size it was for the original publication). Note that once we exceeded a sample size of 30 firms, we
began to apply statistical analysis, yielding statistically significant results.
The scorecard you see on the next page depicts a fake firm that performs 37 BSIMM activities (noted as 1s in the
FIRM columns), including eight activities that are the most common in their respective practices (purple boxes).
On the other hand, the firm does not perform the most commonly observed activities in the other four practices
(red boxes) and should take some time to determine whether these are necessary or useful to its overall software
security initiative. The BSIMM6 FIRM columns show the number of observations (currently out of 78) for each
activity, allowing the firm to understand the general popularity of an activity among the 78 BSIMM firms.
Executive Leadership
Of primary interest is identifying and empowering a senior executive to manage operations, garner resources, and
provide political cover for a software security initiative. Grassroots approaches to software security sparked and
led solely by developers and their direct managers have a poor track record in the real world. Likewise, initiatives
spearheaded by resources from an existing network security group often run into serious trouble when it comes
time to interface with development groups. By identifying a senior executive and putting him or her in charge of
software security directly, you address two management 101 concerns—accountability and empowerment. You also
create a place in the organization where software security can take root and begin to thrive.
The executives in charge of the software security initiatives we studied have a variety of titles, including: Information
Security Manager, Security Director, Director of Application Security Strategy, Director of Enterprise Information
Security, VP of Security Architecture, Chief Product Security and Privacy Officer, CISO, Head of Information
Governance, Global Head of Security Maturation and
Certification, Technical Director, Director of IT Security,
and Manager of Enterprise Information Protection.
We observed a fairly wide spread in exactly where the Carrying out the
SSG is situated in the firms we studied. In particular,
28 exist in the CIO’s organization, 16 exist in the CTO’s activities in the
organization, 10 report to the COO, four report to the
Chief Assurance Officer, two report to the CSO, and one BSIMM successfully
SSG reports to each of the General Counsel, the CFO, and
the founder. Twelve SSGs report up through technology
without an SSG is
or product operations groups (as opposed to governance very unlikely.
organizations). Three report up through the business unit
where they were originally formed. Several companies we
studied did not specify where their SSG fits in the larger
organization.
SSGs come in a variety of shapes and sizes. All good SSGs appear to include both people with deep coding
experience and people with architectural chops. As you will see below, software security can’t only be about finding
specific bugs such as the OWASP Top Ten. Code review is a very important best practice and to perform code
review you must actually understand code (not to mention the huge piles of security bugs). However, the best code
reviewers sometimes make very poor software architects and asking them to perform an architecture risk analysis
will only result in blank stares. Make sure you cover architectural capabilities in your SSG as well as you do code.
Finally, the SSG is often asked to mentor, train, and work directly with hundreds of developers. Communications
skills, teaching capability, and good consulting horse sense are must-haves for at least a portion of the SSG staff.
For more about this issue, see our SearchSecurity article based on SSG structure data gathered at the 2014 BSIMM
Community Conference: How to Build a Team for Software Security Management.
Though no two of the 78 firms we examined had exactly the same SSG structure (suggesting that there is no
one set way to structure an SSG), we did observe some commonalities that are worth mentioning. At the highest
level of organization, SSGs come in five major flavors: those that are 1) organized to provide software security
services, 2) organized around setting policy, 3) mirroring business unit organizations, 4) organized with a hybrid
policy and services approach, and 5) structured around managing a distributed network of others doing software
security work. Some SSGs are highly distributed across a firm and others are very centralized. If we look across
all of the SSGs in our study, there are several common “subgroups” that are often observed: people dedicated to
policy, strategy, and metrics; internal “services” groups that (often separately) cover tools, penetration testing, and
middleware development plus shepherding; incident response groups; groups responsible for training development
and delivery; externally-facing marketing and communications groups; and vendor-control groups.
In the statistics reported above, we noted an average ratio of SSG to development of 1.51% across the entire group
of 78 organizations we studied. That means one SSG member for every 75 developers when we average the ratios
for each participating firm. The SSG with the largest ratio was 16.7% and the smallest was 0.03%. To remind you
of the particulars in terms of actual bodies, SSG size on average among the 78 firms was 13.9 people (smallest 1,
largest 130, median 6).
Satellite
In addition to the SSG, many software security programs have identified a number of individuals (often developers,
testers, and architects) who share a basic interest in software security, but are not directly employed in the SSG.
When people like this carry out software security activities, we call this group the satellite.
Sometimes the satellite is widely distributed, with one or two members in each product group. Sometimes the
Of particular interest, all 10 firms with the highest BSIMM scores have a satellite (100%), with an average satellite
size of 131 people. Outside of the top 10, 30 of the remaining 68 firms have a satellite (44.1%). Of the 10 firms with
the lowest BSIMM scores, none have a satellite. This suggests that as a software security initiative matures, its
activities become distributed and institutionalized into the organizational structure. Among our population of 78
firms, initiatives tend to evolve from centralized and specialized in the beginning to decentralized and distributed
(with an SSG at the core orchestrating things).
Everybody Else
Our survey participants have engaged everyone involved
in the software development lifecycle as a means of
All 10 firms with
addressing software security. the highest BSIMM
• Builders, including developers, architects, and scores have a
their managers must practice security engineering,
ensuring that the systems that we build are satellite.
defensible and not riddled with holes. The SSG will
interact directly with builders when they carry out
the activities described in the BSIMM. Generally
speaking, as an organization matures, the SSG attempts to empower builders so they can carry out most of the
BSIMM activities themselves with the SSG helping in special cases and providing oversight. In this version of the
BSIMM, we often don’t explicitly point out whether a given activity is to be carried out by the SSG, developers,
or testers. You should come up with an approach that makes sense for your organization and accounts for your
workload and your software lifecycle.
• Testers concerned with routine testing and verification should do what they can to keep a weather eye out for
security problems. Some of the BSIMM activities in the Security Testing practice (see page 52) can be carried out
directly by QA.
• Operations people must continue to design reasonable networks, defend them, and keep them up. As you will
see in the Deployment domain of the SSF, software security doesn’t end when software is “shipped.”
• Administrators must understand the distributed nature of modern systems and begin to practice the principle
of least privilege, especially when it comes to the applications they host or attach to as services in the cloud.
• Executives and middle management, including line of business owners and product managers, must
understand how early investment in security design and security analysis affects the degree to which users will
trust their products. Business requirements should explicitly address security needs. Any sizeable business
today depends on software to work. Software security is a business necessity.
• Vendors, including those who supply COTS, custom software and software-as-a-service, are increasingly
subjected to SLAs and reviews (such as vBSIMM) that help ensure products are the result of a secure SDLC.
We have added, deleted, and adjusted the level of various activities based on the data observed as the project
continues. To preserve backwards compatibility, all changes are made by adding new activity labels to the model,
even when an activity has simply changed levels. We make changes by considering outliers both in the model itself
and in the levels we assigned to various activities in the 12 practices. We use the results of an intra-level standard
deviation analysis to determine which “outlier” activities to move between levels. We focus on changes that
minimize standard deviation in the average number of observed activities at each level.
We also carefully considered, but did not adjust, the following activities [SM2.1][AM1.4][CR 1.1][CR2.2][ST2.4]. We
are keeping a close eye on the Attack Models and Code Review practices; both of which were impacted by culling
aged data.
Level 1 Activities
Governance
Strategy & Metrics (SM)
• Publish process (roles, responsibilities, plan), evolve as necessary. [SM1.1]
• Create evangelism role and perform internal marketing. [SM1.2]
• Educate executives. [SM1.3]
• Identify gate locations, gather necessary artifacts. [SM1.4]
Training (T)
• Provide awareness training. [T1.1]
• Deliver role-specific advanced curriculum (tools, technology stacks and bug parade). [T1.5]
• Create and use material specific to company history. [T1.6]
• Deliver on-demand individual training. [T1.7]
Intelligence
Attack Models (AM)
• Build and maintain a top N possible attacks list. [AM1.1: 17]
• Create a data classification scheme and inventory. [AM1.2]
• Identify potential attackers. [AM1.3]
• Collect and publish attack stories. [AM1.4]
• Gather and use attack intelligence. [AM1.5]
• Build an internal forum to discuss attacks. [AM1.6]
Deployment
Penetration Testing (PT)
• Use external penetration testers to find problems. [PT1.1]
• Feed results to the defect management and mitigation system. [PT1.2]
• Use penetration testing tools internally. [PT1.3]
Level 2 Activities
Governance
Strategy & Metrics (SM)
• Publish data about software security internally. [SM2.1]
• Enforce gates with measurements and track exceptions. [SM2.2]
• Create or grow a satellite. [SM2.3]
• Identify metrics and use them to drive budgets. [SM2.5]
• Require security sign-off. [SM2.6]
Training (T)
• Enhance satellite through training and events. [T2.5]
• Include security resources in onboarding. [T2.6]
• Identify satellite through training. [T2.7]
Intelligence
Attack Models (AM)
• Build attack patterns and abuse cases tied to potential attackers. [AM2.1]
• Create technology-specific attack patterns. [AM2.2]
Level 3 Activities
Governance
Strategy & Metrics (SM)
• Use an internal tracking application with portfolio view. [SM3.1]
• Run an external marketing program. [SM3.2]
Training (T)
• Reward progression through curriculum (certification or HR). [T3.1]
• Provide training for vendors or outsourced workers. [T3.2]
• Host external software security events. [T3.3]
• Require an annual refresher. [T3.4]
• Establish SSG office hours. [T3.5]
Intelligence
Attack Models (AM)
• Have a science team that develops new attack methods. [AM3.1]
• Create and use automation to do what attackers will do. [AM3.2]
Deployment
Penetration Testing (PT)
• Use external penetration testers to perform deep-dive analysis. [PT3.1]
• Have the SSG customize penetration testing tools and scripts. [PT3.2]